Darrow, Lyndsey A.; Klein, Mitchel; Flanders, W. Dana; Mulholland, James A.; Tolbert, Paige E.; Strickland, Matthew J.
2014-01-01
Upper and lower respiratory infections are common in early childhood and may be exacerbated by air pollution. We investigated short-term changes in ambient air pollutant concentrations, including speciated particulate matter less than 2.5 μm in diameter (PM2.5), in relation to emergency department (ED) visits for respiratory infections in young children. Daily counts of ED visits for bronchitis and bronchiolitis (n = 80,399), pneumonia (n = 63,359), and upper respiratory infection (URI) (n = 359,246) among children 0–4 years of age were collected from hospitals in the Atlanta, Georgia, area for the period 1993–2010. Daily pollutant measurements were combined across monitoring stations using population weighting. In Poisson generalized linear models, 3-day moving average concentrations of ozone, nitrogen dioxide, and the organic carbon fraction of particulate matter less than 2.5 μm in diameter (PM2.5) were associated with ED visits for pneumonia and URI. Ozone associations were strongest and were observed at low (cold-season) concentrations; a 1–interquartile range increase predicted a 4% increase (95% confidence interval: 2%, 6%) in visits for URI and an 8% increase (95% confidence interval: 4%, 13%) in visits for pneumonia. Rate ratios tended to be higher in the 1- to 4-year age group compared with infants. Results suggest that primary traffic pollutants, ozone, and the organic carbon fraction of PM2.5 exacerbate upper and lower respiratory infections in early life, and that the carbon fraction of PM2.5 is a particularly harmful component of the ambient particulate matter mixture. PMID:25324558
Darrow, Lyndsey A; Klein, Mitchel; Flanders, W Dana; Mulholland, James A; Tolbert, Paige E; Strickland, Matthew J
2014-11-15
Upper and lower respiratory infections are common in early childhood and may be exacerbated by air pollution. We investigated short-term changes in ambient air pollutant concentrations, including speciated particulate matter less than 2.5 μm in diameter (PM2.5), in relation to emergency department (ED) visits for respiratory infections in young children. Daily counts of ED visits for bronchitis and bronchiolitis (n = 80,399), pneumonia (n = 63,359), and upper respiratory infection (URI) (n = 359,246) among children 0-4 years of age were collected from hospitals in the Atlanta, Georgia, area for the period 1993-2010. Daily pollutant measurements were combined across monitoring stations using population weighting. In Poisson generalized linear models, 3-day moving average concentrations of ozone, nitrogen dioxide, and the organic carbon fraction of particulate matter less than 2.5 μm in diameter (PM2.5) were associated with ED visits for pneumonia and URI. Ozone associations were strongest and were observed at low (cold-season) concentrations; a 1-interquartile range increase predicted a 4% increase (95% confidence interval: 2%, 6%) in visits for URI and an 8% increase (95% confidence interval: 4%, 13%) in visits for pneumonia. Rate ratios tended to be higher in the 1- to 4-year age group compared with infants. Results suggest that primary traffic pollutants, ozone, and the organic carbon fraction of PM2.5 exacerbate upper and lower respiratory infections in early life, and that the carbon fraction of PM2.5 is a particularly harmful component of the ambient particulate matter mixture.
NASA Astrophysics Data System (ADS)
Loredo, Thomas
The key, central objectives of the proposed Time Series Explorer project are to develop an organized collection of software tools for analysis of time series data in current and future NASA astrophysics data archives, and to make the tools available in two ways: as a library (the Time Series Toolbox) that individual science users can use to write their own data analysis pipelines, and as an application (the Time Series Automaton) providing an accessible, data-ready interface to many Toolbox algorithms, facilitating rapid exploration and automatic processing of time series databases. A number of time series analysis methods will be implemented, including techniques that range from standard ones to state-of-the-art developments by the proposers and others. Most of the algorithms will be able to handle time series data subject to real-world problems such as data gaps, sampling that is otherwise irregular, asynchronous sampling (in multi-wavelength settings), and data with non-Gaussian measurement errors. The proposed research responds to the ADAP element supporting the development of tools for mining the vast reservoir of information residing in NASA databases. The tools that will be provided to the community of astronomers studying variability of astronomical objects (from nearby stars and extrasolar planets, through galactic and extragalactic sources) will revolutionize the quality of timing analyses that can be carried out, and greatly enhance the scientific throughput of all NASA astrophysics missions past, present, and future. The Automaton will let scientists explore time series - individual records or large data bases -- with the most informative and useful analysis methods available, without having to develop the tools themselves or understand the computational details. Both elements, the Toolbox and the Automaton, will enable deep but efficient exploratory time series data analysis, which is why we have named the project the Time Series Explorer. Science
Dugan, Jon M.
2007-11-02
TSDB is a Python module for storing large volumes of time series data. TSDB stores data in binary files indexed by a timestamp. Aggregation functions (such as rate, sum, avg, etc.) can be performed on the data, but data is never discarded. TSDB is presently best suited for SNMP data but new data types are easily added.
Disaggregating times series data
Joubert, S.B.; Burr, T.; Scovel, J.C.
1997-05-01
This report describes our experiences with disaggregating time series data. Suppose we have gathered data every two seconds and want to guess the data at one-second intervals. Under certain assumptions, there are several reasonable disaggregation methods as well as several performance measures to judge their performance. Here we present results for both simulated and real data for two methods using several performance criteria.
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
Ordinal analysis of time series
NASA Astrophysics Data System (ADS)
Keller, K.; Sinn, M.
2005-10-01
In order to develop fast and robust methods for extracting qualitative information from non-linear time series, Bandt and Pompe have proposed to consider time series from the pure ordinal viewpoint. On the basis of counting ordinal patterns, which describe the up-and-down in a time series, they have introduced the concept of permutation entropy for quantifying the complexity of a system behind a time series. The permutation entropy only provides one detail of the ordinal structure of a time series. Here we present a method for extracting the whole ordinal information.
ERIC Educational Resources Information Center
Cawley, John; Spiess, C. Katharina
2008-01-01
In developed countries, obesity tends to be associated with worse labor market outcomes. One possible reason is that obesity leads to less human capital formation early in life. This paper investigates the association between obesity and the developmental functioning of children at younger ages (2-4 years) than ever previously examined. Data from…
Permutations and time series analysis.
Cánovas, Jose S; Guillamón, Antonio
2009-12-01
The main aim of this paper is to show how the use of permutations can be useful in the study of time series analysis. In particular, we introduce a test for checking the independence of a time series which is based on the number of admissible permutations on it. The main improvement in our tests is that we are able to give a theoretical distribution for independent time series.
NASA Astrophysics Data System (ADS)
Allan, Alasdair
2014-06-01
FROG performs time series analysis and display. It provides a simple user interface for astronomers wanting to do time-domain astrophysics but still offers the powerful features found in packages such as PERIOD (ascl:1406.005). FROG includes a number of tools for manipulation of time series. Among other things, the user can combine individual time series, detrend series (multiple methods) and perform basic arithmetic functions. The data can also be exported directly into the TOPCAT (ascl:1101.010) application for further manipulation if needed.
Time Keeping and Working Memory Development in Early Adolescence: A 4-Year Follow-Up
ERIC Educational Resources Information Center
Forman, Helen; Mantyla, Timo; Carelli, Maria G.
2011-01-01
In this longitudinal study, we examined time keeping in relation to working memory (WM) development. School-aged children completed two tasks of WM updating and a time monitoring task in which they indicated the passing of time every 5 min while watching a film. Children completed these tasks first when they were 8 to 12 years old and then 4 years…
Time series with tailored nonlinearities
NASA Astrophysics Data System (ADS)
Räth, C.; Laut, I.
2015-10-01
It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.
A Measure of Inspection Time in 4-Year-Old Children: The Benny Bee IT Task
ERIC Educational Resources Information Center
Williams, Sarah E.; Turley, Christopher; Nettelbeck, Ted; Burns, Nicholas R.
2009-01-01
Inspection time (IT) measures speed of information processing without the confounding influence of motor speed. While IT has been found to relate to cognitive abilities in adults and older children, no measure of IT has been validated for use with children younger than 6 years. This study examined the validity of a new measure of IT for preschool…
Exploring the Determinants of Time-to-Degree in Public 4-Year Colleges
ERIC Educational Resources Information Center
Zhu, Lillian
2004-01-01
The study examines the factors that impact the students who attained a bachelor's degree in four-years in a public four-year college. The study focuses on students' pre-college preparation, financial aids, academic performance, work-study time arrangement, and intention of completing a bachelor degree at the entering institution. The sample…
Clustering of financial time series
NASA Astrophysics Data System (ADS)
D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo
2013-05-01
This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.
Entropy of electromyography time series
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Zurcher, Ulrich; Sung, Paul S.
2007-12-01
A nonlinear analysis based on Renyi entropy is applied to electromyography (EMG) time series from back muscles. The time dependence of the entropy of the EMG signal exhibits a crossover from a subdiffusive regime at short times to a plateau at longer times. We argue that this behavior characterizes complex biological systems. The plateau value of the entropy can be used to differentiate between healthy and low back pain individuals.
Random time series in astronomy.
Vaughan, Simon
2013-02-13
Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.
Random time series in astronomy.
Vaughan, Simon
2013-02-13
Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series. PMID:23277606
Introduction to Time Series Analysis
NASA Technical Reports Server (NTRS)
Hardin, J. C.
1986-01-01
The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.
Multivariate Time Series Similarity Searching
Wang, Jimin; Zhu, Yuelong; Li, Shijin; Wan, Dingsheng; Zhang, Pengcheng
2014-01-01
Multivariate time series (MTS) datasets are very common in various financial, multimedia, and hydrological fields. In this paper, a dimension-combination method is proposed to search similar sequences for MTS. Firstly, the similarity of single-dimension series is calculated; then the overall similarity of the MTS is obtained by synthesizing each of the single-dimension similarity based on weighted BORDA voting method. The dimension-combination method could use the existing similarity searching method. Several experiments, which used the classification accuracy as a measure, were performed on six datasets from the UCI KDD Archive to validate the method. The results show the advantage of the approach compared to the traditional similarity measures, such as Euclidean distance (ED), cynamic time warping (DTW), point distribution (PD), PCA similarity factor (SPCA), and extended Frobenius norm (Eros), for MTS datasets in some ways. Our experiments also demonstrate that no measure can fit all datasets, and the proposed measure is a choice for similarity searches. PMID:24895665
Multivariate time series similarity searching.
Wang, Jimin; Zhu, Yuelong; Li, Shijin; Wan, Dingsheng; Zhang, Pengcheng
2014-01-01
Multivariate time series (MTS) datasets are very common in various financial, multimedia, and hydrological fields. In this paper, a dimension-combination method is proposed to search similar sequences for MTS. Firstly, the similarity of single-dimension series is calculated; then the overall similarity of the MTS is obtained by synthesizing each of the single-dimension similarity based on weighted BORDA voting method. The dimension-combination method could use the existing similarity searching method. Several experiments, which used the classification accuracy as a measure, were performed on six datasets from the UCI KDD Archive to validate the method. The results show the advantage of the approach compared to the traditional similarity measures, such as Euclidean distance (ED), cynamic time warping (DTW), point distribution (PD), PCA similarity factor (SPCA), and extended Frobenius norm (Eros), for MTS datasets in some ways. Our experiments also demonstrate that no measure can fit all datasets, and the proposed measure is a choice for similarity searches. PMID:24895665
Kolmogorov space in time series data
NASA Astrophysics Data System (ADS)
Kanjamapornkul, Kabin; Pinčák, Richard
2016-10-01
We provide the proof that the space of time series data is a Kolmogorov space with $T_{0}$-separation axiom using the loop space of time series data. In our approach we define a cyclic coordinate of intrinsic time scale of time series data after empirical mode decomposition. A spinor field of time series data comes from the rotation of data around price and time axis by defining a new extradimension to time series data. We show that there exist hidden eight dimensions in Kolmogorov space for time series data. Our concept is realized as the algorithm of empirical mode decomposition and intrinsic time scale decomposition and it is subsequently used for preliminary analysis on the real time series data.
Time series models of symptoms in schizophrenia.
Tschacher, Wolfgang; Kupper, Zeno
2002-12-15
The symptom courses of 84 schizophrenia patients (mean age: 24.4 years; mean previous admissions: 1.3; 64% males) of a community-based acute ward were examined to identify dynamic patterns of symptoms and to investigate the relation between these patterns and treatment outcome. The symptoms were monitored by systematic daily staff ratings using a scale composed of three factors: psychoticity, excitement, and withdrawal. Patients showed moderate to high symptomatic improvement documented by effect size measures. Each of the 84 symptom trajectories was analyzed by time series methods using vector autoregression (VAR) that models the day-to-day interrelations between symptom factors. Multiple and stepwise regression analyses were then performed on the basis of the VAR models. Two VAR parameters were found to be associated significantly with favorable outcome in this exploratory study: 'withdrawal preceding a reduction of psychoticity' as well as 'excitement preceding an increase of withdrawal'. The findings were interpreted as generating hypotheses about how patients cope with psychotic episodes.
A Review of Subsequence Time Series Clustering
Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332
A review of subsequence time series clustering.
Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.
A review of subsequence time series clustering.
Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332
Nonparametric causal inference for bivariate time series.
McCracken, James M; Weigel, Robert S
2016-02-01
We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.
Nonparametric causal inference for bivariate time series
NASA Astrophysics Data System (ADS)
McCracken, James M.; Weigel, Robert S.
2016-02-01
We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.
Forecasting Enrollments with Fuzzy Time Series.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…
Statistical criteria for characterizing irradiance time series.
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
Generation of artificial helioseismic time-series
NASA Technical Reports Server (NTRS)
Schou, J.; Brown, T. M.
1993-01-01
We present an outline of an algorithm to generate artificial helioseismic time-series, taking into account as much as possible of the knowledge we have on solar oscillations. The hope is that it will be possible to find the causes of some of the systematic errors in analysis algorithms by testing them with such artificial time-series.
Reconstruction of time-delay systems from chaotic time series.
Bezruchko, B P; Karavaev, A S; Ponomarenko, V I; Prokhorov, M D
2001-11-01
We propose a method that allows one to estimate the parameters of model scalar time-delay differential equations from time series. The method is based on a statistical analysis of time intervals between extrema in the time series. We verify our method by using it for the reconstruction of time-delay differential equations from their chaotic solutions and for modeling experimental systems with delay-induced dynamics from their chaotic time series.
Salient Segmentation of Medical Time Series Signals
Woodbridge, Jonathan; Lan, Mars; Sarrafzadeh, Majid; Bui, Alex
2016-01-01
Searching and mining medical time series databases is extremely challenging due to large, high entropy, and multidimensional datasets. Traditional time series databases are populated using segments extracted by a sliding window. The resulting database index contains an abundance of redundant time series segments with little to no alignment. This paper presents the idea of “salient segmentation”. Salient segmentation is a probabilistic segmentation technique for populating medical time series databases. Segments with the lowest probabilities are considered salient and are inserted into the index. The resulting index has little redundancy and is composed of aligned segments. This approach reduces index sizes by more than 98% over conventional sliding window techniques. Furthermore, salient segmentation can reduce redundancy in motif discovery algorithms by more than 85%, yielding a more succinct representation of a time series signal.
Entropic Analysis of Electromyography Time Series
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Sung, Paul
2005-03-01
We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.
Network structure of multivariate time series.
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-10-21
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.
Network structure of multivariate time series
NASA Astrophysics Data System (ADS)
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-10-01
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.
Network structure of multivariate time series
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-01-01
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail. PMID:26487040
Homogenising time series: beliefs, dogmas and facts
NASA Astrophysics Data System (ADS)
Domonkos, P.
2011-06-01
In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.
Modelling of nonlinear filtering Poisson time series
NASA Astrophysics Data System (ADS)
Bochkarev, Vladimir V.; Belashova, Inna A.
2016-08-01
In this article, algorithms of non-linear filtering of Poisson time series are tested using statistical modelling. The objective is to find a representation of a time series as a wavelet series with a small number of non-linear coefficients, which allows distinguishing statistically significant details. There are well-known efficient algorithms of non-linear wavelet filtering for the case when the values of a time series have a normal distribution. However, if the distribution is not normal, good results can be expected using the maximum likelihood estimations. The filtration is studied according to the criterion of maximum likelihood by the example of Poisson time series. For direct optimisation of the likelihood function, different stochastic (genetic algorithms, annealing method) and deterministic optimization algorithms are used. Testing of the algorithm using both simulated series and empirical data (series of rare words frequencies according to the Google Books Ngram data were used) showed that filtering based on the criterion of maximum likelihood has a great advantage over well-known algorithms for the case of Poisson series. Also, the most perspective methods of optimisation were selected for this problem.
Developing consistent time series landsat data products
Technology Transfer Automated Retrieval System (TEKTRAN)
The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...
Modeling Time Series Data for Supervised Learning
ERIC Educational Resources Information Center
Baydogan, Mustafa Gokce
2012-01-01
Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Visibility Graph Based Time Series Analysis
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115
Homogenising time series: Beliefs, dogmas and facts
NASA Astrophysics Data System (ADS)
Domonkos, P.
2010-09-01
For obtaining reliable information about climate change and climate variability the use of high quality data series is essentially important, and one basic tool of quality improvements is the statistical homogenisation of observed time series. In the recent decades large number of homogenisation methods has been developed, but the real effects of their application on time series are still not known entirely. The ongoing COST HOME project (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. The author believes that several old theoretical rules have to be re-evaluated. Some examples of the hot questions, a) Statistically detected change-points can be accepted only with the confirmation of metadata information? b) Do semi-hierarchic algorithms for detecting multiple change-points in time series function effectively in practise? c) Is it good to limit the spatial comparison of candidate series with up to five other series in the neighbourhood? Empirical results - those from the COST benchmark, and other experiments too - show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities seem like part of the climatic variability, thus the pure application of classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality, than in raw time series. The developers and users of homogenisation methods have to bear in mind that
Complex network approach to fractional time series
Manshour, Pouya
2015-10-15
In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.
Detecting nonlinear structure in time series
Theiler, J.
1991-01-01
We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs.
Advanced spectral methods for climatic time series
Ghil, M.; Allen, M.R.; Dettinger, M.D.; Ide, K.; Kondrashov, D.; Mann, M.E.; Robertson, A.W.; Saunders, A.; Tian, Y.; Varadi, F.; Yiou, P.
2002-01-01
The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this review we describe the connections between time series analysis and nonlinear dynamics, discuss signal- to-noise enhancement, and present some of the novel methods for spectral analysis. The various steps, as well as the advantages and disadvantages of these methods, are illustrated by their application to an important climatic time series, the Southern Oscillation Index. This index captures major features of interannual climate variability and is used extensively in its prediction. Regional and global sea surface temperature data sets are used to illustrate multivariate spectral methods. Open questions and further prospects conclude the review.
Nonlinear Analysis of Surface EMG Time Series
NASA Astrophysics Data System (ADS)
Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2004-04-01
Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.
Detecting chaos in irregularly sampled time series
NASA Astrophysics Data System (ADS)
Kulp, C. W.
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.
Detecting chaos in irregularly sampled time series.
Kulp, C W
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars. PMID:24089946
Detecting chaos in irregularly sampled time series.
Kulp, C W
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.
Willie, Bettina M; Ashrafi, Shadi; Alajbegovic, Sanjin; Burnett, Trever; Bloebaum, Roy D
2004-06-01
Alternative sterilization methods including ethylene oxide, gas plasma, and gamma-radiation in an inert environment were implemented in the late 1990s, to limit oxidative degradation of ultrahigh molecular weight polyethylene (PE). There was also a simultaneous transition to PE resins that did not contain calcium stearate. Shelf storage duration of PE inserts following gamma-irradiation in air has been correlated to poor clinical performance and increased wear. This study aimed to determine how sterilization method and resin type influenced degradation of PE after 4 years of real-time shelf aging. It was hypothesized that gamma-irradiation and stearate containing resins would incur significantly more degradation than nonradiated, stearate-free resins. Gamma-irradiated PE samples in air and nitrogen had a significantly increased density and oxidation index, compared to nonirradiated PE after 4 years of shelf aging. Alternative sterilization methods such as ethylene oxide and gas plasma appeared to have significantly less oxidation regardless of PE resin type. A partial correlation demonstrated that density and oxidation index were not correlated (r(2) = 0.079) when examining the influence of sterilization method. The data supported that after 4 years of real-time shelf aging, the type of sterilization method had a larger influence on PE degradation than resin type.
Forbidden patterns in financial time series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano
2008-03-01
The existence of forbidden patterns, i.e., certain missing sequences in a given time series, is a recently proposed instrument of potential application in the study of time series. Forbidden patterns are related to the permutation entropy, which has the basic properties of classic chaos indicators, such as Lyapunov exponent or Kolmogorov entropy, thus allowing to separate deterministic (usually chaotic) from random series; however, it requires fewer values of the series to be calculated, and it is suitable for using with small datasets. In this paper, the appearance of forbidden patterns is studied in different economical indicators such as stock indices (Dow Jones Industrial Average and Nasdaq Composite), NYSE stocks (IBM and Boeing), and others (ten year Bond interest rate), to find evidence of deterministic behavior in their evolutions. Moreover, the rate of appearance of the forbidden patterns is calculated, and some considerations about the underlying dynamics are suggested.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Learning time series for intelligent monitoring
NASA Technical Reports Server (NTRS)
Manganaris, Stefanos; Fisher, Doug
1994-01-01
We address the problem of classifying time series according to their morphological features in the time domain. In a supervised machine-learning framework, we induce a classification procedure from a set of preclassified examples. For each class, we infer a model that captures its morphological features using Bayesian model induction and the minimum message length approach to assign priors. In the performance task, we classify a time series in one of the learned classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. We report results from experiments in a monitoring domain of interest to NASA.
Development of an IUE Time Series Browser
NASA Technical Reports Server (NTRS)
Massa, Derck
2005-01-01
The International Ultraviolet Explorer (IUE) satellite operated successfully for more than 17 years. Its archive of more than 100,000 science exposures is widely acknowledged as an invaluable scientific resource that will not be duplicated in the foreseeable future. We have searched this archive for objects which were observed 10 or more times with the same spectral dispersion and wavelength coverage over the lifetime of IUE. Using this definition of a time series, we find that roughly half of the science exposures are members of such time series. This paper describes a WEB-based IUE time series browser which enables the user to visually inspect the repeated observations for variability and to examine each member spectrum individually. Further, if the researcher determines that a specific data set is worthy of further investigation, it can be easily downloaded for further, detailed analysis.
Wavelet analysis of radon time series
NASA Astrophysics Data System (ADS)
Barbosa, Susana; Pereira, Alcides; Neves, Luis
2013-04-01
Radon is a radioactive noble gas with a half-life of 3.8 days ubiquitous in both natural and indoor environments. Being produced in uranium-bearing materials by decay from radium, radon can be easily and accurately measured by nuclear methods, making it an ideal proxy for time-varying geophysical processes. Radon time series exhibit a complex temporal structure and large variability on multiple scales. Wavelets are therefore particularly suitable for the analysis on a scale-by-scale basis of time series of radon concentrations. In this study continuous and discrete wavelet analysis is applied to describe the variability structure of hourly radon time series acquired both indoors and on a granite site in central Portugal. A multi-resolution decomposition is performed for extraction of sub-series associated to specific scales. The high-frequency components are modeled in terms of stationary autoregressive / moving average (ARMA) processes. The amplitude and phase of the periodic components are estimated and tidal features of the signals are assessed. Residual radon concentrations (after removal of periodic components) are further examined and the wavelet spectrum is used for estimation of the corresponding Hurst exponent. The results for the several radon time series considered in the present study are very heterogeneous in terms of both high-frequency and long-term temporal structure indicating that radon concentrations are very site-specific and heavily influenced by local factors.
Integrated method for chaotic time series analysis
Hively, L.M.; Ng, E.G.
1998-09-29
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.
Integrated method for chaotic time series analysis
Hively, Lee M.; Ng, Esmond G.
1998-01-01
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.
Climate Time Series Analysis and Forecasting
NASA Astrophysics Data System (ADS)
Young, P. C.; Fildes, R.
2009-04-01
This paper will discuss various aspects of climate time series data analysis, modelling and forecasting being carried out at Lancaster. This will include state-dependent parameter, nonlinear, stochastic modelling of globally averaged atmospheric carbon dioxide; the computation of emission strategies based on modern control theory; and extrapolative time series benchmark forecasts of annual average temperature, both global and local. The key to the forecasting evaluation will be the iterative estimation of forecast error based on rolling origin comparisons, as recommended in the forecasting research literature. The presentation will conclude with with a comparison of the time series forecasts with forecasts produced from global circulation models and a discussion of the implications for climate modelling research.
Haar Wavelet Analysis of Climatic Time Series
NASA Astrophysics Data System (ADS)
Zhang, Zhihua; Moore, John; Grinsted, Aslak
2014-05-01
In order to extract the intrinsic information of climatic time series from background red noise, we will first give an analytic formula on the distribution of Haar wavelet power spectra of red noise in a rigorous statistical framework. The relation between scale aand Fourier period T for the Morlet wavelet is a= 0.97T . However, for Haar wavelet, the corresponding formula is a= 0.37T . Since for any time series of time step δt and total length Nδt, the range of scales is from the smallest resolvable scale 2δt to the largest scale Nδt in wavelet-based time series analysis, by using the Haar wavelet analysis, one can extract more low frequency intrinsic information. Finally, we use our method to analyze Arctic Oscillation which is a key aspect of climate variability in the Northern Hemisphere, and discover a great change in fundamental properties of the AO,-commonly called a regime shift or tripping point. Our partial results have been published as follows: [1] Z. Zhang, J.C. Moore and A. Grinsted, Haar wavelet analysis of climatic time series, Int. J. Wavelets, Multiresol. & Inf. Process., in press, 2013 [2] Z. Zhang, J.C. Moore, Comment on "Significance tests for the wavelet power and the wavelet power spectrum", Ann. Geophys., 30:12, 2012
Nonlinear time-series analysis revisited.
Bradley, Elizabeth; Kantz, Holger
2015-09-01
In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data-typically univariate-via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems. PMID:26428563
Nonlinear time-series analysis revisited
NASA Astrophysics Data System (ADS)
Bradley, Elizabeth; Kantz, Holger
2015-09-01
In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data—typically univariate—via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems.
Nonlinear Time Series Analysis via Neural Networks
NASA Astrophysics Data System (ADS)
Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin
This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.
Directionality volatility in electroencephalogram time series
NASA Astrophysics Data System (ADS)
Mansor, Mahayaudin M.; Green, David A.; Metcalfe, Andrew V.
2016-06-01
We compare time series of electroencephalograms (EEGs) from healthy volunteers with EEGs from subjects diagnosed with epilepsy. The EEG time series from the healthy group are recorded during awake state with their eyes open and eyes closed, and the records from subjects with epilepsy are taken from three different recording regions of pre-surgical diagnosis: hippocampal, epileptogenic and seizure zone. The comparisons for these 5 categories are in terms of deviations from linear time series models with constant variance Gaussian white noise error inputs. One feature investigated is directionality, and how this can be modelled by either non-linear threshold autoregressive models or non-Gaussian errors. A second feature is volatility, which is modelled by Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) processes. Other features include the proportion of variability accounted for by time series models, and the skewness and the kurtosis of the residuals. The results suggest these comparisons may have diagnostic potential for epilepsy and provide early warning of seizures.
Nonlinear Time Series Analysis of Sunspot Data
NASA Astrophysics Data System (ADS)
Suyal, Vinita; Prasad, Awadhesh; Singh, Harinder P.
2009-12-01
This article deals with the analysis of sunspot number time series using the Hurst exponent. We use the rescaled range ( R/ S) analysis to estimate the Hurst exponent for 259-year and 11 360-year sunspot data. The results show a varying degree of persistence over shorter and longer time scales corresponding to distinct values of the Hurst exponent. We explain the presence of these multiple Hurst exponents by their resemblance to the deterministic chaotic attractors having multiple centers of rotation.
Remote Sensing Time Series Product Tool
NASA Technical Reports Server (NTRS)
Predos, Don; Ryan, Robert E.; Ross, Kenton W.
2006-01-01
The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced
Estimation of Hurst Exponent for the Financial Time Series
NASA Astrophysics Data System (ADS)
Kumar, J.; Manchanda, P.
2009-07-01
Till recently statistical methods and Fourier analysis were employed to study fluctuations in stock markets in general and Indian stock market in particular. However current trend is to apply the concepts of wavelet methodology and Hurst exponent, see for example the work of Manchanda, J. Kumar and Siddiqi, Journal of the Frankline Institute 144 (2007), 613-636 and paper of Cajueiro and B. M. Tabak. Cajueiro and Tabak, Physica A, 2003, have checked the efficiency of emerging markets by computing Hurst component over a time window of 4 years of data. Our goal in the present paper is to understand the dynamics of the Indian stock market. We look for the persistency in the stock market through Hurst exponent and fractal dimension of time series data of BSE 100 and NIFTY 50.
Algorithm for Compressing Time-Series Data
NASA Technical Reports Server (NTRS)
Hawkins, S. Edward, III; Darlington, Edward Hugo
2012-01-01
An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").
Pseudotime estimation: deconfounding single cell time series
Reid, John E.; Wernisch, Lorenz
2016-01-01
Motivation: Repeated cross-sectional time series single cell data confound several sources of variation, with contributions from measurement noise, stochastic cell-to-cell variation and cell progression at different rates. Time series from single cell assays are particularly susceptible to confounding as the measurements are not averaged over populations of cells. When several genes are assayed in parallel these effects can be estimated and corrected for under certain smoothness assumptions on cell progression. Results: We present a principled probabilistic model with a Bayesian inference scheme to analyse such data. We demonstrate our method’s utility on public microarray, nCounter and RNA-seq datasets from three organisms. Our method almost perfectly recovers withheld capture times in an Arabidopsis dataset, it accurately estimates cell cycle peak times in a human prostate cancer cell line and it correctly identifies two precocious cells in a study of paracrine signalling in mouse dendritic cells. Furthermore, our method compares favourably with Monocle, a state-of-the-art technique. We also show using held-out data that uncertainty in the temporal dimension is a common confounder and should be accounted for in analyses of repeated cross-sectional time series. Availability and Implementation: Our method is available on CRAN in the DeLorean package. Contact: john.reid@mrc-bsu.cam.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27318198
Hurst exponents for short time series.
Qi, Jingchao; Yang, Huijie
2011-12-01
A concept called balanced estimator of diffusion entropy is proposed to detect quantitatively scalings in short time series. The effectiveness is verified by detecting successfully scaling properties for a large number of artificial fractional Brownian motions. Calculations show that this method can give reliable scalings for short time series with length ~10(2). It is also used to detect scalings in the Shanghai Stock Index, five stock catalogs, and a total of 134 stocks collected from the Shanghai Stock Exchange Market. The scaling exponent for each catalog is significantly larger compared with that for the stocks included in the catalog. Selecting a window with size 650, the evolution of scaling for the Shanghai Stock Index is obtained by the window's sliding along the series. Global patterns in the evolutionary process are captured from the smoothed evolutionary curve. By comparing the patterns with the important event list in the history of the considered stock market, the evolution of scaling is matched with the stock index series. We can find that the important events fit very well with global transitions of the scaling behaviors.
Renewal, modulation, and superstatistics in times series.
Allegrini, Paolo; Barbi, Francesco; Grigolini, Paolo; Paradisi, Paolo
2006-04-01
We consider two different approaches, to which we refer to as renewal and modulation, to generate time series with a nonexponential distribution of waiting times. We show that different time series with the same waiting time distribution are not necessarily statistically equivalent, and might generate different physical properties. Renewal generates aging and anomalous scaling, while modulation yields no significant aging and either ordinary or anomalous diffusion, according to the dynamic prescription adopted. We show, in fact, that the physical realization of modulation generates two classes of events. The events of the first class are determined by the persistent use of the same exponential time scale for an extended lapse of time, and consequently are numerous; the events of the second class are identified with the abrupt changes from one to another exponential prescription, and consequently are rare. The events of the second class, although rare, determine the scaling of the diffusion process, and for this reason we term them as crucial events. According to the prescription adopted to produce modulation, the distribution density of the time distances between two consecutive crucial events might have, or not, a diverging second moment. In the former case the resulting diffusion process, although going through a transition regime very extended in time, will eventually become anomalous. In conclusion, modulation rather than ruling out the action of renewal events, produces crucial events hidden by clouds of exponential events, thereby setting the challenge for their identification.
Cluster analysis of respiratory time series.
Adams, J M; Attinger, E O; Attinger, F M
1978-03-01
We have investigated the respiratory control system with the hypothesis that, although many variables such as minute ventilation (VI), tidal volume (VT), breathing period (TT), inspiratory duration (TI), and expiratory duration (TE) may be observed, the controller functions more simply by manipulating only 2 or 3 of these. Thus, if tidal volume is the only independent variable, TI being determined by the "off-switch" threshold, these variables should have very similar time courses. Anesthetized dogs were subjected to CO2 breathing and carotid sinus perfusion to stimulate both chemoreceptors. The time series of the variables VI, VT, TT, TE, and TI as well as PACO2 were determined on a breath by breath basis. Derived characteristics of these time series were compared using Cluster Analysis and the latent dimensionality of respiratory control determined by Factor Analysis. The characteristics of the time series clustered into 4 groups: magnitude (of the response), speed, variability and relative change. One respiratory factor accounted for 86% of the variance for the variability characteristics, 2 factors for magnitude (91%) and relative change (85%) and 3 factors for speed (89%). The respiratory variables were analysed for each of the 4 groups of characteristics with the following results: VT and TI clustered together only for the magnitude and relative change characteristics where as TT and TE clustered closely for all four characteristics. One latent factor was associated with the [TT-TE] group and the other usually with PACO2.
Sliced Inverse Regression for Time Series Analysis
NASA Astrophysics Data System (ADS)
Chen, Li-Sue
1995-11-01
In this thesis, general nonlinear models for time series data are considered. A basic form is x _{t} = f(beta_sp{1} {T}X_{t-1},beta_sp {2}{T}X_{t-1},... , beta_sp{k}{T}X_ {t-1},varepsilon_{t}), where x_{t} is an observed time series data, X_{t } is the first d time lag vector, (x _{t},x_{t-1},... ,x _{t-d-1}), f is an unknown function, beta_{i}'s are unknown vectors, varepsilon_{t }'s are independent distributed. Special cases include AR and TAR models. We investigate the feasibility applying SIR/PHD (Li 1990, 1991) (the sliced inverse regression and principal Hessian methods) in estimating beta _{i}'s. PCA (Principal component analysis) is brought in to check one critical condition for SIR/PHD. Through simulation and a study on 3 well -known data sets of Canadian lynx, U.S. unemployment rate and sunspot numbers, we demonstrate how SIR/PHD can effectively retrieve the interesting low-dimension structures for time series data.
Homogeneous global mean temperature time series
Peterson, T.C.; Easterling, D.R.; Vose, R.S.; Eischeid, J.K.
1993-11-01
A multi-agency effort has been underway to create a homogeneous global baseline data set suitable for studying climate change. The joint release of the Global Historical Climatology Network (GHCN, Vose et al, 1992) version I in 1992 by the National Climatic Data Center/NOAA and the Carbon Dioxide Information Analysis Center/DOE gave the climate research community the largest monthly land surface global climate data set available to date with over 6,000 temperature stations, 39% of which have more than 50 years of data and 10% have more than 100 years of data (see Figure 1). Fifteen different global or regional data sets were merged to create GHCN version 1. Ten of these source data sets have temperature data but only two have been tested and adjusted for inhomogeneities in the station time series. The majority of the station temperature time series in GHCN have not been systematically examined for discontinuities.
Time-Series Analysis: A Cautionary Tale
NASA Technical Reports Server (NTRS)
Damadeo, Robert
2015-01-01
Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.
Univariate time series forecasting algorithm validation
NASA Astrophysics Data System (ADS)
Ismail, Suzilah; Zakaria, Rohaiza; Muda, Tuan Zalizam Tuan
2014-12-01
Forecasting is a complex process which requires expert tacit knowledge in producing accurate forecast values. This complexity contributes to the gaps between end users and expert. Automating this process by using algorithm can act as a bridge between them. Algorithm is a well-defined rule for solving a problem. In this study a univariate time series forecasting algorithm was developed in JAVA and validated using SPSS and Excel. Two set of simulated data (yearly and non-yearly); several univariate forecasting techniques (i.e. Moving Average, Decomposition, Exponential Smoothing, Time Series Regressions and ARIMA) and recent forecasting process (such as data partition, several error measures, recursive evaluation and etc.) were employed. Successfully, the results of the algorithm tally with the results of SPSS and Excel. This algorithm will not just benefit forecaster but also end users that lacking in depth knowledge of forecasting process.
Multifractal Analysis of Sunspot Number Time Series
NASA Astrophysics Data System (ADS)
Kasde, Satish Kumar; Gwal, Ashok Kumar; Sondhiya, Deepak Kumar
2016-07-01
Multifractal analysis based approaches have been recently developed as an alternative framework to study the complex dynamical fluctuations in sunspot numbers data including solar cycles 20 to 23 and ascending phase of current solar cycle 24.To reveal the multifractal nature, the time series data of monthly sunspot number are analyzed by singularity spectrum and multi resolution wavelet analysis. Generally, the multifractility in sunspot number generate turbulence with the typical characteristics of the anomalous process governing the magnetosphere and interior of Sun. our analysis shows that singularities spectrum of sunspot data shows well Gaussian shape spectrum, which clearly establishes the fact that monthly sunspot number has multifractal character. The multifractal analysis is able to provide a local and adaptive description of the cyclic components of sunspot number time series, which are non-stationary and result of nonlinear processes. Keywords: Sunspot Numbers, Magnetic field, Multifractal analysis and wavelet Transform Techniques.
Visibility graphlet approach to chaotic time series.
Mutua, Stephen; Gu, Changgui; Yang, Huijie
2016-05-01
Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.
Detecting anomalous phase synchronization from time series
Tokuda, Isao T.; Kumar Dana, Syamal; Kurths, Juergen
2008-06-15
Modeling approaches are presented for detecting an anomalous route to phase synchronization from time series of two interacting nonlinear oscillators. The anomalous transition is characterized by an enlargement of the mean frequency difference between the oscillators with an initial increase in the coupling strength. Although such a structure is common in a large class of coupled nonisochronous oscillators, prediction of the anomalous transition is nontrivial for experimental systems, whose dynamical properties are unknown. Two approaches are examined; one is a phase equational modeling of coupled limit cycle oscillators and the other is a nonlinear predictive modeling of coupled chaotic oscillators. Application to prototypical models such as two interacting predator-prey systems in both limit cycle and chaotic regimes demonstrates the capability of detecting the anomalous structure from only a few sets of time series. Experimental data from two coupled Chua circuits shows its applicability to real experimental system.
Applying time series analysis to performance logs
NASA Astrophysics Data System (ADS)
Kubacki, Marcin; Sosnowski, Janusz
2015-09-01
Contemporary computer systems provide mechanisms for monitoring various performance parameters (e.g. processor or memory usage, disc or network transfers), which are collected and stored in performance logs. An important issue is to derive characteristic features describing normal and abnormal behavior of the systems. For this purpose we use various schemes of analyzing time series. They have been adapted to the specificity of performance logs and verified using data collected from real systems. The presented approach is useful in evaluating system dependability.
Data mining in medical time series.
Mikut, Ralf; Reischl, Markus; Burmeister, Ole; Loose, Tobias
2006-12-01
This article proposes a modular, computer-based methodology to describe and compare medical problems using data mining methods. The methodology focuses on a mathematical formulation of typical classification problems, systematic extraction of interpretable features from time series, and an evaluation adapted to problem-specific preferences and limitations (computational power, interpretability, etc.). The approach is applied to instrumented gait analysis and to the individual design of myoelectric controllers for hand prostheses.
Aggregated Indexing of Biomedical Time Series Data
Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A.T.
2016-01-01
Remote and wearable medical sensing has the potential to create very large and high dimensional datasets. Medical time series databases must be able to efficiently store, index, and mine these datasets to enable medical professionals to effectively analyze data collected from their patients. Conventional high dimensional indexing methods are a two stage process. First, a superset of the true matches is efficiently extracted from the database. Second, supersets are pruned by comparing each of their objects to the query object and rejecting any objects falling outside a predetermined radius. This pruning stage heavily dominates the computational complexity of most conventional search algorithms. Therefore, indexing algorithms can be significantly improved by reducing the amount of pruning. This paper presents an online algorithm to aggregate biomedical times series data to significantly reduce the search space (index size) without compromising the quality of search results. This algorithm is built on the observation that biomedical time series signals are composed of cyclical and often similar patterns. This algorithm takes in a stream of segments and groups them to highly concentrated collections. Locality Sensitive Hashing (LSH) is used to reduce the overall complexity of the algorithm, allowing it to run online. The output of this aggregation is used to populate an index. The proposed algorithm yields logarithmic growth of the index (with respect to the total number of objects) while keeping sensitivity and specificity simultaneously above 98%. Both memory and runtime complexities of time series search are improved when using aggregated indexes. In addition, data mining tasks, such as clustering, exhibit runtimes that are orders of magnitudes faster when run on aggregated indexes.
Analysis of Polyphonic Musical Time Series
NASA Astrophysics Data System (ADS)
Sommer, Katrin; Weihs, Claus
A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.
Interpretation of a compositional time series
NASA Astrophysics Data System (ADS)
Tolosana-Delgado, R.; van den Boogaart, K. G.
2012-04-01
Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA
Characterization of noisy symbolic time series
NASA Astrophysics Data System (ADS)
Kulp, Christopher W.; Smith, Suzanne
2011-02-01
The 0-1 test for chaos is a recently developed time series characterization algorithm that can determine whether a system is chaotic or nonchaotic. While the 0-1 test was designed for deterministic series, in real-world measurement situations, noise levels may not be known and the 0-1 test may have difficulty distinguishing between chaos and randomness. In this paper, we couple the 0-1 test for chaos with a test for determinism and apply these tests to noisy symbolic series generated from various model systems. We find that the pairing of the 0-1 test with a test for determinism improves the ability to correctly distinguish between chaos and randomness from a noisy series. Furthermore, we explore the modes of failure for the 0-1 test and the test for determinism so that we can better understand the effectiveness of the two tests to handle various levels of noise. We find that while the tests can handle low noise and high noise situations, moderate levels of noise can lead to inconclusive results from the two tests.
Weighted Dynamic Time Warping for Time Series Classification
Jeong, Young-Seon; Jeong, Myong K; Omitaomu, Olufemi A
2011-01-01
Dynamic time warping (DTW), which finds the minimum path by providing non-linear alignments between two time series, has been widely used as a distance measure for time series classification and clustering. However, DTW does not account for the relative importance regarding the phase difference between a reference point and a testing point. This may lead to misclassification especially in applications where the shape similarity between two sequences is a major consideration for an accurate recognition. Therefore, we propose a novel distance measure, called a weighted DTW (WDTW), which is a penalty-based DTW. Our approach penalizes points with higher phase difference between a reference point and a testing point in order to prevent minimum distance distortion caused by outliers. The rationale underlying the proposed distance measure is demonstrated with some illustrative examples. A new weight function, called the modified logistic weight function (MLWF), is also proposed to systematically assign weights as a function of the phase difference between a reference point and a testing point. By applying different weights to adjacent points, the proposed algorithm can enhance the detection of similarity between two time series. We show that some popular distance measures such as DTW and Euclidean distance are special cases of our proposed WDTW measure. We extend the proposed idea to other variants of DTW such as derivative dynamic time warping (DDTW) and propose the weighted version of DDTW. We have compared the performances of our proposed procedures with other popular approaches using public data sets available through the UCR Time Series Data Mining Archive for both time series classification and clustering problems. The experimental results indicate that the proposed approaches can achieve improved accuracy for time series classification and clustering problems.
Synchronization Analysis of Nonstationary Bivariate Time Series
NASA Astrophysics Data System (ADS)
Kurths, J.
First the concept of synchronization in coupled complex systems is presented and it is shown that synchronization phenomena are abundant in science, nature, engineer- ing etc. We use this concept to treat the inverse problem and to reveal interactions between oscillating systems from observational data. First it is discussed how time varying phases and frequencies can be estimated from time series and second tech- niques for detection and quantification of hidden synchronization is presented. We demonstrate that this technique is effective for the analysis of systems' interrelation from noisy nonstationary bivariate data and provides other insights than traditional cross correlation and spectral analysis. For this, model examples and geophysical data are discussed.
A Time-Series Model for Academic Library Data Using Intervention Analysis.
ERIC Educational Resources Information Center
Naylor, Maiken; Walsh, Kathleen
1994-01-01
Discussion of methods for gathering journal use information in academic libraries (for retention decisions) highlights an 8.4-year time-series of weekly library journal pickup data. Use of the autocorrelation function, spectral analysis, and intervention analysis is described.(LRW)
Developmental milestones record - 4 years
Normal childhood growth milestones - 4 years; Growth milestones for children - 4 years; Childhood growth milestones - 4 years ... care provider. PHYSICAL AND MOTOR During the fourth year, a child typically: Gains weight at the rate ...
Homogenization of precipitation time series with ACMANT
NASA Astrophysics Data System (ADS)
Domonkos, Peter
2015-10-01
New method for the time series homogenization of observed precipitation (PP) totals is presented; this method is a unit of the ACMANT software package. ACMANT is a relative homogenization method; minimum four time series with adequate spatial correlations are necessary for its use. The detection of inhomogeneities (IHs) is performed with fitting optimal step function, while the calculation of adjustment terms is based on the minimization of the residual variance in homogenized datasets. Together with the presentation of PP homogenization with ACMANT, some peculiarities of PP homogenization as, for instance, the frequency and seasonal variation of IHs in observed PP data and their relation to the performance of homogenization methods are discussed. In climatic regions of snowy winters, ACMANT distinguishes two seasons, namely, rainy season and snowy season, and the seasonal IHs are searched with bivariate detection. ACMANT is a fully automatic method, is freely downloadable from internet and treats either daily or monthly input. Series of observed data in the input dataset may cover different periods, and the occurrence of data gaps is allowed. False zero values instead of missing data code or physical outliers should be corrected before running ACMANT. Efficiency tests indicate that ACMANT belongs to the best performing methods, although further comparative tests of automatic homogenization methods are needed to confirm or reject this finding.
Fractal fluctuations in cardiac time series
NASA Technical Reports Server (NTRS)
West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)
1999-01-01
Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.
Radar Interferometry Time Series Analysis and Tools
NASA Astrophysics Data System (ADS)
Buckley, S. M.
2006-12-01
We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including localized subsidence bowls with constant velocity and seasonal deformation fluctuations. We will present results and insights from the application of these time series analysis techniques to several land subsidence study sites with varying deformation and environmental conditions, e.g., arid Phoenix and coastal Houston-Galveston metropolitan areas and rural Texas sink holes. We consistently find that the time invested in implementing, applying and comparing multiple InSAR time series approaches for a given study site is rewarded with a deeper understanding of the techniques and deformation phenomena. To this end, and with support from NSF, we are preparing a first-version of an InSAR post-processing toolkit to be released to the InSAR science community. These studies form a baseline of results to compare against the higher spatial and temporal sampling anticipated from TerraSAR-X as well as the trade-off between spatial coverage and resolution when relying on ScanSAR interferometry.
Modeling stylized facts for financial time series
NASA Astrophysics Data System (ADS)
Krivoruchenko, M. I.; Alessio, E.; Frappietro, V.; Streckert, L. J.
2004-12-01
Multivariate probability density functions of returns are constructed in order to model the empirical behavior of returns in a financial time series. They describe the well-established deviations from the Gaussian random walk, such as an approximate scaling and heavy tails of the return distributions, long-ranged volatility-volatility correlations (volatility clustering) and return-volatility correlations (leverage effect). The model is tested successfully to fit joint distributions of the 100+ years of daily price returns of the Dow Jones 30 Industrial Average.
Time Series Photometry of KZ Lacertae
NASA Astrophysics Data System (ADS)
Joner, Michael D.
2016-01-01
We present BVRI time series photometry of the high amplitude delta Scuti star KZ Lacertae secured using the 0.9-meter telescope located at the Brigham Young University West Mountain Observatory. In addition to the multicolor light curves that are presented, the V data from the last six years of observations are used to plot an O-C diagram in order to determine the ephemeris and evaluate evidence for period change. We wish to thank the Brigham Young University College of Physical and Mathematical Sciences as well as the Department of Physics and Astronomy for their continued support of the research activities at the West Mountain Observatory.
Forecasting spore concentrations: A time series approach
NASA Astrophysics Data System (ADS)
Stephen, Elaine; Raffery, Adrian E.; Dowding, Paul
1990-06-01
Fungal basidiospores and Cladosporium spores are the two most numerous spore types in the air of Dublin and its surroundings. They are known to have allergenic components, and the aim of the study described here is to develop a predictive model for these spores. A very simple model, which combines an estimated diurnal rhythm with a simple, one-parameter time series model, provided golld short-term forecasts. The one-step prediction error variance was reduced by 88% for Cladosporium spores and by 98% for basidiospores.
Time Series Analysis of SOLSTICE Measurements
NASA Astrophysics Data System (ADS)
Wen, G.; Cahalan, R. F.
2003-12-01
Solar radiation is the major energy source for the Earth's biosphere and atmospheric and ocean circulations. Variations of solar irradiance have been a major concern of scientists both in solar physics and atmospheric sciences. A number of missions have been carried out to monitor changes in total solar irradiance (TSI) [see Fröhlich and Lean, 1998 for review] and spectral solar irradiance (SSI) [e.g., SOLSTICE on UARS and VIRGO on SOHO]. Observations over a long time period reveal the connection between variations in solar irradiance and surface magnetic fields of the Sun [Lean1997]. This connection provides a guide to scientists in modeling solar irradiances [e.g., Fontenla et al., 1999; Krivova et al., 2003]. Solar spectral observations have now been made over a relatively long time period, allowing statistical analysis. This paper focuses on predictability of solar spectral irradiance using observed SSI from SOLSTICE . Analysis of predictability is based on nonlinear dynamics using an artificial neural network in a reconstructed phase space [Abarbanel et al., 1993]. In the analysis, we first examine the average mutual information of the observed time series and a delayed time series. The time delay that gives local minimum of mutual information is chosen as the time-delay for phase space reconstruction [Fraser and Swinney, 1986]. The embedding dimension of the reconstructed phase space is determined using the false neighbors and false strands method [Kennel and Abarbanel, 2002]. Subsequently, we use a multi-layer feed-forward network with back propagation scheme [e.g., Haykin, 1994] to model the time series. The predictability of solar irradiance as a function of wavelength is considered. References Abarbanel, H. D. I., R. Brown, J. J. Sidorowich, and L. Sh. Tsimring, Rev. Mod. Phys. 65, 1331, 1993. Fraser, A. M. and H. L. Swinney, Phys. Rev. 33A, 1134, 1986. Fontenla, J., O. R. White, P. Fox, E. H. Avrett and R. L. Kurucz, The Astrophysical Journal, 518, 480
Time series analysis of temporal networks
NASA Astrophysics Data System (ADS)
Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh
2016-01-01
A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue
Partial spectral analysis of hydrological time series
NASA Astrophysics Data System (ADS)
Jukić, D.; Denić-Jukić, V.
2011-03-01
SummaryHydrological time series comprise the influences of numerous processes involved in the transfer of water in hydrological cycle. It implies that an ambiguity with respect to the processes encoded in spectral and cross-spectral density functions exists. Previous studies have not paid attention adequately to this issue. Spectral and cross-spectral density functions represent the Fourier transforms of auto-covariance and cross-covariance functions. Using this basic property, the ambiguity is resolved by applying a novel approach based on the spectral representation of partial correlation. Mathematical background for partial spectral density, partial amplitude and partial phase functions is presented. The proposed functions yield the estimates of spectral density, amplitude and phase that are not affected by a controlling process. If an input-output relation is the subject of interest, antecedent and subsequent influences of the controlling process can be distinguished considering the input event as a referent point. The method is used for analyses of the relations between the rainfall, air temperature and relative humidity, as well as the influences of air temperature and relative humidity on the discharge from karst spring. Time series are collected in the catchment of the Jadro Spring located in the Dinaric karst area of Croatia.
Nonparametric, nonnegative deconvolution of large time series
NASA Astrophysics Data System (ADS)
Cirpka, O. A.
2006-12-01
There is a long tradition of characterizing hydrologic systems by linear models, in which the response of the system to a time-varying stimulus is computed by convolution of a system-specific transfer function with the input signal. Despite its limitations, the transfer-function concept has been shown valuable for many situations such as the precipitation/run-off relationships of catchments and solute transport in agricultural soils and aquifers. A practical difficulty lies in the identification of the transfer function. A common approach is to fit a parametric function, enforcing a particular shape of the transfer function, which may be in contradiction to the real behavior (e.g., multimodal transfer functions, long tails, etc.). In our nonparametric deconvolution, the transfer function is assumed an auto-correlated random time function, which is conditioned on the data by a Bayesian approach. Nonnegativity, which is a vital constraint for solute-transport applications, is enforced by the method of Lagrange multipliers. This makes the inverse problem nonlinear. In nonparametric deconvolution, identifying the auto-correlation parameters is crucial. Enforcing too much smoothness prohibits the identification of important features, whereas insufficient smoothing leads to physically meaningless transfer functions, mapping noise components in the two data series onto each other. We identify optimal smoothness parameters by the expectation-maximization method, which requires the repeated generation of many conditional realizations. The overall approach, however, is still significantly faster than Markov-Chain Monte-Carlo methods presented recently. We apply our approach to electric-conductivity time series measured in a river and monitoring wells in the adjacent aquifer. The data cover 1.5 years with a temporal resolution of 1h. The identified transfer functions have lengths of up to 60 days, making up 1440 parameters. We believe that nonparametric deconvolution is an
Assessing burn severity using satellite time series
NASA Astrophysics Data System (ADS)
Veraverbeke, Sander; Lhermitte, Stefaan; Verstraeten, Willem; Goossens, Rudi
2010-05-01
In this study a multi-temporal differenced Normalized Burn Ratio (dNBRMT) is presented to assess burn severity of the 2007 Peloponnese (Greece) wildfires. 8-day composites were created using the daily near infrared (NIR) and mid infrared (MIR) reflectance products of the Moderate Resolution Imaging Spectroradiometer (MODIS). Prior to the calculation of the dNBRMT a pixel-based control plot selection procedure was initiated for each burned pixel based on time series similarity of the pre-fire year 2006 to estimate the spatio-temporal NBR dynamics in the case that no fire event would have occurred. The dNBRMT is defined as the one-year post-fire integrated difference between the NBR values of the control and focal pixels. Results reveal the temporal dependency of the absolute values of bi-temporal dNBR maps as the mean temporal standard deviation of the one-year post-fire bi-temporal dNBR time series equaled 0.14 (standard deviation of 0.04). The dNBRMT's integration of temporal variability into one value potentially enhances the comparability of fires across space and time. In addition, the dNBRMT is robust to random noise thanks to the averaging effect. The dNBRMT, based on coarse resolution imagery with high temporal frequency, has the potential to become either a valuable complement to fine resolution Landsat dNBR mapping or an imperative option for assessing burn severity at a continental to global scale.
Periodograms for multiband astronomical time series
NASA Astrophysics Data System (ADS)
Ivezic, Z.; VanderPlas, J. T.
2016-05-01
We summarize the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time- domain data developed by VanderPlas & Ivezic (2015). A Python implementation of this method is available on GitHub. The multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST), and can treat non-uniform sampling and heteroscedastic errors. The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data.
Correcting and combining time series forecasters.
Firmino, Paulo Renato A; de Mattos Neto, Paulo S G; Ferreira, Tiago A E
2014-02-01
Combined forecasters have been in the vanguard of stochastic time series modeling. In this way it has been usual to suppose that each single model generates a residual or prediction error like a white noise. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. The present paper introduces a two-step method for correcting and combining forecasting models. Firstly, the stochastic process underlying the bias of each predictive model is built according to a recursive ARIMA algorithm in order to achieve a white noise behavior. At each iteration of the algorithm the best ARIMA adjustment is determined according to a given information criterion (e.g. Akaike). Then, in the light of the corrected predictions, it is considered a maximum likelihood combined estimator. Applications involving single ARIMA and artificial neural networks models for Dow Jones Industrial Average Index, S&P500 Index, Google Stock Value, and Nasdaq Index series illustrate the usefulness of the proposed framework. PMID:24239986
Scaling laws from geomagnetic time series
Voros, Z.; Kovacs, P.; Juhasz, A.; Kormendi, A.; Green, A.W.
1998-01-01
The notion of extended self-similarity (ESS) is applied here for the X - component time series of geomagnetic field fluctuations. Plotting nth order structure functions against the fourth order structure function we show that low-frequency geomagnetic fluctuations up to the order n = 10 follow the same scaling laws as MHD fluctuations in solar wind, however, for higher frequencies (f > l/5[h]) a clear departure from the expected universality is observed for n > 6. ESS does not allow to make an unambiguous statement about the non triviality of scaling laws in "geomagnetic" turbulence. However, we suggest to use higher order moments as promising diagnostic tools for mapping the contributions of various remote magnetospheric sources to local observatory data. Copyright 1998 by the American Geophysical Union.
Deconvolution of time series in the laboratory
NASA Astrophysics Data System (ADS)
John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian
2016-10-01
In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.
Using entropy to cut complex time series
NASA Astrophysics Data System (ADS)
Mertens, David; Poncela Casasnovas, Julia; Spring, Bonnie; Amaral, L. A. N.
2013-03-01
Using techniques from statistical physics, physicists have modeled and analyzed human phenomena varying from academic citation rates to disease spreading to vehicular traffic jams. The last decade's explosion of digital information and the growing ubiquity of smartphones has led to a wealth of human self-reported data. This wealth of data comes at a cost, including non-uniform sampling and statistically significant but physically insignificant correlations. In this talk I present our work using entropy to identify stationary sub-sequences of self-reported human weight from a weight management web site. Our entropic approach-inspired by the infomap network community detection algorithm-is far less biased by rare fluctuations than more traditional time series segmentation techniques. Supported by the Howard Hughes Medical Institute
Normalizing the causality between time series
NASA Astrophysics Data System (ADS)
Liang, X. San
2015-08-01
Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.
Inferring phase equations from multivariate time series.
Tokuda, Isao T; Jain, Swati; Kiss, István Z; Hudson, John L
2007-08-10
An approach is presented for extracting phase equations from multivariate time series data recorded from a network of weakly coupled limit cycle oscillators. Our aim is to estimate important properties of the phase equations including natural frequencies and interaction functions between the oscillators. Our approach requires the measurement of an experimental observable of the oscillators; in contrast with previous methods it does not require measurements in isolated single or two-oscillator setups. This noninvasive technique can be advantageous in biological systems, where extraction of few oscillators may be a difficult task. The method is most efficient when data are taken from the nonsynchronized regime. Applicability to experimental systems is demonstrated by using a network of electrochemical oscillators; the obtained phase model is utilized to predict the synchronization diagram of the system.
OPTIMAL TIME-SERIES SELECTION OF QUASARS
Butler, Nathaniel R.; Bloom, Joshua S.
2011-03-15
We present a novel method for the optimal selection of quasars using time-series observations in a single photometric bandpass. Utilizing the damped random walk model of Kelly et al., we parameterize the ensemble quasar structure function in Sloan Stripe 82 as a function of observed brightness. The ensemble model fit can then be evaluated rigorously for and calibrated with individual light curves with no parameter fitting. This yields a classification in two statistics-one describing the fit confidence and the other describing the probability of a false alarm-which can be tuned, a priori, to achieve high quasar detection fractions (99% completeness with default cuts), given an acceptable rate of false alarms. We establish the typical rate of false alarms due to known variable stars as {approx}<3% (high purity). Applying the classification, we increase the sample of potential quasars relative to those known in Stripe 82 by as much as 29%, and by nearly a factor of two in the redshift range 2.5 < z < 3, where selection by color is extremely inefficient. This represents 1875 new quasars in a 290 deg{sup 2} field. The observed rates of both quasars and stars agree well with the model predictions, with >99% of quasars exhibiting the expected variability profile. We discuss the utility of the method at high redshift and in the regime of noisy and sparse data. Our time-series selection complements well-independent selection based on quasar colors and has strong potential for identifying high-redshift quasars for Baryon Acoustic Oscillations and other cosmology studies in the LSST era.
PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES
VanderPlas, Jacob T.; Ivezic, Željko
2015-10-10
This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.
Financial time series: A physics perspective
NASA Astrophysics Data System (ADS)
Gopikrishnan, Parameswaran; Plerou, Vasiliki; Amaral, Luis A. N.; Rosenow, Bernd; Stanley, H. Eugene
2000-06-01
Physicists in the last few years have started applying concepts and methods of statistical physics to understand economic phenomena. The word ``econophysics'' is sometimes used to refer to this work. One reason for this interest is the fact that Economic systems such as financial markets are examples of complex interacting systems for which a huge amount of data exist and it is possible that economic problems viewed from a different perspective might yield new results. This article reviews the results of a few recent phenomenological studies focused on understanding the distinctive statistical properties of financial time series. We discuss three recent results-(i) The probability distribution of stock price fluctuations: Stock price fluctuations occur in all magnitudes, in analogy to earthquakes-from tiny fluctuations to very drastic events, such as market crashes, eg., the crash of October 19th 1987, sometimes referred to as ``Black Monday''. The distribution of price fluctuations decays with a power-law tail well outside the Lévy stable regime and describes fluctuations that differ by as much as 8 orders of magnitude. In addition, this distribution preserves its functional form for fluctuations on time scales that differ by 3 orders of magnitude, from 1 min up to approximately 10 days. (ii) Correlations in financial time series: While price fluctuations themselves have rapidly decaying correlations, the magnitude of fluctuations measured by either the absolute value or the square of the price fluctuations has correlations that decay as a power-law and persist for several months. (iii) Correlations among different companies: The third result bears on the application of random matrix theory to understand the correlations among price fluctuations of any two different stocks. From a study of the eigenvalue statistics of the cross-correlation matrix constructed from price fluctuations of the leading 1000 stocks, we find that the largest 5-10% of the eigenvalues and
Timing calibration and spectral cleaning of LOFAR time series data
NASA Astrophysics Data System (ADS)
Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.
2016-05-01
We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw time series data for each receiver. The spectral cleaning has a calculated optimal sensitivity corresponding to a power signal-to-noise ratio of 0.08 (or -11 dB) in a spectral window of 25 kHz, for 2 ms of data in 48 antennas. This is well sufficient for our application. Timing calibration across individual antenna pairs has been performed at 0.4 ns precision; for calibration of signal clocks across stations of 48 antennas the precision is 0.1 ns. Monitoring differences in timing calibration per antenna pair over the course of the period 2011 to 2015 shows a precision of 0.08 ns, which is useful for monitoring and correcting drifts in signal path synchronizations. A cross-check method for timing calibration is presented, using a pulse transmitter carried by a drone flying over the array. Timing precision is similar, 0.3 ns, but is limited by transmitter position measurements, while requiring dedicated flights.
Peat conditions mapping using MODIS time series
NASA Astrophysics Data System (ADS)
Poggio, Laura; Gimona, Alessandro; Bruneau, Patricia; Johnson, Sally; McBride, Andrew; Artz, Rebekka
2016-04-01
Large areas of Scotland are covered in peatlands, providing an important sink of carbon in their near natural state but act as a potential source of gaseous and dissolved carbon emission if not in good conditions. Data on the condition of most peatlands in Scotland are, however, scarce and largely confined to sites under nature protection designations, often biased towards sites in better condition. The best information available at present is derived from labour intensive field-based monitoring of relatively few designated sites (Common Standard Monitoring Dataset). In order to provide a national dataset of peat conditions, the available point information from the CSM data was modelled with morphological features and information derived from MODIS sensor. In particular we used time series of indices describing vegetation greenness (Enhanced Vegetation Index), water availability (Normalised Water Difference index), Land Surface Temperature and vegetation productivity (Gross Primary productivity). A scorpan-kriging approach was used, in particular using Generalised Additive Models for the description of the trend. The model provided the probability of a site to be in favourable conditions and the uncertainty of the predictions was taken into account. The internal validation (leave-one-out) provided a mis-classification error of around 0.25. The derived dataset was then used, among others, in the decision making process for the selection of sites for restoration.
NASA Astrophysics Data System (ADS)
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Long-time Behavior of Surface Electromyography Time Series
NASA Astrophysics Data System (ADS)
Vyhnalek, Brian; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2008-10-01
We have previously reported that the mean-square displacement from the sEMG time series xi with i=1,2,..., 2^16 exhibits diffusive behavior for short times, t <˜ 50 ,ms, which is followed by a plateau-like behavior for intermediate times, 50 ,ms <˜ t <˜ 500 ,ms. For long times, t >˜ 500 ,ms, the msd increases as time t increases. We show that the long-time behavior reflects non-stationarity of the signal; we find that for a fixed time interval t=const, the displacement Xs,t= ∑i=0^t-1 xs+i˜μ1 for s [s0, s1] and Xs,t= - μ2 for s [s0, s1]. This property explains the fit of the probability distribution pt( X) = < δ( X - Xs,t)>s as a superposition of two Gaussians that we reported in Physica A 386, 698-709 (2007). Supported by a grant from the Research Corporation [UZ].
The scaling of time series size towards detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Gao, Xiaolei; Ren, Liwei; Shang, Pengjian; Feng, Guochen
2016-06-01
In this paper, we introduce a modification of detrended fluctuation analysis (DFA), called multivariate DFA (MNDFA) method, based on the scaling of time series size N. In traditional DFA method, we obtained the influence of the sequence segmentation interval s, and it inspires us to propose a new model MNDFA to discuss the scaling of time series size towards DFA. The effectiveness of the procedure is verified by numerical experiments with both artificial and stock returns series. Results show that the proposed MNDFA method contains more significant information of series compared to traditional DFA method. The scaling of time series size has an influence on the auto-correlation (AC) in time series. For certain series, we obtain an exponential relationship, and also calculate the slope through the fitting function. Our analysis and finite-size effect test demonstrate that an appropriate choice of the time series size can avoid unnecessary influences, and also make the testing results more accurate.
Practical overview of ARIMA models for time-series forecasting
Pack, D.J.
1980-01-01
Single series analysis methodology is illustrated. The commentary summarizes the Box-Jenkins philosophy and the ARIMA model structure, with particular emphasis on practical aspects of application, forecast interpretation, strengths weaknesses, and comparison to other time series forecasting approaches. (GHT)
From time series to complex networks: the visibility graph.
Lacasa, Lucas; Luque, Bartolo; Ballesteros, Fernando; Luque, Jordi; Nuño, Juan Carlos
2008-04-01
In this work we present a simple and fast computational method, the visibility algorithm, that converts a time series into a graph. The constructed graph inherits several properties of the series in its structure. Thereby, periodic series convert into regular graphs, and random series do so into random graphs. Moreover, fractal series convert into scale-free networks, enhancing the fact that power law degree distributions are related to fractality, something highly discussed recently. Some remarkable examples and analytical tools are outlined to test the method's reliability. Many different measures, recently developed in the complex network theory, could by means of this new approach characterize time series from a new point of view.
NASA Astrophysics Data System (ADS)
Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron
2009-10-01
A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.
Scale-dependent intrinsic entropies of complex time series.
Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E
2016-04-13
Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
Visibility graph network analysis of gold price time series
NASA Astrophysics Data System (ADS)
Long, Yu
2013-08-01
Mapping time series into a visibility graph network, the characteristics of the gold price time series and return temporal series, and the mechanism underlying the gold price fluctuation have been explored from the perspective of complex network theory. The network degree distribution characters, which change from power law to exponent law when the series was shuffled from original sequence, and the average path length characters, which change from L∼lnN into lnL∼lnN as the sequence was shuffled, demonstrate that price series and return series are both long-rang dependent fractal series. The relations of Hurst exponent to the power-law exponent of degree distribution demonstrate that the logarithmic price series is a fractal Brownian series and the logarithmic return series is a fractal Gaussian series. Power-law exponents of degree distribution in a time window changing with window moving demonstrates that a logarithmic gold price series is a multifractal series. The Power-law average clustering coefficient demonstrates that the gold price visibility graph is a hierarchy network. The hierarchy character, in light of the correspondence of graph to price fluctuation, means that gold price fluctuation is a hierarchy structure, which appears to be in agreement with Elliot’s experiential Wave Theory on stock price fluctuation, and the local-rule growth theory of a hierarchy network means that the hierarchy structure of gold price fluctuation originates from persistent, short term factors, such as short term speculation.
Interpretable Early Classification of Multivariate Time Series
ERIC Educational Resources Information Center
Ghalwash, Mohamed F.
2013-01-01
Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…
Simulation of Ground Winds Time Series
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
2008-01-01
A simulation process has been developed for generation of the longitudinal and lateral components of ground wind atmospheric turbulence as a function of mean wind speed, elevation, temporal frequency range and distance between locations. The distance between locations influences the spectral coherence between the simulated series at adjacent locations. Short distances reduce correlation only at high frequencies; as distances increase correlation is reduced over a wider range of frequencies. The choice of values for the constants d1 and d3 in the PSD model is the subject of work in progress. An improved knowledge of the values for zO as a function of wind direction at the ARES-1 launch pads is necessary for definition of d1. Results of other studies at other locations may be helpful as summarized in Fichtl's recent correspondence. Ideally, further research is needed based on measurements of ground wind turbulence with high resolution anemometers at a number of altitudes at a new KSC tower located closer to the ARES-1 launch pad .The proposed research would be based on turbulence measurements that may be influenced by surface terrain roughness that may be significantly different from roughness prior to 1970 in Fichtl's measurements. Significant improvements in instrumentation, data storage end processing will greatly enhance the capability to model ground wind profiles and ground wind turbulence.
Volatility modeling of rainfall time series
NASA Astrophysics Data System (ADS)
Yusof, Fadhilah; Kane, Ibrahim Lawal
2013-07-01
Networks of rain gauges can provide a better insight into the spatial and temporal variability of rainfall, but they tend to be too widely spaced for accurate estimates. A way to estimate the spatial variability of rainfall between gauge points is to interpolate between them. This paper evaluates the spatial autocorrelation of rainfall data in some locations in Peninsular Malaysia using geostatistical technique. The results give an insight on the spatial variability of rainfall in the area, as such, two rain gauges were selected for an in-depth study of the temporal dependence of the rainfall data-generating process. It could be shown that rainfall data are affected by nonlinear characteristics of the variance often referred to as variance clustering or volatility, where large changes tend to follow large changes and small changes tend to follow small changes. The autocorrelation structure of the residuals and the squared residuals derived from autoregressive integrated moving average (ARIMA) models were inspected, the residuals are uncorrelated but the squared residuals show autocorrelation, and the Ljung-Box test confirmed the results. A test based on the Lagrange multiplier principle was applied to the squared residuals from the ARIMA models. The results of this auxiliary test show a clear evidence to reject the null hypothesis of no autoregressive conditional heteroskedasticity (ARCH) effect. Hence, it indicates that generalized ARCH (GARCH) modeling is necessary. An ARIMA error model is proposed to capture the mean behavior and a GARCH model for modeling heteroskedasticity (variance behavior) of the residuals from the ARIMA model. Therefore, the composite ARIMA-GARCH model captures the dynamics of daily rainfall in the study area. On the other hand, seasonal ARIMA model became a suitable model for the monthly average rainfall series of the same locations treated.
How to analyse irregularly sampled geophysical time series?
NASA Astrophysics Data System (ADS)
Eroglu, Deniz; Ozken, Ibrahim; Stemler, Thomas; Marwan, Norbert; Wyrwoll, Karl-Heinz; Kurths, Juergen
2015-04-01
One of the challenges of time series analysis is to detect dynamical changes in the dynamics of the underlying system.There are numerous methods that can be used to detect such regime changes in regular sampled times series. Here we present a new approach, that can be applied, when the time series is irregular sampled. Such data sets occur frequently in real world applications as in paleo climate proxy records. The basic idea follows Victor and Purpura [1] and considers segments of the time series. For each segment we compute the cost of transforming the segment into the following one. If the time series is from one dynamical regime the cost of transformation should be similar for each segment of the data. Dramatic changes in the cost time series indicate a change in the underlying dynamics. Any kind of analysis can be applicable to the cost time series since it is a regularly sampled time series. While recurrence plots are not the best choice for irregular sampled data with some measurement noise component, we show that a recurrence plot analysis based on the cost time series can successfully identify the changes in the dynamics of the system. We tested this method using synthetically created time series and will use these results to highlight the performance of our method. Furthermore we present our analysis of a suite of calcite and aragonite stalagmites located in the eastern Kimberley region of tropical Western Australia. This oxygen isotopic data is a proxy for the monsoon activity over the last 8,000 years. In this time series our method picks up several so far undetected changes from wet to dry in the monsoon system and therefore enables us to get a better understanding of the monsoon dynamics in the North-East of Australia over the last couple of thousand years. [1] J. D. Victor and K. P. Purpura, Network: Computation in Neural Systems 8, 127 (1997)
Horizontal visibility graphs: exact results for random time series.
Luque, B; Lacasa, L; Ballesteros, F; Luque, J
2009-10-01
The visibility algorithm has been recently introduced as a mapping between time series and complex networks. This procedure allows us to apply methods of complex network theory for characterizing time series. In this work we present the horizontal visibility algorithm, a geometrically simpler and analytically solvable version of our former algorithm, focusing on the mapping of random series (series of independent identically distributed random variables). After presenting some properties of the algorithm, we present exact results on the topological properties of graphs associated with random series, namely, the degree distribution, the clustering coefficient, and the mean path length. We show that the horizontal visibility algorithm stands as a simple method to discriminate randomness in time series since any random series maps to a graph with an exponential degree distribution of the shape P(k)=(1/3)(2/3)(k-2), independent of the probability distribution from which the series was generated. Accordingly, visibility graphs with other P(k) are related to nonrandom series. Numerical simulations confirm the accuracy of the theorems for finite series. In a second part, we show that the method is able to distinguish chaotic series from independent and identically distributed (i.i.d.) theory, studying the following situations: (i) noise-free low-dimensional chaotic series, (ii) low-dimensional noisy chaotic series, even in the presence of large amounts of noise, and (iii) high-dimensional chaotic series (coupled map lattice), without needs for additional techniques such as surrogate data or noise reduction methods. Finally, heuristic arguments are given to explain the topological properties of chaotic series, and several sequences that are conjectured to be random are analyzed.
Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models
ERIC Educational Resources Information Center
Price, Larry R.
2012-01-01
The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…
Measurements of spatial population synchrony: influence of time series transformations.
Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël
2015-09-01
Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.
Using Time-Series Regression to Predict Academic Library Circulations.
ERIC Educational Resources Information Center
Brooks, Terrence A.
1984-01-01
Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…
The Prediction of Teacher Turnover Employing Time Series Analysis.
ERIC Educational Resources Information Center
Costa, Crist H.
The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…
Measurements of spatial population synchrony: influence of time series transformations.
Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël
2015-09-01
Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies. PMID:25953116
Nonlinear parametric model for Granger causality of time series
NASA Astrophysics Data System (ADS)
Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano
2006-06-01
The notion of Granger causality between two time series examines if the prediction of one series could be improved by incorporating information of the other. In particular, if the prediction error of the first time series is reduced by including measurements from the second time series, then the second time series is said to have a causal influence on the first one. We propose a radial basis function approach to nonlinear Granger causality. The proposed model is not constrained to be additive in variables from the two time series and can approximate any function of these variables, still being suitable to evaluate causality. Usefulness of this measure of causality is shown in two applications. In the first application, a physiological one, we consider time series of heart rate and blood pressure in congestive heart failure patients and patients affected by sepsis: we find that sepsis patients, unlike congestive heart failure patients, show symmetric causal relationships between the two time series. In the second application, we consider the feedback loop in a model of excitatory and inhibitory neurons: we find that in this system causality measures the combined influence of couplings and membrane time constants.
Complex Network from Pseudoperiodic Time Series: Topology versus Dynamics
NASA Astrophysics Data System (ADS)
Zhang, J.; Small, M.
2006-06-01
We construct complex networks from pseudoperiodic time series, with each cycle represented by a single node in the network. We investigate the statistical properties of these networks for various time series and find that time series with different dynamics exhibit distinct topological structures. Specifically, noisy periodic signals correspond to random networks, and chaotic time series generate networks that exhibit small world and scale free features. We show that this distinction in topological structure results from the hierarchy of unstable periodic orbits embedded in the chaotic attractor. Standard measures of structure in complex networks can therefore be applied to distinguish different dynamic regimes in time series. Application to human electrocardiograms shows that such statistical properties are able to differentiate between the sinus rhythm cardiograms of healthy volunteers and those of coronary care patients.
Sunspot Time Series: Passive and Active Intervals
NASA Astrophysics Data System (ADS)
Zięba, S.; Nieckarz, Z.
2014-07-01
Solar activity slowly and irregularly decreases from the first spotless day (FSD) in the declining phase of the old sunspot cycle and systematically, but also in an irregular way, increases to the new cycle maximum after the last spotless day (LSD). The time interval between the first and the last spotless day can be called the passive interval (PI), while the time interval from the last spotless day to the first one after the new cycle maximum is the related active interval (AI). Minima of solar cycles are inside PIs, while maxima are inside AIs. In this article, we study the properties of passive and active intervals to determine the relation between them. We have found that some properties of PIs, and related AIs, differ significantly between two group of solar cycles; this has allowed us to classify Cycles 8 - 15 as passive cycles, and Cycles 17 - 23 as active ones. We conclude that the solar activity in the PI declining phase (a descending phase of the previous cycle) determines the strength of the approaching maximum in the case of active cycles, while the activity of the PI rising phase (a phase of the ongoing cycle early growth) determines the strength of passive cycles. This can have implications for solar dynamo models. Our approach indicates the important role of solar activity during the declining and the rising phases of the solar-cycle minimum.
Comparison of New and Old Sunspot Number Time Series
NASA Astrophysics Data System (ADS)
Cliver, E. W.
2016-06-01
Four new sunspot number time series have been published in this Topical Issue: a backbone-based group number in Svalgaard and Schatten (Solar Phys., 2016; referred to here as SS, 1610 - present), a group number series in Usoskin et al. (Solar Phys., 2016; UEA, 1749 - present) that employs active day fractions from which it derives an observational threshold in group spot area as a measure of observer merit, a provisional group number series in Cliver and Ling (Solar Phys., 2016; CL, 1841 - 1976) that removed flaws in the Hoyt and Schatten (Solar Phys. 179, 189, 1998a; 181, 491, 1998b) normalization scheme for the original relative group sunspot number ( RG, 1610 - 1995), and a corrected Wolf (international, RI) number in Clette and Lefèvre (Solar Phys., 2016; SN, 1700 - present). Despite quite different construction methods, the four new series agree well after about 1900. Before 1900, however, the UEA time series is lower than SS, CL, and SN, particularly so before about 1885. Overall, the UEA series most closely resembles the original RG series. Comparison of the UEA and SS series with a new solar wind B time series (Owens et al. in J. Geophys. Res., 2016; 1845 - present) indicates that the UEA time series is too low before 1900. We point out incongruities in the Usoskin et al. (Solar Phys., 2016) observer normalization scheme and present evidence that this method under-estimates group counts before 1900. In general, a correction factor time series, obtained by dividing an annual group count series by the corresponding yearly averages of raw group counts for all observers, can be used to assess the reliability of new sunspot number reconstructions.
Detecting unstable periodic orbits from transient chaotic time series
Dhamala; Lai; Kostelich
2000-06-01
We address the detection of unstable periodic orbits from experimentally measured transient chaotic time series. In particular, we examine recurrence times of trajectories in the vector space reconstructed from an ensemble of such time series. Numerical experiments demonstrate that this strategy can yield periodic orbits of low periods even when noise is present. We analyze the probability of finding periodic orbits from transient chaotic time series and derive a scaling law for this probability. The scaling law implies that unstable periodic orbits of high periods are practically undetectable from transient chaos.
Characterizing time series: when Granger causality triggers complex networks
NASA Astrophysics Data System (ADS)
Ge, Tian; Cui, Yindong; Lin, Wei; Kurths, Jürgen; Liu, Chong
2012-08-01
In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIHMassachusetts Institute of Technology-Beth Israel Hospital. human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length.
High Performance Biomedical Time Series Indexes Using Salient Segmentation
Woodbridge, Jonathan; Mortazavi, Bobak; Bui, Alex A.T.; Sarrafzadeh, Majid
2016-01-01
The advent of remote and wearable medical sensing has created a dire need for efficient medical time series databases. Wearable medical sensing devices provide continuous patient monitoring by various types of sensors and have the potential to create massive amounts of data. Therefore, time series databases must utilize highly optimized indexes in order to efficiently search and analyze stored data. This paper presents a highly efficient technique for indexing medical time series signals using Locality Sensitive Hashing (LSH). Unlike previous work, only salient (or interesting) segments are inserted into the index. This technique reduces search times by up to 95% while yielding near identical search results. PMID:23367072
DEM time series of an agricultural watershed
NASA Astrophysics Data System (ADS)
Pineux, Nathalie; Lisein, Jonathan; Swerts, Gilles; Degré, Aurore
2014-05-01
In agricultural landscape soil surface evolves notably due to erosion and deposition phenomenon. Even if most of the field data come from plot scale studies, the watershed scale seems to be more appropriate to understand them. Currently, small unmanned aircraft systems and images treatments are improving. In this way, 3D models are built from multiple covering shots. When techniques for large areas would be to expensive for a watershed level study or techniques for small areas would be too time consumer, the unmanned aerial system seems to be a promising solution to quantify the erosion and deposition patterns. The increasing technical improvements in this growth field allow us to obtain a really good quality of data and a very high spatial resolution with a high Z accuracy. In the center of Belgium, we equipped an agricultural watershed of 124 ha. For three years (2011-2013), we have been monitoring weather (including rainfall erosivity using a spectropluviograph), discharge at three different locations, sediment in runoff water, and watershed microtopography through unmanned airborne imagery (Gatewing X100). We also collected all available historical data to try to capture the "long-term" changes in watershed morphology during the last decades: old topography maps, soil historical descriptions, etc. An erosion model (LANDSOIL) is also used to assess the evolution of the relief. Short-term evolution of the surface are now observed through flights done at 200m height. The pictures are taken with a side overlap equal to 80%. To precisely georeference the DEM produced, ground control points are placed on the study site and surveyed using a Leica GPS1200 (accuracy of 1cm for x and y coordinates and 1.5cm for the z coordinate). Flights are done each year in December to have an as bare as possible ground surface. Specific treatments are developed to counteract vegetation effect because it is know as key sources of error in the DEM produced by small unmanned aircraft
Sensor-Generated Time Series Events: A Definition Language
Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan
2012-01-01
There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.
Clustering Financial Time Series by Network Community Analysis
NASA Astrophysics Data System (ADS)
Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio
In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.
Time series modeling of system self-assessment of survival
Lu, H.; Kolarik, W.J.
1999-06-01
Self-assessment of survival for a system, subsystem or component is implemented by assessing conditional performance reliability in real-time, which includes modeling and analysis of physical performance data. This paper proposes a time series analysis approach to system self-assessment (prediction) of survival. In the approach, physical performance data are modeled in a time series. The performance forecast is based on the model developed and is converted to the reliability of system survival. In contrast to a standard regression model, a time series model, using on-line data, is suitable for the real-time performance prediction. This paper illustrates an example of time series modeling and survival assessment, regarding an excessive tool edge wear failure mode for a twist drill operation.
Model-free quantification of time-series predictability
NASA Astrophysics Data System (ADS)
Garland, Joshua; James, Ryan; Bradley, Elizabeth
2014-11-01
This paper provides insight into when, why, and how forecast strategies fail when they are applied to complicated time series. We conjecture that the inherent complexity of real-world time-series data, which results from the dimension, nonlinearity, and nonstationarity of the generating process, as well as from measurement issues such as noise, aggregation, and finite data length, is both empirically quantifiable and directly correlated with predictability. In particular, we argue that redundancy is an effective way to measure complexity and predictive structure in an experimental time series and that weighted permutation entropy is an effective way to estimate that redundancy. To validate these conjectures, we study 120 different time-series data sets. For each time series, we construct predictions using a wide variety of forecast models, then compare the accuracy of the predictions with the permutation entropy of that time series. We use the results to develop a model-free heuristic that can help practitioners recognize when a particular prediction method is not well matched to the task at hand: that is, when the time series has more predictive structure than that method can capture and exploit.
Database for Hydrological Time Series of Inland Waters (DAHITI)
NASA Astrophysics Data System (ADS)
Schwatke, Christian; Dettmering, Denise
2016-04-01
Satellite altimetry was designed for ocean applications. However, since some years, satellite altimetry is also used over inland water to estimate water level time series of lakes, rivers and wetlands. The resulting water level time series can help to understand the water cycle of system earth and makes altimetry to a very useful instrument for hydrological applications. In this poster, we introduce the "Database for Hydrological Time Series of Inland Waters" (DAHITI). Currently, the database contains about 350 water level time series of lakes, reservoirs, rivers, and wetlands which are freely available after a short registration process via http://dahiti.dgfi.tum.de. In this poster, we introduce the product of DAHITI and the functionality of the DAHITI web service. Furthermore, selected examples of inland water targets are presented in detail. DAHITI provides time series of water level heights of inland water bodies and their formal errors . These time series are available within the period of 1992-2015 and have varying temporal resolutions depending on the data coverage of the investigated water body. The accuracies of the water level time series depend mainly on the extent of the investigated water body and the quality of the altimeter measurements. Hereby, an external validation with in-situ data reveals RMS differences between 5 cm and 40 cm for lakes and 10 cm and 140 cm for rivers, respectively.
Model-free quantification of time-series predictability.
Garland, Joshua; James, Ryan; Bradley, Elizabeth
2014-11-01
This paper provides insight into when, why, and how forecast strategies fail when they are applied to complicated time series. We conjecture that the inherent complexity of real-world time-series data, which results from the dimension, nonlinearity, and nonstationarity of the generating process, as well as from measurement issues such as noise, aggregation, and finite data length, is both empirically quantifiable and directly correlated with predictability. In particular, we argue that redundancy is an effective way to measure complexity and predictive structure in an experimental time series and that weighted permutation entropy is an effective way to estimate that redundancy. To validate these conjectures, we study 120 different time-series data sets. For each time series, we construct predictions using a wide variety of forecast models, then compare the accuracy of the predictions with the permutation entropy of that time series. We use the results to develop a model-free heuristic that can help practitioners recognize when a particular prediction method is not well matched to the task at hand: that is, when the time series has more predictive structure than that method can capture and exploit.
Estimation of Parameters from Discrete Random Nonstationary Time Series
NASA Astrophysics Data System (ADS)
Takayasu, H.; Nakamura, T.
For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.
Wavelet analysis and scaling properties of time series
NASA Astrophysics Data System (ADS)
Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.
2005-10-01
We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.
Time Series Analysis and Prediction of AE and Dst Data
NASA Astrophysics Data System (ADS)
Takalo, J.; Lohikiski, R.; Timonen, J.; Lehtokangas, M.; Kaski, K.
1996-12-01
A new method to analyse the structure function has been constructed and used in the analysis of the AE time series for the years 1978-85 and Dst time series for 1957-84. The structure function (SF) was defined by S(l) = <|x(ti + lDt) - x(ti)|>, where Dt is the sampling time, l is an integer, and <|.|> denotes the average of absolute values. If a time series is self-affine its SF should scale for small values of l as S(l) is proportional to lH, where 0 < H < 1 is called the scaling exponent. It is known that for power-law (coloured) noise, which has P ~ f-a, a ~ 2H + 1 for 1 < a < 3. In this work the scaling exponent H was analysed by considering the local slopes dlog(S(l))/dlog(l) between two adjacent points as a function of l. For self-affine time series the local slopes should stay constant, at least for small values of l. The AE time series was found to be affine such that the scaling exponent changes at a time scale of 113 (+/-9) minutes. On the other hand, in the SF function analysis, the Dst data were dominated by the 24-hour and 27-day periods. The 27-day period was further modulated by the annual variation. These differences between the two time series arise from the difference in their periodicities in relation to their respective characteristic time scales. In the AE data the dominating periods are longer than that related to the characteristic time scale, i.e. they appear in the flatter part of the power spectrum. This is why the affinity is the dominating feature of the AE time series. In contrast with this the dominating periods of the Dst data are shorter than the characteristic time scale, and appear in the steeper part of the spectrum. Consequently periodicity is the dominating feature of the Dst data. Because of their different dynamic characteristics, prediction of Dst and AE time series appear to presuppose rather different approaches. In principle it is easier to produce the gross features of the Dst time series correctly as it is periodicity
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
The use of synthetic input sequences in time series modeling
NASA Astrophysics Data System (ADS)
de Oliveira, Dair José; Letellier, Christophe; Gomes, Murilo E. D.; Aguirre, Luis A.
2008-08-01
In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.
Time Series Analysis of Insar Data: Methods and Trends
NASA Technical Reports Server (NTRS)
Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique
2015-01-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
Comparison of New and Old Sunspot Number Time Series
NASA Astrophysics Data System (ADS)
Cliver, Edward W.; Clette, Frédéric; Lefévre, Laure; Svalgaard, Leif
2016-05-01
As a result of the Sunspot Number Workshops, five new sunspot series have recently been proposed: a revision of the original Wolf or international sunspot number (Lockwood et al., 2014), a backbone-based group sunspot number (Svalgaard and Schatten, 2016), a revised group number series that employs active day fractions (Usoskin et al., 2016), a provisional group sunspot number series (Cliver and Ling, 2016) that removes flaws in the normalization scheme for the original group sunspot number (Hoyt and Schatten,1998), and a revised Wolf or international number (termed SN) published on the SILSO website as a replacement for the original Wolf number (Clette and Lefèvre, 2016; thttp://www.sidc.be/silso/datafiles). Despite quite different construction methods, the five new series agree reasonably well after about 1900. From 1750 to ~1875, however, the Lockwood et al. and Usoskin et al. time series are lower than the other three series. Analysis of the Hoyt and Schatten normalization factors used to scale secondary observers to their Royal Greenwich Observatory primary observer reveals a significant inhomogeneity spanning the divergence in ~1885 of the group number from the original Wolf number. In general, a correction factor time series, obtained by dividing an annual group count series by the corresponding yearly averages of raw group counts for all observers, can be used to assess the reliability of new sunspot number reconstructions.
A method for detecting changes in long time series
Downing, D.J.; Lawkins, W.F.; Morris, M.D.; Ostrouchov, G.
1995-09-01
Modern scientific activities, both physical and computational, can result in time series of many thousands or even millions of data values. Here the authors describe a statistically motivated algorithm for quick screening of very long time series data for the presence of potentially interesting but arbitrary changes. The basic data model is a stationary Gaussian stochastic process, and the approach to detecting a change is the comparison of two predictions of the series at a time point or contiguous collection of time points. One prediction is a ``forecast``, i.e. based on data from earlier times, while the other a ``backcast``, i.e. based on data from later times. The statistic is the absolute value of the log-likelihood ratio for these two predictions, evaluated at the observed data. A conservative procedure is suggested for specifying critical values for the statistic under the null hypothesis of ``no change``.
Symplectic geometry spectrum regression for prediction of noisy time series.
Xie, Hong-Bo; Dokos, Socrates; Sivakumar, Bellie; Mengersen, Kerrie
2016-05-01
We present the symplectic geometry spectrum regression (SGSR) technique as well as a regularized method based on SGSR for prediction of nonlinear time series. The main tool of analysis is the symplectic geometry spectrum analysis, which decomposes a time series into the sum of a small number of independent and interpretable components. The key to successful regularization is to damp higher order symplectic geometry spectrum components. The effectiveness of SGSR and its superiority over local approximation using ordinary least squares are demonstrated through prediction of two noisy synthetic chaotic time series (Lorenz and Rössler series), and then tested for prediction of three real-world data sets (Mississippi River flow data and electromyographic and mechanomyographic signal recorded from human body). PMID:27300890
Symplectic geometry spectrum regression for prediction of noisy time series
NASA Astrophysics Data System (ADS)
Xie, Hong-Bo; Dokos, Socrates; Sivakumar, Bellie; Mengersen, Kerrie
2016-05-01
We present the symplectic geometry spectrum regression (SGSR) technique as well as a regularized method based on SGSR for prediction of nonlinear time series. The main tool of analysis is the symplectic geometry spectrum analysis, which decomposes a time series into the sum of a small number of independent and interpretable components. The key to successful regularization is to damp higher order symplectic geometry spectrum components. The effectiveness of SGSR and its superiority over local approximation using ordinary least squares are demonstrated through prediction of two noisy synthetic chaotic time series (Lorenz and Rössler series), and then tested for prediction of three real-world data sets (Mississippi River flow data and electromyographic and mechanomyographic signal recorded from human body).
Comparison of time series using entropy and mutual correlation
NASA Astrophysics Data System (ADS)
Madonna, Fabio; Rosoldi, Marco
2015-04-01
The potential for redundant time series to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. Moreover, comparison among time series of in situ and ground based remote sensing measurements have been performed using several methods, but quite often relying on linear models. In this work, the concepts of entropy (H) and mutual correlation (MC), defined in the frame of the information theory, are applied to the study of essential climate variables with the aim of characterizing the uncertainty of a time series and the redundancy of collocated measurements provided by different surface-based techniques. In particular, integrated water vapor (IWV) and water vapour mixing ratio times series obtained at five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations with several sensors (e.g radiosondes, GPS, microwave and infrared radiometers, Raman lidar), in the period from 2010-2012, are analyzed in terms of H and MC. The comparison between the probability density functions of the time series shows that caution in using linear assumptions is needed and the use of statistics, like entropy, that are robust to outliers, is recommended to investigate measurements time series. Results reveals that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8 % over the considered time period. Comparisons of the time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by 60% by constraining the measurements with those from
Correlation measure to detect time series distances, whence economy globalization
NASA Astrophysics Data System (ADS)
Miśkiewicz, Janusz; Ausloos, Marcel
2008-11-01
An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.
Forecasting Smoothed Non-Stationary Time Series Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Norouzzadeh, P.; Rahmani, B.; Norouzzadeh, M. S.
We introduce kernel smoothing method to extract the global trend of a time series and remove short time scales variations and fluctuations from it. A multifractal detrended fluctuation analysis (MF-DFA) shows that the multifractality nature of TEPIX returns time series is due to both fatness of the probability density function of returns and long range correlations between them. MF-DFA results help us to understand how genetic algorithm and kernel smoothing methods act. Then we utilize a recently developed genetic algorithm for carrying out successful forecasts of the trend in financial time series and deriving a functional form of Tehran price index (TEPIX) that best approximates the time variability of it. The final model is mainly dominated by a linear relationship with the most recent past value, while contributions from nonlinear terms to the total forecasting performance are rather small.
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data
Evaluation of Scaling Invariance Embedded in Short Time Series
Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping
2014-01-01
Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356
Evaluation of scaling invariance embedded in short time series.
Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping
2014-01-01
Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.
Detection of flood events in hydrological discharge time series
NASA Astrophysics Data System (ADS)
Seibert, S. P.; Ehret, U.
2012-04-01
The shortcomings of mean-squared-error (MSE) based distance metrics are well known (Beran 1999, Schaeffli & Gupta 2007) and the development of novel distance metrics (Pappenberger & Beven 2004, Ehret & Zehe 2011) and multi-criteria-approaches enjoy increasing popularity (Reusser 2009, Gupta et al. 2009). Nevertheless, the hydrological community still lacks metrics which identify and thus, allow signature based evaluations of hydrological discharge time series. Signature based information/evaluations are required wherever specific time series features, such as flood events, are of special concern. Calculation of event based runoff coefficients or precise knowledge on flood event characteristics (like onset or duration of rising limp or the volume of falling limp, etc.) are possible applications. The same applies for flood forecasting/simulation models. Directly comparing simulated and observed flood event features may reveal thorough insights into model dynamics. Compared to continuous space-and-time-aggregated distance metrics, event based evaluations may provide answers like the distributions of event characteristics or the percentage of the events which were actually reproduced by a hydrological model. It also may help to provide information on the simulation accuracy of small, medium and/or large events in terms of timing and magnitude. However, the number of approaches which expose time series features is small and their usage is limited to very specific questions (Merz & Blöschl 2009, Norbiato et al. 2009). We believe this is due to the following reasons: i) a generally accepted definition of the signature of interest is missing or difficult to obtain (in our case: what makes a flood event a flood event?) and/or ii) it is difficult to translate such a definition into a equation or (graphical) procedure which exposes the feature of interest in the discharge time series. We reviewed approaches which detect event starts and/or ends in hydrological discharge time
Stochastic modeling of hourly rainfall times series in Campania (Italy)
NASA Astrophysics Data System (ADS)
Giorgio, M.; Greco, R.
2009-04-01
Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil
Compounding approach for univariate time series with nonstationary variances
NASA Astrophysics Data System (ADS)
Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich
2015-12-01
A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.
Generalized Dynamic Factor Models for Mixed-Measurement Time Series
Cui, Kai; Dunson, David B.
2013-01-01
In this article, we propose generalized Bayesian dynamic factor models for jointly modeling mixed-measurement time series. The framework allows mixed-scale measurements associated with each time series, with different measurements having different distributions in the exponential family conditionally on time-varying latent factor(s). Efficient Bayesian computational algorithms are developed for posterior inference on both the latent factors and model parameters, based on a Metropolis Hastings algorithm with adaptive proposals. The algorithm relies on a Greedy Density Kernel Approximation (GDKA) and parameter expansion with latent factor normalization. We tested the framework and algorithms in simulated studies and applied them to the analysis of intertwined credit and recovery risk for Moody’s rated firms from 1982–2008, illustrating the importance of jointly modeling mixed-measurement time series. The article has supplemental materials available online. PMID:24791133
Orbital maneuver optimization using time-explicit power series
NASA Astrophysics Data System (ADS)
Thorne, James D.
2011-05-01
Orbital maneuver transfer time optimization is traditionally accomplished using direct numerical sampling to find the mission design with the lowest delta-v requirements. The availability of explicit time series solutions to the Lambert orbit determination problem allows for the total delta-v of a series of orbital maneuvers to be expressed as an algebraic function of only the individual transfer times. The delta-v function is then minimized for a series of maneuvers by finding the optimal transfer times for each orbital arc. Results are shown for the classical example of the Hohmann transfer, a noncoplanar transfer as well as an interplanetary fly-by mission to the asteroids Pallas and Juno.
Minimum entropy density method for the time series analysis
NASA Astrophysics Data System (ADS)
Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae
2009-01-01
The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
Wavelet analysis for non-stationary, nonlinear time series
NASA Astrophysics Data System (ADS)
Schulte, Justin A.
2016-08-01
Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the quasi-biennial oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16 and 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.
Multitask Gaussian processes for multivariate physiological time-series analysis.
Dürichen, Robert; Pimentel, Marco A F; Clifton, Lei; Schweikard, Achim; Clifton, David A
2015-01-01
Gaussian process (GP) models are a flexible means of performing nonparametric Bayesian regression. However, GP models in healthcare are often only used to model a single univariate output time series, denoted as single-task GPs (STGP). Due to an increasing prevalence of sensors in healthcare settings, there is an urgent need for robust multivariate time-series tools. Here, we propose a method using multitask GPs (MTGPs) which can model multiple correlated multivariate physiological time series simultaneously. The flexible MTGP framework can learn the correlation between multiple signals even though they might be sampled at different frequencies and have training sets available for different intervals. Furthermore, prior knowledge of any relationship between the time series such as delays and temporal behavior can be easily integrated. A novel normalization is proposed to allow interpretation of the various hyperparameters used in the MTGP. We investigate MTGPs for physiological monitoring with synthetic data sets and two real-world problems from the field of patient monitoring and radiotherapy. The results are compared with standard Gaussian processes and other existing methods in the respective biomedical application areas. In both cases, we show that our framework learned the correlation between physiological time series efficiently, outperforming the existing state of the art.
Nonlinear Analysis of Surface EMG Time Series of Back Muscles
NASA Astrophysics Data System (ADS)
Dolton, Donald C.; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2004-10-01
A nonlinear analysis of surface electromyography time series of subjects with and without low back pain is presented. The mean-square displacement and entropy shows anomalous diffusive behavior on intermediate time range 10 ms < t < 1 s. This behavior implies the presence of correlations in the signal. We discuss the shape of the power spectrum of the signal.
MODIS Vegetation Indices time series improvement considering real acquisition dates
NASA Astrophysics Data System (ADS)
Testa, S.; Borgogno Mondino, E.
2013-12-01
Satellite Vegetation Indices (VI) time series images are widely used for the characterization phenology, which requires a high temporal accuracy of the satellite data. The present work is based on the MODerate resolution Imaging Spectroradiometer (MODIS) MOD13Q1 product - Vegetation Indices 16-Day L3 Global 250m, which is generated through a maximum value compositing process that reduces the number of cloudy pixels and excludes, when possible, off-nadir ones. Because of its 16-days compositing period, the distance between two adjacent-in-time values within each pixel NDVI time series can range from 1 to 32 days, thus not acceptable for phenologic studies. Moreover, most of the available smoothing algorithms, which are widely used for phenology characterization, assume that data points are equidistant in time and contemporary over the image. The objective of this work was to assess temporal features of NDVI time series over a test area, composed by Castanea sativa (chestnut) and Fagus sylvatica (beech) pure pixels within the Piemonte region in Northwestern Italy. Firstly, NDVI, Pixel Reliability (PR) and Composite Day of the Year (CDOY) data ranging from 2000 to 2011 were extracted from MOD13Q1 and corresponding time series were generated (in further computations, 2000 was not considered since it is not complete because acquisition began in February and calibration is unreliable until October). Analysis of CDOY time series (containing the actual reference date of each NDVI value) over the selected study areas showed NDVI values to be prevalently generated from data acquired at the centre of each 16-days period (the 9th day), at least constantly along the year. This leads to consider each original NDVI value nominally placed to the centre of its 16-days reference period. Then, a new NDVI time series was generated: a) moving each NDVI value to its actual "acquisition" date, b) interpolating the obtained temporary time series through SPLINE functions, c) sampling such
A bag-of-features framework to classify time series.
Baydogan, Mustafa Gokce; Runger, George; Tuv, Eugene
2013-11-01
Time series classification is an important task with many challenging applications. A nearest neighbor (NN) classifier with dynamic time warping (DTW) distance is a strong solution in this context. On the other hand, feature-based approaches have been proposed as both classifiers and to provide insight into the series, but these approaches have problems handling translations and dilations in local patterns. Considering these shortcomings, we present a framework to classify time series based on a bag-of-features representation (TSBF). Multiple subsequences selected from random locations and of random lengths are partitioned into shorter intervals to capture the local information. Consequently, features computed from these subsequences measure properties at different locations and dilations when viewed from the original series. This provides a feature-based approach that can handle warping (although differently from DTW). Moreover, a supervised learner (that handles mixed data types, different units, etc.) integrates location information into a compact codebook through class probability estimates. Additionally, relevant global features can easily supplement the codebook. TSBF is compared to NN classifiers and other alternatives (bag-of-words strategies, sparse spatial sample kernels, shapelets). Our experimental results show that TSBF provides better results than competitive methods on benchmark datasets from the UCR time series database.
Time series, correlation matrices and random matrix models
Vinayak; Seligman, Thomas H.
2014-01-08
In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series. By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.
On fractal analysis of cardiac interbeat time series
NASA Astrophysics Data System (ADS)
Guzmán-Vargas, L.; Calleja-Quevedo, E.; Angulo-Brown, F.
2003-09-01
In recent years the complexity of a cardiac beat-to-beat time series has been taken as an auxiliary tool to identify the health status of human hearts. Several methods has been employed to characterize the time series complexity. In this work we calculate the fractal dimension of interbeat time series arising from three groups: 10 young healthy persons, 8 elderly healthy persons and 10 patients with congestive heart failures. Our numerical results reflect evident differences in the dynamic behavior corresponding to each group. We discuss these results within the context of the neuroautonomic control of heart rate dynamics. We also propose a numerical simulation which reproduce aging effects of heart rate behavior.
Characterizing Complex Time Series from the Scaling of Prediction Error.
NASA Astrophysics Data System (ADS)
Hinrichs, Brant Eric
This thesis concerns characterizing complex time series from the scaling of prediction error. We use the global modeling technique of radial basis function approximation to build models from a state-space reconstruction of a time series that otherwise appears complicated or random (i.e. aperiodic, irregular). Prediction error as a function of prediction horizon is obtained from the model using the direct method. The relationship between the underlying dynamics of the time series and the logarithmic scaling of prediction error as a function of prediction horizon is investigated. We use this relationship to characterize the dynamics of both a model chaotic system and physical data from the optic tectum of an attentive pigeon exhibiting the important phenomena of nonstationary neuronal oscillations in response to visual stimuli.
Detection of "noisy" chaos in a time series
NASA Technical Reports Server (NTRS)
Chon, K. H.; Kanters, J. K.; Cohen, R. J.; Holstein-Rathlou, N. H.
1997-01-01
Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both the internal dynamics of the systems, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series, and if this determinism has chaotic attributes. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations, and applied to heart rate variability data.
Improvements in Accurate GPS Positioning Using Time Series Analysis
NASA Astrophysics Data System (ADS)
Koyama, Yuichiro; Tanaka, Toshiyuki
Although the Global Positioning System (GPS) is used widely in car navigation systems, cell phones, surveying, and other areas, several issues still exist. We focus on the continuous data received in public use of GPS, and propose a new positioning algorithm that uses time series analysis. By fitting an autoregressive model to the time series model of the pseudorange, we propose an appropriate state-space model. We apply the Kalman filter to the state-space model and use the pseudorange estimated by the filter in our positioning calculations. The results of the authors' positioning experiment show that the accuracy of the proposed method is much better than that of the standard method. In addition, as we can obtain valid values estimated by time series analysis using the state-space model, the proposed state-space model can be applied to several other fields.
A multidisciplinary database for geophysical time series management
NASA Astrophysics Data System (ADS)
Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.
2013-12-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
Detecting multiple breaks in geodetic time series using indicator saturation.
NASA Astrophysics Data System (ADS)
Jackson, Luke; Pretis, Felix
2016-04-01
Identifying the timing and magnitude of breaks in geodetic time series has been the source of much discussion. Instruments recording different geophysical phenomena may record long term trends, quasi-periodic signals at a variety of time scales from days to decades, and sudden breaks due to natural or anthropogenic causes, ranging from instrument replacement to earthquakes. Records can not always be relied upon to be continuous in time, yet one may desire to accurately bridge gaps without performing interpolation. We apply the novel Indicator Saturation (IS) method to identify breaks in a synthetic GPS time series used for the Detection of Offsets in GPS Experiments (DOGEX). The IS approach differs from alternative break detection methods by considering every point in the time series as a break, until it is demonstrated statistically that it is not. Saturating a model with a full set of break functions and removing all but significant ones, formulates the detection of breaks as a problem of model selection. This allows multiple breaks of different forms (from impulses, to shifts in the mean, and changing trends) without requiring a minimum break-length to be detected, while simultaneously modelling any underlying variation driven by additional covariates. To address selection bias in the coefficients, we demonstrate the bias-corrected estimates of break coefficients when using step-shifts in the mean of the modelled time-series. The regimes of the time-varying mean of the time-series (the `coefficient path' of the intercept determined by the detected breaks) can be used to conduct hypothesis tests on whether subsequent shifts offset each other - for example whether a measurement change induces a temporary bias rather than a permanent one. We explore this non-classical analysis method to see if it can bring about the sub millimetre errors in long term rates of land motion currently required by the GPS community.
Microbial oceanography and the Hawaii Ocean Time-series programme.
Karl, David M; Church, Matthew J
2014-10-01
The Hawaii Ocean Time-series (HOT) programme has been tracking microbial and biogeochemical processes in the North Pacific Subtropical Gyre since October 1988. The near-monthly time series observations have revealed previously undocumented phenomena within a temporally dynamic ecosystem that is vulnerable to climate change. Novel microorganisms, genes and unexpected metabolic pathways have been discovered and are being integrated into our evolving ecological paradigms. Continued research, including higher-frequency observations and at-sea experimentation, will help to provide a comprehensive scientific understanding of microbial processes in the largest biome on Earth.
Application of nonlinear time series models to driven systems
Hunter, N.F. Jr.
1990-01-01
In our laboratory we have been engaged in an effort to model nonlinear systems using time series methods. Our objectives have been, first, to understand how the time series response of a nonlinear system unfolds as a function of the underlying state variables, second, to model the evolution of the state variables, and finally, to predict nonlinear system responses. We hope to address the relationship between model parameters and system parameters in the near future. Control of nonlinear systems based on experimentally derived parameters is also a planned topic of future research. 28 refs., 15 figs., 2 tabs.
Microbial oceanography and the Hawaii Ocean Time-series programme.
Karl, David M; Church, Matthew J
2014-10-01
The Hawaii Ocean Time-series (HOT) programme has been tracking microbial and biogeochemical processes in the North Pacific Subtropical Gyre since October 1988. The near-monthly time series observations have revealed previously undocumented phenomena within a temporally dynamic ecosystem that is vulnerable to climate change. Novel microorganisms, genes and unexpected metabolic pathways have been discovered and are being integrated into our evolving ecological paradigms. Continued research, including higher-frequency observations and at-sea experimentation, will help to provide a comprehensive scientific understanding of microbial processes in the largest biome on Earth. PMID:25157695
Causal inference with multiple time series: principles and problems.
Eichler, Michael
2013-08-28
I review the use of the concept of Granger causality for causal inference from time-series data. First, I give a theoretical justification by relating the concept to other theoretical causality measures. Second, I outline possible problems with spurious causality and approaches to tackle these problems. Finally, I sketch an identification algorithm that learns causal time-series structures in the presence of latent variables. The description of the algorithm is non-technical and thus accessible to applied scientists who are interested in adopting the method.
Period04: Statistical analysis of large astronomical time series
NASA Astrophysics Data System (ADS)
Lenz, Patrick; Breger, Michel
2014-07-01
Period04 statistically analyzes large astronomical time series containing gaps. It calculates formal uncertainties, can extract the individual frequencies from the multiperiodic content of time series, and provides a flexible interface to perform multiple-frequency fits with a combination of least-squares fitting and the discrete Fourier transform algorithm. Period04, written in Java/C++, supports the SAMP communication protocol to provide interoperability with other applications of the Virtual Observatory. It is a reworked and extended version of Period98 (Sperl 1998) and PERIOD/PERDET (Breger 1990).
Adaptive median filtering for preprocessing of time series measurements
NASA Technical Reports Server (NTRS)
Paunonen, Matti
1993-01-01
A median (L1-norm) filtering program using polynomials was developed. This program was used in automatic recycling data screening. Additionally, a special adaptive program to work with asymmetric distributions was developed. Examples of adaptive median filtering of satellite laser range observations and TV satellite time measurements are given. The program proved to be versatile and time saving in data screening of time series measurements.
Kālī: Time series data modeler
NASA Astrophysics Data System (ADS)
Kasliwal, Vishal P.
2016-07-01
The fully parallelized and vectorized software package Kālī models time series data using various stochastic processes such as continuous-time ARMA (C-ARMA) processes and uses Bayesian Markov Chain Monte-Carlo (MCMC) for inferencing a stochastic light curve. Kālimacr; is written in c++ with Python language bindings for ease of use. K¯lī is named jointly after the Hindu goddess of time, change, and power and also as an acronym for KArma LIbrary.
Power Computations in Time Series Analyses for Traffic Safety Interventions
McLeod, A. Ian; Vingilis, E. R.
2008-01-01
The evaluation of traffic safety interventions or other policies that can affect road safety often requires the collection of administrative time series data, such as monthly motor vehicle collision data that may be difficult and/or expensive to collect. Furthermore, since policy decisions may be based on the results found from the intervention analysis of the policy, it is important to ensure that the statistical tests have enough power, that is, that we have collected enough time series data both before and after the intervention so that a meaningful change in the series will likely be detected. In this short paper we present a simple methodology for doing this. It is expected that the methodology presented will be useful for sample size determination in a wide variety of traffic safety intervention analysis applications. Our method is illustrated with a proposed traffic safety study that was funded by NIH. PMID:18460394
Learning time series evolution by unsupervised extraction of correlations
Deco, G.; Schuermann, B. )
1995-03-01
As a consequence, we are able to model chaotic and nonchaotic time series. Furthermore, one critical point in modeling time series is the determination of the dimension of the embedding vector used, i.e., the number of components of the past that are needed to predict the future. With this method we can detect the embedding dimension by extracting the influence of the past on the future, i.e., the correlation of remote past and future. Optimal embedding dimensions are obtained for the Henon map and the Mackey-Glass series. When noisy data corrupted by colored noise are used, a model is still possible. The noise will then be decorrelated by the network. In the case of modeling a chemical reaction, the most natural architecture that conserves the volume is a symplectic network which describes a system that conserves the entropy and therefore the transmitted information.
A multiscale statistical model for time series forecasting
NASA Astrophysics Data System (ADS)
Wang, W.; Pollak, I.
2007-02-01
We propose a stochastic grammar model for random-walk-like time series that has features at several temporal scales. We use a tree structure to model these multiscale features. The inside-outside algorithm is used to estimate the model parameters. We develop an algorithm to forecast the sign of the first difference of a time series. We illustrate the algorithm using log-price series of several stocks and compare with linear prediction and a neural network approach. We furthermore illustrate our algorithm using synthetic data and show that it significantly outperforms both the linear predictor and the neural network. The construction of our synthetic data indicates what types of signals our algorithm is well suited for.
Segmentation of time series with long-range fractal correlations
Bernaola-Galván, P.; Oliver, J.L.; Hackenberg, M.; Coronado, A.V.; Ivanov, P.Ch.; Carpena, P.
2012-01-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome. PMID:23645997
Recovery of delay time from time series based on the nearest neighbor method
NASA Astrophysics Data System (ADS)
Prokhorov, M. D.; Ponomarenko, V. I.; Khorev, V. S.
2013-12-01
We propose a method for the recovery of delay time from time series of time-delay systems. The method is based on the nearest neighbor analysis. The method allows one to reconstruct delays in various classes of time-delay systems including systems of high order, systems with several coexisting delays, and nonscalar time-delay systems. It can be applied to time series heavily corrupted by additive and dynamical noise.
Ocean time-series near Bermuda: Hydrostation S and the US JGOFS Bermuda Atlantic time-series study
NASA Technical Reports Server (NTRS)
Michaels, Anthony F.; Knap, Anthony H.
1992-01-01
Bermuda is the site of two ocean time-series programs. At Hydrostation S, the ongoing biweekly profiles of temperature, salinity and oxygen now span 37 years. This is one of the longest open-ocean time-series data sets and provides a view of decadal scale variability in ocean processes. In 1988, the U.S. JGOFS Bermuda Atlantic Time-series Study began a wide range of measurements at a frequency of 14-18 cruises each year to understand temporal variability in ocean biogeochemistry. On each cruise, the data range from chemical analyses of discrete water samples to data from electronic packages of hydrographic and optics sensors. In addition, a range of biological and geochemical rate measurements are conducted that integrate over time-periods of minutes to days. This sampling strategy yields a reasonable resolution of the major seasonal patterns and of decadal scale variability. The Sargasso Sea also has a variety of episodic production events on scales of days to weeks and these are only poorly resolved. In addition, there is a substantial amount of mesoscale variability in this region and some of the perceived temporal patterns are caused by the intersection of the biweekly sampling with the natural spatial variability. In the Bermuda time-series programs, we have added a series of additional cruises to begin to assess these other sources of variation and their impacts on the interpretation of the main time-series record. However, the adequate resolution of higher frequency temporal patterns will probably require the introduction of new sampling strategies and some emerging technologies such as biogeochemical moorings and autonomous underwater vehicles.
A data mining framework for time series estimation.
Hu, Xiao; Xu, Peng; Wu, Shaozhi; Asgari, Shadnaz; Bergsneider, Marvin
2010-04-01
Time series estimation techniques are usually employed in biomedical research to derive variables less accessible from a set of related and more accessible variables. These techniques are traditionally built from systems modeling approaches including simulation, blind decovolution, and state estimation. In this work, we define target time series (TTS) and its related time series (RTS) as the output and input of a time series estimation process, respectively. We then propose a novel data mining framework for time series estimation when TTS and RTS represent different sets of observed variables from the same dynamic system. This is made possible by mining a database of instances of TTS, its simultaneously recorded RTS, and the input/output dynamic models between them. The key mining strategy is to formulate a mapping function for each TTS-RTS pair in the database that translates a feature vector extracted from RTS to the dissimilarity between true TTS and its estimate from the dynamic model associated with the same TTS-RTS pair. At run time, a feature vector is extracted from an inquiry RTS and supplied to the mapping function associated with each TTS-RTS pair to calculate a dissimilarity measure. An optimal TTS-RTS pair is then selected by analyzing these dissimilarity measures. The associated input/output model of the selected TTS-RTS pair is then used to simulate the TTS given the inquiry RTS as an input. An exemplary implementation was built to address a biomedical problem of noninvasive intracranial pressure assessment. The performance of the proposed method was superior to that of a simple training-free approach of finding the optimal TTS-RTS pair by a conventional similarity-based search on RTS features.
Complexity analysis of the turbulent environmental fluid flow time series
NASA Astrophysics Data System (ADS)
Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.
2014-02-01
We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.
ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES
PENG, C.-K.; COSTA, MADALENA; GOLDBERGER, ARY L.
2009-01-01
We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations. PMID:20041035
Structured Time Series Analysis for Human Action Segmentation and Recognition.
Dian Gong; Medioni, Gerard; Xuemei Zhao
2014-07-01
We address the problem of structure learning of human motion in order to recognize actions from a continuous monocular motion sequence of an arbitrary person from an arbitrary viewpoint. Human motion sequences are represented by multivariate time series in the joint-trajectories space. Under this structured time series framework, we first propose Kernelized Temporal Cut (KTC), an extension of previous works on change-point detection by incorporating Hilbert space embedding of distributions, to handle the nonparametric and high dimensionality issues of human motions. Experimental results demonstrate the effectiveness of our approach, which yields realtime segmentation, and produces high action segmentation accuracy. Second, a spatio-temporal manifold framework is proposed to model the latent structure of time series data. Then an efficient spatio-temporal alignment algorithm Dynamic Manifold Warping (DMW) is proposed for multivariate time series to calculate motion similarity between action sequences (segments). Furthermore, by combining the temporal segmentation algorithm and the alignment algorithm, online human action recognition can be performed by associating a few labeled examples from motion capture data. The results on human motion capture data and 3D depth sensor data demonstrate the effectiveness of the proposed approach in automatically segmenting and recognizing motion sequences, and its ability to handle noisy and partially occluded data, in the transfer learning module. PMID:26353312
The Relationship of Negative Affect and Thought: Time Series Analyses.
ERIC Educational Resources Information Center
Rubin, Amy; And Others
In recent years, the relationship between moods and thoughts has been the focus of much theorizing and some empirical work. A study was undertaken to examine the intraindividual relationship between negative affect and negative thoughts using a Box-Jenkins time series analysis. College students (N=33) completed a measure of negative mood and…
Analysis of Complex Intervention Effects in Time-Series Experiments.
ERIC Educational Resources Information Center
Bower, Cathleen
An iterative least squares procedure for analyzing the effect of various kinds of intervention in time-series data is described. There are numerous applications of this design in economics, education, and psychology, although until recently, no appropriate analysis techniques had been developed to deal with the model adequately. This paper…
The application of the transfer entropy to gappy time series
NASA Astrophysics Data System (ADS)
Kulp, C. W.; Tracy, E. R.
2009-03-01
The application of the transfer entropy to gappy symbolic time series is discussed. Although the transfer entropy can fail to correctly identify the drive-response relationship, it is able to robustly detect phase relationships. Hence, it might still be of use in applications requiring the detection of changes in these relationships.
Daily time series evapotranspiration maps for Oklahoma and Texas panhandle
Technology Transfer Automated Retrieval System (TEKTRAN)
Evapotranspiration (ET) is an important process in ecosystems’ water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. ...
IDENTIFICATION OF REGIME SHIFTS IN TIME SERIES USING NEIGHBORHOOD STATISTICS
The identification of alternative dynamic regimes in ecological systems requires several lines of evidence. Previous work on time series analysis of dynamic regimes includes mainly model-fitting methods. We introduce two methods that do not use models. These approaches use state-...
Synchronization-based parameter estimation from time series
NASA Astrophysics Data System (ADS)
Parlitz, U.; Junge, L.; Kocarev, L.
1996-12-01
The parameters of a given (chaotic) dynamical model are estimated from scalar time series by adapting a computer model until it synchronizes with the given data. This parameter identification method is applied to numerically generated and experimental data from Chua's circuit.
Ultrasound RF time series for classification of breast lesions.
Uniyal, Nishant; Eskandari, Hani; Abolmaesumi, Purang; Sojoudi, Samira; Gordon, Paula; Warren, Linda; Rohling, Robert N; Salcudean, Septimiu E; Moradi, Mehdi
2015-02-01
This work reports the use of ultrasound radio frequency (RF) time series analysis as a method for ultrasound-based classification of malignant breast lesions. The RF time series method is versatile and requires only a few seconds of raw ultrasound data with no need for additional instrumentation. Using the RF time series features, and a machine learning framework, we have generated malignancy maps, from the estimated cancer likelihood, for decision support in biopsy recommendation. These maps depict the likelihood of malignancy for regions of size 1 mm(2) within the suspicious lesions. We report an area under receiver operating characteristics curve of 0.86 (95% confidence interval [CI]: 0.84%-0.90%) using support vector machines and 0.81 (95% CI: 0.78-0.85) using Random Forests classification algorithms, on 22 subjects with leave-one-subject-out cross-validation. Changing the classification method yielded consistent results which indicates the robustness of this tissue typing method. The findings of this report suggest that ultrasound RF time series, along with the developed machine learning framework, can help in differentiating malignant from benign breast lesions, subsequently reducing the number of unnecessary biopsies after mammography screening. PMID:25350925
Time Series Data Visualization in World Wide Telescope
NASA Astrophysics Data System (ADS)
Fay, J.
WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.
What Makes a Coursebook Series Stand the Test of Time?
ERIC Educational Resources Information Center
Illes, Eva
2009-01-01
Intriguingly, at a time when the ELT market is inundated with state-of-the-art coursebooks teaching modern-day English, a 30-year-old series enjoys continuing popularity in some secondary schools in Hungary. Why would teachers, several of whom are school-based teacher-mentors in the vanguard of the profession, purposefully choose materials which…
A Time-Series Analysis of Hispanic Unemployment.
ERIC Educational Resources Information Center
Defreitas, Gregory
1986-01-01
This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)
Irreversibility of financial time series: A graph-theoretical approach
NASA Astrophysics Data System (ADS)
Flanagan, Ryan; Lacasa, Lucas
2016-04-01
The relation between time series irreversibility and entropy production has been recently investigated in thermodynamic systems operating away from equilibrium. In this work we explore this concept in the context of financial time series. We make use of visibility algorithms to quantify, in graph-theoretical terms, time irreversibility of 35 financial indices evolving over the period 1998-2012. We show that this metric is complementary to standard measures based on volatility and exploit it to both classify periods of financial stress and to rank companies accordingly. We then validate this approach by finding that a projection in principal components space of financial years, based on time irreversibility features, clusters together periods of financial stress from stable periods. Relations between irreversibility, efficiency and predictability are briefly discussed.
Metagenomics meets time series analysis: unraveling microbial community dynamics.
Faust, Karoline; Lahti, Leo; Gonze, Didier; de Vos, Willem M; Raes, Jeroen
2015-06-01
The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic patterns, help to build predictive models or, on the contrary, quantify irregularities that make community behavior unpredictable. Microbial communities can change abruptly in response to small perturbations, linked to changing conditions or the presence of multiple stable states. With sufficient samples or time points, such alternative states can be detected. In addition, temporal variation of microbial interactions can be captured with time-varying networks. Here, we apply these techniques on multiple longitudinal datasets to illustrate their potential for microbiome research.
Multiple imputation for time series data with Amelia package
2016-01-01
Time series data are common in medical researches. Many laboratory variables or study endpoints could be measured repeatedly over time. Multiple imputation (MI) without considering time trend of a variable may cause it to be unreliable. The article illustrates how to perform MI by using Amelia package in a clinical scenario. Amelia package is powerful in that it allows for MI for time series data. External information on the variable of interest can also be incorporated by using prior or bound argument. Such information may be based on previous published observations, academic consensus, and personal experience. Diagnostics of imputation model can be performed by examining the distributions of imputed and observed values, or by using over-imputation technique. PMID:26904578
Classification of time series patterns from complex dynamic systems
Schryver, J.C.; Rao, N.
1998-07-01
An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.
Mixed Spectrum Analysis on fMRI Time-Series.
Kumar, Arun; Lin, Feng; Rajapakse, Jagath C
2016-06-01
Temporal autocorrelation present in functional magnetic resonance image (fMRI) data poses challenges to its analysis. The existing approaches handling autocorrelation in fMRI time-series often presume a specific model of autocorrelation such as an auto-regressive model. The main limitation here is that the correlation structure of voxels is generally unknown and varies in different brain regions because of different levels of neurogenic noises and pulsatile effects. Enforcing a universal model on all brain regions leads to bias and loss of efficiency in the analysis. In this paper, we propose the mixed spectrum analysis of the voxel time-series to separate the discrete component corresponding to input stimuli and the continuous component carrying temporal autocorrelation. A mixed spectral analysis technique based on M-spectral estimator is proposed, which effectively removes autocorrelation effects from voxel time-series and identify significant peaks of the spectrum. As the proposed method does not assume any prior model for the autocorrelation effect in voxel time-series, varying correlation structure among the brain regions does not affect its performance. We have modified the standard M-spectral method for an application on a spatial set of time-series by incorporating the contextual information related to the continuous spectrum of neighborhood voxels, thus reducing considerably the computation cost. Likelihood of the activation is predicted by comparing the amplitude of discrete component at stimulus frequency of voxels across the brain by using normal distribution and modeling spatial correlations among the likelihood with a conditional random field. We also demonstrate the application of the proposed method in detecting other desired frequencies.
An Introductory Overview of Statistical Methods for Discrete Time Series
NASA Astrophysics Data System (ADS)
Meng, X.-L.; California-Harvard AstroStat Collaboration
2004-08-01
A number of statistical problems encounted in astrophysics are concerned with discrete time series, such as photon counts with variation in source intensity over time. This talk provides an introductory overview of the current state-of-the-art methods in statistics, including Bayesian methods aided by Markov chain Monte Carlo, for modeling and analyzing such data. These methods have also been successfully applied in other fields, such as economics.
Normalization methods in time series of platelet function assays
Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham
2016-01-01
Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217
Timely immunization series completion among children of immigrants.
Buelow, Victoria H; Van Hook, Jennifer
2008-02-01
This study examines the relationship between timely immunization series completion among children of immigrants and parental nativity, residential duration in the United States, and citizenship status. We analyzed data from the childhood immunization supplement of the 2000-2003 National Health Interview Surveys (NHIS). Combined 4:3:1:3:3 immunization series completion by 18 months of age served as the dependent variable. Nested logistic regression models were estimated to examine relationship between parental nativity and timely immunization completion. Although socio-economic and health care access partially explained parental nativity, citizenship, and residential duration differences in timely completion, having a foreign-born mother was associated with a 14% reduced odds of completing the combined series on time when compared to children with US-born mothers net of covariates. Children of non-citizen mothers who had resided in the country for less than 5 years were the least likely to complete immunizations on time. The elimination of disparities in timely immunization completion among children requires special attention to children of newly arrived and non-citizen immigrants.
Scale-space analysis of time series in circulatory research.
Mortensen, Kim Erlend; Godtliebsen, Fred; Revhaug, Arthur
2006-12-01
Statistical analysis of time series is still inadequate within circulation research. With the advent of increasing computational power and real-time recordings from hemodynamic studies, one is increasingly dealing with vast amounts of data in time series. This paper aims to illustrate how statistical analysis using the significant nonstationarities (SiNoS) method may complement traditional repeated-measures ANOVA and linear mixed models. We applied these methods on a dataset of local hepatic and systemic circulatory changes induced by aortoportal shunting and graded liver resection. We found SiNoS analysis more comprehensive when compared with traditional statistical analysis in the following four ways: 1) the method allows better signal-to-noise detection; 2) including all data points from real time recordings in a statistical analysis permits better detection of significant features in the data; 3) analysis with multiple scales of resolution facilitates a more differentiated observation of the material; and 4) the method affords excellent visual presentation by combining group differences, time trends, and multiscale statistical analysis allowing the observer to quickly view and evaluate the material. It is our opinion that SiNoS analysis of time series is a very powerful statistical tool that may be used to complement conventional statistical methods.
National Ignition Campaign (NIC) Precision Tuning Series Shock Timing Experiments
Robey, H F; Celliers, P M
2011-07-19
A series of precision shock timing experiments have been performed on NIF. These experiments continue to adjust the laser pulse shape and employ the adjusted cone fraction (CF) in the picket (1st 2 ns of the laser pulse) as determined from the re-emit experiment series. The NIF ignition laser pulse is precisely shaped and consists of a series of four impulses, which drive a corresponding series of shock waves of increasing strength to accelerate and compress the capsule ablator and fuel layer. To optimize the implosion, they tune not only the strength (or power) but also, to sub-nanosecond accuracy, the timing of the shock waves. In a well-tuned implosion, the shock waves work together to compress and heat the fuel. For the shock timing experiments, a re-entrant cone is inserted through both the hohlraum wall and the capsule ablator allowing a direct optical view of the propagating shocks in the capsule interior using the VISAR (Velocity Interferometer System for Any Reflector) diagnostic from outside the hohlraum. To emulate the DT ice of an ignition capsule, the inside of the cone and the capsule are filled with liquid deuterium.
Time-dependent spectral analysis of epidemiological time-series with wavelets.
Cazelles, Bernard; Chavez, Mario; Magny, Guillaume Constantin de; Guégan, Jean-Francois; Hales, Simon
2007-08-22
In the current context of global infectious disease risks, a better understanding of the dynamics of major epidemics is urgently needed. Time-series analysis has appeared as an interesting approach to explore the dynamics of numerous diseases. Classical time-series methods can only be used for stationary time-series (in which the statistical properties do not vary with time). However, epidemiological time-series are typically noisy, complex and strongly non-stationary. Given this specific nature, wavelet analysis appears particularly attractive because it is well suited to the analysis of non-stationary signals. Here, we review the basic properties of the wavelet approach as an appropriate and elegant method for time-series analysis in epidemiological studies. The wavelet decomposition offers several advantages that are discussed in this paper based on epidemiological examples. In particular, the wavelet approach permits analysis of transient relationships between two signals and is especially suitable for gradual change in force by exogenous variables.
Autoregression of Quasi-Stationary Time Series (Invited)
NASA Astrophysics Data System (ADS)
Meier, T. M.; Küperkoch, L.
2009-12-01
Autoregression is a model based tool for spectral analysis and prediction of time series. It has the potential to increase the resolution of spectral estimates. However, the validity of the assumed model has to be tested. Here we review shortly methods for the determination of the parameters of autoregression and summarize properties of autoregressive prediction and autoregressive spectral analysis. Time series with a limited number of dominant frequencies varying slowly in time (quasi-stationary time series) may well be described by a time-dependent autoregressive model of low order. An algorithm for the estimation of the autoregression parameters in a moving window is presented. Time-varying dominant frequencies are estimated. The comparison to results obtained by Fourier transform based methods and the visualization of the time dependent normalized prediction error are essential for quality assessment of the results. The algorithm is applied to synthetic examples as well as to mircoseism and tremor. The sensitivity of the results to the choice of model and filter parameters is discussed. Autoregressive forward prediction offers the opportunity to detect body wave phases in seismograms and to determine arrival times automatically. Examples are shown for P- and S-phases at local and regional distances. In order to determine S-wave arrival times the autoregressive model is extended to multi-component recordings. For the detection of significant temporal changes in waveforms, the choice of the model appears to be less crucial compared to spectral analysis. Temporal changes in frequency, amplitude, phase, and polarisation are detectable by autoregressive prediction. Quality estimates of automatically determined onset times may be obtained from the slope of the absolute prediction error as a function of time and the signal-to-noise ratio. Results are compared to manual readings.
Wavelet transform approach for fitting financial time series data
NASA Astrophysics Data System (ADS)
Ahmed, Amel Abdoullah; Ismail, Mohd Tahir
2015-10-01
This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.
Time series segmentation with shifting means hidden markov models
NASA Astrophysics Data System (ADS)
Kehagias, Ath.; Fortin, V.
2006-08-01
We present a new family of hidden Markov models and apply these to the segmentation of hydrological and environmental time series. The proposed hidden Markov models have a discrete state space and their structure is inspired from the shifting means models introduced by Chernoff and Zacks and by Salas and Boes. An estimation method inspired from the EM algorithm is proposed, and we show that it can accurately identify multiple change-points in a time series. We also show that the solution obtained using this algorithm can serve as a starting point for a Monte-Carlo Markov chain Bayesian estimation method, thus reducing the computing time needed for the Markov chain to converge to a stationary distribution.
Mulstiscale Stochastic Generator of Multivariate Met-Ocean Time Series
NASA Astrophysics Data System (ADS)
Guanche, Yanira; Mínguez, Roberto; Méndez, Fernando J.
2013-04-01
The design of maritime structures requires information on sea state conditions that influence its behavior during its life cycle. In the last decades, there has been a increasing development of sea databases (buoys, reanalysis, satellite) that allow an accurate description of the marine climate and its interaction with a given structure in terms of functionality and stability. However, these databases have a limited timelength, and its appliance entails an associated uncertainty. To avoid this limitation, engineers try to sample synthetically generated time series, statistically consistent, which allow the simulation of longer time periods. The present work proposes a hybrid methodology to deal with this issue. It is based in the combination of clustering algorithms (k-means) and an autoregressive logistic regression model (logit). Since the marine climate is directly related to the atmospheric conditions at a synoptic scale, the proposed methodology takes both systems into account; generating simultaneously circulation patterns (weather types) time series and the sea state time series related. The generation of these time series can be summarized in three steps: (1) By applying the clustering technique k-means the atmospheric conditions are classified into a representative number of synoptical patterns (2) Taking into account different covariates involved (such as seasonality, interannual variability, trends or autoregressive term) the autoregressive logistic model is adjusted (3) Once the model is able to simulate weather types time series the last step is to generate multivariate hourly metocean parameters related to these weather types. This is done by an autoregressive model (ARMA) for each variable, including cross-correlation between them. To show the goodness of the proposed method the following data has been used: Sea Level Pressure (SLP) databases from NCEP-NCAR and Global Ocean Wave (GOW) reanalysis from IH Cantabria. The synthetical met-ocean hourly
Alignment of Noisy and Uniformly Scaled Time Series
NASA Astrophysics Data System (ADS)
Lipowsky, Constanze; Dranischnikow, Egor; Göttler, Herbert; Gottron, Thomas; Kemeter, Mathias; Schömer, Elmar
The alignment of noisy and uniformly scaled time series is an important but difficult task. Given two time series, one of which is a uniformly stretched subsequence of the other, we want to determine the stretching factor and the offset of the second time series within the first one. We adapted and enhanced different methods to address this problem: classical FFT-based approaches to determine the offset combined with a naïve search for the stretching factor or its direct computation in the frequency domain, bounded dynamic time warping and a new approach called shotgun analysis, which is inspired by sequencing and reassembling of genomes in bioinformatics. We thoroughly examined the strengths and weaknesses of the different methods on synthetic and real data sets. The FFT-based approaches are very accurate on high quality data, the shotgun approach is especially suitable for data with outliers. Dynamic time warping is a candidate for non-linear stretching or compression. We successfully applied the presented methods to identify steel coils via their thickness profiles.
Exploring large scale time-series data using nested timelines
NASA Astrophysics Data System (ADS)
Xie, Zaixian; Ward, Matthew O.; Rundensteiner, Elke A.
2013-01-01
When data analysts study time-series data, an important task is to discover how data patterns change over time. If the dataset is very large, this task becomes challenging. Researchers have developed many visualization techniques to help address this problem. However, little work has been done regarding the changes of multivariate patterns, such as linear trends and clusters, on time-series data. In this paper, we describe a set of history views to fill this gap. This technique works under two modes: merge and non-merge. For the merge mode, merge algorithms were applied to selected time windows to generate a change-based hierarchy. Contiguous time windows having similar patterns are merged first. Users can choose different levels of merging with the tradeoff between more details in the data and less visual clutter in the visualizations. In the non-merge mode, the framework can use natural hierarchical time units or one defined by domain experts to represent timelines. This can help users navigate across long time periods. Gridbased views were designed to provide a compact overview for the history data. In addition, MDS pattern starfields and distance maps were developed to enable users to quickly investigate the degree of pattern similarity among different time periods. The usability evaluation demonstrated that most participants could understand the concepts of the history views correctly and finished assigned tasks with a high accuracy and relatively fast response time.
Reconstruction of ensembles of coupled time-delay systems from time series.
Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P
2014-06-01
We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.
Reconstruction of ensembles of coupled time-delay systems from time series
NASA Astrophysics Data System (ADS)
Sysoev, I. V.; Prokhorov, M. D.; Ponomarenko, V. I.; Bezruchko, B. P.
2014-06-01
We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.
Perception of acoustically presented time series with varied intervals.
Wackermann, Jiří; Pacer, Jakob; Wittmann, Marc
2014-03-01
Data from three experiments on serial perception of temporal intervals in the supra-second domain are reported. Sequences of short acoustic signals ("pips") separated by periods of silence were presented to the observers. Two types of time series, geometric or alternating, were used, where the modulus 1+δ of the inter-pip series and the base duration Tb (range from 1.1 to 6s) were varied as independent parameters. The observers had to judge whether the series were accelerating, decelerating, or uniform (3 paradigm), or to distinguish regular from irregular sequences (2 paradigm). "Intervals of subjective uniformity" (isus) were obtained by fitting Gaussian psychometric functions to individual subjects' responses. Progression towards longer base durations (Tb=4.4 or 6s) shifts the isus towards negative δs, i.e., accelerating series. This finding is compatible with the phenomenon of "subjective shortening" of past temporal intervals, which is naturally accounted for by the lossy integration model of internal time representation. The opposite effect observed for short durations (Tb=1.1 or 1.5s) remains unexplained by the lossy integration model, and presents a challenge for further research.
The Puoko-nui CCD Time-Series Photometer
NASA Astrophysics Data System (ADS)
Chote, P.; Sullivan, D. J.
2013-01-01
Puoko-nui (te reo Maori for ‘big eye’) is a precision time series photometer developed at Victoria University of Wellington, primarily for use with the 1m McLellan telescope at Mt John University Observatory (MJUO), at Lake Tekapo, New Zealand. GPS based timing provides excellent timing accuracy, and online reduction software processes frames as they are acquired. The user is presented with a simple user interface that includes instrument control and an up to date lightcurve and Fourier amplitude spectrum of the target star. Puoko-nui has been operating in its current form since early 2011, where it is primarily used to monitor pulsating white dwarf stars.
FTSPlot: fast time series visualization for large datasets.
Riss, Michael
2014-01-01
The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N)); the visualization itself can be done with a complexity of O(1) and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with < 20 ms ms. The current 64-bit implementation theoretically supports datasets with up to 2(64) bytes, on the x86_64 architecture currently up to 2(48) bytes are supported, and benchmarks have been conducted with 2(40) bytes/1 TiB or 1.3 x 10(11) double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.
Assessing spatial covariance among time series of abundance.
Jorgensen, Jeffrey C; Ward, Eric J; Scheuerell, Mark D; Zabel, Richard W
2016-04-01
For species of conservation concern, an essential part of the recovery planning process is identifying discrete population units and their location with respect to one another. A common feature among geographically proximate populations is that the number of organisms tends to covary through time as a consequence of similar responses to exogenous influences. In turn, high covariation among populations can threaten the persistence of the larger metapopulation. Historically, explorations of the covariance in population size of species with many (>10) time series have been computationally difficult. Here, we illustrate how dynamic factor analysis (DFA) can be used to characterize diversity among time series of population abundances and the degree to which all populations can be represented by a few common signals. Our application focuses on anadromous Chinook salmon (Oncorhynchus tshawytscha), a species listed under the US Endangered Species Act, that is impacted by a variety of natural and anthropogenic factors. Specifically, we fit DFA models to 24 time series of population abundance and used model selection to identify the minimum number of latent variables that explained the most temporal variation after accounting for the effects of environmental covariates. We found support for grouping the time series according to 5 common latent variables. The top model included two covariates: the Pacific Decadal Oscillation in spring and summer. The assignment of populations to the latent variables matched the currently established population structure at a broad spatial scale. At a finer scale, there was more population grouping complexity. Some relatively distant populations were grouped together, and some relatively close populations - considered to be more aligned with each other - were more associated with populations further away. These coarse- and fine-grained examinations of spatial structure are important because they reveal different structural patterns not evident
Assessing spatial covariance among time series of abundance.
Jorgensen, Jeffrey C; Ward, Eric J; Scheuerell, Mark D; Zabel, Richard W
2016-04-01
For species of conservation concern, an essential part of the recovery planning process is identifying discrete population units and their location with respect to one another. A common feature among geographically proximate populations is that the number of organisms tends to covary through time as a consequence of similar responses to exogenous influences. In turn, high covariation among populations can threaten the persistence of the larger metapopulation. Historically, explorations of the covariance in population size of species with many (>10) time series have been computationally difficult. Here, we illustrate how dynamic factor analysis (DFA) can be used to characterize diversity among time series of population abundances and the degree to which all populations can be represented by a few common signals. Our application focuses on anadromous Chinook salmon (Oncorhynchus tshawytscha), a species listed under the US Endangered Species Act, that is impacted by a variety of natural and anthropogenic factors. Specifically, we fit DFA models to 24 time series of population abundance and used model selection to identify the minimum number of latent variables that explained the most temporal variation after accounting for the effects of environmental covariates. We found support for grouping the time series according to 5 common latent variables. The top model included two covariates: the Pacific Decadal Oscillation in spring and summer. The assignment of populations to the latent variables matched the currently established population structure at a broad spatial scale. At a finer scale, there was more population grouping complexity. Some relatively distant populations were grouped together, and some relatively close populations - considered to be more aligned with each other - were more associated with populations further away. These coarse- and fine-grained examinations of spatial structure are important because they reveal different structural patterns not evident
Rényi’s information transfer between financial time series
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad
2012-05-01
In this paper, we quantify the statistical coherence between financial time series by means of the Rényi entropy. With the help of Campbell’s coding theorem, we show that the Rényi entropy selectively emphasizes only certain sectors of the underlying empirical distribution while strongly suppressing others. This accentuation is controlled with Rényi’s parameter q. To tackle the issue of the information flow between time series, we formulate the concept of Rényi’s transfer entropy as a measure of information that is transferred only between certain parts of underlying distributions. This is particularly pertinent in financial time series, where the knowledge of marginal events such as spikes or sudden jumps is of a crucial importance. We apply the Rényian information flow to stock market time series from 11 world stock indices as sampled at a daily rate in the time period 02.01.1990-31.12.2009. Corresponding heat maps and net information flows are represented graphically. A detailed discussion of the transfer entropy between the DAX and S&P500 indices based on minute tick data gathered in the period 02.04.2008-11.09.2009 is also provided. Our analysis shows that the bivariate information flow between world markets is strongly asymmetric with a distinct information surplus flowing from the Asia-Pacific region to both European and US markets. An important yet less dramatic excess of information also flows from Europe to the US. This is particularly clearly seen from a careful analysis of Rényi information flow between the DAX and S&P500 indices.
FTSPlot: Fast Time Series Visualization for Large Datasets
Riss, Michael
2014-01-01
The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of ; the visualization itself can be done with a complexity of and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with ms. The current 64-bit implementation theoretically supports datasets with up to bytes, on the x86_64 architecture currently up to bytes are supported, and benchmarks have been conducted with bytes/1 TiB or double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments. PMID:24732865
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
Dynamical Analysis and Visualization of Tornadoes Time Series
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281
Financial time series analysis based on information categorization method
NASA Astrophysics Data System (ADS)
Tian, Qiang; Shang, Pengjian; Feng, Guochen
2014-12-01
The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.
Satellite time series analysis using Empirical Mode Decomposition
NASA Astrophysics Data System (ADS)
Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.
2016-04-01
Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281
Graphical LASSO based Model Selection for Time Series
NASA Astrophysics Data System (ADS)
Jung, Alexander; Hannak, Gabor; Goertz, Norbert
2015-10-01
We propose a novel graphical model selection (GMS) scheme for high-dimensional stationary time series or discrete time process. The method is based on a natural generalization of the graphical LASSO (gLASSO), introduced originally for GMS based on i.i.d. samples, and estimates the conditional independence graph (CIG) of a time series from a finite length observation. The gLASSO for time series is defined as the solution of an l1-regularized maximum (approximate) likelihood problem. We solve this optimization problem using the alternating direction method of multipliers (ADMM). Our approach is nonparametric as we do not assume a finite dimensional (e.g., an autoregressive) parametric model for the observed process. Instead, we require the process to be sufficiently smooth in the spectral domain. For Gaussian processes, we characterize the performance of our method theoretically by deriving an upper bound on the probability that our algorithm fails to correctly identify the CIG. Numerical experiments demonstrate the ability of our method to recover the correct CIG from a limited amount of samples.
Learning time series evolution by unsupervised extraction of correlations
NASA Astrophysics Data System (ADS)
Deco, Gustavo; Schürmann, Bernd
1995-03-01
We focus on the problem of modeling time series by learning statistical correlations between the past and present elements of the series in an unsupervised fashion. This kind of correlation is, in general, nonlinear, especially in the chaotic domain. Therefore the learning algorithm should be able to extract statistical correlations, i.e., higher-order correlations between the elements of the time signal. This problem can be viewed as a special case of factorial learning. Factorial learning may be formulated as an unsupervised redundancy reduction between the output components of a transformation that conserves the transmitted information. An information-theoretic-based architecture and learning paradigm are introduced. The neural architecture has only one layer and a triangular structure in order to transform elements by observing only the past and to conserve the volume. In this fashion, a transformation that guarantees transmission of information without loss is formulated. The learning rule decorrelates the output components of the network. Two methods are used: higher-order decorrelation by explicit evaluation of higher-order cumulants of the output distributions, and minimization of the sum of entropies of each output component in order to minimize the mutual information between them, assuming that the entropies have an upper bound given by Gibbs second theorem. After decorrelation between the output components, the correlation between the elements of the time series can be extracted by analyzing the trained neural architecture. As a consequence, we are able to model chaotic and nonchaotic time series. Furthermore, one critical point in modeling time series is the determination of the dimension of the embedding vector used, i.e., the number of components of the past that are needed to predict the future. With this method we can detect the embedding dimension by extracting the influence of the past on the future, i.e., the correlation of remote past and future
Simple Patterns in Fluctuations of Time Series of Economic Interest
NASA Astrophysics Data System (ADS)
Fanchiotti, H.; García Canal, C. A.; García Zúñiga, H.
Time series corresponding to nominal exchange rates between the US dollar and Argentina, Brazil and European Economic Community currencies; different financial indexes as the Industrial Dow Jones, the British Footsie, the German DAX Composite, the Australian Share Price and the Nikkei Cash and also different Argentine local tax revenues, are analyzed looking for the appearance of simple patterns and the possible definition of forecast evaluators. In every case, the statistical fractal dimensions are obtained from the behavior of the corresponding variance of increments at a given lag. The detrended fluctuation analysis of the data in terms of the corresponding exponent in the resulting power law is carried out. Finally, the frequency power spectra of all the time series considered are computed and compared
Nonlinear modeling of chaotic time series: Theory and applications
NASA Astrophysics Data System (ADS)
Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.
We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.
Causality networks from multivariate time series and application to epilepsy.
Siggiridou, Elsa; Koutlis, Christos; Tsimpiris, Alkiviadis; Kimiskidis, Vasilios K; Kugiumtzis, Dimitris
2015-08-01
Granger causality and variants of this concept allow the study of complex dynamical systems as networks constructed from multivariate time series. In this work, a large number of Granger causality measures used to form causality networks from multivariate time series are assessed. For this, realizations on high dimensional coupled dynamical systems are considered and the performance of the Granger causality measures is evaluated, seeking for the measures that form networks closest to the true network of the dynamical system. In particular, the comparison focuses on Granger causality measures that reduce the state space dimension when many variables are observed. Further, the linear and nonlinear Granger causality measures of dimension reduction are compared to a standard Granger causality measure on electroencephalographic (EEG) recordings containing episodes of epileptiform discharges.
Fast Nonparametric Clustering of Structured Time-Series.
Hensman, James; Rattray, Magnus; Lawrence, Neil D
2015-02-01
In this publication, we combine two Bayesian nonparametric models: the Gaussian Process (GP) and the Dirichlet Process (DP). Our innovation in the GP model is to introduce a variation on the GP prior which enables us to model structured time-series data, i.e., data containing groups where we wish to model inter- and intra-group variability. Our innovation in the DP model is an implementation of a new fast collapsed variational inference procedure which enables us to optimize our variational approximation significantly faster than standard VB approaches. In a biological time series application we show how our model better captures salient features of the data, leading to better consistency with existing biological classifications, while the associated inference algorithm provides a significant speed-up over EM-based variational inference. PMID:26353249
Deviations from uniform power law scaling in nonstationary time series
NASA Technical Reports Server (NTRS)
Viswanathan, G. M.; Peng, C. K.; Stanley, H. E.; Goldberger, A. L.
1997-01-01
A classic problem in physics is the analysis of highly nonstationary time series that typically exhibit long-range correlations. Here we test the hypothesis that the scaling properties of the dynamics of healthy physiological systems are more stable than those of pathological systems by studying beat-to-beat fluctuations in the human heart rate. We develop techniques based on the Fano factor and Allan factor functions, as well as on detrended fluctuation analysis, for quantifying deviations from uniform power-law scaling in nonstationary time series. By analyzing extremely long data sets of up to N = 10(5) beats for 11 healthy subjects, we find that the fluctuations in the heart rate scale approximately uniformly over several temporal orders of magnitude. By contrast, we find that in data sets of comparable length for 14 subjects with heart disease, the fluctuations grow erratically, indicating a loss of scaling stability.
Common structure in panels of short ecological time-series.
Yao, Q; Tong, H; Finkenstädt, B; Stenseth, N C
2000-01-01
Typically, in many studies in ecology, epidemiology, biomedicine and others, we are confronted with panels of short time-series of which we are interested in obtaining a biologically meaningful grouping. Here, we propose a bootstrap approach to test whether the regression functions or the variances of the error terms in a family of stochastic regression models are the same. Our general setting includes panels of time-series models as a special case. We rigorously justify the use of the test by investigating its asymptotic properties, both theoretically and through simulations. The latter confirm that for finite sample size, bootstrap provides a better approximation than classical asymptotic theory. We then apply the proposed tests to the mink-muskrat data across 81 trapping regions in Canada. Ecologically interpretable groupings are obtained, which serve as a necessary first step before a fuller biological and statistical analysis of the food chain interaction. PMID:11133038
Nonlinear modeling of chaotic time series: Theory and applications
Casdagli, M.; Eubank, S.; Farmer, J.D.; Gibson, J. Santa Fe Inst., NM ); Des Jardins, D.; Hunter, N.; Theiler, J. )
1990-01-01
We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.
Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis
NASA Astrophysics Data System (ADS)
Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.
2015-06-01
This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.
Time-series analysis in operant research1
Jones, Richard R.; Vaught, Russell S.; Weinrott, Mark
1977-01-01
A time-series method is presented, nontechnically, for analysis of data generated in individual-subject operant studies, and is recommended as a supplement to visual analysis of behavior change in reversal or multiple-baseline experiments. The method can be used to identify three kinds of statistically significant behavior change: (a) changes in score levels from one experimental phase to another, (b) reliable upward or downward trends in scores, and (c) changes in trends between phases. The detection of, and reliance on, serial dependency (autocorrelation among temporally adjacent scores) in individual-subject behavioral scores is emphasized. Examples of published data from the operant literature are used to illustrate the time-series method. PMID:16795544
A Multiscale Approach to InSAR Time Series Analysis
NASA Astrophysics Data System (ADS)
Hetland, E. A.; Muse, P.; Simons, M.; Lin, N.; Dicaprio, C. J.
2010-12-01
We present a technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale InSAR Time Series analysis), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. As opposed to single pixel InSAR time series techniques, MInTS takes advantage of both spatial and temporal characteristics of the deformation field. We use a weighting scheme which accounts for the presence of localized holes due to decorrelation or unwrapping errors in any given interferogram. We represent time-dependent deformation using a dictionary of general basis functions, capable of detecting both steady and transient processes. The estimation is regularized using a model resolution based smoothing so as to be able to capture rapid deformation where there are temporally dense radar acquisitions and to avoid oscillations during time periods devoid of acquisitions. MInTS also has the flexibility to explicitly parametrize known time-dependent processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). We use cross validation to choose the regularization penalty parameter in the inversion of for the time-dependent deformation field. We demonstrate MInTS using a set of 63 ERS-1/2 and 29 Envisat interferograms for Long Valley Caldera.
Stratospheric ozone time series analysis using dynamical linear models
NASA Astrophysics Data System (ADS)
Laine, Marko; Kyrölä, Erkki
2013-04-01
We describe a hierarchical statistical state space model for ozone profile time series. The time series are from satellite measurements by the SAGE II and GOMOS instruments spanning years 1984-2012. The original data sets are combined and gridded monthly using 10 degree latitude bands, and covering 20-60 km with 1 km vertical spacing. Model components include level, trend, seasonal effect with solar activity, and quasi biennial oscillations as proxy variables. A typical feature of an atmospheric time series is that they are not stationary but exhibit both slowly varying and abrupt changes in the distributional properties. These are caused by external forcing such as changes in the solar activity or volcanic eruptions. Further, the data sampling is often nonuniform, there are data gaps, and the uncertainty of the observations can vary. When observations are combined from various sources there will be instrument and retrieval method related biases. The differences in sampling lead also to uncertainties. Standard classical ARIMA type of statistical time series methods are mostly useless for atmospheric data. A more general approach makes use of dynamical linear models and Kalman filter type of sequential algorithms. These state space models assume a linear relationship between the unknown state of the system and the observations and for the process evolution of the hidden states. They are still flexible enough to model both smooth trends and sudden changes. The above mentioned methodological challenges are discussed, together with analysis of change points in trends related to recovery of stratospheric ozone. This work is part of the ESA SPIN and ozone CCI projects.
One nanosecond time synchronization using series and GPS
NASA Technical Reports Server (NTRS)
Buennagel, A. A.; Spitzmesser, D. J.; Young, L. E.
1983-01-01
Subnanosecond time sychronization between two remote rubidium frequency standards is verified by a traveling clock comparison. Using a novel, code ignorant Global Positioning System (GPS) receiver developed at JPL, the SERIES geodetic baseline measurement system is applied to establish the offset between the 1 Hz. outputs of the remote standards. Results of the two intercomparison experiments to date are presented as well as experimental details.
An online novel adaptive filter for denoising time series measurements.
Willis, Andrew J
2006-04-01
A nonstationary form of the Wiener filter based on a principal components analysis is described for filtering time series data possibly derived from noisy instrumentation. The theory of the filter is developed, implementation details are presented and two examples are given. The filter operates online, approximating the maximum a posteriori optimal Bayes reconstruction of a signal with arbitrarily distributed and non stationary statistics. PMID:16649562
Time series regression model for infectious disease and weather.
Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro
2015-10-01
Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context.
Scaling features of texts, images and time series
NASA Astrophysics Data System (ADS)
Pavlov, Alexey N.; Ebeling, Werner; Molgedey, Lutz; Ziganshin, Amir R.; Anishchenko, Vadim S.
2001-11-01
In the given paper, we consider the scaling features of long letter sequences like human writings, discretized images and discretized financial data. Using several approaches we show that the symbolic strings and time series being analyzed have a complex multiscale structure and demonstrate different scalings for large and small fluctuations. We discuss complex phenomena in the scaling behavior of partition functions in the case of high frequency DAX-future data.
Multifractal analysis of time series generated by discrete Ito equations
Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele
2015-06-15
In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.
Time series analysis for psychological research: examining and forecasting change
Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341
Genetic programming and serial processing for time series classification.
Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I
2014-01-01
This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets. PMID:24032750
Characterization of aggressive prostate cancer using ultrasound RF time series
NASA Astrophysics Data System (ADS)
Khojaste, Amir; Imani, Farhad; Moradi, Mehdi; Berman, David; Siemens, D. Robert; Sauerberi, Eric E.; Boag, Alexander H.; Abolmaesumi, Purang; Mousavi, Parvin
2015-03-01
Prostate cancer is the most prevalently diagnosed and the second cause of cancer-related death in North American men. Several approaches have been proposed to augment detection of prostate cancer using different imaging modalities. Due to advantages of ultrasound imaging, these approaches have been the subject of several recent studies. This paper presents the results of a feasibility study on differentiating between lower and higher grade prostate cancer using ultrasound RF time series data. We also propose new spectral features of RF time series to highlight aggressive prostate cancer in small ROIs of size 1 mm × 1 mm in a cohort of 19 ex vivo specimens of human prostate tissue. In leave-one-patient-out cross-validation strategy, an area under accumulated ROC curve of 0.8 has been achieved with overall sensitivity and specificity of 81% and 80%, respectively. The current method shows promising results on differentiating between lower and higher grade of prostate cancer using ultrasound RF time series.
Data visualization in interactive maps and time series
NASA Astrophysics Data System (ADS)
Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe
2014-05-01
State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.
Scaling detection in time series: diffusion entropy analysis.
Scafetta, Nicola; Grigolini, Paolo
2002-09-01
The methods currently used to determine the scaling exponent of a complex dynamic process described by a time series are based on the numerical evaluation of variance. This means that all of them can be safely applied only to the case where ordinary statistical properties hold true even if strange kinetics are involved. We illustrate a method of statistical analysis based on the Shannon entropy of the diffusion process generated by the time series, called diffusion entropy analysis (DEA). We adopt artificial Gauss and Lévy time series, as prototypes of ordinary and anomalous statistics, respectively, and we analyze them with the DEA and four ordinary methods of analysis, some of which are very popular. We show that the DEA determines the correct scaling exponent even when the statistical properties, as well as the dynamic properties, are anomalous. The other four methods produce correct results in the Gauss case but fail to detect the correct scaling in the case of Lévy statistics. PMID:12366207
Time series analysis for psychological research: examining and forecasting change.
Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341
Genetic programming and serial processing for time series classification.
Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I
2014-01-01
This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.
Learning restricted Boolean network model by time-series data.
Ouyang, Hongjia; Fang, Jie; Shen, Liangzhong; Dougherty, Edward R; Liu, Wenbin
2014-01-01
Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance [Formula: see text], the normalized Hamming distance of state transition [Formula: see text], and the steady-state distribution distance μ (ssd). Results show that the proposed algorithm outperforms the others according to both [Formula: see text] and [Formula: see text], whereas its performance according to μ (ssd) is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data.
Robust, automatic GPS station velocities and velocity time series
NASA Astrophysics Data System (ADS)
Blewitt, G.; Kreemer, C.; Hammond, W. C.
2014-12-01
Automation in GPS coordinate time series analysis makes results more objective and reproducible, but not necessarily as robust as the human eye to detect problems. Moreover, it is not a realistic option to manually scan our current load of >20,000 time series per day. This motivates us to find an automatic way to estimate station velocities that is robust to outliers, discontinuities, seasonality, and noise characteristics (e.g., heteroscedasticity). Here we present a non-parametric method based on the Theil-Sen estimator, defined as the median of velocities vij=(xj-xi)/(tj-ti) computed between all pairs (i, j). Theil-Sen estimators produce statistically identical solutions to ordinary least squares for normally distributed data, but they can tolerate up to 29% of data being problematic. To mitigate seasonality, our proposed estimator only uses pairs approximately separated by an integer number of years (N-δt)<(tj-ti )<(N+δt), where δt is chosen to be small enough to capture seasonality, yet large enough to reduce random error. We fix N=1 to maximally protect against discontinuities. In addition to estimating an overall velocity, we also use these pairs to estimate velocity time series. To test our methods, we process real data sets that have already been used with velocities published in the NA12 reference frame. Accuracy can be tested by the scatter of horizontal velocities in the North American plate interior, which is known to be stable to ~0.3 mm/yr. This presents new opportunities for time series interpretation. For example, the pattern of velocity variations at the interannual scale can help separate tectonic from hydrological processes. Without any step detection, velocity estimates prove to be robust for stations affected by the Mw7.2 2010 El Mayor-Cucapah earthquake, and velocity time series show a clear change after the earthquake, without any of the usual parametric constraints, such as relaxation of postseismic velocities to their preseismic values.
Estimating the Lyapunov spectrum of time delay feedback systems from scalar time series.
Hegger, R
1999-08-01
On the basis of a recently developed method for modeling time delay systems, we propose a procedure to estimate the spectrum of Lyapunov exponents from a scalar time series. It turns out that the spectrum is approximated very well and allows for good estimates of the Lyapunov dimension even if the sampling rate of the time series is so low that the infinite dimensional tangent space is spanned quite sparsely.
A Time Series Model for Assessing the Trend and Forecasting the Road Traffic Accident Mortality
Yousefzadeh-Chabok, Shahrokh; Ranjbar-Taklimie, Fatemeh; Malekpouri, Reza; Razzaghi, Alireza
2016-01-01
Background Road traffic accident (RTA) is one of the main causes of trauma and known as a growing public health concern worldwide, especially in developing countries. Assessing the trend of fatalities in the past years and forecasting it enables us to make the appropriate planning for prevention and control. Objectives This study aimed to assess the trend of RTAs and forecast it in the next years by using time series modeling. Materials and Methods In this historical analytical study, the RTA mortalities in Zanjan Province, Iran, were evaluated during 2007 - 2013. The time series analyses including Box-Jenkins models were used to assess the trend of accident fatalities in previous years and forecast it for the next 4 years. Results The mean age of the victims was 37.22 years (SD = 20.01). From a total of 2571 deaths, 77.5% (n = 1992) were males and 22.5% (n = 579) were females. The study models showed a descending trend of fatalities in the study years. The SARIMA (1, 1, 3) (0, 1, 0) 12 model was recognized as a best fit model in forecasting the trend of fatalities. Forecasting model also showed a descending trend of traffic accident mortalities in the next 4 years. Conclusions There was a decreasing trend in the study and the future years. It seems that implementation of some interventions in the recent decade has had a positive effect on the decline of RTA fatalities. Nevertheless, there is still a need to pay more attention in order to prevent the occurrence and the mortalities related to traffic accidents. PMID:27800467
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-02-20
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-01-01
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.
Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations
NASA Astrophysics Data System (ADS)
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-02-01
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Unraveling the cause-effect relation between time series.
Liang, X San
2014-11-01
Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion, namely, information flow, we solve an inverse problem and give this important and challenging question, which is of interest in a wide variety of disciplines, a positive answer. Here causality is measured by the time rate of information flowing from one series to the other. The resulting formula is tight in form, involving only commonly used statistics, namely, sample covariances; an immediate corollary is that causation implies correlation, but correlation does not imply causation. It has been validated with touchstone linear and nonlinear series, purportedly generated with one-way causality that evades the traditional approaches. It has also been applied successfully to the investigation of real-world problems; an example presented here is the cause-and-effect relation between the two climate modes, El Niño and the Indian Ocean Dipole (IOD), which have been linked to hazards in far-flung regions of the globe. In general, the two modes are mutually causal, but the causality is asymmetric: El Niño tends to stabilize IOD, while IOD functions to make El Niño more uncertain. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. PMID:25493782
Time-series animation techniques for visualizing urban growth
Acevedo, W.; Masuoka, P.
1997-01-01
Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.
Seasonality of tuberculosis in delhi, India: a time series analysis.
Kumar, Varun; Singh, Abhay; Adhikary, Mrinmoy; Daral, Shailaja; Khokhar, Anita; Singh, Saudan
2014-01-01
Background. It is highly cost effective to detect a seasonal trend in tuberculosis in order to optimize disease control and intervention. Although seasonal variation of tuberculosis has been reported from different parts of the world, no definite and consistent pattern has been observed. Therefore, the study was designed to find the seasonal variation of tuberculosis in Delhi, India. Methods. Retrospective record based study was undertaken in a Directly Observed Treatment Short course (DOTS) centre located in the south district of Delhi. Six-year data from January 2007 to December 2012 was analyzed. Expert modeler of SPSS ver. 21 software was used to fit the best suitable model for the time series data. Results. Autocorrelation function (ACF) and partial autocorrelation function (PACF) at lag 12 show significant peak suggesting seasonal component of the TB series. Seasonal adjusted factor (SAF) showed peak seasonal variation from March to May. Univariate model by expert modeler in the SPSS showed that Winter's multiplicative model could best predict the time series data with 69.8% variability. The forecast shows declining trend with seasonality. Conclusion. A seasonal pattern and declining trend with variable amplitudes of fluctuation were observed in the incidence of tuberculosis.
Deriving crop calendar using NDVI time-series
NASA Astrophysics Data System (ADS)
Patel, J. H.; Oza, M. P.
2014-11-01
Agricultural intensification is defined in terms as cropping intensity, which is the numbers of crops (single, double and triple) per year in a unit cropland area. Information about crop calendar (i.e. number of crops in a parcel of land and their planting & harvesting dates and date of peak vegetative stage) is essential for proper management of agriculture. Remote sensing sensors provide a regular, consistent and reliable measurement of vegetation response at various growth stages of crop. Therefore it is ideally suited for monitoring purpose. The spectral response of vegetation, as measured by the Normalized Difference Vegetation Index (NDVI) and its profiles, can provide a new dimension for describing vegetation growth cycle. The analysis based on values of NDVI at regular time interval provides useful information about various crop growth stages and performance of crop in a season. However, the NDVI data series has considerable amount of local fluctuation in time domain and needs to be smoothed so that dominant seasonal behavior is enhanced. Based on temporal analysis of smoothed NDVI series, it is possible to extract number of crop cycles per year and their crop calendar. In the present study, a methodology is developed to extract key elements of crop growth cycle (i.e. number of crops per year and their planting - peak - harvesting dates). This is illustrated by analysing MODIS-NDVI data series of one agricultural year (from June 2012 to May 2013) over Gujarat. Such an analysis is very useful for analysing dynamics of kharif and rabi crops.
Time-Series Analysis of Supergranule Characterstics at Solar Minimum
NASA Technical Reports Server (NTRS)
Williams, Peter E.; Pesnell, W. Dean
2013-01-01
Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.
Inverse sequential procedures for the monitoring of time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy J.
1995-01-01
When one or more new values are added to a developing time series, they change its descriptive parameters (mean, variance, trend, coherence). A 'change index (CI)' is developed as a quantitative indicator that the changed parameters remain compatible with the existing 'base' data. CI formulate are derived, in terms of normalized likelihood ratios, for small samples from Poisson, Gaussian, and Chi-Square distributions, and for regression coefficients measuring linear or exponential trends. A substantial parameter change creates a rapid or abrupt CI decrease which persists when the length of the bases is changed. Except for a special Gaussian case, the CI has no simple explicit regions for tests of hypotheses. However, its design ensures that the series sampled need not conform strictly to the distribution form assumed for the parameter estimates. The use of the CI is illustrated with both constructed and observed data samples, processed with a Fortran code 'Sequitor'.
Multifractal analysis of validated wind speed time series
NASA Astrophysics Data System (ADS)
García-Marín, A. P.; Estévez, J.; Jiménez-Hornero, F. J.; Ayuso-Muñoz, J. L.
2013-03-01
Multifractal properties of 30 min wind data series recorded at six locations in Cadiz (Southern Spain) have been studied in this work with the aim of obtaining detailed information for a range of time scales. Wind speed records have been validated, applying various quality control tests as a pre-requisite before their use, improving the reliability of the results due to the identification of incorrect values which have been discarded in the analysis. The scaling of the wind speed moments has been analysed and empirical moments scaling exponent functions K(q) have been obtained. Although the same critical moment (qcrit) has been obtained for all the places, some differences appear in other multifractal parameters like γmax and the value of K(0). These differences have been related to the presence of extreme events and zero data values in the data series analysed, respectively.
Removing atmosphere loading effect from GPS time series
NASA Astrophysics Data System (ADS)
Tiampo, K. F.; Samadi Alinia, H.; Samsonov, S. V.; Gonzalez, P. J.
2015-12-01
The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various approaches available for compensating ionosphere path delay cannot be used for removal of the tropospheric component. Quantifying the tropospheric delay plays an important role for determination of the vertical GPS component precision, as tropospheric parameters over a large distance have very little correlation with each other. Several methods have been proposed for tropospheric signal elimination from GPS vertical time series. Here we utilize surface temperature fluctuations and seasonal variations in water vapour and air pressure data for various spatial and temporal profiles in order to more accurately remove the atmospheric path delay [Samsonov et al., 2014]. In this paper, we model the atmospheric path delay of vertical position time series by analyzing the signal in the frequency domain and study its dependency on topography in eastern Ontario for the time period from January 2008 to December 2012. Systematic dependency of amplitude of atmospheric path delay as a function of height and its temporal variations based on the development of a new, physics-based model relating tropospheric/atmospheric effects with topography and can help in determining the most accurate GPS position.The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various
Monitoring Forest Regrowth Using a Multi-Platform Time Series
NASA Technical Reports Server (NTRS)
Sabol, Donald E., Jr.; Smith, Milton O.; Adams, John B.; Gillespie, Alan R.; Tucker, Compton J.
1996-01-01
Over the past 50 years, the forests of western Washington and Oregon have been extensively harvested for timber. This has resulted in a heterogeneous mosaic of remaining mature forests, clear-cuts, new plantations, and second-growth stands that now occur in areas that formerly were dominated by extensive old-growth forests and younger forests resulting from fire disturbance. Traditionally, determination of seral stage and stand condition have been made using aerial photography and spot field observations, a methodology that is not only time- and resource-intensive, but falls short of providing current information on a regional scale. These limitations may be solved, in part, through the use of multispectral images which can cover large areas at spatial resolutions in the order of tens of meters. The use of multiple images comprising a time series potentially can be used to monitor land use (e.g. cutting and replanting), and to observe natural processes such as regeneration, maturation and phenologic change. These processes are more likely to be spectrally observed in a time series composed of images taken during different seasons over a long period of time. Therefore, for many areas, it may be necessary to use a variety of images taken with different imaging systems. A common framework for interpretation is needed that reduces topographic, atmospheric, instrumental, effects as well as differences in lighting geometry between images. The present state of remote-sensing technology in general use does not realize the full potential of the multispectral data in areas of high topographic relief. For example, the primary method for analyzing images of forested landscapes in the Northwest has been with statistical classifiers (e.g. parallelepiped, nearest-neighbor, maximum likelihood, etc.), often applied to uncalibrated multispectral data. Although this approach has produced useful information from individual images in some areas, landcover classes defined by these
Loading effects in GPS vertical displacement time series
NASA Astrophysics Data System (ADS)
Memin, A.; Boy, J. P.; Santamaría-Gómez, A.; Watson, C.; Gravelle, M.; Tregoning, P.
2015-12-01
Surface deformations due to loading, with yet no comprehensive representation, account for a significant part of the variability in geodetic time series. We assess effects of loading in GPS vertical displacement time series at several frequency bands. We compare displacement derived from up-to-date loading models to two global sets of positioning time series, and investigate how they are reduced looking at interannual periods (> 2 months), intermediate periods (> 7 days) and the whole spectrum (> 1day). We assess the impact of interannual loading on estimating velocities. We compute atmospheric loading effects using surface pressure fields from the ECMWF. We use the inverted barometer (IB) hypothesis valid for periods exceeding a week to describe the ocean response to the pressure forcing. We used general circulation ocean model (ECCO and GLORYS) to account for wind, heat and fresh water flux. We separately use the Toulouse Unstructured Grid Ocean model (TUGO-m), forced by air pressure and winds, to represent the dynamics of the ocean response at high frequencies. The continental water storage is described using GLDAS/Noah and MERRA-land models. Non-hydrology loading reduces the variability of the observed vertical displacement differently according to the frequency band. The hydrology loading leads to a further reduction mostly at annual periods. ECMWF+TUGO-m better agrees with vertical surface motion than the ECMWF+IB model at all frequencies. The interannual deformation is time-correlated at most of the locations. It is adequately described by a power-law process of spectral index varying from -1.5 to -0.2. Depending on the power-law parameters, the predicted non-linear deformation due to mass loading variations leads to vertical velocity biases up to 0.7 mm/yr when estimated from 5 years of continuous observations. The maximum velocity bias can reach up to 1 mm/yr in regions around the southern Tropical band.
Dynamical recurrent neural networks--towards environmental time series prediction.
Aussem, A; Murtagh, F; Sarazin, M
1995-06-01
Dynamical Recurrent Neural Networks (DRNN) (Aussem 1995a) are a class of fully recurrent networks obtained by modeling synapses as autoregressive filters. By virtue of their internal dynamic, these networks approximate the underlying law governing the time series by a system of nonlinear difference equations of internal variables. They therefore provide history-sensitive forecasts without having to be explicitly fed with external memory. The model is trained by a local and recursive error propagation algorithm called temporal-recurrent-backpropagation. The efficiency of the procedure benefits from the exponential decay of the gradient terms backpropagated through the adjoint network. We assess the predictive ability of the DRNN model with meterological and astronomical time series recorded around the candidate observation sites for the future VLT telescope. The hope is that reliable environmental forecasts provided with the model will allow the modern telescopes to be preset, a few hours in advance, in the most suited instrumental mode. In this perspective, the model is first appraised on precipitation measurements with traditional nonlinear AR and ARMA techniques using feedforward networks. Then we tackle a complex problem, namely the prediction of astronomical seeing, known to be a very erratic time series. A fuzzy coding approach is used to reduce the complexity of the underlying laws governing the seeing. Then, a fuzzy correspondence analysis is carried out to explore the internal relationships in the data. Based on a carefully selected set of meteorological variables at the same time-point, a nonlinear multiple regression, termed nowcasting (Murtagh et al. 1993, 1995), is carried out on the fuzzily coded seeing records. The DRNN is shown to outperform the fuzzy k-nearest neighbors method. PMID:7496587
Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool
NASA Technical Reports Server (NTRS)
McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall
2008-01-01
The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify
Dynamical recurrent neural networks--towards environmental time series prediction.
Aussem, A; Murtagh, F; Sarazin, M
1995-06-01
Dynamical Recurrent Neural Networks (DRNN) (Aussem 1995a) are a class of fully recurrent networks obtained by modeling synapses as autoregressive filters. By virtue of their internal dynamic, these networks approximate the underlying law governing the time series by a system of nonlinear difference equations of internal variables. They therefore provide history-sensitive forecasts without having to be explicitly fed with external memory. The model is trained by a local and recursive error propagation algorithm called temporal-recurrent-backpropagation. The efficiency of the procedure benefits from the exponential decay of the gradient terms backpropagated through the adjoint network. We assess the predictive ability of the DRNN model with meterological and astronomical time series recorded around the candidate observation sites for the future VLT telescope. The hope is that reliable environmental forecasts provided with the model will allow the modern telescopes to be preset, a few hours in advance, in the most suited instrumental mode. In this perspective, the model is first appraised on precipitation measurements with traditional nonlinear AR and ARMA techniques using feedforward networks. Then we tackle a complex problem, namely the prediction of astronomical seeing, known to be a very erratic time series. A fuzzy coding approach is used to reduce the complexity of the underlying laws governing the seeing. Then, a fuzzy correspondence analysis is carried out to explore the internal relationships in the data. Based on a carefully selected set of meteorological variables at the same time-point, a nonlinear multiple regression, termed nowcasting (Murtagh et al. 1993, 1995), is carried out on the fuzzily coded seeing records. The DRNN is shown to outperform the fuzzy k-nearest neighbors method.
Identifying multiple periodicities in sparse photon event time series
NASA Astrophysics Data System (ADS)
Koen, Chris
2016-07-01
The data considered are event times (e.g. photon arrival times, or the occurrence of sharp pulses). The source is multiperiodic, or the data could be multiperiodic because several unresolved sources contribute to the time series. Most events may be unobserved, either because the source is intermittent, or because some events are below the detection limit. The data may also be contaminated by spurious pulses. The problem considered is the determination of the periods in the data. A two-step procedure is proposed: in the first, a likely period is identified; in the second, events associated with this periodicity are removed from the time series. The steps are repeated until the remaining events do not exhibit any periodicity. A number of period-finding methods from the literature are reviewed, and a new maximum likelihood statistic is also introduced. It is shown that the latter is competitive compared to other techniques. The proposed methodology is tested on simulated data. Observations of two rotating radio transients are discussed, but contrary to claims in the literature, no evidence for multiperiodicity could be found.
Long-term time series prediction using OP-ELM.
Grigorievskiy, Alexander; Miche, Yoan; Ventelä, Anne-Mari; Séverin, Eric; Lendasse, Amaury
2014-03-01
In this paper, an Optimally Pruned Extreme Learning Machine (OP-ELM) is applied to the problem of long-term time series prediction. Three known strategies for the long-term time series prediction i.e. Recursive, Direct and DirRec are considered in combination with OP-ELM and compared with a baseline linear least squares model and Least-Squares Support Vector Machines (LS-SVM). Among these three strategies DirRec is the most time consuming and its usage with nonlinear models like LS-SVM, where several hyperparameters need to be adjusted, leads to relatively heavy computations. It is shown that OP-ELM, being also a nonlinear model, allows reasonable computational time for the DirRec strategy. In all our experiments, except one, OP-ELM with DirRec strategy outperforms the linear model with any strategy. In contrast to the proposed algorithm, LS-SVM behaves unstably without variable selection. It is also shown that there is no superior strategy for OP-ELM: any of three can be the best. In addition, the prediction accuracy of an ensemble of OP-ELM is studied and it is shown that averaging predictions of the ensemble can improve the accuracy (Mean Square Error) dramatically.
Improvement in global forecast for chaotic time series
NASA Astrophysics Data System (ADS)
Alves, P. R. L.; Duarte, L. G. S.; da Mota, L. A. C. P.
2016-10-01
In the Polynomial Global Approach to Time Series Analysis, the most costly (computationally speaking) step is the finding of the fitting polynomial. Here we present two routines that improve the forecasting. In the first, an algorithm that greatly improves this situation is introduced and implemented. The heart of this procedure is implemented on the specific routine which performs a mapping with great efficiency. In comparison with the similar procedure of the TimeS package developed by Carli et al. (2014), an enormous gain in efficiency and an increasing in accuracy are obtained. Another development in this work is the establishment of a level of confidence in global prediction with a statistical test for evaluating if the minimization performed is suitable or not. The other program presented in this article applies the Shapiro-Wilk test for checking the normality of the distribution of errors and calculates the expected deviation. The development is employed in observed and simulated time series to illustrate the performance obtained.
Temporal properties of diagnosis code time series in aggregate.
Perotte, Adler; Hripcsak, George
2013-03-01
Time series are essential to health data research and data mining. We aim to study the properties of one of the more commonly available but historically unreliable types of data: administrative diagnoses in the form of the International Classification of Diseases, Ninth Revision (ICD9) codes. We use differential entropy of ICD9 code time series as a surrogate measure for disease time course and also explore Gaussian kernel smoothing to characterize the time course of diseases in a more fine-grained way. Compared to a gold standard created by a panel of clinicians, the first model classified diseases into acute and chronic groups with a receiver operating characteristic area under curve of 0.83. In the second model, several characteristic temporal profiles were observed including permanent, chronic, and acute. In addition, condition dynamics such as the refractory period for giving birth following childbirth were observed. These models demonstrate that ICD9 codes, despite well-documented concerns, contain valid and potentially valuable temporal information.
Weighted statistical parameters for irregularly sampled time series
NASA Astrophysics Data System (ADS)
Rimoldini, Lorenzo
2014-01-01
Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.
Multi-scale description and prediction of financial time series
NASA Astrophysics Data System (ADS)
Nawroth, A. P.; Friedrich, R.; Peinke, J.
2010-08-01
A new method is proposed that allows a reconstruction of time series based on higher order multi-scale statistics given by a hierarchical process. This method is able to model financial time series not only on a specific scale but for a range of scales. The method itself is based on the general n-scale joint probability density, which can be extracted directly from given data. It is shown how based on this n-scale statistics, general n-point probabilities can be estimated from which predictions can be achieved. Exemplary results are shown for the German DAX index. The ability to model correctly the behaviour of the original process for different scales simultaneously and in time is demonstrated. As a main result it is shown that this method is able to reproduce the known volatility cluster, although the model contains no explicit time dependence. Thus a new mechanism is shown how, in a stationary multi-scale process, volatility clustering can emerge.
Long-term time series prediction using OP-ELM.
Grigorievskiy, Alexander; Miche, Yoan; Ventelä, Anne-Mari; Séverin, Eric; Lendasse, Amaury
2014-03-01
In this paper, an Optimally Pruned Extreme Learning Machine (OP-ELM) is applied to the problem of long-term time series prediction. Three known strategies for the long-term time series prediction i.e. Recursive, Direct and DirRec are considered in combination with OP-ELM and compared with a baseline linear least squares model and Least-Squares Support Vector Machines (LS-SVM). Among these three strategies DirRec is the most time consuming and its usage with nonlinear models like LS-SVM, where several hyperparameters need to be adjusted, leads to relatively heavy computations. It is shown that OP-ELM, being also a nonlinear model, allows reasonable computational time for the DirRec strategy. In all our experiments, except one, OP-ELM with DirRec strategy outperforms the linear model with any strategy. In contrast to the proposed algorithm, LS-SVM behaves unstably without variable selection. It is also shown that there is no superior strategy for OP-ELM: any of three can be the best. In addition, the prediction accuracy of an ensemble of OP-ELM is studied and it is shown that averaging predictions of the ensemble can improve the accuracy (Mean Square Error) dramatically. PMID:24365536
Periodicity detection method for small-sample time series datasets.
Tominaga, Daisuke
2010-01-01
Time series of gene expression often exhibit periodic behavior under the influence of multiple signal pathways, and are represented by a model that incorporates multiple harmonics and noise. Most of these data, which are observed using DNA microarrays, consist of few sampling points in time, but most periodicity detection methods require a relatively large number of sampling points. We have previously developed a detection algorithm based on the discrete Fourier transform and Akaike's information criterion. Here we demonstrate the performance of the algorithm for small-sample time series data through a comparison with conventional and newly proposed periodicity detection methods based on a statistical analysis of the power of harmonics.We show that this method has higher sensitivity for data consisting of multiple harmonics, and is more robust against noise than other methods. Although "combinatorial explosion" occurs for large datasets, the computational time is not a problem for small-sample datasets. The MATLAB/GNU Octave script of the algorithm is available on the author's web site: http://www.cbrc.jp/%7Etominaga/piccolo/. PMID:21151841
ERIC Educational Resources Information Center
Ngan, Chun-Kit
2013-01-01
Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…
Synthesis of rainfall time series in a high temporal resolution
NASA Astrophysics Data System (ADS)
Callau Poduje, Ana Claudia; Haberlandt, Uwe
2014-05-01
In order to optimize the design and operation of urban drainage systems, long and continuous rain series in a high temporal resolution are essential. As the length of the rainfall records is often short, particularly the data available with the temporal and regional resolutions required for urban hydrology, it is necessary to find some numerical representation of the precipitation phenomenon to generate long synthetic rainfall series. An Alternating Renewal Model (ARM) is applied for this purpose, which consists of two structures: external and internal. The former is the sequence of wet and dry spells, described by their durations which are simulated stochastically. The internal structure is characterized by the amount of rain corresponding to each wet spell and its distribution within the spell. A multivariate frequency analysis is applied to analyze the internal structure of the wet spells and to generate synthetic events. The stochastic time series must reproduce the statistical characteristics of observed high resolution precipitation measurements used to generate them. The spatio-temporal interdependencies between stations are addressed by resampling the continuous synthetic series based on the Simulated Annealing (SA) procedure. The state of Lower-Saxony and surrounding areas, located in the north-west of Germany is used to develop the ARM. A total of 26 rainfall stations with high temporal resolution records, i.e. rainfall data every 5 minutes, are used to define the events, find the most suitable probability distributions, calibrate the corresponding parameters, simulate long synthetic series and evaluate the results. The length of the available data ranges from 10 to 20 years. The rainfall series involved in the different steps of calculation are compared using a rainfall-runoff model to simulate the runoff behavior in urban areas. The EPA Storm Water Management Model (SWMM) is applied for this evaluation. The results show a good representation of the
Characterizability of metabolic pathway systems from time series data.
Voit, Eberhard O
2013-12-01
Over the past decade, the biomathematical community has devoted substantial effort to the complicated challenge of estimating parameter values for biological systems models. An even more difficult issue is the characterization of functional forms for the processes that govern these systems. Most parameter estimation approaches tacitly assume that these forms are known or can be assumed with some validity. However, this assumption is not always true. The recently proposed method of Dynamic Flux Estimation (DFE) addresses this problem in a genuinely novel fashion for metabolic pathway systems. Specifically, DFE allows the characterization of fluxes within such systems through an analysis of metabolic time series data. Its main drawback is the fact that DFE can only directly be applied if the pathway system contains as many metabolites as unknown fluxes. This situation is unfortunately rare. To overcome this roadblock, earlier work in this field had proposed strategies for augmenting the set of unknown fluxes with independent kinetic information, which however is not always available. Employing Moore-Penrose pseudo-inverse methods of linear algebra, the present article discusses an approach for characterizing fluxes from metabolic time series data that is applicable even if the pathway system is underdetermined and contains more fluxes than metabolites. Intriguingly, this approach is independent of a specific modeling framework and unaffected by noise in the experimental time series data. The results reveal whether any fluxes may be characterized and, if so, which subset is characterizable. They also help with the identification of fluxes that, if they could be determined independently, would allow the application of DFE. PMID:23391489
Mapping Brazilian savanna vegetation gradients with Landsat time series
NASA Astrophysics Data System (ADS)
Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick
2016-10-01
Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important
Aerosol Climate Time Series Evaluation In ESA Aerosol_cci
NASA Astrophysics Data System (ADS)
Popp, T.; de Leeuw, G.; Pinnock, S.
2015-12-01
Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. By the end of 2015 full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which are also validated. The paper will summarize and discuss the results of major reprocessing and validation conducted in 2015. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products
Financial Time Series Prediction Using Elman Recurrent Random Neural Networks.
Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli
2016-01-01
In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.
Financial Time Series Prediction Using Elman Recurrent Random Neural Networks
Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli
2016-01-01
In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices. PMID:27293423
Assemblage time series reveal biodiversity change but not systematic loss.
Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E
2014-04-18
The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority. PMID:24744374
Financial Time Series Prediction Using Elman Recurrent Random Neural Networks.
Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli
2016-01-01
In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices. PMID:27293423
Estimation of coupling between time-delay systems from time series
NASA Astrophysics Data System (ADS)
Prokhorov, M. D.; Ponomarenko, V. I.
2005-07-01
We propose a method for estimation of coupling between the systems governed by scalar time-delay differential equations of the Mackey-Glass type from the observed time series data. The method allows one to detect the presence of certain types of linear coupling between two time-delay systems, to define the type, strength, and direction of coupling, and to recover the model equations of coupled time-delay systems from chaotic time series corrupted by noise. We verify our method using both numerical and experimental data.
Estimation of coupling between time-delay systems from time series.
Prokhorov, M D; Ponomarenko, V I
2005-07-01
We propose a method for estimation of coupling between the systems governed by scalar time-delay differential equations of the Mackey-Glass type from the observed time series data. The method allows one to detect the presence of certain types of linear coupling between two time-delay systems, to define the type, strength, and direction of coupling, and to recover the model equations of coupled time-delay systems from chaotic time series corrupted by noise. We verify our method using both numerical and experimental data.
Quantifying evolutionary dynamics from variant-frequency time series.
Khatri, Bhavin S
2016-01-01
From Kimura's neutral theory of protein evolution to Hubbell's neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher's angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series. PMID:27616332
On the maximum-entropy/autoregressive modeling of time series
NASA Technical Reports Server (NTRS)
Chao, B. F.
1984-01-01
The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.
Quantifying evolutionary dynamics from variant-frequency time series
NASA Astrophysics Data System (ADS)
Khatri, Bhavin S.
2016-09-01
From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.
Income inequality and mortality: time series evidence from Canada.
Laporte, Audrey; Ferguson, Brian S
2003-10-01
In this paper, we apply the standard model used in the income strand of the socio-economic status (SES)-population health literature to explain the relationship between mortality and income to pooled cross-section time-series data for Canada. The use of time-series data increases the available degrees of freedom and allows for the possibility that the effects of inequality take time to translate into poorer health outcomes. In light of recent criticisms of aggregate level studies, we do not attempt to differentiate between the absolute and relative inequality hypotheses, but test for the existence of a relationship between mortality and a measure of income inequality. We find that whether an exogenous trend is incorporated or an auto-regressive distributed lag form is used, the coefficients on mean income and the Gini are not significantly different from zero, which contradicts the findings in other parts of the literature, but which is consistent with earlier cross-section evidence for Canada. The results suggest that models that focus exclusively on income as a measure of the impact of SES on mortality are not complete and that health spending and unemployment may be even more important than income growth and dispersion.
Monthly hail time series analysis related to agricultural insurance
NASA Astrophysics Data System (ADS)
Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.
2010-05-01
Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.
Analysis of complex causal networks through time series
NASA Astrophysics Data System (ADS)
Hut, R.; van de Giesen, N.
2008-12-01
We introduce a new way of looking at (the relations between) groups of signals. In complex networks, such as in landscapes and ecosystems, multiple factors influence each other either through direct causal relations or indirectly through intermediate variables. To puzzle apart the causal relations in a complex network on the basis of measured time series, is not trivial. The method developed here allows us to do excalty that. Using relations that can be derived by (classical) multiple input multiple output system identification, we construct underlying networks of linear time-invariant systems that describe the direct relations between the different signals. The structure of this underlying network can provide valuable information about which signals are dominant, which relations between signals are dominant, and which signals affect each other through another signal in stead of directly. Feedback is easily identified using this approach. We show that the Eigenvalues of the underlying network determine the stability of the network as a whole. Applications are foreseen in for instance the fields of data-driven climate modeling as well as other research involving time series analysis in complex networks.
Quantifying evolutionary dynamics from variant-frequency time series
Khatri, Bhavin S.
2016-01-01
From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series. PMID:27616332
Time series as a diagnostic tool for EKG
NASA Astrophysics Data System (ADS)
Erkal, Cahit; Cecen, Aydin
2007-11-01
A preliminary analysis of heart rate variability (peak-to-peak intervals based on EKG) will be presented using the tools of nonlinear dynamics and chaos. We show that uncertainty determination of the most commonly used invariant-the correlation dimension- and a proper implementation of time series analysis tools are necessary to differentiate between the healthy and unhealthy state of the heart. We present an example analysis based on normal and atrial fibrillation EKGs and point of some pitfalls that may give rise to misleading conclusions.
Time series analysis using semiparametric regression on oil palm production
NASA Astrophysics Data System (ADS)
Yundari, Pasaribu, U. S.; Mukhaiyar, U.
2016-04-01
This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).
Ensemble Deep Learning for Biomedical Time Series Classification
2016-01-01
Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost. PMID:27725828
Best linear forecast of volatility in financial time series
NASA Astrophysics Data System (ADS)
Krivoruchenko, M. I.
2004-09-01
The autocorrelation function of volatility in financial time series is fitted well by a superposition of several exponents. This case admits an explicit analytical solution of the problem of constructing the best linear forecast of a stationary stochastic process. We describe and apply the proposed analytical method for forecasting volatility. The leverage effect and volatility clustering are taken into account. Parameters of the predictor function are determined numerically for the Dow Jones 30 Industrial Average. Connection of the proposed method to the popular autoregressive conditional heteroskedasticity models is discussed.
Chaotic time series analysis in economics: Balance and perspectives
Faggini, Marisa
2014-12-15
The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.
Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis
NASA Astrophysics Data System (ADS)
Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo
2002-03-01
In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.
Albedo Pattern Recognition and Time-Series Analyses in Malaysia
NASA Astrophysics Data System (ADS)
Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.
2012-07-01
Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear
Bilinear System Characteristics from Nonlinear Time Series Analysis
Hunter, N.F. Jr.
1999-02-08
Detection of changes in the resonant frequencies and mode shapes of a system is a fundamental problem in dynamics. This paper describes a time series method of detecting and quantifying changes in these parameters for a ten degree-of-freedom bilinear system excited by narrow band random noise. The method partitions the state space and computes mode frequencies and mode shapes for each region. Different regions of the space may exhibit different mode shapes, allowing diagnosis of stiffness changes at structural discontinuities. The method is useful for detecting changes in the properties of joints in mechanical systems or for detection of damage as the properties of a structure change during use.
Kernel canonical-correlation Granger causality for multiple time series
NASA Astrophysics Data System (ADS)
Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu
2011-04-01
Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.
Time series ARIMA models for daily price of palm oil
NASA Astrophysics Data System (ADS)
Ariff, Noratiqah Mohd; Zamhawari, Nor Hashimah; Bakar, Mohd Aftar Abu
2015-02-01
Palm oil is deemed as one of the most important commodity that forms the economic backbone of Malaysia. Modeling and forecasting the daily price of palm oil is of great interest for Malaysia's economic growth. In this study, time series ARIMA models are used to fit the daily price of palm oil. The Akaike Infromation Criterion (AIC), Akaike Infromation Criterion with a correction for finite sample sizes (AICc) and Bayesian Information Criterion (BIC) are used to compare between different ARIMA models being considered. It is found that ARIMA(1,2,1) model is suitable for daily price of crude palm oil in Malaysia for the year 2010 to 2012.
Visualizing trends and clusters in ranked time-series data
NASA Astrophysics Data System (ADS)
Gousie, Michael B.; Grady, John; Branagan, Melissa
2013-12-01
There are many systems that provide visualizations for time-oriented data. Of those, few provide the means of finding patterns in time-series data in which rankings are also important. Fewer still have the fine granularity necessary to visually follow individual data points through time. We propose the Ranking Timeline, a novel visualization method for modestly-sized multivariate data sets that include the top ten rankings over time. The system includes two main visualization components: a ranking over time and a cluster analysis. The ranking visualization, loosely based on line plots, allows the user to track individual data points so as to facilitate comparisons within a given time frame. Glyphs represent additional attributes within the framework of the overall system. The user has control over many aspects of the visualization, including viewing a subset of the data and/or focusing on a desired time frame. The cluster analysis tool shows the relative importance of individual items in conjunction with a visualization showing the connection(s) to other, similar items, while maintaining the aforementioned glyphs and user interaction. The user controls the clustering according to a similarity threshold. The system has been implemented as a Web application, and has been tested with data showing the top ten actors/actresses from 1929-2010. The experiments have revealed patterns in the data heretofore not explored.
Feature extraction for change analysis in SAR time series
NASA Astrophysics Data System (ADS)
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2015-10-01
In remote sensing, the change detection topic represents a broad field of research. If time series data is available, change detection can be used for monitoring applications. These applications require regular image acquisitions at identical time of day along a defined period. Focusing on remote sensing sensors, radar is especially well-capable for applications requiring regularity, since it is independent from most weather and atmospheric influences. Furthermore, regarding the image acquisitions, the time of day plays no role due to the independence from daylight. Since 2007, the German SAR (Synthetic Aperture Radar) satellite TerraSAR-X (TSX) permits the acquisition of high resolution radar images capable for the analysis of dense built-up areas. In a former study, we presented the change analysis of the Stuttgart (Germany) airport. The aim of this study is the categorization of detected changes in the time series. This categorization is motivated by the fact that it is a poor statement only to describe where and when a specific area has changed. At least as important is the statement about what has caused the change. The focus is set on the analysis of so-called high activity areas (HAA) representing areas changing at least four times along the investigated period. As first step for categorizing these HAAs, the matching HAA changes (blobs) have to be identified. Afterwards, operating in this object-based blob level, several features are extracted which comprise shape-based, radiometric, statistic, morphological values and one context feature basing on a segmentation of the HAAs. This segmentation builds on the morphological differential attribute profiles (DAPs). Seven context classes are established: Urban, infrastructure, rural stable, rural unstable, natural, water and unclassified. A specific HA blob is assigned to one of these classes analyzing the CovAmCoh time series signature of the surrounding segments. In combination, also surrounding GIS information
Girls' Series Books: A View of Times Past.
ERIC Educational Resources Information Center
Schumacher, Mark
The Girls' Books in Series collection at the University of North Carolina at Greensboro's Jackson Library contains over 1850 volumes, with publication dates ranging from the mid-1800s to the 1980s. The library's list currently contains approximately 511 different series. The library owns all the titles for 85 of the series. For 167 of the series,…
Detection of intermittent events in atmospheric time series
NASA Astrophysics Data System (ADS)
Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.
2009-04-01
associated with the occurrence of critical events in the atmospheric dynamics. The critical events are associated with transitions between meta-stable configurations. Consequently, this approach could give some effort in the study of Extreme Events in meteorology and climatology and in weather classification schemes. Then, the renewal approach could give some effort in the modelling of non-Gaussian closures for turbulent fluxes [3]. In the proposed approach the main features that need to be estimated are: (a) the distribution of life-times of a given atmospheric meta-stable structure (Waiting Times between two critical events); (b) the statistical distribution of fluctuations; (c) the presence of memory in the time series. These features are related to the evaluation of memory content and scaling from the time series. In order to analyze these features, in recent years some novel statistical techniques have been developed. In particular, the analysis of Diffusion Entropy [4] was shown to be a robust method for the determination of the dynamical scaling. This property is related to the power-law behaviour of the life-time statistics and to the memory properties of the time series. The analysis of Renewal Aging [5], based on renewal theory [2], allows to estimate the content of memory in a time series that is related to the amount of critical events in the time series itself. After a brief review of the statistical techniques (Diffusion Entropy and Renewal Aging), an application to experimental atmospheric time series will be illustrated. References [1] Weiss G.H., Rubin R.J., Random Walks: theory and selected applications, Advances in Chemical Physics,1983, 52, 363-505 (1983). [2] D.R. Cox, Renewal Theory, Methuen, London (1962). [3] P. Paradisi, R. Cesari, F. Mainardi, F. Tampieri: The fractional Fick's law for non-local transport processes, Physica A, 293, p. 130-142 (2001). [4] P. Grigolini, L. Palatella, G. Raffaelli, Fractals 9 (2001) 439. [5] P. Allegrini, F. Barbi, P
Bayesian methods for outliers detection in GNSS time series
NASA Astrophysics Data System (ADS)
Qianqian, Zhang; Qingming, Gui
2013-07-01
This article is concerned with the problem of detecting outliers in GNSS time series based on Bayesian statistical theory. Firstly, a new model is proposed to simultaneously detect different types of outliers based on the conception of introducing different types of classification variables corresponding to the different types of outliers; the problem of outlier detection is converted into the computation of the corresponding posterior probabilities, and the algorithm for computing the posterior probabilities based on standard Gibbs sampler is designed. Secondly, we analyze the reasons of masking and swamping about detecting patches of additive outliers intensively; an unmasking Bayesian method for detecting additive outlier patches is proposed based on an adaptive Gibbs sampler. Thirdly, the correctness of the theories and methods proposed above is illustrated by simulated data and then by analyzing real GNSS observations, such as cycle slips detection in carrier phase data. Examples illustrate that the Bayesian methods for outliers detection in GNSS time series proposed by this paper are not only capable of detecting isolated outliers but also capable of detecting additive outlier patches. Furthermore, it can be successfully used to process cycle slips in phase data, which solves the problem of small cycle slips.
Software for detection and correction of inhomogeneities in time series
NASA Astrophysics Data System (ADS)
Stepanek, Petr
2010-05-01
During the last decade, software package consisting of AnClim, ProClimDB and LoadData software for processing climatological data has been created. This software offers complex solution in processing climatological time series, starting from loading data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme values evaluation and model outputs verification (ProClimDB and AnClim software). In recent years tools for correction of inhomogeneites in daily data was introduced. Partly methods already programmed in R (e.g. by Christine Gruber, ZAMG) like HOM of Paul Della-Marta and SPLIDHOM method of Olivier Mestre or own methods are available, some of them being able to apply multi-element approach (using e.g. weather types). Available methods can be easily compared and evaluated (both for inhomogeneity detection or correction in this case). Comparison of the available correction methods is also current task of ongoing COST action ESO601 (www. homogenisation.org). Further methods, if available under R, can be easily linked with the software and then the whole processing can benefit from user-friendly environment in which all the most commonly used functions for data handling and climatological processing are available (read more at www.climahom.eu).
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
A Markov switching model for annual hydrologic time series
NASA Astrophysics Data System (ADS)
Akıntuǧ, B.; Rasmussen, P. F.
2005-09-01
This paper investigates the properties of Markov switching (MS) models (also known as hidden Markov models) for generating annual time series. This type of model has been used in a number of recent studies in the water resources literature. The model considered here assumes that climate is switching between M states and that the state sequence can be described by a Markov chain. Observations are assumed to be drawn from a normal distribution whose parameters depend on the state variable. We present the stochastic properties of this class of models along with procedures for model identification and parameter estimation. Although, at a first glance, MS models appear to be quite different from ARMA models, we show that it is possible to find an ARMA model that has the same autocorrelation function and the same marginal distribution as any given MS model. Hence, despite the difference in model structure, there are strong similarities between MS and ARMA models. MS and ARMA models are applied to the time series of mean annual discharge of the Niagara River. Although it is difficult to draw any general conclusion from a single case study, it appears that MS models (and ARMA models derived from MS models) generally have stronger autocorrelation at higher lags than ARMA models estimated by conventional maximum likelihood. This may be an important property if the purpose of the study is the analysis of multiyear droughts.
Diagnosis of nonlinear systems using time series analysis
Hunter, N.F. Jr.
1991-01-01
Diagnosis and analysis techniques for linear systems have been developed and refined to a high degree of precision. In contrast, techniques for the analysis of data from nonlinear systems are in the early stages of development. This paper describes a time series technique for the analysis of data from nonlinear systems. The input and response time series resulting from excitation of the nonlinear system are embedded in a state space. The form of the embedding is optimized using local canonical variate analysis and singular value decomposition techniques. From the state space model, future system responses are estimated. The expected degree of predictability of the system is investigated using the state transition matrix. The degree of nonlinearity present is quantified using the geometry of the transfer function poles in the z plane. Examples of application to a linear single-degree-of-freedom system, a single-degree-of-freedom Duffing Oscillator, and linear and nonlinear three degree of freedom oscillators are presented. 11 refs., 9 figs.
Coastal Atmosphere and Sea Time Series (CoASTS)
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Berthon, Jean-Francoise; Zibordi, Giuseppe; Doyle, John P.; Grossi, Stefania; vanderLinde, Dirk; Targa, Cristina; McClain, Charles R. (Technical Monitor)
2002-01-01
In this document, the first three years of a time series of bio-optical marine and atmospheric measurements are presented and analyzed. These measurements were performed from an oceanographic tower in the northern Adriatic Sea within the framework of the Coastal Atmosphere and Sea Time Series (CoASTS) project, an ocean color calibration and validation activity. The data set collected includes spectral measurements of the in-water apparent (diffuse attenuation coefficient, reflectance, Q-factor, etc.) and inherent (absorption and scattering coefficients) optical properties, as well as the concentrations of the main optical components (pigment and suspended matter concentrations). Clear seasonal patterns are exhibited by the marine quantities on which an appreciable short-term variability (on the order of a half day to one day) is superimposed. This short-term variability is well correlated with the changes in salinity at the surface resulting from the southward transport of freshwater coming from the northern rivers. Concentrations of chlorophyll alpha and total suspended matter span more than two orders of magnitude. The bio-optical characteristics of the measurement site pertain to both Case-I (about 64%) and Case-II (about 36%) waters, based on a relationship between the beam attenuation coefficient at 660nm and the chlorophyll alpha concentration. Empirical algorithms relating in-water remote sensing reflectance ratios and optical components or properties of interest (chlorophyll alpha, total suspended matter, and the diffuse attenuation coefficient) are presented.
Predicting physical time series using dynamic ridge polynomial neural networks.
Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir
2014-01-01
Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques.
Disentangling the stochastic behavior of complex time series
Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus
2016-01-01
Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events – or jumps – with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes. PMID:27759055
Financial time series prediction using spiking neural networks.
Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam
2014-01-01
In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.
A new complexity measure for time series analysis and classification
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth
2013-07-01
Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).
Hybrid perturbation methods based on statistical time series models
NASA Astrophysics Data System (ADS)
San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario
2016-04-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.
Efficient Bayesian inference for natural time series using ARFIMA processes
NASA Astrophysics Data System (ADS)
Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.
2015-11-01
Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. In this paper we present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators. For CET we also extend our method to seasonal long memory.
Financial Time Series Prediction Using Spiking Neural Networks
Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam
2014-01-01
In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two “traditional”, rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments. PMID:25170618
Time series analysis of gold production in Malaysia
NASA Astrophysics Data System (ADS)
Muda, Nora; Hoon, Lee Yuen
2012-05-01
Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.
Time series clustering analysis of health-promoting behavior
NASA Astrophysics Data System (ADS)
Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng
2013-10-01
Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.
Time series modelling and forecasting of emergency department overcrowding.
Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian
2014-09-01
Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand. PMID:25053208
Network inference with confidence from multivariate time series.
Kramer, Mark A; Eden, Uri T; Cash, Sydney S; Kolaczyk, Eric D
2009-06-01
Networks--collections of interacting elements or nodes--abound in the natural and manmade worlds. For many networks, complex spatiotemporal dynamics stem from patterns of physical interactions unknown to us. To infer these interactions, it is common to include edges between those nodes whose time series exhibit sufficient functional connectivity, typically defined as a measure of coupling exceeding a predetermined threshold. However, when uncertainty exists in the original network measurements, uncertainty in the inferred network is likely, and hence a statistical propagation of error is needed. In this manuscript, we describe a principled and systematic procedure for the inference of functional connectivity networks from multivariate time series data. Our procedure yields as output both the inferred network and a quantification of uncertainty of the most fundamental interest: uncertainty in the number of edges. To illustrate this approach, we apply a measure of linear coupling to simulated data and electrocorticogram data recorded from a human subject during an epileptic seizure. We demonstrate that the procedure is accurate and robust in both the determination of edges and the reporting of uncertainty associated with that determination. PMID:19658533
Network inference with confidence from multivariate time series
NASA Astrophysics Data System (ADS)
Kramer, Mark A.; Eden, Uri T.; Cash, Sydney S.; Kolaczyk, Eric D.
2009-06-01
Networks—collections of interacting elements or nodes—abound in the natural and manmade worlds. For many networks, complex spatiotemporal dynamics stem from patterns of physical interactions unknown to us. To infer these interactions, it is common to include edges between those nodes whose time series exhibit sufficient functional connectivity, typically defined as a measure of coupling exceeding a predetermined threshold. However, when uncertainty exists in the original network measurements, uncertainty in the inferred network is likely, and hence a statistical propagation of error is needed. In this manuscript, we describe a principled and systematic procedure for the inference of functional connectivity networks from multivariate time series data. Our procedure yields as output both the inferred network and a quantification of uncertainty of the most fundamental interest: uncertainty in the number of edges. To illustrate this approach, we apply a measure of linear coupling to simulated data and electrocorticogram data recorded from a human subject during an epileptic seizure. We demonstrate that the procedure is accurate and robust in both the determination of edges and the reporting of uncertainty associated with that determination.
Two algorithms to fill cloud gaps in LST time series
NASA Astrophysics Data System (ADS)
Frey, Corinne; Kuenzer, Claudia
2013-04-01
Cloud contamination is a challenge for optical remote sensing. This is especially true for the recording of a fast changing radiative quantity like land surface temperature (LST). The substitution of cloud contaminated pixels with estimated values - gap filling - is not straightforward but possible to a certain extent, as this research shows for medium-resolution time series of MODIS data. Area of interest is the Upper Mekong Delta (UMD). The background for this work is an analysis of the temporal development of 1-km LST in the context of the WISDOM project. The climate of the UMD is characterized by peak rainfalls in the summer months, which is also the time where cloud contamination is highest in the area. Average number of available daytime observations per pixel can go down to less than five for example in the month of June. In winter the average number may reach 25 observations a month. This situation is not appropriate to the calculation of longterm statistics; an adequate gap filling method should be used beforehand. In this research, two different algorithms were tested on an 11 year time series: 1) a gradient based algorithm and 2) a method based on ECMWF era interim re-analysis data. The first algorithm searches for stable inter-image gradients from a given environment and for a certain period of time. These gradients are then used to estimate LST for cloud contaminated pixels in each acquisition. The estimated LSTs are clear-sky LSTs and solely based on the MODIS LST time series. The second method estimates LST on the base of adapted ECMWF era interim skin temperatures and creates a set of expected LSTs. The estimated values were used to fill the gaps in the original dataset, creating two new daily, 1 km datasets. The maps filled with the gradient based method had more than the double amount of valid pixels than the original dataset. The second method (ECMWF era interim based) was able to fill all data gaps. From the gap filled data sets then monthly
Vegetation Dynamics of NW Mexico using MODIS time series data
NASA Astrophysics Data System (ADS)
Valdes, M.; Bonifaz, R.; Pelaez, G.; Leyva Contreras, A.
2010-12-01
Northwestern Mexico is an area subjected to a combination of marine and continental climatic influences which produce a highly variable vegetation dynamics throughout time. Using Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indices data (NDVI and EVI) from 2001 to 2008, mean and standard deviation image values of the time series were calculated. Using this data, annual vegetation dynamics was characterized based on the different values for the different vegetation types. Annual mean values were compared and inter annual variations or anomalies were analyzed calculating departures of de mean. An anomaly was considered if the value was over or under two standard deviations. Using this procedure it was possible determine spatio-temporal patterns over the study area and relate them to climatic conditions.
A quasi-global precipitation time series for drought monitoring
Funk, Chris C.; Peterson, Pete J.; Landsfeld, Martin F.; Pedreros, Diego H.; Verdin, James P.; Rowland, James D.; Romero, Bo E.; Husak, Gregory J.; Michaelsen, Joel C.; Verdin, Andrew P.
2014-01-01
Estimating precipitation variations in space and time is an important aspect of drought early warning and environmental monitoring. An evolving drier-than-normal season must be placed in historical context so that the severity of rainfall deficits may quickly be evaluated. To this end, scientists at the U.S. Geological Survey Earth Resources Observation and Science Center, working closely with collaborators at the University of California, Santa Barbara Climate Hazards Group, have developed a quasi-global (50°S–50°N, 180°E–180°W), 0.05° resolution, 1981 to near-present gridded precipitation time series: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) data archive.
Behavior of road accidents: Structural time series approach
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin
2014-12-01
Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.
A new correlation coefficient for bivariate time-series data
NASA Astrophysics Data System (ADS)
Erdem, Orhan; Ceyhan, Elvan; Varli, Yusuf
2014-11-01
The correlation in time series has received considerable attention in the literature. Its use has attained an important role in the social sciences and finance. For example, pair trading in finance is concerned with the correlation between stock prices, returns, etc. In general, Pearson’s correlation coefficient is employed in these areas although it has many underlying assumptions which restrict its use. Here, we introduce a new correlation coefficient which takes into account the lag difference of data points. We investigate the properties of this new correlation coefficient. We demonstrate that it is more appropriate for showing the direction of the covariation of the two variables over time. We also compare the performance of the new correlation coefficient with Pearson’s correlation coefficient and Detrended Cross-Correlation Analysis (DCCA) via simulated examples.
Adaptive Sensing of Time Series with Application to Remote Exploration
NASA Technical Reports Server (NTRS)
Thompson, David R.; Cabrol, Nathalie A.; Furlong, Michael; Hardgrove, Craig; Low, Bryan K. H.; Moersch, Jeffrey; Wettergreen, David
2013-01-01
We address the problem of adaptive informationoptimal data collection in time series. Here a remote sensor or explorer agent throttles its sampling rate in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility -- all collected datapoints lie in the past, but its resource allocation decisions require predicting far into the future. Our solution is to continually fit a Gaussian process model to the latest data and optimize the sampling plan on line to maximize information gain. We compare the performance characteristics of stationary and nonstationary Gaussian process models. We also describe an application based on geologic analysis during planetary rover exploration. Here adaptive sampling can improve coverage of localized anomalies and potentially benefit mission science yield of long autonomous traverses.
Fast and Flexible Multivariate Time Series Subsequence Search
NASA Technical Reports Server (NTRS)
Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.
2010-01-01
Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.
Time series power flow analysis for distribution connected PV generation.
Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger
2013-01-01
Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating
Seasonal signals in the reprocessed GPS coordinate time series
NASA Astrophysics Data System (ADS)
Kenyeres, A.; van Dam, T.; Figurski, M.; Szafranek, K.
2008-12-01
The global (IGS) and regional (EPN) CGPS time series have already been studied in detail by several authors to analyze the periodic signals and noise present in the long term displacement series. The comparisons indicated that the amplitude and phase of the CGPS derived seasonal signals mostly disagree with the surface mass redistribution models. The CGPS results are highly overestimating the seasonal term, only about 40% of the observed annual amplitude can be explained with the joint contribution of the geophysical models (Dong et al. 2002). Additionally the estimated amplitudes or phases are poorly coherent with the models, especially at sites close to coastal areas (van Dam et al, 2007). The conclusion of the studies was that the GPS results are distorted by analysis artifacts (e.g. ocean tide loading, aliasing of unmodeled short periodic tidal signals, antenna PCV models), monument thermal effects and multipath. Additionally, the GPS series available so far are inhomogeneous in terms of processing strategy, applied models and reference frames. The introduction of the absolute phase center variation (PCV) models for the satellite and ground antennae in 2006 and the related reprocessing of the GPS precise orbits made a perfect ground and strong argument for the complete re-analysis of the GPS observations from global to local level of networks. This enormous work is in progress within the IGS and a pilot analysis was already done for the complete EPN observations from 1996 to 2007 by the MUT group (Military University of Warsaw). The quick analysis of the results proved the expectations and the superiority of the reprocessed data. The noise level (weekly coordinate repeatability) was highly reduced making ground for the later analysis on the daily solution level. We also observed the significant decrease of the seasonal term in the residual coordinate time series, which called our attention to perform a repeated comparison of the GPS derived annual periodicity
Tidal Analysis of Very Long Gravity Time Series
NASA Astrophysics Data System (ADS)
Calvo, M.; Hinderer, J.; Rosat, S.; Legros, H.; Boy, J.; Riccardi, U.; Ducarme, B.; Zuern, W. E.
2012-12-01
We report on the tidal analyses carried out on very long gravity time series collected at three European permanent gravity observatories. According to the Nyquist's criterion, very long gravity series enable us to obtain a high resolution spectral analysis in the tidal bands allowing to separate small amplitude waves in the major tidal groups and also to attempt to detect very long period (18.6 and 9 yr) tides that have never been observed in gravity data. For this study we use 2 long data sets recorded by spring gravimeters in BFO (Germany) (1980-2012) and in Walferdange (Luxemburg) (1980-1995) as well as two time series (1987-1996 and 1996-2012) from two superconducting gravimeters located at the Strasbourg station (France). It is well known that the temporal changes of the instrumental sensitivity may introduce a related error in the tidal analysis. Hence the sensitivity of each instrument is investigated using the temporal variations of the delta factor for the main tidal waves (O1, K1, M2, and S2) as well as the M2/O1 delta factor ratio. Our findings demonstrate that the lack of long term stability of the spring instruments prevents from more detailed spectral analysis; on the contrary promising results have been obtained from gravity data collected by the two superconducting gravimeters operating at different consecutive epochs at Strasbourg. We checked the stability of instrumental sensitivity using numerous calibration experiments carried out during the last 15 years by co-located absolute gravity measurements. It turns out that the SG stability is much better than the one that can be achieved by SG/AG calibration repetitions. The observed temporal evolution of the tidal delta factors in Strasbourg is also compared with results from other European SG stations. Finally, we compare the observed parameters, with those theoretically estimated from the solid Earth tide models. The results demonstrate that long series of precise SG observations are a powerful
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
Computer Program Recognizes Patterns in Time-Series Data
NASA Technical Reports Server (NTRS)
Hand, Charles
2003-01-01
A computer program recognizes selected patterns in time-series data like digitized samples of seismic or electrophysiological signals. The program implements an artificial neural network (ANN) and a set of N clocks for the purpose of determining whether N or more instances of a certain waveform, W, occur within a given time interval, T. The ANN must be trained to recognize W in the incoming stream of data. The first time the ANN recognizes W, it sets clock 1 to count down from T to zero; the second time it recognizes W, it sets clock 2 to count down from T to zero, and so forth through the Nth instance. On the N + 1st instance, the cycle is repeated, starting with clock 1. If any clock has not reached zero when it is reset, then N instances of W have been detected within time T, and the program so indicates. The program can readily be encoded in a field-programmable gate array or an application-specific integrated circuit that could be used, for example, to detect electroencephalographic or electrocardiographic waveforms indicative of epileptic seizures or heart attacks, respectively.
Adaptive Sampling of Time Series During Remote Exploration
NASA Technical Reports Server (NTRS)
Thompson, David R.
2012-01-01
This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models
Established time series measure occurrence and frequency of episodic events.
NASA Astrophysics Data System (ADS)
Pebody, Corinne; Lampitt, Richard
2015-04-01
Established time series measure occurrence and frequency of episodic events. Episodic flux events occur in open oceans. Time series making measurements over significant time scales are one of the few methods that can capture these events and compare their impact with 'normal' flux. Seemingly rare events may be significant on local scales, but without the ability to measure the extent of flux on spatial and temporal scales and combine with the frequency of occurrence, it is difficult to constrain their impact. The Porcupine Abyssal Plain Sustained Observatory (PAP-SO) in the Northeast Atlantic (49 °N 16 °W, 5000m water depth) has measured particle flux since 1989 and zooplankton swimmers since 2000. Sediment traps at 3000m and 100 metres above bottom, collect material year round and we have identified close links between zooplankton and particle flux. Some of these larger animals, for example Diacria trispinosa, make a significant contribution to carbon flux through episodic flux events. D. trispinosa is a euthecosome mollusc which occurs in the Northeast Atlantic, though the PAP-SO is towards the northern limit of its distribution. Pteropods are comprised of aragonite shell, containing soft body parts excepting the muscular foot which extends beyond the mouth of the living animal. Pteropods, both live-on-entry animals and the empty shells are found year round in the 3000m trap. Generally the abundance varies with particle flux, but within that general pattern there are episodic events where significant numbers of these animals containing both organic and inorganic carbon are captured at depth and therefore could be defined as contributing to export flux. Whether the pulse of animals is as a result of the life cycle of D. trispinosa or the effects of the physics of the water column is unclear, but the complexity of the PAP-SO enables us not only to collect these animals but to examine them in parallel to the biogeochemical and physical elements measured by the
Acoustic thermometry time series in the North Pacific
NASA Astrophysics Data System (ADS)
Dushaw, B. D.; Howe, B. M.; Mercer, J. A.; Worcester; Npal Group*, P. F.
2002-12-01
Acoustic measurements of large-scale, depth-averaged temperatures are continuing in the North Pacific as a follow on to the Acoustic Thermometry of Ocean Climate (ATOC) project. An acoustic source is located just north of Kauai. It transmits to six receivers to the east at 1-4-Mm ranges and one receiver to the northwest at about 4-Mm range. The transmission schedule is six times per day at four-day intervals. The time series were obtained from 1998 through 1999 and, after a two-year interruption because of permitting issues, began again in January 2002 to continue for at least another five years. The intense mesoscale thermal variability around Hawaii is evident in all time series; this variability is much greater than that observed near the California coast. The paths to the east, particularly those paths to the California coast, show cooling this year relative to the earlier data. The path to the northwest shows a modest warming. The acoustic rays sample depths below the mixed layer near Hawaii and to the surface as they near the California coast or extend north of the sub-arctic front. The temperatures measured acoustically are compared with those inferred from TOPEX altimetry, ARGO float data, and with ECCO (Estimating the Circulation and Climate of the Ocean) model output. This on-going data collection effort, to be augmented over the next years with a more complete observing array, can be used for, e.g., separating whole-basin climate change from low-mode spatial variability such as the Pacific Decadal Oscillation (PDO). [*NPAL (North Pacific Acoustic Laboratory) Group: J. A. Colosi, B. D. Cornuelle, B. D. Dushaw, M. A. Dzieciuch, B. M. Howe, J. A. Mercer, R. C. Spindel, and P. F. Worcester. Work supported by the Office of Naval Research.
United States Forest Disturbance Trends Observed Using Landsat Time Series
NASA Technical Reports Server (NTRS)
Masek, Jeffrey G.; Goward, Samuel N.; Kennedy, Robert E.; Cohen, Warren B.; Moisen, Gretchen G.; Schleeweis, Karen; Huang, Chengquan
2013-01-01
Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing U.S. land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest disturbance across the conterminous United States for 1985-2005. The geographic sample design used a probability-based scheme to encompass major forest types and maximize geographic dispersion. For each sample location disturbance was identified in the Landsat series using the Vegetation Change Tracker (VCT) algorithm. The NAFD analysis indicates that, on average, 2.77 Mha/yr of forests were disturbed annually, representing 1.09%/yr of US forestland. These satellite-based national disturbance rates estimates tend to be lower than those derived from land management inventories, reflecting both methodological and definitional differences. In particular the VCT approach used with a biennial time step has limited sensitivity to low-intensity disturbances. Unlike prior satellite studies, our biennial forest disturbance rates vary by nearly a factor of two between high and low years. High western US disturbance rates were associated with active fire years and insect activity, while variability in the east is more strongly related to harvest rates in managed forests. We note that generating a geographic sample based on representing forest type and variability may be problematic since the spatial pattern of disturbance does not necessarily correlate with forest type. We also find that the prevalence of diffuse, non-stand clearing disturbance in US forests makes the application of a biennial geographic sample problematic. Future satellite-based studies of disturbance at regional and national scales should focus on wall-to-wall analyses with annual time step for improved accuracy.
12 years time series of SPOT/VEGETATION biophysical variables
NASA Astrophysics Data System (ADS)
Pacholczyk, P.; Makhmara, H.; Lacaze, R.
2012-04-01
Geoland2 project is the FP7 project which intends to prepare, validate and demonstrate pre-operational service chains and products of the GMES Land Monitoring Service. The architecture of geoland2 is made of 3 Core Mapping Services (CMS) providing "basic" land products to7 Core Information Services (CIS) acting on various applications in spatial planning, water quality, forest monitoring, agriculture and food security, land carbon monitoring, and natural resources monitoring. We focus here on the BioPar CMS products related to soil and vegetation variables: the surface albedo, the Leaf Area Index (LAI), the Fraction of green Vegetation Cover (FCover), the fraction of absorbed photosynthetically active radiation (FAPAR) and the Normalized Differential Vegetation Index (NDVI). These products are derived from SPOT/VEGETATION sensor data and are currently distributed on the geoland2 portal (http://www.geoland2.eu). During the last year, the French Space Agency (CNES) has processed the 12 years of VGT archive data and generated a long term time series of biophysical variables, from 1999 to 2010. Since 2011, the production is running continuously at VITO (Belgium). The products provide a global coverage with a spatial resolution of 1 km and a temporal resolution of 10 days. CNES is currently processing this archive to produce a climatology : the vegetation variables are averaged over the 12 years to get a reference on vegetation variables with a 10 days step. In the next months, the SPOT/VGT time series will be completed by consistent LAI, FAPAR and FCover products derived from the AVHRR long term data archive covering the period from 1982 to 2000. This 30-years time series will provide a unique view of the evolution of ecosystems due to natural changes or human pressure. The poster will briefly describe the organization set-up to build the BioPar CMS and the product portfolio. Then the emphasis will be put on the content of the 12-year archive of vegetation products
Assimilation of LAI time-series in crop production models
NASA Astrophysics Data System (ADS)
Kooistra, Lammert; Rijk, Bert; Nannes, Louis
2014-05-01
Agriculture is worldwide a large consumer of freshwater, nutrients and land. Spatial explicit agricultural management activities (e.g., fertilization, irrigation) could significantly improve efficiency in resource use. In previous studies and operational applications, remote sensing has shown to be a powerful method for spatio-temporal monitoring of actual crop status. As a next step, yield forecasting by assimilating remote sensing based plant variables in crop production models would improve agricultural decision support both at the farm and field level. In this study we investigated the potential of remote sensing based Leaf Area Index (LAI) time-series assimilated in the crop production model LINTUL to improve yield forecasting at field level. The effect of assimilation method and amount of assimilated observations was evaluated. The LINTUL-3 crop production model was calibrated and validated for a potato crop on two experimental fields in the south of the Netherlands. A range of data sources (e.g., in-situ soil moisture and weather sensors, destructive crop measurements) was used for calibration of the model for the experimental field in 2010. LAI from cropscan field radiometer measurements and actual LAI measured with the LAI-2000 instrument were used as input for the LAI time-series. The LAI time-series were assimilated in the LINTUL model and validated for a second experimental field on which potatoes were grown in 2011. Yield in 2011 was simulated with an R2 of 0.82 when compared with field measured yield. Furthermore, we analysed the potential of assimilation of LAI into the LINTUL-3 model through the 'updating' assimilation technique. The deviation between measured and simulated yield decreased from 9371 kg/ha to 8729 kg/ha when assimilating weekly LAI measurements in the LINTUL model over the season of 2011. LINTUL-3 furthermore shows the main growth reducing factors, which are useful for farm decision support. The combination of crop models and sensor
Aerosol Climate Time Series in ESA Aerosol_cci
NASA Astrophysics Data System (ADS)
Popp, Thomas; de Leeuw, Gerrit; Pinnock, Simon
2016-04-01
Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. Meanwhile, full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer, but also from ATSR instruments and the POLDER sensor), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which were also validated and improved in the reprocessing. For the three ATSR algorithms the use of an ensemble method was tested. The paper will summarize and discuss the status of dataset reprocessing and validation. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension
Nonlinear time-series analysis of Hyperion's lightcurves
NASA Astrophysics Data System (ADS)
Tarnopolski, M.
2015-06-01
Hyperion is a satellite of Saturn that was predicted to remain in a chaotic rotational state. This was confirmed to some extent by Voyager 2 and Cassini series of images and some ground-based photometric observations. The aim of this article is to explore conditions for potential observations to meet in order to estimate a maximal Lyapunov Exponent (mLE), which being positive is an indicator of chaos and allows to characterise it quantitatively. Lightcurves existing in literature as well as numerical simulations are examined using standard tools of theory of chaos. It is found that existing datasets are too short and undersampled to detect a positive mLE, although its presence is not rejected. Analysis of simulated lightcurves leads to an assertion that observations from one site should be performed over a year-long period to detect a positive mLE, if present, in a reliable way. Another approach would be to use 2-3 telescopes spread over the world to have observations distributed more uniformly. This may be achieved without disrupting other observational projects being conducted. The necessity of time-series to be stationary is highly stressed.
Continuous representations of time-series gene expression data.
Bar-Joseph, Ziv; Gerber, Georg K; Gifford, David K; Jaakkola, Tommi S; Simon, Itamar
2003-01-01
We present algorithms for time-series gene expression analysis that permit the principled estimation of unobserved time points, clustering, and dataset alignment. Each expression profile is modeled as a cubic spline (piecewise polynomial) that is estimated from the observed data and every time point influences the overall smooth expression curve. We constrain the spline coefficients of genes in the same class to have similar expression patterns, while also allowing for gene specific parameters. We show that unobserved time points can be reconstructed using our method with 10-15% less error when compared to previous best methods. Our clustering algorithm operates directly on the continuous representations of gene expression profiles, and we demonstrate that this is particularly effective when applied to nonuniformly sampled data. Our continuous alignment algorithm also avoids difficulties encountered by discrete approaches. In particular, our method allows for control of the number of degrees of freedom of the warp through the specification of parameterized functions, which helps to avoid overfitting. We demonstrate that our algorithm produces stable low-error alignments on real expression data and further show a specific application to yeast knock-out data that produces biologically meaningful results.
VARTOOLS: A program for analyzing astronomical time-series data
NASA Astrophysics Data System (ADS)
Hartman, J. D.; Bakos, G. Á.
2016-10-01
This paper describes the VARTOOLS program, which is an open-source command-line utility, written in C, for analyzing astronomical time-series data, especially light curves. The program provides a general-purpose set of tools for processing light curves including signal identification, filtering, light curve manipulation, time conversions, and modeling and simulating light curves. Some of the routines implemented include the Generalized Lomb-Scargle periodogram, the Box-Least Squares transit search routine, the Analysis of Variance periodogram, the Discrete Fourier Transform including the CLEAN algorithm, the Weighted Wavelet Z-Transform, light curve arithmetic, linear and non-linear optimization of analytic functions including support for Markov Chain Monte Carlo analyses with non-trivial covariances, characterizing and/or simulating time-correlated noise, and the TFA and SYSREM filtering algorithms, among others. A mechanism is also provided for incorporating a user's own compiled processing routines into the program. VARTOOLS is designed especially for batch processing of light curves, including built-in support for parallel processing, making it useful for large time-domain surveys such as searches for transiting planets. Several examples are provided to illustrate the use of the program.
SPITZER IRAC PHOTOMETRY FOR TIME SERIES IN CROWDED FIELDS
Novati, S. Calchi; Beichman, C.; Gould, A.; Fausnaugh, M.; Gaudi, B. S.; Pogge, R. W.; Wibking, B.; Zhu, W.; Poleski, R.; Yee, J. C.; Bryden, G.; Henderson, C. B.; Shvartzvald, Y.; Carey, S.; Udalski, A.; Pawlak, M.; Szymański, M. K.; Skowron, J.; Mróz, P.; Kozłowski, S.; Collaboration: Spitzer team; OGLE group; and others
2015-12-01
We develop a new photometry algorithm that is optimized for the Infrared Array Camera (IRAC) Spitzer time series in crowded fields and that is particularly adapted to faint or heavily blended targets. We apply this to the 170 targets from the 2015 Spitzer microlensing campaign and present the results of three variants of this algorithm in an online catalog. We present detailed accounts of the application of this algorithm to two difficult cases, one very faint and the other very crowded. Several of Spitzer's instrumental characteristics that drive the specific features of this algorithm are shared by Kepler and WFIRST, implying that these features may prove to be a useful starting point for algorithms designed for microlensing campaigns by these other missions.
Assessment of Time Series Complexity Using Improved Approximate Entropy
NASA Astrophysics Data System (ADS)
Kong, De-Ren; Xie, Hong-Bo
2011-09-01
Approximate entropy (ApEn), a measure quantifying complexity and/or regularity, is believed to be an effective method of analyzing diverse settings. However, the similarity definition of vectors based on Heaviside function may cause some problems in the validity and accuracy of ApEn. To overcome the problems, an improved approximate entropy (iApEn) based on the sigmoid function is proposed. The performance of iApEn is tested on the independent identically distributed (IID) Gaussian noise, the MIX stochastic model, the Rossler map, the logistic map, and the high-dimensional Mackey—Glass oscillator. The results show that iApEn is superior to ApEn in several aspects, including better relative consistency, freedom of parameter selection, robust to noise, and more independence on record length when characterizing time series with different complexities.
Optimizing functional network representation of multivariate time series.
Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; del Pozo, Francisco; Menasalvas, Ernestina; Boccaletti, Stefano
2012-01-01
By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.
Correlation filtering in financial time series (Invited Paper)
NASA Astrophysics Data System (ADS)
Aste, T.; Di Matteo, Tiziana; Tumminello, M.; Mantegna, R. N.
2005-05-01
We apply a method to filter relevant information from the correlation coefficient matrix by extracting a network of relevant interactions. This method succeeds to generate networks with the same hierarchical structure of the Minimum Spanning Tree but containing a larger amount of links resulting in a richer network topology allowing loops and cliques. In Tumminello et al.,1 we have shown that this method, applied to a financial portfolio of 100 stocks in the USA equity markets, is pretty efficient in filtering relevant information about the clustering of the system and its hierarchical structure both on the whole system and within each cluster. In particular, we have found that triangular loops and 4 element cliques have important and significant relations with the market structure and properties. Here we apply this filtering procedure to the analysis of correlation in two different kind of interest rate time series (16 Eurodollars and 34 US interest rates).
Visual analytics for model selection in time series analysis.
Bögl, Markus; Aigner, Wolfgang; Filzmoser, Peter; Lammarsch, Tim; Miksch, Silvia; Rind, Alexander
2013-12-01
Model selection in time series analysis is a challenging task for domain experts in many application areas such as epidemiology, economy, or environmental sciences. The methodology used for this task demands a close combination of human judgement and automated computation. However, statistical software tools do not adequately support this combination through interactive visual interfaces. We propose a Visual Analytics process to guide domain experts in this task. For this purpose, we developed the TiMoVA prototype that implements this process based on user stories and iterative expert feedback on user experience. The prototype was evaluated by usage scenarios with an example dataset from epidemiology and interviews with two external domain experts in statistics. The insights from the experts' feedback and the usage scenarios show that TiMoVA is able to support domain experts in model selection tasks through interactive visual interfaces with short feedback cycles.
The Abysmal State of Abyssal Time Series: An Acoustic Challenge
NASA Astrophysics Data System (ADS)
Munk, W. H.; Worcester, P. F.; Dushaw, B. D.; Howe, B. M.; Spindel, R. C.
2001-12-01
The 20th century rise in global sea level by 18 cm has not been explained. The rise has been continuous and linear since the previous century. It cannot be predominantly the result of thermal expansion. Global ocean warming (as recently compiled by Levitus and his collaborators) started too late, is too non-linear and too weak to account for the recorded rise. It is not impossible that the global warming has been underestimated for lack of adequate observations in the southern hemisphere, and at abyssal depths. Time series of abyssal temperatures are badly lacking. Tomographic methods have the required precision, vertical resolution and horizontal integration to accomplish this task. A more likely explanation is to attribute most of the sea level rise to melting of polar ice sheets. There are two difficulties: the required melting is considerably larger than has generally been estimated, and there are serious restrictions imposed by astronomic measurements of the Earth?s rotation.
Optimal estimation of recurrence structures from time series
NASA Astrophysics Data System (ADS)
beim Graben, Peter; Sellers, Kristin K.; Fröhlich, Flavio; Hutt, Axel
2016-05-01
Recurrent temporal dynamics is a phenomenon observed frequently in high-dimensional complex systems and its detection is a challenging task. Recurrence quantification analysis utilizing recurrence plots may extract such dynamics, however it still encounters an unsolved pertinent problem: the optimal selection of distance thresholds for estimating the recurrence structure of dynamical systems. The present work proposes a stochastic Markov model for the recurrent dynamics that allows for the analytical derivation of a criterion for the optimal distance threshold. The goodness of fit is assessed by a utility function which assumes a local maximum for that threshold reflecting the optimal estimate of the system's recurrence structure. We validate our approach by means of the nonlinear Lorenz system and its linearized stochastic surrogates. The final application to neurophysiological time series obtained from anesthetized animals illustrates the method and reveals novel dynamic features of the underlying system. We propose the number of optimal recurrence domains as a statistic for classifying an animals' state of consciousness.
Spitzer IRAC Photometry for Time Series in Crowded Fields
NASA Astrophysics Data System (ADS)
Calchi Novati, S.; Gould, A.; Yee, J. C.; Beichman, C.; Bryden, G.; Carey, S.; Fausnaugh, M.; Gaudi, B. S.; Henderson, C. B.; Pogge, R. W.; Shvartzvald, Y.; Wibking, B.; Zhu, W.; Spitzer Team; Udalski, A.; Poleski, R.; Pawlak, M.; Szymański, M. K.; Skowron, J.; Mróz, P.; Kozłowski, S.; Wyrzykowski, Ł.; Pietrukowicz, P.; Pietrzyński, G.; Soszyński, I.; Ulaczyk, K.; OGLE Group
2015-12-01
We develop a new photometry algorithm that is optimized for the Infrared Array Camera (IRAC) Spitzer time series in crowded fields and that is particularly adapted to faint or heavily blended targets. We apply this to the 170 targets from the 2015 Spitzer microlensing campaign and present the results of three variants of this algorithm in an online catalog. We present detailed accounts of the application of this algorithm to two difficult cases, one very faint and the other very crowded. Several of Spitzer's instrumental characteristics that drive the specific features of this algorithm are shared by Kepler and WFIRST, implying that these features may prove to be a useful starting point for algorithms designed for microlensing campaigns by these other missions.
Time-series analysis of Campylobacter incidence in Switzerland.
Wei, W; Schüpbach, G; Held, L
2015-07-01
Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.
Optimizing Functional Network Representation of Multivariate Time Series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco Del; Menasalvas, Ernestina; Boccaletti, Stefano
2012-09-01
By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.
Time series analysis for minority game simulations of financial markets
NASA Astrophysics Data System (ADS)
Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy
2003-04-01
The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.
SMAP Global Model Calibration Using SMOS Time-Series Observations
NASA Astrophysics Data System (ADS)
Chan, S.; Njoku, E. G.; Bindlish, R.; O'Neill, P. E.; Jackson, T. J.
2014-12-01
Within the suite of SMAP's standard data products is the Level 2 Passive Soil Moisture Product, which is derived primarily from SMAP's brightness temperature (TB) observations. The baseline retrieval algorithm uses an established microwave emission model that had been extensively tested in many past field experiments. One approach to applying the same model at a global scale with SMAP's TB observations is to use the same calibration coefficients derived from past field experiments and apply them globally. Although this approach is a simplification of reality, it resulted in accurate retrieval in several geographically limited studies. Nevertheless, significant retrieval bias may occur in areas where land cover types had not been considered in past field experiments. In this work, a time-series global model calibration approach is proposed and evaluated. One year of gridded L-band TB observations from the Soil Moisture and Ocean Salinity (SMOS) mission were used as the primary input. At each land pixel on the SMAP grid, the observed TBs were compared with the simulated TBs according to the model with unknown calibration coefficients to be determined. Because of the time-series nature of the input, the above comparison could be repeated for successive revisit dates as a system of equations until the number of known variables (TBs) exceeds the number of unknown variables (calibration coefficients and/or geophysical retrieval). Global nonlinear optimization techniques were then applied to the equations to solve for the optimal model calibration coefficients for that pixel. Following global application of this approach, soil moisture estimates were extracted and compared with in-situ ground measurement. The resulting soil moisture estimates were shown to have an accuracy comparable to what was observed in past field experiments, confirming the versatility of this global model calibration approach.
Controlled, distributed data management of an Antarctic time series
NASA Astrophysics Data System (ADS)
Leadbetter, Adam; Connor, David; Cunningham, Nathan; Reynolds, Sarah
2010-05-01
The Rothera Time Series (RaTS) presents over ten years of oceanographic data collected off the Antarctic Peninsula comprising conductivity, temperature, depth cast data; current meter data; and bottle sample data. The data set has been extensively analysed and is well represented in the scientific literature. However, it has never been available to browse as a coherent entity. Work has been undertaken by both the data collecting organisation (the British Antarctic Survey, BAS) and the associated national data centre (the British Oceanographic Data Centre, BODC) to describe the parameters comprising the dataset in a consistent manner. To this end, each data point in the RaTS dataset has now been ascribed a parameter usage term, selected from the appropriate controlled vocabulary of the Natural Environment Research Council's Data Grid (NDG). By marking up the dataset in this way the semantic richness of the NDG vocabularies is fully accessible, and the dataset can be then explored using the Global Change Master Directory keyword set, the International Standards Organisation topic categories, SeaDataNet disciplines and agreed parameter groups, and the NDG parameter discovery vocabulary. We present a single data discovery and exploration tool, a web portal which allows the user to drill down through the dataset using their chosen keyword set. The spatial coverage of the chosen data is displayed through a Google Earth web plugin. Finally, as the time series data are held at BODC and the discrete sample data held at BAS (which are separate physical locations), a mechanism has been established to provide metadata from one site to another. This takes the form of an Open Geospatial Consortium Web Map Service server at BODC feeding information into the portal hosted at BAS.
Efficient Bayesian inference for natural time series using ARFIMA processes
NASA Astrophysics Data System (ADS)
Graves, Timothy; Gramacy, Robert; Franzke, Christian; Watkins, Nicholas
2016-04-01
Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. We present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators [1]. In addition we show how the method can be used to perform joint inference of the stability exponent and the memory parameter when ARFIMA is extended to allow for alpha-stable innovations. Such models can be used to study systems where heavy tails and long range memory coexist. [1] Graves et al, Nonlin. Processes Geophys., 22, 679-700, 2015; doi:10.5194/npg-22-679-2015.
4-Year Cohort Graduation Rate: Overview
ERIC Educational Resources Information Center
Pennsylvania Department of Education, 2010
2010-01-01
Federal law requires Pennsylvania, and all other states, to transition to a new calculation method for determining high school graduation rates. Beginning in 2012, using graduation data from the Classes of 2010 and 2011, the "4-Year Cohort Graduation Rate" calculation will replace the "4-Year Leaver Graduation Rate" calculation. The new…
Time series of organochlorine pesticides from ice cores
NASA Astrophysics Data System (ADS)
Matthews, K. A.; Steig, E. J.; Hermanson, M. H.
2001-05-01
The transport to the Arctic from lower latitudes of organochlorine compounds (OCs), including PCBs and various biocides, is a major environmental health issue. Time series of OC deposition from the atmosphere are of considerable interest for establishing a historical pattern of Arctic inputs relative to uses and regulations. Deposition rates of particular compounds may remain high for some time after use has ceased because of long-term revolatilization from temperate soil and oceanic reservoirs. Obtaining time series of OCs from ice cores has been problematic due to the large sample sizes required for analysis and concerns that post-depositional loss from the firn may obscure the primary depositional signal. We report here results of OC measurements on a 38-m firn/ice core from Lomonosovfonna, Svalbard (78° 51'53"N, 17° 25'30"E, 1230 m asl). The core covers approximately the period 1930 to 2000 A.D., providing complete coverage of all commercial OC production and use. Among the notable results include a distinct peak in hexachlorocyclohexanes (HCH) and DDT concentrations in the 1960s, consistent with known usage histories. A decrease in use of technical (α -enriched) HCH in the United States after 1960 is known to have led to a decrease in the α /γ -HCH ratio since that time, which is precisely what we observe in the core, further supporting the reliability of these time series. We also find that the ratios of primary OCs to their decomposition products are also much higher, overall, than what has been observed in, for example, Arctic sediment cores. High DDT/DDE and cis/trans-chlordane ratios are indicative of recently-produced OCs. Our results suggest two important conclusions. First, that transport to the Arctic is remarkably fast, on the order of months rather than years. Second, that post-depositional decomposition of OCs in the ice core environment is negligible, at least at high-accumulation rate sites like Lomonosovfonna (>30 cm/year ice equiv). The
Blind source separation problem in GPS time series
NASA Astrophysics Data System (ADS)
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2016-04-01
A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition
Muscle segmentation in time series images of Drosophila metamorphosis.
Yadav, Kuleesha; Lin, Feng; Wasser, Martin
2015-01-01
In order to study genes associated with muscular disorders, we characterize the phenotypic changes in Drosophila muscle cells during metamorphosis caused by genetic perturbations. We collect in vivo images of muscle fibers during remodeling of larval to adult muscles. In this paper, we focus on the new image processing pipeline designed to quantify the changes in shape and size of muscles. We propose a new two-step approach to muscle segmentation in time series images. First, we implement a watershed algorithm to divide the image into edge-preserving regions, and then, we classify these regions into muscle and non-muscle classes on the basis of shape and intensity. The advantage of our method is two-fold: First, better results are obtained because classification of regions is constrained by the shape of muscle cell from previous time point; and secondly, minimal user intervention results in faster processing time. The segmentation results are used to compare the changes in cell size between controls and reduction of the autophagy related gene Atg 9 during Drosophila metamorphosis.
Time series inversion of spectra from ground-based radiometers
NASA Astrophysics Data System (ADS)
Christensen, O. M.; Eriksson, P.
2013-07-01
Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument, which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the Onsala Space Observatory (OSO) water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.
Time series inversion of spectra from ground-based radiometers
NASA Astrophysics Data System (ADS)
Christensen, O. M.; Eriksson, P.
2013-02-01
Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the OSO water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.
Streamflow properties from time series of surface velocity and stage
Plant, W.J.; Keller, W.C.; Hayes, K.; Spicer, K.
2005-01-01
Time series of surface velocity and stage have been collected simultaneously. Surface velocity was measured using an array of newly developed continuous-wave microwave sensors. Stage was obtained from the standard U.S. Geological Survey (USGS) measurements. The depth of the river was measured several times during our experiments using sounding weights. The data clearly showed that the point of zero flow was not the bottom at the measurement site, indicating that a downstream control exists. Fathometer measurements confirmed this finding. A model of the surface velocity expected at a site having a downstream control was developed. The model showed that the standard form for the friction velocity does not apply to sites where a downstream control exists. This model fit our measured surface velocity versus stage plots very well with reasonable values of the parameters. Discharges computed using the surface velocities and measured depths matched the USGS rating curve for the site. Values of depth-weighted mean velocities derived from our data did not agree with those expected from Manning's equation due to the downstream control. These results suggest that if real-time surface velocities were available at a gauging station, unstable stream beds could be monitored. Journal of Hydraulic Engineering ?? ASCE.
MaTSE: the gene expression time-series explorer
2013-01-01
Background High throughput gene expression time-course experiments provide a perspective on biological functioning recognized as having huge value for the diagnosis, treatment, and prevention of diseases. There are however significant challenges to properly exploiting this data due to its massive scale and complexity. In particular, existing techniques are found to be ill suited to finding patterns of changing activity over a limited interval of an experiments time frame. The Time-Series Explorer (TSE) was developed to overcome this limitation by allowing users to explore their data by controlling an animated scatter-plot view. MaTSE improves and extends TSE by allowing users to visualize data with missing values, cross reference multiple conditions, highlight gene groupings, and collaborate by sharing their findings. Results MaTSE was developed using an iterative software development cycle that involved a high level of user feedback and evaluation. The resulting software combines a variety of visualization and interaction techniques which work together to allow biologists to explore their data and reveal temporal patterns of gene activity. These include a scatter-plot that can be animated to view different temporal intervals of the data, a multiple coordinated view framework to support the cross reference of multiple experimental conditions, a novel method for highlighting overlapping groups in the scatter-plot, and a pattern browser component that can be used with scatter-plot box queries to support cooperative visualization. A final evaluation demonstrated the tools effectiveness in allowing users to find unexpected temporal patterns and the benefits of functionality such as the overlay of gene groupings and the ability to store patterns. Conclusions We have developed a new exploratory analysis tool, MaTSE, that allows users to find unexpected patterns of temporal activity in gene expression time-series data. Overall, the study acted well to demonstrate the
New insights into time series analysis. I. Correlated observations
NASA Astrophysics Data System (ADS)
Ferreira Lopes, C. E.; Cross, N. J. G.
2016-02-01
Context. The first step when investigating time varying data is the detection of any reliable changes in star brightness. This step is crucial to decreasing the processing time by reducing the number of sources processed in later, slower steps. Variability indices and their combinations have been used to identify variability patterns and to select non-stochastic variations, but the separation of true variables is hindered because of wavelength-correlated systematics of instrumental and atmospheric origin or due to possible data reduction anomalies. Aims: The main aim is to review the current inventory of correlation variability indices and measure the efficiency for selecting non-stochastic variations in photometric data. Methods: We test new and standard data-mining methods for correlated data using public time-domain data from the WFCAM Science Archive (WSA). This archive contains multi-wavelength calibration data (WFCAMCAL) for 216,722 point sources, with at least ten unflagged epochs in any of five filters (YZJHK), which were used to test the different indices against. We improve the panchromatic variability indices and introduce a new set of variability indices for preselecting variable star candidates. Using the WFCAMCAL Variable Star Catalogue (WVSC1) we delimit the efficiency of each variability index. Moreover we test new insights about these indices to improve the efficiency of detection of time-series data dominated by correlated variations. Results: We propose five new variability indices that display high efficiency for the detection of variable stars. We determine the best way to select variable stars with these indices and the current tool inventory. In addition, we propose a universal analytical expression to select likely variables using the fraction of fluctuations on these indices (ffluc). The ffluc can be used as a universal way to analyse photometric data since it displays a only weak dependency with the instrument properties. The variability
NASA Technical Reports Server (NTRS)
He, Yuning
2015-01-01
Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.
Traffic time series analysis by using multiscale time irreversibility and entropy
NASA Astrophysics Data System (ADS)
Wang, Xuejiao; Shang, Pengjian; Fang, Jintang
2014-09-01
Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method—multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy.
Traffic time series analysis by using multiscale time irreversibility and entropy.
Wang, Xuejiao; Shang, Pengjian; Fang, Jintang
2014-09-01
Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method-multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy. PMID:25273180
Indispensable finite time corrections for Fokker-Planck equations from time series data.
Ragwitz, M; Kantz, H
2001-12-17
The reconstruction of Fokker-Planck equations from observed time series data suffers strongly from finite sampling rates. We show that previously published results are degraded considerably by such effects. We present correction terms which yield a robust estimation of the diffusion terms, together with a novel method for one-dimensional problems. We apply these methods to time series data of local surface wind velocities, where the dependence of the diffusion constant on the state variable shows a different behavior than previously suggested.
Interglacial climate dynamics and advanced time series analysis
NASA Astrophysics Data System (ADS)
Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit
2013-04-01
Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R
Time series photometry of faint cataclysmic variables with a CCD
NASA Astrophysics Data System (ADS)
Abbott, Timothy Mark Cameron
1992-08-01
I describe a new hardware and software environment for the practice of time-series stellar photometry with the CCD systems available at McDonald Observatory. This instrument runs suitable CCD's in frame transfer mode and permits windowing on the CCD image to maximize the duty cycle of the photometer. Light curves may be extracted and analyzed in real time at the telescope and image data are stored for later, more thorough analysis. I describe a star tracking algorithm, which is optimized for a timeseries of images of the same stellar field. I explore the extraction of stellar brightness measures from these images using circular software apertures and develop a complete description of the noise properties of this technique. I show that scintillation and pixelization noise have a significant effect on high quality observations. I demonstrate that optimal sampling and profile fitting techniques are unnecessarily complex or detrimental methods of obtaining stellar brightness measures under conditions commonly encountered in timeseries CCD photometry. I compare CCD's and photomultiplier tubes as detectors for timeseries photometry using light curves of a variety of stars obtained simultaneously with both detectors and under equivalent conditions. A CCD can produce useful data under conditions when a photomultiplier tube cannot, and a CCD will often produce more reliable results even under photometric conditions. I prevent studies of the cataclysmic variables (CV's) AL Com, CP Eri, V Per, and DO Leo made using the time series CCD photometer. AL Com is a very faint CV at high Galactic latitude and a bona fide Population II CV. Some of the properties of AL Com are similar to the dwarf nova WZ Sge and others are similar to the intermediate polar EX Hya, but overall AL Com is unlike any other well-studied cataclysmic variable. CP Eri is shown to be the fifth known interacting binary white dwarf. V Per was the first CV found to have an orbital period near the middle of the
A multiscale approach to InSAR time series analysis
NASA Astrophysics Data System (ADS)
Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y. N.; Dicaprio, C.; Rickerby, A.
2008-12-01
We describe a new technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale analysis of InSAR Time Series), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. This approach also permits a consistent treatment of all data independent of the presence of localized holes in any given interferogram. In essence, MInTS allows one to considers all data at the same time (as opposed to one pixel at a time), thereby taking advantage of both spatial and temporal characteristics of the deformation field. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to includes a set of general functions (e.g., splines) in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate our results by comparing with ground-based observations.
Nonparametric directionality measures for time series and point process data.
Halliday, David M
2015-06-01
The need to determine the directionality of interactions between neural signals is a key requirement for analysis of multichannel recordings. Approaches most commonly used are parametric, typically relying on autoregressive models. A number of concerns have been expressed regarding parametric approaches, thus there is a need to consider alternatives. We present an alternative nonparametric approach for construction of directionality measures for bivariate random processes. The method combines time and frequency domain representations of bivariate data to decompose the correlation by direction. Our framework generates two sets of complementary measures, a set of scalar measures, which decompose the total product moment correlation coefficient summatively into three terms by direction and a set of functions which decompose the coherence summatively at each frequency into three terms by direction: forward direction, reverse direction and instantaneous interaction. It can be undertaken as an addition to a standard bivariate spectral and coherence analysis, and applied to either time series or point-process (spike train) data or mixtures of the two (hybrid data). In this paper, we demonstrate application to spike train data using simulated cortical neurone networks and application to experimental data from isolated muscle spindle sensory endings subject to random efferent stimulation. PMID:25958923
Advanced tools for astronomical time series and image analysis
NASA Astrophysics Data System (ADS)
Scargle, Jeffrey D.
The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.
Urban Area Monitoring using MODIS Time Series Data
NASA Astrophysics Data System (ADS)
Devadiga, S.; Sarkar, S.; Mauoka, E.
2015-12-01
Growing urban sprawl and its impact on global climate due to urban heat island effects has been an active area of research over the recent years. This is especially significant in light of rapid urbanization that is happening in some of the first developing nations across the globe. But so far study of urban area growth has been largely restricted to local and regional scales, using high to medium resolution satellite observations, taken at distinct time periods. In this presentation we propose a new approach to detect and monitor urban area expansion using long time series of MODIS data. This work characterizes data points using a vector of several annual metrics computed from the MODIS 8-day and 16-day composite L3 data products, at 250M resolution and over several years and then uses a vector angle mapping classifier to detect and segment the urban area. The classifier is trained using a set of training points obtained from a reference vector point and polygon pre-filtered using the MODIS VI product. This work gains additional significance, given that, despite unprecedented urban growth since 2000, the area covered by the urban class in the MODIS Global Land Cover (MCD12Q1, MCDLCHKM and MCDLC1KM) product hasn't changed since the launch of Terra and Aqua. The proposed approach was applied to delineate the urban area around several cities in Asia known to have maximum growth in the last 15 years. Results were verified using high resolution Landsat data.
Time-series models for border inspection data.
Decrouez, Geoffrey; Robinson, Andrew
2013-12-01
We propose a new modeling approach for inspection data that provides a more useful interpretation of the patterns of detections of invasive pests, using cargo inspection as a motivating example. Methods that are currently in use generally classify shipments according to their likelihood of carrying biosecurity risk material, given available historical and contextual data. Ideally, decisions regarding which cargo containers to inspect should be made in real time, and the models used should be able to focus efforts when the risk is higher. In this study, we propose a dynamic approach that treats the data as a time series in order to detect periods of high risk. A regulatory organization will respond differently to evidence of systematic problems than evidence of random problems, so testing for serial correlation is of major interest. We compare three models that account for various degrees of serial dependence within the data. First is the independence model where the prediction of the arrival of a risky shipment is made solely on the basis of contextual information. We also consider a Markov chain that allows dependence between successive observations, and a hidden Markov model that allows further dependence on past data. The predictive performance of the models is then evaluated using ROC and leakage curves. We illustrate this methodology on two sets of real inspection data. PMID:23682814
Impact of Sensor Degradation on the MODIS NDVI Time Series
NASA Technical Reports Server (NTRS)
Wang, Dongdong; Morton, Douglas; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert
2011-01-01
Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, we evaluated the impact of sensor degradation on trend detection using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004/yr decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends MODIS NDVI over North America were consistent with simulated results, with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in NDVI trends over vegetation.
Impact of Sensor Degradation on the MODIS NDVI Time Series
NASA Technical Reports Server (NTRS)
Wang, Dongdong; Morton, Douglas Christopher; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert
2012-01-01
Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, the impact of sensor degradation on trend detection was evaluated using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470 nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004 yr-1 decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends in MODIS NDVI over North America were consistentwith simulated results,with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in detection of NDVI trends.
Interrupted time-series analysis: studying trends in neurosurgery.
Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K
2015-12-01
OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research. PMID:26621420
Coastal Atmosphere and Sea Time Series (CoASTS)
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Zibordi, Giuseppe; Berthon, Jean-Francoise; Doyle, John P.; Grossi, Stefania; vanderLinde, Dirk; Targa, Cristina; Alberotanza, Luigi; McClain, Charles R. (Technical Monitor)
2002-01-01
The Coastal Atmosphere and Sea Time Series (CoASTS) Project aimed at supporting ocean color research and applications, from 1995 up to the time of publication of this document, has ensured the collection of a comprehensive atmospheric and marine data set from an oceanographic tower located in the northern Adriatic Sea. The instruments and the measurement methodologies used to gather quantities relevant for bio-optical modeling and for the calibration and validation of ocean color sensors, are described. Particular emphasis is placed on four items: (1) the evaluation of perturbation effects in radiometric data (i.e., tower-shading, instrument self-shading, and bottom effects); (2) the intercomparison of seawater absorption coefficients from in situ measurements and from laboratory spectrometric analysis on discrete samples; (3) the intercomparison of two filter techniques for in vivo measurement of particulate absorption coefficients; and (4) the analysis of repeatability and reproducibility of the most relevant laboratory measurements carried out on seawater samples (i.e., particulate and yellow substance absorption coefficients, and pigment and total suspended matter concentrations). Sample data are also presented and discussed to illustrate the typical features characterizing the CoASTS measurement site in view of supporting the suitability of the CoASTS data set for bio-optical modeling and ocean color calibration and validation.
Interrupted time-series analysis: studying trends in neurosurgery.
Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K
2015-12-01
OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.
Time series decomposition methods were applied to meteorological and air quality data and their numerical model estimates. Decomposition techniques express a time series as the sum of a small number of independent modes which hypothetically represent identifiable forcings, thereb...
Mackenzie River Delta morphological change based on Landsat time series
NASA Astrophysics Data System (ADS)
Vesakoski, Jenni-Mari; Alho, Petteri; Gustafsson, David; Arheimer, Berit; Isberg, Kristina
2015-04-01
Arctic rivers are sensitive and yet quite unexplored river systems to which the climate change will impact on. Research has not focused in detail on the fluvial geomorphology of the Arctic rivers mainly due to the remoteness and wideness of the watersheds, problems with data availability and difficult accessibility. Nowadays wide collaborative spatial databases in hydrology as well as extensive remote sensing datasets over the Arctic are available and they enable improved investigation of the Arctic watersheds. Thereby, it is also important to develop and improve methods that enable detecting the fluvio-morphological processes based on the available data. Furthermore, it is essential to reconstruct and improve the understanding of the past fluvial processes in order to better understand prevailing and future fluvial processes. In this study we sum up the fluvial geomorphological change in the Mackenzie River Delta during the last ~30 years. The Mackenzie River Delta (~13 000 km2) is situated in the North Western Territories, Canada where the Mackenzie River enters to the Beaufort Sea, Arctic Ocean near the city of Inuvik. Mackenzie River Delta is lake-rich, productive ecosystem and ecologically sensitive environment. Research objective is achieved through two sub-objectives: 1) Interpretation of the deltaic river channel planform change by applying Landsat time series. 2) Definition of the variables that have impacted the most on detected changes by applying statistics and long hydrological time series derived from Arctic-HYPE model (HYdrologic Predictions for Environment) developed by Swedish Meteorological and Hydrological Institute. According to our satellite interpretation, field observations and statistical analyses, notable spatio-temporal changes have occurred in the morphology of the river channel and delta during the past 30 years. For example, the channels have been developing in braiding and sinuosity. In addition, various linkages between the studied
A Multiscale Approach to InSAR Time Series Analysis
NASA Astrophysics Data System (ADS)
Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y.; Dicaprio, C. J.
2009-12-01
We describe progress in the development of MInTS (Multiscale analysis of InSAR Time Series), an approach to constructed self-consistent time-dependent deformation observations from repeated satellite-based InSAR images of a given region. MInTS relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. In essence, MInTS allows one to considers all data at the same time as opposed to one pixel at a time, thereby taking advantage of both spatial and temporal characteristics of the deformation field. This approach also permits a consistent treatment of all data independent of the presence of localized holes due to unwrapping issues in any given interferogram. Specifically, the presence of holes is accounted for through a weighting scheme that accounts for the extent of actual data versus the area of holes associated with any given wavelet. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to include a set of general functions in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. We also experiment with the use of sparsity inducing regularization as a way to select from a large dictionary of time functions. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate
Time series analysis as a tool for karst water management
NASA Astrophysics Data System (ADS)
Fournier, Matthieu; Massei, Nicolas; Duran, Léa
2015-04-01
Karst hydrosystems are well known for their vulnerability to turbidity due to their complex and unique characteristics which make them very different from other aquifers. Moreover, many parameters can affect their functioning. It makes the characterization of their vulnerability difficult and needs the use of statistical analyses Time series analyses on turbidity, electrical conductivity and water discharge datasets, such as correlation and spectral analyses, have proven to be useful in improving our understanding of karst systems. However, the loss of information on time localization is a major drawback of those Fourier spectral methods; this problem has been overcome by the development of wavelet analysis (continuous or discrete) for hydrosystems offering the possibility to better characterize the complex modalities of variation inherent to non stationary processes. Nevertheless, from wavelet transform, signal is decomposed on several continuous wavelet signals which cannot be true with local-time processes frequently observed in karst aquifer. More recently, a new approach associating empirical mode decomposition and the Hilbert transform was presented for hydrosystems. It allows an orthogonal decomposition of the signal analyzed and provides a more accurate estimation of changing variability scales across time for highly transient signals. This study aims to identify the natural and anthropogenic parameters which control turbidity released at a well for drinking water supply. The well is located in the chalk karst aquifer near the Seine river at 40 km of the Seine estuary in western Paris Basin. At this location, tidal variations greatly affect the level of the water in the Seine. Continuous wavelet analysis on turbidity dataset have been used to decompose turbidity release at the well into three components i) the rain event periods, ii) the pumping periods and iii) the tidal range of Seine river. Time-domain reconstruction by inverse wavelet transform allows
Propagation of stage measurement uncertainties to streamflow time series
NASA Astrophysics Data System (ADS)
Horner, Ivan; Le Coz, Jérôme; Renard, Benjamin; Branger, Flora; McMillan, Hilary
2016-04-01
Streamflow uncertainties due to stage measurements errors are generally overlooked in the promising probabilistic approaches that have emerged in the last decade. We introduce an original error model for propagating stage uncertainties through a stage-discharge rating curve within a Bayesian probabilistic framework. The method takes into account both rating curve (parametric errors and structural errors) and stage uncertainty (systematic and non-systematic errors). Practical ways to estimate the different types of stage errors are also presented: (1) non-systematic errors due to instrument resolution and precision and non-stationary waves and (2) systematic errors due to gauge calibration against the staff gauge. The method is illustrated at a site where the rating-curve-derived streamflow can be compared with an accurate streamflow reference. The agreement between the two time series is overall satisfying. Moreover, the quantification of uncertainty is also satisfying since the streamflow reference is compatible with the streamflow uncertainty intervals derived from the rating curve and the stage uncertainties. Illustrations from other sites are also presented. Results are much contrasted depending on the site features. In some cases, streamflow uncertainty is mainly due to stage measurement errors. The results also show the importance of discriminating systematic and non-systematic stage errors, especially for long term flow averages. Perspectives for improving and validating the streamflow uncertainty estimates are eventually discussed.
Time series analysis of Monte Carlo neutron transport calculations
NASA Astrophysics Data System (ADS)
Nease, Brian Robert
A time series based approach is applied to the Monte Carlo (MC) fission source distribution to calculate the non-fundamental mode eigenvalues of the system. The approach applies Principal Oscillation Patterns (POPs) to the fission source distribution, transforming the problem into a simple autoregressive order one (AR(1)) process. Proof is provided that the stationary MC process is linear to first order approximation, which is a requirement for the application of POPs. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern k-eigenvalue MC codes calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. The strength of this approach is contrasted against the Fission Matrix method (FMM) in terms of accuracy versus computer memory constraints. Multi-dimensional problems are considered since the approach has strong potential for use in reactor analysis, and the implementation of the method into production codes is discussed. Lastly, the appearance of complex eigenvalues is investigated and solutions are provided.
Landslide monitoring using airphotos time series and GIS
NASA Astrophysics Data System (ADS)
Kavoura, Katerina; Nikolakopoulos, Konstantinos G.; Sabatakakis, Nikolaos
2014-10-01
Western Greece is suffering by landslides. The term landslide includes a wide range of ground movement, such as slides, falls, flows etc. mainly based on gravity with the aid of many conditioning and triggering factors. Landslides provoke enormous changes to the natural and artificial relief. The annual cost of repairing the damage amounts to millions of euros. In this paper a combined use of airphotos time series, high resolution remote sensing data and GIS for the landslide monitoring is presented. Analog and digital air-photos used covered a period of almost 70 years from 1945 until 2012. Classical analog airphotos covered the period from 1945 to 2000, while digital airphotos and satellite images covered the 2008-2012 period. The air photos have been orthorectified using the Leica Photogrammetry Suite. Ground control points and a high accuracy DSM were used for the orthorectification of the air photos. The 2008 digital air photo mosaic from the Greek Cadastral with a spatial resolution of 25 cm and the respective DSM was used as the base map for all the others data sets. The RMS error was less than 0.5 pixel. Changes to the artificial constructions provoked by the landslideswere digitized and then implemented in an ARCGIS database. The results are presented in this paper.
Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis
NASA Technical Reports Server (NTRS)
Eberhart, C. J.; Casiano, M. J.
2015-01-01
Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.
Innovative techniques to analyze time series of geomagnetic activity indices
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos
2016-04-01
Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.
Task-Driven Evaluation of Aggregation in Time Series Visualization.
Albers, Danielle; Correll, Michael; Gleicher, Michael
2014-01-01
Many visualization tasks require the viewer to make judgments about aggregate properties of data. Recent work has shown that viewers can perform such tasks effectively, for example to efficiently compare the maximums or means over ranges of data. However, this work also shows that such effectiveness depends on the designs of the displays. In this paper, we explore this relationship between aggregation task and visualization design to provide guidance on matching tasks with designs. We combine prior results from perceptual science and graphical perception to suggest a set of design variables that influence performance on various aggregate comparison tasks. We describe how choices in these variables can lead to designs that are matched to particular tasks. We use these variables to assess a set of eight different designs, predicting how they will support a set of six aggregate time series comparison tasks. A crowd-sourced evaluation confirms these predictions. These results not only provide evidence for how the specific visualizations support various tasks, but also suggest using the identified design variables as a tool for designing visualizations well suited for various types of tasks.
Chaotic time series analysis of vision evoked EEG
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Wang, Hong
2010-01-01
To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.
Chaotic time series analysis of vision evoked EEG
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Wang, Hong
2009-12-01
To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.
Chaos Time Series Prediction Based on Membrane Optimization Algorithms
Li, Meng; Yi, Liangzhong; Pei, Zheng; Gao, Zhisheng
2015-01-01
This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ, m) and least squares support vector machine (LS-SVM) (γ, σ) by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM) broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE), root mean square error (RMSE), and mean absolute percentage error (MAPE). PMID:25874249
A Monte Carlo Approach to Biomedical Time Series Search
Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A.T.
2016-01-01
Time series subsequence matching (or signal searching) has importance in a variety of areas in health care informatics. These areas include case-based diagnosis and treatment as well as the discovery of trends and correlations between data. Much of the traditional research in signal searching has focused on high dimensional R-NN matching. However, the results of R-NN are often small and yield minimal information gain; especially with higher dimensional data. This paper proposes a randomized Monte Carlo sampling method to broaden search criteria such that the query results are an accurate sampling of the complete result set. The proposed method is shown both theoretically and empirically to improve information gain. The number of query results are increased by several orders of magnitude over approximate exact matching schemes and fall within a Gaussian distribution. The proposed method also shows excellent performance as the majority of overhead added by sampling can be mitigated through parallelization. Experiments are run on both simulated and real-world biomedical datasets.
On the Fourier and Wavelet Analysis of Coronal Time Series
NASA Astrophysics Data System (ADS)
Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.
2016-07-01
Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.
Vegetation classification in eastern China using time series NDVI images
NASA Astrophysics Data System (ADS)
Han, Guifeng; Xu, Jianhua
2007-11-01
The SPOT/VGT NDVI (S10) time series data of eastern China (1998-2005) are smoothed with two methods, the moving average and the Savitzky-Golay filter, after they are downloaded from the official website of VITO. Then the monthly maximal NDVI images (total 93 images) are extracted from 279 NDVI (S10) images and the Principal Component Analysis (PCA) is applied on the 93 images. There are 3 components that each explains more than 1% of the variance, in which the principal components 1, 2 and 3 explain respectively 93.25%, 2.77% and 1.21% of the variance in the original 93 maximum NDVI images. The principal component 1 is interpreted as the "climate" component, and principal components 2 and 3 are interpreted as the "growth season" and "non-growth season" components respectively. Principal components 1, 2 and 3 are composed to a 3-band color image which is classified into 7 classes (including 18 subclasses) by ISODATA. The overall accuracy of classification in five samples is 83.6%, and the kappa index is 0.82. Finally, the unique intra-annual NDVI curve of each vegetation class is displayed.
Time series visualization tools through a Virtual Observatory in geodesy
NASA Astrophysics Data System (ADS)
Deleflie, Florent; Berthier, Jérôme; Barache, Christophe; Soudarin, Laurent; Portmann, Christophe; Lambert, Sébastien; Collilieux, Xavier
2013-04-01
This poster presents the context of the astronomical Virtual Observatory (VO), an ambitious international proposal to provide uniform, convenient access to disparate, geographically dispersed archives of astronomical data from software which runs on the computer on the astronomer's desktop. The VO could be of interest for the geodetic community: we present here some of our efforts in this direction that we have recently achieved, concerning the visualization of time series obtained from the analysis of space geodetic techniques. Some of these products are now natively built and archived following the data format recommended by IVOA, the VO-Table format. We present this format, which is based on the XML format, and we list the reasons why we chose to use it. Astronomers using that Virtual Observatory are now organized within an international association called the International Virtual Observatory Alliance (IVOA). As noted on the IVOA website (http://www.ivoa.net/), IVOA was formed in June 2002 with a mission to "facilitate the international coordination and collaboration necessary for the development and deployment of the tools, systems and organizational structures necessary to enable the international utilization of astronomical archives as an integrated and interoperating virtual observatory.
Scene Context Dependency of Pattern Constancy of Time Series Imagery
NASA Technical Reports Server (NTRS)
Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur
2008-01-01
A fundamental element of future generic pattern recognition technology is the ability to extract similar patterns for the same scene despite wide ranging extraneous variables, including lighting, turbidity, sensor exposure variations, and signal noise. In the process of demonstrating pattern constancy of this kind for retinex/visual servo (RVS) image enhancement processing, we found that the pattern constancy performance depended somewhat on scene content. Most notably, the scene topography and, in particular, the scale and extent of the topography in an image, affects the pattern constancy the most. This paper will explore these effects in more depth and present experimental data from several time series tests. These results further quantify the impact of topography on pattern constancy. Despite this residual inconstancy, the results of overall pattern constancy testing support the idea that RVS image processing can be a universal front-end for generic visual pattern recognition. While the effects on pattern constancy were significant, the RVS processing still does achieve a high degree of pattern constancy over a wide spectrum of scene content diversity, and wide ranging extraneousness variations in lighting, turbidity, and sensor exposure.
Permutation Entropy Analysis of Geomagnetic Indices Time Series
NASA Astrophysics Data System (ADS)
De Michelis, Paola; Consolini, Giuseppe
2013-04-01
The Earth's magnetospheric dynamics displays a very complex nature in response to solar wind changes as widely documented in the scientific literature. This complex dynamics manifests in various physical processes occurring in different regions of the Earth's magnetosphere as clearly revealed by previous analyses on geomagnetic indices (AE-indices, Dst, Sym-H, ....., etc.). One of the most interesting features of the geomagnetic indices as proxies of the Earth's magnetospheric dynamics is the multifractional nature of the time series of such indices. This aspect has been interpreted as the occurrence of intermittence and dynamical phase transition in the Earth's magnetosphere. Here, we investigate the Markovian nature of different geomagnetic indices (AE-indices, Sym-H, Asy-H) and their fluctuations by means of Permutation Entropy Analysis. The results clearly show the non-Markovian and different nature of the distinct sets of geomagnetic indices, pointing towards diverse underlying physical processes. A discussion in connection with the nature of the physical processes responsible of each set of indices and their multifractional character is attempted.
Presentations to Emergency Departments for COPD: A Time Series Analysis.
Rosychuk, Rhonda J; Youngson, Erik; Rowe, Brian H
2016-01-01
Background. Chronic obstructive pulmonary disease (COPD) is a common respiratory condition characterized by progressive dyspnea and acute exacerbations which may result in emergency department (ED) presentations. This study examines monthly rates of presentations to EDs in one Canadian province. Methods. Presentations for COPD made by individuals aged ≥55 years during April 1999 to March 2011 were extracted from provincial databases. Data included age, sex, and health zone of residence (North, Central, South, and urban). Crude rates were calculated. Seasonal autoregressive integrated moving average (SARIMA) time series models were developed. Results. ED presentations for COPD totalled 188,824 and the monthly rate of presentation remained relatively stable (from 197.7 to 232.6 per 100,000). Males and seniors (≥65 years) comprised 52.2% and 73.7% of presentations, respectively. The ARIMA(1,0, 0) × (1,0, 1)12 model was appropriate for the overall rate of presentations and for each sex and seniors. Zone specific models showed relatively stable or decreasing rates; the North zone had an increasing trend. Conclusions. ED presentation rates for COPD have been relatively stable in Alberta during the past decade. However, their increases in northern regions deserve further exploration. The SARIMA models quantified the temporal patterns and can help planning future health care service needs. PMID:27445514
River flow time series using least squares support vector machines
NASA Astrophysics Data System (ADS)
Samsudin, R.; Saad, P.; Shabri, A.
2011-06-01
This paper proposes a novel hybrid forecasting model known as GLSSVM, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM). The GMDH is used to determine the useful input variables which work as the time series forecasting for the LSSVM model. Monthly river flow data from two stations, the Selangor and Bernam rivers in Selangor state of Peninsular Malaysia were taken into consideration in the development of this hybrid model. The performance of this model was compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA), GMDH and LSSVM models using the long term observations of monthly river flow discharge. The root mean square error (RMSE) and coefficient of correlation (R) are used to evaluate the models' performances. In both cases, the new hybrid model has been found to provide more accurate flow forecasts compared to the other models. The results of the comparison indicate that the new hybrid model is a useful tool and a promising new method for river flow forecasting.
Presentations to Emergency Departments for COPD: A Time Series Analysis
Youngson, Erik; Rowe, Brian H.
2016-01-01
Background. Chronic obstructive pulmonary disease (COPD) is a common respiratory condition characterized by progressive dyspnea and acute exacerbations which may result in emergency department (ED) presentations. This study examines monthly rates of presentations to EDs in one Canadian province. Methods. Presentations for COPD made by individuals aged ≥55 years during April 1999 to March 2011 were extracted from provincial databases. Data included age, sex, and health zone of residence (North, Central, South, and urban). Crude rates were calculated. Seasonal autoregressive integrated moving average (SARIMA) time series models were developed. Results. ED presentations for COPD totalled 188,824 and the monthly rate of presentation remained relatively stable (from 197.7 to 232.6 per 100,000). Males and seniors (≥65 years) comprised 52.2% and 73.7% of presentations, respectively. The ARIMA(1,0, 0) × (1,0, 1)12 model was appropriate for the overall rate of presentations and for each sex and seniors. Zone specific models showed relatively stable or decreasing rates; the North zone had an increasing trend. Conclusions. ED presentation rates for COPD have been relatively stable in Alberta during the past decade. However, their increases in northern regions deserve further exploration. The SARIMA models quantified the temporal patterns and can help planning future health care service needs. PMID:27445514
Detrended fluctuation analysis of laser Doppler flowmetry time series.
Esen, Ferhan; Aydin, Gülsün Sönmez; Esen, Hamza
2009-12-01
Detrended fluctuation analysis (DFA) of laser Doppler flow (LDF) time series appears to yield improved prognostic power in microvascular dysfunction, through calculation of the scaling exponent, alpha. In the present study the long lasting strenuous activity-induced change in microvascular function was evaluated by DFA in basketball players compared with sedentary control. Forearm skin blood flow was measured at rest and during local heating. Three scaling exponents, the slopes of the three regression lines, were identified corresponding to cardiac, cardio-respiratory and local factors. Local scaling exponent was always approximately one, alpha=1.01+/-0.15, in the control group and did not change with local heating. However, we found a broken line with two scaling exponents (alpha(1)=1.06+/-0.01 and alpha(2)=0.75+/-0.01) in basketball players. The broken line became a single line having one scaling exponent (alpha(T)=0.94+/-0.01) with local heating. The scaling exponents, alpha(2) and alpha(T), smaller than 1 indicate reduced long-range correlation in blood flow due to a loss of integration in local mechanisms and suggest endothelial dysfunction as the most likely candidate. Evaluation of microvascular function from a baseline LDF signal at rest is the superiority of DFA to other methods, spectral or not, that use the amplitude changes of evoked relative signal. PMID:19660479
Detrended fluctuation analysis of laser Doppler flowmetry time series.
Esen, Ferhan; Aydin, Gülsün Sönmez; Esen, Hamza
2009-12-01
Detrended fluctuation analysis (DFA) of laser Doppler flow (LDF) time series appears to yield improved prognostic power in microvascular dysfunction, through calculation of the scaling exponent, alpha. In the present study the long lasting strenuous activity-induced change in microvascular function was evaluated by DFA in basketball players compared with sedentary control. Forearm skin blood flow was measured at rest and during local heating. Three scaling exponents, the slopes of the three regression lines, were identified corresponding to cardiac, cardio-respiratory and local factors. Local scaling exponent was always approximately one, alpha=1.01+/-0.15, in the control group and did not change with local heating. However, we found a broken line with two scaling exponents (alpha(1)=1.06+/-0.01 and alpha(2)=0.75+/-0.01) in basketball players. The broken line became a single line having one scaling exponent (alpha(T)=0.94+/-0.01) with local heating. The scaling exponents, alpha(2) and alpha(T), smaller than 1 indicate reduced long-range correlation in blood flow due to a loss of integration in local mechanisms and suggest endothelial dysfunction as the most likely candidate. Evaluation of microvascular function from a baseline LDF signal at rest is the superiority of DFA to other methods, spectral or not, that use the amplitude changes of evoked relative signal.
Reconstructing Ocean Circulation using Coral (triangle)14C Time Series
Kashgarian, M; Guilderson, T P
2001-02-23
the invasion of fossil fuel CO{sub 2} and bomb {sup 14}C into the atmosphere and surface oceans. Therefore the {Delta}{sup 14}C data that are produced in this study can be used to validate the ocean uptake of fossil fuel CO2 in coupled ocean-atmosphere models. This study takes advantage of the quasi-conservative nature of {sup 14}C as a water mass tracer by using {Delta}{sup 14}C time series in corals to identify changes in the shallow circulation of the Pacific. Although the data itself provides fundamental information on surface water mass movement the true strength is a combined approach which is greater than the individual parts; the data helps uncover deficiencies in ocean circulation models and the model results place long {Delta}{sup 14}C time series in a dynamic framework which helps to identify those locations where additional observations are most needed.
Time series analysis of collective motions in proteins
NASA Astrophysics Data System (ADS)
Alakent, Burak; Doruker, Pemra; ćamurdan, Mehmet C.
2004-01-01
The dynamics of α-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the Cα atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm-1 range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers.
D City Transformations by Time Series of Aerial Images
NASA Astrophysics Data System (ADS)
Adami, A.
2015-02-01
Recent photogrammetric applications, based on dense image matching algorithms, allow to use not only images acquired by digital cameras, amateur or not, but also to recover the vast heritage of analogue photographs. This possibility opens up many possibilities in the use and enhancement of existing photos heritage. The research of the original figuration of old buildings, the virtual reconstruction of disappeared architectures and the study of urban development are some of the application areas that exploit the great cultural heritage of photography. Nevertheless there are some restrictions in the use of historical images for automatic reconstruction of buildings such as image quality, availability of camera parameters and ineffective geometry of image acquisition. These constrains are very hard to solve and it is difficult to discover good dataset in the case of terrestrial close range photogrammetry for the above reasons. Even the photographic archives of museums and superintendence, while retaining a wealth of documentation, have no dataset for a dense image matching approach. Compared to the vast collection of historical photos, the class of aerial photos meets both criteria stated above. In this paper historical aerial photographs are used with dense image matching algorithms to realize 3d models of a city in different years. The models can be used to study the urban development of the city and its changes through time. The application relates to the city centre of Verona, for which some time series of aerial photographs have been retrieved. The models obtained in this way allowed, right away, to observe the urban development of the city, the places of expansion and new urban areas. But a more interesting aspect emerged from the analytical comparison between models. The difference, as the Euclidean distance, between two models gives information about new buildings or demolitions. As considering accuracy it is necessary point out that the quality of final
Optimizing the search for transiting planets in long time series
NASA Astrophysics Data System (ADS)
Ofir, Aviv
2014-01-01
Context. Transit surveys, both ground- and space-based, have already accumulated a large number of light curves that span several years. Aims: The search for transiting planets in these long time series is computationally intensive. We wish to optimize the search for both detection and computational efficiencies. Methods: We assume that the searched systems can be described well by Keplerian orbits. We then propagate the effects of different system parameters to the detection parameters. Results: We show that the frequency information content of the light curve is primarily determined by the duty cycle of the transit signal, and thus the optimal frequency sampling is found to be cubic and not linear. Further optimization is achieved by considering duty-cycle dependent binning of the phased light curve. By using the (standard) BLS, one is either fairly insensitive to long-period planets or less sensitive to short-period planets and computationally slower by a significant factor of ~330 (for a 3 yr long dataset). We also show how the physical system parameters, such as the host star's size and mass, directly affect transit detection. This understanding can then be used to optimize the search for every star individually. Conclusions: By considering Keplerian dynamics explicitly rather than implicitly one can optimally search the BLS parameter space. The presented Optimal BLS enhances the detectability of both very short and very long period planets, while allowing such searches to be done with much reduced resources and time. The Matlab/Octave source code for Optimal BLS is made available. The MATLAB code is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/561/A138
Nonlinear Time Series Analysis of White Dwarf Light Curves
NASA Astrophysics Data System (ADS)
Jevtic, N.; Zelechoski, S.; Feldman, H.; Peterson, C.; Schweitzer, J.
2001-12-01
We use nonlinear time series analysis methods to examine the light intensity curves of white dwarf PG1351+489 obtained by the Whole Earth Telescope (WET). Though these methods were originally introduced to study chaotic systems, when a clear signature of determinism is found for the process generating an observable and it couples the active degrees of freedom of the system, then the notion of phase space provides a framework for exploring the system dynamics of nonlinear systems in general. With a pronounced single frequency, its harmonics and other frequencies of lower amplitude on a broadband background, the PG1351 light curve lends itself to the use of time delay coordinates. Our phase space reconstruction yields a triangular, toroidal three-dimensional shape. This differs from earlier results of a circular toroidal representation. We find a morphological similarity to a magnetic dynamo model developed for fast rotators that yields a union of both results: the circular phase space structure for the ascending portion of the cycle, and the triangular structure for the declining portion. The rise and fall of the dynamo cycle yield both different phase space representations and different correlation dimensions. Since PG1351 is known to have no significant fields, these results may stimulate the observation of light curves of known magnetic white dwarfs for comparison. Using other data obtained by the WET, we compare the phase space reconstruction of DB white dwarf PG1351 with that of GD 358 which has a more complex power spectrum. We also compare these results with those for PG1159. There is some general similarity between the results of the phase space reconstruction for the DB white dwarfs. As expected, the difference between the results for the DB white dwarfs and PG1159 is great.
Nonlinear time series analysis of normal and pathological human walking
NASA Astrophysics Data System (ADS)
Dingwell, Jonathan B.; Cusumano, Joseph P.
2000-12-01
Characterizing locomotor dynamics is essential for understanding the neuromuscular control of locomotion. In particular, quantifying dynamic stability during walking is important for assessing people who have a greater risk of falling. However, traditional biomechanical methods of defining stability have not quantified the resistance of the neuromuscular system to perturbations, suggesting that more precise definitions are required. For the present study, average maximum finite-time Lyapunov exponents were estimated to quantify the local dynamic stability of human walking kinematics. Local scaling exponents, defined as the local slopes of the correlation sum curves, were also calculated to quantify the local scaling structure of each embedded time series. Comparisons were made between overground and motorized treadmill walking in young healthy subjects and between diabetic neuropathic (NP) patients and healthy controls (CO) during overground walking. A modification of the method of surrogate data was developed to examine the stochastic nature of the fluctuations overlying the nominally periodic patterns in these data sets. Results demonstrated that having subjects walk on a motorized treadmill artificially stabilized their natural locomotor kinematics by small but statistically significant amounts. Furthermore, a paradox previously present in the biomechanical literature that resulted from mistakenly equating variability with dynamic stability was resolved. By slowing their self-selected walking speeds, NP patients adopted more locally stable gait patterns, even though they simultaneously exhibited greater kinematic variability than CO subjects. Additionally, the loss of peripheral sensation in NP patients was associated with statistically significant differences in the local scaling structure of their walking kinematics at those length scales where it was anticipated that sensory feedback would play the greatest role. Lastly, stride-to-stride fluctuations in the
Downscaled TRMM Rainfall Time-Series for Catchment Hydrology Applications
NASA Astrophysics Data System (ADS)
Tarnavsky, E.; Mulligan, M.
2009-04-01
Hydrology in semi-arid regions is controlled, to a large extent, by the spatial and temporal distribution of rainfall defined in terms of rainfall depth and intensity. Thus, appropriate representation of the space-time variability of rainfall is essential for catchment-scale hydrological models applied in semi-arid regions. While spaceborne platforms equipped with remote sensing instruments provide information on a range of variables for hydrological modelling, including rainfall, the necessary spatial and temporal detail is rarely obtained from a single dataset. This paper presents a new dynamic model of dryland hydrology, DryMOD, which makes best use of free, public-domain remote sensing data for representation of key variables with a particular focus on (a) simulation of spatial rainfall fields and (b) the hydrological response to rainfall, particularly in terms of rainfall-runoff partitioning. In DryMOD, rainfall is simulated using a novel approach combining 1-km spatial detail from a climatology derived from the TRMM 2B31 dataset (mean monthly rainfall) and 3-hourly temporal detail from time-series derived from the 0.25-degree gridded TRMM 3B42 dataset (rainfall intensity). This allows for rainfall simulation at the hourly time step, as well as accumulation of infiltration, recharge, and runoff at the monthly time step. In combination with temperature, topography, and soil data, rainfall-runoff and soil moisture dynamics are simulated over large dryland regions. In order to investigate the hydrological response to rainfall and variable catchment characteristics, the model is applied to two very different catchments in the drylands of North and West Africa. The results of the study demonstrate the use of remote sensing-based estimates of precipitation intensity and volume for the simulation of critical hydrological parameters. The model allows for better spatial planning of water harvesting activities, as well as for optimisation of agricultural activities
A time-series approach to dynamical systems from classical and quantum worlds
Fossion, Ruben
2014-01-08
This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.
Time series change detection: Algorithms for land cover change
NASA Astrophysics Data System (ADS)
Boriah, Shyam
can be used for decision making and policy planning purposes. In particular, previous change detection studies have primarily relied on examining differences between two or more satellite images acquired on different dates. Thus, a technological solution that detects global land cover change using high temporal resolution time series data will represent a paradigm-shift in the field of land cover change studies. To realize these ambitious goals, a number of computational challenges in spatio-temporal data mining need to be addressed. Specifically, analysis and discovery approaches need to be cognizant of climate and ecosystem data characteristics such as seasonality, non-stationarity/inter-region variability, multi-scale nature, spatio-temporal autocorrelation, high-dimensionality and massive data size. This dissertation, a step in that direction, translates earth science challenges to computer science problems, and provides computational solutions to address these problems. In particular, three key technical capabilities are developed: (1) Algorithms for time series change detection that are effective and can scale up to handle the large size of earth science data; (2) Change detection algorithms that can handle large numbers of missing and noisy values present in satellite data sets; and (3) Spatio-temporal analysis techniques to identify the scale and scope of disturbance events.
Faculty Employment at 4-Year Colleges and Universities
ERIC Educational Resources Information Center
Zhang, Liang; Liu, Xiangmin
2010-01-01
We examine the variation in employment levels of part-time faculty, full-time teaching faculty, and full-time professorial faculty across 4-year colleges and universities in the United States. Employment structures and practices in higher education institutions are determined by a variety of economic and institutional factors. For example, a 1%…
The Coral Data Time Series Need To Be Revisited
NASA Astrophysics Data System (ADS)
Juillet-Leclerc, A.
2004-12-01
Coral skeleton is formed under organism control and its geochemical properties are strongly influenced by biological effects embedding environmental signal. Geochemists have been puzzled by the diversity of geochemical responses showed by colonies grown in a same area. By revisiting the Weber and Woodhead data series (1972), gathering data from enough colonies developed in similar conditions to provide a statistical isotopic value representative of one site, we demonstrate that for Porites and Acropora, the expected isotopic thermometer is revealed when the "vital effect" is removed. On the other hand, by using Acropora cultured in controlled condition, with changing temperature on a range comprised between 23 and 29°C, the comparison of oxygen and carbon isotopic values revealed the role played by kinetic fractionation. This apparent paradox of two co-existing fractionations is explained by the isotopic analyzes of wild and cultured corals operated at micrometer size scale taking into account of microstructures of the skeleton. Two different crystals appear to be the growth units of the skeleton, each crystal corresponding to a specific deposition mechanism. Thus, the measurement performed with a conventional method is a "bulk" measurement, which depends upon two isotopic fractionations. Some investigations underlined the discrepancy of the meaning of the inter-annual and seasonal isotopic records, which could be illustrated by different isotopic calibrations assessed from seasonal or annual data. It has been also explained by micrometer analyses of Porites aragonite. A smoothing at around 400microns of isotopic measurements as well as Sr/Ca indicates that at seasonal time scale the growth unit is the month. This is in agreement with extensive studies conducted by biologists describing the mechanism governing the formation of Porites skeleton: every month is deposited a framework which is progressively filled in. By combining biologists and geochemists knowledge
Time series analysis of diverse extreme phenomena: universal features
NASA Astrophysics Data System (ADS)
Eftaxias, K.; Balasis, G.
2012-04-01
The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.
Evaluating mallard adaptive management models with time series
Conn, P.B.; Kendall, W.L.
2004-01-01
Wildlife practitioners concerned with midcontinent mallard (Anas platyrhynchos) management in the United States have instituted a system of adaptive harvest management (AHM) as an objective format for setting harvest regulations. Under the AHM paradigm, predictions from a set of models that reflect key uncertainties about processes underlying population dynamics are used in coordination with optimization software to determine an optimal set of harvest decisions. Managers use comparisons of the predictive abilities of these models to gauge the relative truth of different hypotheses about density-dependent recruitment and survival, with better-predicting models giving more weight to the determination of harvest regulations. We tested the effectiveness of this strategy by examining convergence rates of 'predictor' models when the true model for population dynamics was known a priori. We generated time series for cases when the a priori model was 1 of the predictor models as well as for several cases when the a priori model was not in the model set. We further examined the addition of different levels of uncertainty into the variance structure of predictor models, reflecting different levels of confidence about estimated parameters. We showed that in certain situations, the model-selection process favors a predictor model that incorporates the hypotheses of additive harvest mortality and weakly density-dependent recruitment, even when the model is not used to generate data. Higher levels of predictor model variance led to decreased rates of convergence to the model that generated the data, but model weight trajectories were in general more stable. We suggest that predictive models should incorporate all sources of uncertainty about estimated parameters, that the variance structure should be similar for all predictor models, and that models with different functional forms for population dynamics should be considered for inclusion in predictor model! sets. All of these
In aquatic systems, time series of dissolved oxygen (DO) have been used to compute estimates of ecosystem metabolism. Central to this open-water method is the assumption that the DO time series is a Lagrangian specification of the flow field. However, most DO time series are coll...
NASA Astrophysics Data System (ADS)
Michon, Timothée; Saulnier, Georges-Marie; Castaings, William
2013-04-01
Despite hydrological models progressed in terms of relevancy and efficiency, a calibration step is still required to estimate parameters values that can not be obtained by field experiments or other physically based reasoning. As a consequence, hydrological models results robustness depends on data availability and data accuracy. Furthermore, current calibration procedure often require concomitant forcing and prognostic variables time series at identical time step (e.g. hourly rainfall and discharges times series for flood hydrological models) which also limits the applicability of hydrological models (what can be done for retrospective historical analysis, for poorly gauged catchment, etc). This communication deals with the question of the possibility of hydrological model calibration with less information content. In particular, it will be shown that rainfall and discharge time series are redundant to some extent and at least on the case study presented here (Ardèche catchment, 2000 km2, Southern France). As a first part, it will be shown that "doing hydrology backward" (Kirchner, 2009) can be generalized to several models based on different and even contradictory assumptions, leading to a model-independent hydrology backward approach. Also, it will be shown that if a model is reasonably set up on a catchment, rainfall time series can be accurately inverted using only discharges time-series. This prefigure the idea that discharges time-series contains both information on rainfall inputs and informations on the discharge-rainfall relationship, i.e. the hydrological behaviour of the considered catchment, and that these coupled informations may be separately identified. In other terms, is it possible to distinguish and to quantify, within the discharges time series, the model parameters values from one side and the rainfall time-series on the other side ? This will be illustrated as a second part. Indeed, presented results will show that, knowing only the hourly
NASA Astrophysics Data System (ADS)
Siggiridou, Elsa; Kugiumtzis, Dimitris
2016-04-01
Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different time series lengths. For nonlinear systems, CGCI from the restricted VAR representations are compared with analogous nonlinear causality indices. Further, CGCI in conjunction with BTS and other restricted VAR representations is applied to multi-channel scalp electroencephalogram (EEG) recordings of epileptic patients containing epileptiform discharges. CGCI on the restricted VAR, and BTS in particular, could track the changes in brain connectivity before, during and after epileptiform discharges, which was not possible using the full VAR representation.
Crossing the Digital Divide: Connecting GIS, Time Series and Space-Time Arrays (Invited)
NASA Astrophysics Data System (ADS)
Maidment, D. R.; Salas, F.; Domenico, B.; Nativi, S.
2010-12-01
Hydrologic information science requires several different kinds of information: GIS coverages of water features of the land surface and subsurface; time series of observations of streamflow, water quality, groundwater levels and climate; and space-time arrays of weather, climate and remotely sensed information. Increasingly, such information is being published as web services, in standardized data structures that transmit smoothly through the internet. A large "Digital Divide" exists between the world of discrete spatial objects in GIS and associated time series, and the world of continuous space-time arrays as is used weather and climate science. In order to cross this divide, it should be possible to search for quantities such as “precipitation” and to find the information no matter whether it comprises time series of precipitation at gage sites, or space-time arrays of precipitation from Nexrad radar rainfall measurements. This means that servers of discrete space-time hydrologic data, such as the CUAHSI HydroServer, and servers of continuous space-time weather and climate data, such as the Unidata THREDDS server, should be able to be indexed in a unified manner that will permit discovery of common information types across different classes of information services. This paper will explore options for accomplishing this goal using the CUAHSI HydroServer and the Unidata THREDDS server as representative examples of information service providers. Among the options to be explored is GI-cat, a federated, standards-based catalog service developed at the Earth and Space Science Informatics Laboratory of the University of Florence.
Detection of cavity migration risks using radar interferometric time series
NASA Astrophysics Data System (ADS)
Chang, L.; Hanssen, R. F.
2012-12-01
, ERS-2, Envisat, and Radarsat-2, to investigate the dynamics (deformation) of the area. In particular we show, for the first time, shear-stress change distribution patterns within the structure of a building, over a period of close to 20 years. Time series analysis shows that deformation rates of ~4 mm/a could be detected for about 18 years, followed by a dramatic increase of up to 20 mm/a in the last period. These results imply that the driving mechanisms of the 2011 catastrophe have a very long lead time and are therefore likely due to a long-lasting gradual motion, such as the upward migration of a cavity. The analysis shows the collocation of the deformation location with relatively shallow near-horizontal mine shafts, suggesting that cavity migration has a high likelihood to be the driving mechanism of the collapse-sinkhole.
NASA Astrophysics Data System (ADS)
Goela, Priscila Costa; Cordeiro, Clara; Danchenko, Sergei; Icely, John; Cristina, Sónia; Newton, Alice
2016-11-01
This study relates sea surface temperature (SST) to the upwelling conditions off the southwest coast of Portugal using statistical analyses of publically available data. Optimum Interpolation (OI) of daily SST data were extracted from the United States (US) National Oceanic and Atmospheric Administration (NOAA) and data for wind speed and direction were from the US National Climatic Data Center. Time series were extracted at a daily frequency for a time horizon of 26 years. Upwelling indices were estimated using westerly (Qx) and southerly (Qy) Ekman transport components. In the first part of the study, time series were inspected for trend and seasonality over the whole period. The seasonally adjusted time series revealed an increasing slope for SST (0.15 °C per decade) and decreasing slopes for Qx (- 84.01 m3 s- 1 km- 1 per decade) and Qy (- 25.20 m3 s- 1 km- 1 per decade), over the time horizon. Structural breaks analysis applied to the time series showed that a statistically significant incremental increase in SST was more pronounced during the last decade. Cross-correlation between upwelling indices and SST revealed a time delay of 5 and 2 days between Qx and SST, and between Qy and SST, respectively. A spectral analysis combined with the previous analysis enabled the identification of four oceanographic seasons. Those seasons were later recognised over a restricted time period of 4 years, between 2008 and 2012, when there was an extensive sampling programme for the validation of ocean colour remote sensing imagery. The seasons were defined as: summer, with intense and regular events of upwelling; autumn, indicating relaxation of upwelling conditions; and spring and winter, showing high interannual variability in terms of number and intensity of upwelling events.
Empirical method to measure stochasticity and multifractality in nonlinear time series.
Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping
2013-12-01
An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems. PMID:24483536
Empirical method to measure stochasticity and multifractality in nonlinear time series
NASA Astrophysics Data System (ADS)
Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping
2013-12-01
An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.
SensL B-Series and C-Series silicon photomultipliers for time-of-flight positron emission tomography
NASA Astrophysics Data System (ADS)
O`Neill, K.; Jackson, C.
2015-07-01
Silicon photomultipliers from SensL are designed for high performance, uniformity and low cost. They demonstrate peak photon detection efficiency of 41% at 420 nm, which is matched to the output spectrum of cerium doped lutetium orthosilicate. Coincidence resolving time of less than 220 ps is demonstrated. New process improvements have lead to the development of C-Series SiPM which reduces the dark noise by over an order of magnitude. In this paper we will show characterization test results which include photon detection efficiency, dark count rate, crosstalk probability, afterpulse probability and coincidence resolving time comparing B-Series to the newest pre-production C-Series. Additionally we will discuss the effect of silicon photomultiplier microcell size on coincidence resolving time allowing the optimal microcell size choice to be made for time of flight positron emission tomography systems.
FALSE DETERMINATIONS OF CHAOS IN SHORT NOISY TIME SERIES. (R828745)
A method (NEMG) proposed in 1992 for diagnosing chaos in noisy time series with 50 or fewer observations entails fitting the time series with an empirical function which predicts an observation in the series from previous observations, and then estimating the rate of divergenc...
Gómez-Extremera, Manuel; Carpena, Pedro; Ivanov, Plamen Ch; Bernaola-Galván, Pedro A
2016-04-01
We systematically study the scaling properties of the magnitude and sign of the fluctuations in correlated time series, which is a simple and useful approach to distinguish between systems with different dynamical properties but the same linear correlations. First, we decompose artificial long-range power-law linearly correlated time series into magnitude and sign series derived from the consecutive increments in the original series, and we study their correlation properties. We find analytical expressions for the correlation exponent of the sign series as a function of the exponent of the original series. Such expressions are necessary for modeling surrogate time series with desired scaling properties. Next, we study linear and nonlinear correlation properties of series composed as products of independent magnitude and sign series. These surrogate series can be considered as a zero-order approximation to the analysis of the coupling of magnitude and sign in real data, a problem still open in many fields. We find analytical results for the scaling behavior of the composed series as a function of the correlation exponents of the magnitude and sign series used in the composition, and we determine the ranges of magnitude and sign correlation exponents leading to either single scaling or to crossover behaviors. Finally, we obtain how the linear and nonlinear properties of the composed series depend on the correlation exponents of their magnitude and sign series. Based on this information we propose a method to generate surrogate series with controlled correlation exponent and multifractal spectrum.
Gómez-Extremera, Manuel; Carpena, Pedro; Ivanov, Plamen Ch; Bernaola-Galván, Pedro A
2016-04-01
We systematically study the scaling properties of the magnitude and sign of the fluctuations in correlated time series, which is a simple and useful approach to distinguish between systems with different dynamical properties but the same linear correlations. First, we decompose artificial long-range power-law linearly correlated time series into magnitude and sign series derived from the consecutive increments in the original series, and we study their correlation properties. We find analytical expressions for the correlation exponent of the sign series as a function of the exponent of the original series. Such expressions are necessary for modeling surrogate time series with desired scaling properties. Next, we study linear and nonlinear correlation properties of series composed as products of independent magnitude and sign series. These surrogate series can be considered as a zero-order approximation to the analysis of the coupling of magnitude and sign in real data, a problem still open in many fields. We find analytical results for the scaling behavior of the composed series as a function of the correlation exponents of the magnitude and sign series used in the composition, and we determine the ranges of magnitude and sign correlation exponents leading to either single scaling or to crossover behaviors. Finally, we obtain how the linear and nonlinear properties of the composed series depend on the correlation exponents of their magnitude and sign series. Based on this information we propose a method to generate surrogate series with controlled correlation exponent and multifractal spectrum. PMID:27176287
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2014-07-01
Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. PMID:24732236
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2014-07-01
Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs.
Record statistics of financial time series and geometric random walks.
Sabir, Behlool; Santhanam, M S
2014-09-01
The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.
Record statistics of financial time series and geometric random walks
NASA Astrophysics Data System (ADS)
Sabir, Behlool; Santhanam, M. S.
2014-09-01
The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.
Aerosol climate time series from ESA Aerosol_cci (Invited)
NASA Astrophysics Data System (ADS)
Holzer-Popp, T.
2013-12-01
developed further, to evaluate the datasets and their regional and seasonal merits. The validation showed that most datasets have improved significantly and in particular PARASOL (ocean only) provides excellent results. The metrics for AATSR (land and ocean) datasets are similar to those of MODIS and MISR, with AATSR better in some land regions and less good in some others (ocean). However, AATSR coverage is smaller than that of MODIS due to swath width. The MERIS dataset provides better coverage than AATSR but has lower quality (especially over land) than the other datasets. Also the synergetic AATSR/SCIAMACHY dataset has lower quality. The evaluation of the pixel uncertainties shows first good results but also reveals that more work needs to be done to provide comprehensive information for data assimilation. Users (MACC/ECMWF, AEROCOM) confirmed the relevance of this additional information and encouraged Aerosol_cci to release the current uncertainties. The paper will summarize and discuss the results of three year work in Aerosol_cci, extract the lessons learned and conclude with an outlook to the work proposed for the next three years. In this second phase a cyclic effort of algorithm evolution, dataset generation, validation and assessment will be applied to produce and further improve complete time series from all sensors under investigation, new sensors will be added (e.g. IASI), and preparation for the Sentinel missions will be made.
Evaluating the uncertainty of predicting future climate time series at the hourly time scale
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.
2011-12-01
A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.
Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary
2014-11-01
Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management.
Studies in astronomical time series analysis: Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1979-01-01
Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.
Study in the natural time domain of the entropy of dichotomic geoelectrical and chaotic time series
NASA Astrophysics Data System (ADS)
Ramírez-Rojas, A.; Telesca, L.; Angulo-Brown, F.
2010-12-01
The so-called seismo-electric signals (SES) have been considered as precursors of great earthquakes. To characterize possible SES activities, the Natural Time Domain (NTD) (Varotsos et al., 2001) was proposed as adequate methodology. In this work we analyze two geoelectric time series measured in a very seismically active area of South Pacific Mexican coast, and a chaotic time series obtained from the Liebovitch and Thot (LT) chaotic map. The two analyzed geoelectric signals display possible SES activities associated with the earhquakes occurred on October 24, 1993 (M6.6, epicenter at (16.54N, 98.98W)) and on September 14, 1995 (M7.4, epicenter at (16.31N, 98.88W)). Our monitoring station was located at (16.50N, 99.47W) close to Acapulco city and the experimental set-up was based on the VAN methodology. We found that the correlation degree of the SES geoelectric signals increases before the occurrence of the seismic events with power spectrum and entropy calculated in NTD in good agreement with analogous studies in the field of earthquake-related phenomena. Such SES activity, analysed in NTD, can be discriminated from the LT- chaotic map and from artificial noises. Varotsos P.A., Sarlis N.V., Skordas E.S., Practica of Athens Academy 76, (2001) 294 Liebovitch S.L. and Thot T.I., J. Theor. Biol., 148(1991), 243-267
A hybrid algorithm for clustering of time series data based on affinity search technique.
Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A; Shaygan, Mohammad Amin; Jalali, Alireza
2014-01-01
Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets.
Piecewise aggregate representations and lower-bound distance functions for multivariate time series
NASA Astrophysics Data System (ADS)
Li, Hailin
2015-06-01
Dimensionality reduction is one of the most important methods to improve the efficiency of the techniques that are applied to the field of multivariate time series data mining. Due to multivariate time series with the variable-based and time-based dimensions, the reduction techniques must take both of them into consideration. To achieve this goal, we use a center sequence to represent a multivariate time series so that the new sequence can be seen as a univariate time series. Thus two sophisticated piecewise aggregate representations, including piecewise aggregate approximation and symbolization applied to univariate time series, are used to further represent the extended sequence that is derived from the center one. Furthermore, some distance functions are designed to measure the similarity between two representations. Through being proven by some related mathematical analysis, the proposed functions are lower bound on Euclidean distance and dynamic time warping. In this way, false dismissals can be avoided when they are used to index the time series. In addition, multivariate time series with different lengths can be transformed into the extended sequences with equal length, and their corresponding distance functions can measure the similarity between two unequal-length multivariate time series. The experimental results demonstrate that the proposed methods can reduce the dimensionality, and their corresponding distance functions satisfy the lower-bound condition, which can speed up the calculation of similarity search and indexing in the multivariate time series datasets.
Correlated errors in geodetic time series: Implications for time-dependent deformation
Langbein, J.; Johnson, H.
1997-01-01
Analysis of frequent trilateration observations from the two-color electronic distance measuring networks in California demonstrate that the noise power spectra are dominated by white noise at higher frequencies and power law behavior at lower frequencies. In contrast, Earth scientists typically have assumed that only white noise is present in a geodetic time series, since a combination of infrequent measurements and low precision usually preclude identifying the time-correlated signature in such data. After removing a linear trend from the two-color data, it becomes evident that there are primarily two recognizable types of time-correlated noise present in the residuals. The first type is a seasonal variation in displacement which is probably a result of measuring to shallow surface monuments installed in clayey soil which responds to seasonally occurring rainfall; this noise is significant only for a small fraction of the sites analyzed. The second type of correlated noise becomes evident only after spectral analysis of line length changes and shows a functional relation at long periods between power and frequency of and where f is frequency and ?? ??? 2. With ?? = 2, this type of correlated noise is termed random-walk noise, and its source is mainly thought to be small random motions of geodetic monuments with respect to the Earth's crust, though other sources are possible. Because the line length changes in the two-color networks are measured at irregular intervals, power spectral techniques cannot reliably estimate the level of I//" noise. Rather, we also use here a maximum likelihood estimation technique which assumes that there are only two sources of noise in the residual time series (white noise and randomwalk noise) and estimates the amount of each. From this analysis we find that the random-walk noise level averages about 1.3 mm/Vyr and that our estimates of the white noise component confirm theoretical limitations of the measurement technique. In
Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
2001-01-01
Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…
Scale and time dependence of serial correlations in word-length time series of written texts
NASA Astrophysics Data System (ADS)
Rodriguez, E.; Aguilar-Cornejo, M.; Femat, R.; Alvarez-Ramirez, J.
2014-11-01
This work considered the quantitative analysis of large written texts. To this end, the text was converted into a time series by taking the sequence of word lengths. The detrended fluctuation analysis (DFA) was used for characterizing long-range serial correlations of the time series. To this end, the DFA was implemented within a rolling window framework for estimating the variations of correlations, quantified in terms of the scaling exponent, strength along the text. Also, a filtering derivative was used to compute the dependence of the scaling exponent relative to the scale. The analysis was applied to three famous English-written literary narrations; namely, Alice in Wonderland (by Lewis Carrol), Dracula (by Bram Stoker) and Sense and Sensibility (by Jane Austen). The results showed that high correlations appear for scales of about 50-200 words, suggesting that at these scales the text contains the stronger coherence. The scaling exponent was not constant along the text, showing important variations with apparent cyclical behavior. An interesting coincidence between the scaling exponent variations and changes in narrative units (e.g., chapters) was found. This suggests that the scaling exponent obtained from the DFA is able to detect changes in narration structure as expressed by the usage of words of different lengths.
Modeling PSInSAR time series without phase unwrapping
Zhang, L.; Ding, X.; Lu, Zhiming
2011-01-01
In this paper, we propose a least-squares-based method for multitemporal synthetic aperture radar interferometry that allows one to estimate deformations without the need of phase unwrapping. The method utilizes a series of multimaster wrapped differential interferograms with short baselines and focuses on arcs at which there are no phase ambiguities. An outlier detector is used to identify and remove the arcs with phase ambiguities, and a pseudoinverse of the variancecovariance matrix is used as the weight matrix of the correlated observations. The deformation rates at coherent points are estimated with a least squares model constrained by reference points. The proposed approach is verified with a set of simulated data. ?? 2006 IEEE.
Inverse sequential procedures for the monitoring of time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy
1993-01-01
Climate changes traditionally have been detected from long series of observations and long after they happened. The 'inverse sequential' monitoring procedure is designed to detect changes as soon as they occur. Frequency distribution parameters are estimated both from the most recent existing set of observations and from the same set augmented by 1,2,...j new observations. Individual-value probability products ('likelihoods') are then calculated which yield probabilities for erroneously accepting the existing parameter(s) as valid for the augmented data set and vice versa. A parameter change is signaled when these probabilities (or a more convenient and robust compound 'no change' probability) show a progressive decrease. New parameters are then estimated from the new observations alone to restart the procedure. The detailed algebra is developed and tested for Gaussian means and variances, Poisson and chi-square means, and linear or exponential trends; a comprehensive and interactive Fortran program is provided in the appendix.
From time series to complex networks: The phase space coarse graining
NASA Astrophysics Data System (ADS)
Wang, Minggang; Tian, Lixin
2016-11-01
In this paper, we present a simple and fast computational method, the phase space coarse graining algorithm that converts a time series into a directed and weighted complex network. The constructed directed and weighted complex network inherits several properties of the series in its structure. Thereby, periodic series convert into regular networks, and random series do so into random networks. Moreover, chaotic series convert into scale-free networks. It is shown that the phase space coarse graining algorithm allows us to distinguish, identify and describe in detail various time series. Finally, we apply the phase space coarse graining algorithm to the practical observations series, international gasoline regular spot price series and identify its dynamic characteristics.
Improving Post-Hurricane Katrina Forest Management with MODIS Time Series Products
NASA Technical Reports Server (NTRS)
Lewis, Mark David; Spruce, Joseph; Evans, David; Anderson, Daniel
2012-01-01
Hurricane damage to forests can be severe, causing millions of dollars of timber damage and loss. To help mitigate loss, state agencies require information on location, intensity, and extent of damaged forests. NASA's MODerate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) time series data products offers a potential means for state agencies to monitor hurricane-induced forest damage and recovery across a broad region. In response, a project was conducted to produce and assess 250 meter forest disturbance and recovery maps for areas in southern Mississippi impacted by Hurricane Katrina. The products and capabilities from the project were compiled to aid work of the Mississippi Institute for Forest Inventory (MIFI). A series of NDVI change detection products were computed to assess hurricane induced damage and recovery. Hurricane-induced forest damage maps were derived by computing percent change between MODIS MOD13 16-day composited NDVI pre-hurricane "baseline" products (2003 and 2004) and post-hurricane NDVI products (2005). Recovery products were then computed in which post storm 2006, 2007, 2008 and 2009 NDVI data was each singularly compared to the historical baseline NDVI. All percent NDVI change considered the 16-day composite period of August 29 to September 13 for each year in the study. This provided percent change in the maximum NDVI for the 2 week period just after the hurricane event and for each subsequent anniversary through 2009, resulting in forest disturbance products for 2005 and recovery products for the following 4 years. These disturbance and recovery products were produced for the Mississippi Institute for Forest Inventory's (MIFI) Southeast Inventory District and also for the entire hurricane impact zone. MIFI forest inventory products were used as ground truth information for the project. Each NDVI percent change product was classified into 6 categories of forest disturbance intensity. Stand age
Inferring Time-Delayed Causal Gene Network Using Time-Series Expression Data.
Lo, Leung-Yau; Leung, Kwong-Sak; Lee, Kin-Hong
2015-01-01
Inferring gene regulatory network (GRN) from the microarray expression data is an important problem in Bioinformatics, because knowing the GRN is an essential first step in understanding the inner workings of the cell and the related diseases. Time delays exist in the regulatory effects from one gene to another due to the time needed for transcription, translation, and to accumulate a sufficient number of needed proteins. Also, it is known that the delays are important for oscillatory phenomenon. Therefore, it is crucial to develop a causal gene network model, preferably as a function of time. In this paper, we propose an algorithm CLINDE to infer causal directed links in GRN with time delays and regulatory effects in the links from time-series microarray gene expression data. It is one of the most comprehensive in terms of features compared to the state-of-the-art discrete gene network models. We have tested CLINDE on synthetic data, the in vivo IRMA (On and Off) datasets and the [1] yeast expression data validated using KEGG pathways. Results show that CLINDE can effectively recover the links, the time delays and the regulatory effects in the synthetic data, and outperforms other algorithms in the IRMA in vivo datasets.
Study of Track Irregularity Time Series Calibration and Variation Pattern at Unit Section
Jia, Chaolong; Wei, Lili; Wang, Hanning; Yang, Jiulin
2014-01-01
Focusing on problems existing in track irregularity time series data quality, this paper first presents abnormal data identification, data offset correction algorithm, local outlier data identification, and noise cancellation algorithms. And then proposes track irregularity time series decomposition and reconstruction through the wavelet decomposition and reconstruction approach. Finally, the patterns and features of track irregularity standard deviation data sequence in unit sections are studied, and the changing trend of track irregularity time series is discovered and described. PMID:25435869
NASA Technical Reports Server (NTRS)
Hailperin, Max
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.
Analysis of the temporal properties in car accident time series
NASA Astrophysics Data System (ADS)
Telesca, Luciano; Lovallo, Michele
2008-05-01
In this paper we study the time-clustering behavior of sequences of car accidents, using data from a freely available database in the internet. The Allan Factor analysis, which is a well-suited method to investigate time-dynamical behaviors in point processes, has revealed that the car accident sequences are characterized by a general time-scaling behavior, with the presence of cyclic components. These results indicate that the time dynamics of the events are not Poissonian but long range correlated with periodicities ranging from 12 h to 1 year.
Rich, Virginia I; Pham, Vinh D; Eppley, John; Shi, Yanmei; DeLong, Edward F
2011-01-01
To investigate the temporal, spatial and phylogenetic resolution of marine microbial community structure and variability, we designed and expanded a genome proxy array (an oligonucleotide microarray targeting marine microbial genome fragments and genomes), evaluated it against metagenomic sequencing, and applied it to time-series samples from the Monterey Bay. The expanded array targeted 268 microbial genotypes across much of the known diversity of cultured and uncultured marine microbes. The target abundances measured by the array were highly correlated to pyrosequence-based abundances (linear regression R(2) = 0.85-0.91, P < 0.0001). Fifty-seven samples from ∼4 years in Monterey Bay were examined with the array, spanning the photic zone (0 m), the base of the surface mixed layer (30 m) and the subphotic zone (200 m). A significant portion of the expanded genome proxy array's targets showed signal (95 out of 268 targets present in ≥ 1 sample). The multi-year community survey showed the consistent presence of a core group of common and abundant targeted taxa at each depth in Monterey Bay, higher variability among shallow than deep samples, and episodic occurrences of more transient marine genotypes. The abundance of the most dominant genotypes peaked after strong episodic upwelling events. The genome-proxy array's ability to track populations of closely related genotypes indicated population shifts within several abundant target taxa, with specific populations in some cases clustering by depth or oceanographic season. Although 51 cultivated organisms were targeted (representing 19% of the array) the majority of targets detected and of total target signal (85% and ∼92% respectively) were from uncultivated genotypes, often those derived from Monterey Bay. The array provided a relatively cost-effective approach (∼$15 per array) for surveying the natural history of uncultivated lineages.
On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis
NASA Astrophysics Data System (ADS)
Vigo, Isabel M.; Trottini, Mario; Belda, Santiago
2016-04-01
In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.
2005-01-01
We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.
Marwaha, Puneeta; Sunkaria, Ramesh Kumar
2016-09-01
The sample entropy (SampEn) has been widely used to quantify the complexity of RR-interval time series. It is a fact that higher complexity, and hence, entropy is associated with the RR-interval time series of healthy subjects. But, SampEn suffers from the disadvantage that it assigns higher entropy to the randomized surrogate time series as well as to certain pathological time series, which is a misleading observation. This wrong estimation of the complexity of a time series may be due to the fact that the existing SampEn technique updates the threshold value as a function of long-term standard deviation (SD) of a time series. However, time series of certain pathologies exhibits substantial variability in beat-to-beat fluctuations. So the SD of the first order difference (short term SD) of the time series should be considered while updating threshold value, to account for period-to-period variations inherited in a time series. In the present work, improved sample entropy (I-SampEn), a new methodology has been proposed in which threshold value is updated by considering the period-to-period variations of a time series. The I-SampEn technique results in assigning higher entropy value to age-matched healthy subjects than patients suffering atrial fibrillation (AF) and diabetes mellitus (DM). Our results are in agreement with the theory of reduction in complexity of RR-interval time series in patients suffering from chronic cardiovascular and non-cardiovascular diseases.
Introducing the US Ocean Carbon Biogeochemistry Subcommittee on Ocean Time-Series
NASA Astrophysics Data System (ADS)
Neuer, Susanne; Benway, Heather
2015-04-01
The objective of this presentation is to showcase activities of the Ocean Time-series Committee (OTC), a subcommittee of the scientific steering committee of the US Ocean Carbon & Biogeochemistry (OCB) Program (www.us-ocb.org). OCB is a scientific coordinating body that facilitates collaborative, interdisciplinary research opportunities and initiatives within the U.S. and with international partners. The OTC's focus is to highlight the importance of shipboard time-series as unique observing assets to the oceanographic community, and to encourage synergistic and collaborative technology and methods development, including development and validation of sensors and autonomous devices, and their possible integration into existing time-series observations. A major emphasis of the OTC has been to improve communication and collaboration among U.S. and international scientists engaged in ocean time-series science. For example, in 2012, OCB/OTC and the International Ocean Carbon Coordination Project (IOCCP) co-organized an international time-series workshop in Bermuda focused on biogeochemical time-series methods and data intercomparison. A key outcome of this workshop was a best practices guide for shipboard sampling and analytical protocols used at biogeochemical time-series sites and the development of a global time-series network to improve international coordination and communication among the operators of the >150 marine biogeochemical time-series. We hope that this presentation will stimulate a discussion of common goals and visions for the future of time-series observations and ways to enhance collaboration among the international time-series community.
Time and Learning: Scheduling for Success. Hot Topics Series.
ERIC Educational Resources Information Center
Kennedy, Robert L., Ed.; Witcher, Ann E., Ed.
This book provides information for educators considering ways to make the best use of time available for learning. Twenty-one articles are divided into 5 chapters. Chapter 1: "How Can We Make the Most of the School Day?" includes an overview and 6 articles: (1) "Block Scheduling" (Karen Irmsher); (2)"The Hybrid Schedule: Scheduling to the…
Mediating Relations: Therapeutic Discourse in American Prime Time Series.
ERIC Educational Resources Information Center
White, Mimi
Although "The Equalizer" and "Finder of Lost Loves" are different kinds of prime time fiction--urban thriller on the one hand and fantasy melodrama on the other--they share an underlying dramatic structure and symbolic problematic in their repeated enactments of a therapeutic cure overseen by a mediating, authority figure. The protagonists in both…
Pollution source analyses using 1-dimensional time series
NASA Astrophysics Data System (ADS)
Schaeffer, David J.; Corley, Charles; Chien, Harris
1983-09-01
Lake Holiday, a human-made recreational lake in northern Illinois, was threatened with closure due to high bacterial levels. A factorially designed experiment with multivatiate responses was developed to study and identify the main sources of pollution. Data on total coliform, fecal coliform, fecal streptococci, dissolved oxygen, pH, ammonia-N, total phosphorus, nitrate/nitrite-N were analyzed using regression models describing dam spillway loads as a function of source loads and time. The results suggest that the relationships among source, time, and load are complex, even though only two sources account for most of the lake's loading Stevens Brook, which receives the discharge from the Somonauk Sewage Treatment Plant, and Somonauk Creek, which is the major drainage, contribute high loads of bacteria and nutrients to the lake. The influent loads contributed to the lake are discharged at the dam over about 4 weeks
Series Solutions of Time-Fractional Host-Parasitoid Systems
NASA Astrophysics Data System (ADS)
Arafa, A. A. M.
2011-12-01
In this paper, Adomian's decomposition method (ADM) has been used for solving time-fractional host-parasitoid system. The derivatives are understood in the Caputo sense. The reason of using fractional order differential equations (FOD) is that FOD are naturally related to systems with memory which exists in most biological systems. Also they are closely related to fractals which are abundant in biological systems. Numerical example justifies the proposed scheme.
A new threshold selection method for peak over for nonstationary time series
NASA Astrophysics Data System (ADS)
Zhou, C. R.; Chen, Y. F.; Gu, S. H.; Huang, Q.; Yuan, J. C.; Yu, S. N.
2016-08-01
In the context of global climate change, human activities dramatically damage the consistency of hydrological time series. Peak Over Threshold (POT) series have become an alternative to the traditional Annual Maximum series, but it is still underutilized due to its complexity. Most literature about POT tended to employ only one threshold regardless of the non-stationarity of the whole series. Obviously, it is unwise to ignore the fact that our hydrological time series may no longer be a stationary stochastic process. Hence, in this paper, we take the daily runoff time series of the Yichang gauge station on the Yangtze River in China as an example, and try to shed light on the selection of the threshold provided non- stationarity of our time series. The Mann-Kendall test is applied to detect the change points; then, we gave different thresholds according to the change points to the sub-series. Comparing the goodness-of-fit of the series with one and several thresholds, it clearly investigates the series that employs different thresholds performs much better than that just fixes one threshold during the selection of the peak events.
Hosseini, Parsa; Tremblay, Arianne; Matthews, Benjamin F; Alkharouf, Nadim W
2012-01-01
With the advent of next-generation sequencing, -omics fields such as transcriptomics have experienced increases in data throughput on the order of magnitudes. In terms of analyzing and visually representing these huge datasets, an intuitive and computationally tractable approach is to map quantified transcript expression onto biochemical pathways while employing datamining and visualization principles to accelerate knowledge discovery. We present two cross-platform tools: MAPT (Mapping and Analysis of Pathways through Time) and PAICE (Pathway Analysis and Integrated Coloring of Experiments), an easy to use analysis suite to facilitate time series and single time point transcriptomics analysis. In unison, MAPT and PAICE serve as a visual workbench for transcriptomics knowledge discovery, data-mining and functional annotation. Both PAICE and MAPT are two distinct but yet inextricably linked tools. The former is specifically designed to map EC accessions onto KEGG pathways while handling multiple gene copies, detection-call analysis, as well as UN/annotated EC accessions lacking quantifiable expression. The latter tool integrates PAICE datasets to drive visualization, annotation, and data-mining. Availability The database is available for free at http://sourceforge.net/projects/paice/http://sourceforge.net/projects/mapt/ PMID:22493539
Use of Time-Series, ARIMA Designs to Assess Program Efficacy.
ERIC Educational Resources Information Center
Braden, Jeffery P.; And Others
1990-01-01
Illustrates use of time-series designs for determining efficacy of interventions with fictitious data describing drug-abuse prevention program. Discusses problems and procedures associated with time-series data analysis using Auto Regressive Integrated Moving Averages (ARIMA) models. Example illustrates application of ARIMA analysis for…
Transformation-cost time-series method for analyzing irregularly sampled data.
Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen
2015-06-01
Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.
Investigation of changes in characteristics of hydrological time series by Bayesian methods
NASA Astrophysics Data System (ADS)
Rao, A. Ramachandra; Tirtotjondro, Wahju
1996-11-01
A review of literature reveals the inadequacy of Intervention analysis and spectrum based methods to adequately quantify changes in hydrologic times series. A Bayesian method is used to investigate the statistical significance of observed changes in hydrologic times series and the results are reported herein. The Bayesian method is superior to the previous methods.
A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series
ERIC Educational Resources Information Center
Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.
2011-01-01
Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…
New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…
Gradient radial basis function networks for nonlinear and nonstationary time series prediction.
Chng, E S; Chen, S; Mulgrew, B
1996-01-01
We present a method of modifying the structure of radial basis function (RBF) network to work with nonstationary series that exhibit homogeneous nonstationary behavior. In the original RBF network, the hidden node's function is to sense the trajectory of the time series and to respond when there is a strong correlation between the input pattern and the hidden node's center. This type of response, however, is highly sensitive to changes in the level and trend of the time series. To counter these effects, the hidden node's function is modified to one which detects and reacts to the gradient of the series. We call this new network the gradient RBF (GRBF) model. Single and multistep predictive performance for the Mackey-Glass chaotic time series were evaluated using the classical RBF and GRBF models. The simulation results for the series without and with a tine-varying mean confirm the superior performance of the GRBF predictor over the RBF predictor.
In situ time-series measurements of subseafloor sediment properties
Wheatcroft, R.A.; Stevens, A.W.; Johnson, R.V.
2007-01-01
The capabilities and diversity of subsurface sediment sensors lags significantly from what is available for the water column, thereby limiting progress in understanding time-dependent seabed exchange and high-frequency acoustics. To help redress this imbalance, a new instrument, the autonomous sediment profiler (ASP), is described herein. ASP consists of a four-electrode, Wenner-type resistivity probe and a thermistor that log data at 0.1-cm vertical intervals over a 58-cm vertical profile. To avoid resampling the same spot on the seafloor, the probes are moved horizontally within a 20 times 100-cm-2 area in one of three preselected patterns. Memory and power capacities permit sampling at hourly intervals for up to 3-mo duration. The system was tested in a laboratory tank and shown to be able to resolve high-frequency sediment consolidation, as well as changes in sediment roughness. In a field test off the southern coast of France, the system collected resistivity and temperature data at hourly intervals for 16 d. Coupled with environmental data collected on waves, currents, and suspended sediment, the ASP is shown to be useful for understanding temporal evolution of subsurface sediment porosity, although no large depositional or erosional events occurred during the deployment. Following a rapid decrease in bottom-water temperature, the evolution of the subsurface temperature field was consistent with the 1-D thermal diffusion equation coupled with advection in the upper 3-4 cm. Collectively, the laboratory and field tests yielded promising results on time-dependent seabed change.
Analysis of time series of glacier speed: Columbia Glacier, Alaska
Walters, R.A.; Dunlap, W.W.
1987-01-01
During the summer of 1984 and 1985, laser measurements were made of the distance from a reference location to markers on the surface of the lower reach of Columbia Glacier, Alaska. The speed varies from 7 to 15 m/d and has three noteworthy components: 1) a low-frequency perturbation in speed with a time scale of days related to increased precipitation, 2) semidiurnal and diurnal variations related to sea tides, and 3) diurnal variations related to glacier surface melt. -from Authors
Alcohol Messages in Prime-Time Television Series
RUSSELL, CRISTEL ANTONIA; RUSSELL, DALE W.
2010-01-01
Alcohol messages contained in television programming serve as sources of information about drinking. To better understand the ways embedded messages about alcohol are communicated, it is crucial to objectively monitor and analyze television alcohol depictions. This article presents a content analysis of an eight-week sample of eighteen prime-time programs. Alcohol messages were coded based on modalities of presentation, level of plot connection, and valence. The analysis reveals that mixed messages about alcohol often coexist but the ways in which they are presented differ: whereas negative messages are tied to the plot and communicated verbally, positive messages are associated with subtle visual portrayals. PMID:21188281
On the Long-Term Correlations and Multifractal Properties of Electric Arc Furnace Time Series
NASA Astrophysics Data System (ADS)
Livi, Lorenzo; Maiorino, Enrico; Rizzi, Antonello; Sadeghian, Alireza
In this paper, we study long-term correlations and multifractal properties elaborated from time series of three-phase current signals from an industrial electric arc furnace. Implicit sinusoidal trends are suitably detected by considering the scaling of the fluctuation functions. Time series are then filtered via a Fourier-based analysis to remove such strong periodicities. In the filtered time series we detected long-term, positive correlations. The presence of positive correlations is in agreement with the typical V-I characteristic (hysteresis) of the electric arc furnace, thus providing a sound physical justification for the memory effects found in the current time series. The multifractal signature is strong enough in the filtered time series to be effectively classified as multifractal.
Time series analysis of the developed financial markets' integration using visibility graphs
NASA Astrophysics Data System (ADS)
Zhuang, Enyu; Small, Michael; Feng, Gang
2014-09-01
A time series representing the developed financial markets' segmentation from 1973 to 2012 is studied. The time series reveals an obvious market integration trend. To further uncover the features of this time series, we divide it into seven windows and generate seven visibility graphs. The measuring capabilities of the visibility graphs provide means to quantitatively analyze the original time series. It is found that the important historical incidents that influenced market integration coincide with variations in the measured graphical node degree. Through the measure of neighborhood span, the frequencies of the historical incidents are disclosed. Moreover, it is also found that large "cycles" and significant noise in the time series are linked to large and small communities in the generated visibility graphs. For large cycles, how historical incidents significantly affected market integration is distinguished by density and compactness of the corresponding communities.
Evaluation of nonlinearity and validity of nonlinear modeling for complex time series
NASA Astrophysics Data System (ADS)
Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo
2007-10-01
Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.
Time-series analysis of networks: Exploring the structure with random walks
NASA Astrophysics Data System (ADS)
Weng, Tongfeng; Zhao, Yi; Small, Michael; Huang, Defeng David
2014-08-01
We generate time series from scale-free networks based on a finite-memory random walk traversing the network. These time series reveal topological and functional properties of networks via their temporal correlations. Remarkably, networks with different node-degree mixing patterns exhibit distinct self-similar characteristics. In particular, assortative networks are transformed into time series with long-range correlation, while disassortative networks are transformed into time series exhibiting anticorrelation. These relationships are consistent across a diverse variety of real networks. Moreover, we show that multiscale analysis of these time series can describe and classify various physical networks ranging from social and technological to biological networks according to their functional origin. These results suggest that there is a unified dynamical mechanism that governs the structural organization of many seemingly different networks.
Ocean wavenumber estimation from wave-resolving time series imagery
Plant, N.G.; Holland, K.T.; Haller, M.C.
2008-01-01
We review several approaches that have been used to estimate ocean surface gravity wavenumbers from wave-resolving remotely sensed image sequences. Two fundamentally different approaches that utilize these data exist. A power spectral density approach identifies wavenumbers where image intensity variance is maximized. Alternatively, a cross-spectral correlation approach identifies wavenumbers where intensity coherence is maximized. We develop a solution to the latter approach based on a tomographic analysis that utilizes a nonlinear inverse method. The solution is tolerant to noise and other forms of sampling deficiency and can be applied to arbitrary sampling patterns, as well as to full-frame imagery. The solution includes error predictions that can be used for data retrieval quality control and for evaluating sample designs. A quantitative analysis of the intrinsic resolution of the method indicates that the cross-spectral correlation fitting improves resolution by a factor of about ten times as compared to the power spectral density fitting approach. The resolution analysis also provides a rule of thumb for nearshore bathymetry retrievals-short-scale cross-shore patterns may be resolved if they are about ten times longer than the average water depth over the pattern. This guidance can be applied to sample design to constrain both the sensor array (image resolution) and the analysis array (tomographic resolution). ?? 2008 IEEE.
Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis
NASA Astrophysics Data System (ADS)
Zhou, Z.-G.; Tang, P.; Zhou, M.
2016-06-01
Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.