Science.gov

Sample records for actual time series

  1. Time Series Explorer

    NASA Astrophysics Data System (ADS)

    Scargle, J.

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Examples of application of these tools for automated time series discovery will be given.

  2. Time Series Explorer

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas

    The key, central objectives of the proposed Time Series Explorer project are to develop an organized collection of software tools for analysis of time series data in current and future NASA astrophysics data archives, and to make the tools available in two ways: as a library (the Time Series Toolbox) that individual science users can use to write their own data analysis pipelines, and as an application (the Time Series Automaton) providing an accessible, data-ready interface to many Toolbox algorithms, facilitating rapid exploration and automatic processing of time series databases. A number of time series analysis methods will be implemented, including techniques that range from standard ones to state-of-the-art developments by the proposers and others. Most of the algorithms will be able to handle time series data subject to real-world problems such as data gaps, sampling that is otherwise irregular, asynchronous sampling (in multi-wavelength settings), and data with non-Gaussian measurement errors. The proposed research responds to the ADAP element supporting the development of tools for mining the vast reservoir of information residing in NASA databases. The tools that will be provided to the community of astronomers studying variability of astronomical objects (from nearby stars and extrasolar planets, through galactic and extragalactic sources) will revolutionize the quality of timing analyses that can be carried out, and greatly enhance the scientific throughput of all NASA astrophysics missions past, present, and future. The Automaton will let scientists explore time series - individual records or large data bases -- with the most informative and useful analysis methods available, without having to develop the tools themselves or understand the computational details. Both elements, the Toolbox and the Automaton, will enable deep but efficient exploratory time series data analysis, which is why we have named the project the Time Series Explorer. Science

  3. Pattern Recognition in Time Series

    NASA Astrophysics Data System (ADS)

    Lin, Jessica; Williamson, Sheri; Borne, Kirk D.; DeBarr, David

    2012-03-01

    , planetary transits), quasi-periodic variations (e.g., star spots, neutron star oscillations, active galactic nuclei), outburst events (e.g., accretion binaries, cataclysmic variable stars, symbiotic stars), transient events (e.g., gamma-ray bursts (GRB), flare stars, novae, supernovae (SNe)), stochastic variations (e.g., quasars, cosmic rays, luminous blue variables (LBVs)), and random events with precisely predictable patterns (e.g., microlensing events). Several such astrophysical phenomena are wavelength-specific cases, or were discovered as a result of wavelength-specific flux variations, such as soft gamma ray repeaters, x-ray binaries, radio pulsars, and gravitational waves. Despite the wealth of discoveries in this space of time variability, there is still a vast unexplored region, especially at low flux levels and short time scales (see also the chapter by Bloom and Richards in this book). Figure 28.1 illustrates the gap in astronomical knowledge in this time-domain space. The LSST project aims to explore phenomena in the time gap. In addition to flux-based time series, astronomical data also include motion-based time series. These include the trajectories of planets, comets, and asteroids in the Solar System, the motions of stars around the massive black hole at the center of the Milky Way galaxy, and the motion of gas filaments in the interstellar medium (e.g., expanding supernova blast wave shells). In most cases, the motions measured in the time series correspond to the actual changing positions of the objects being studied. In other cases, the detected motions indirectly reflect other changes in the astronomical phenomenon, such as light echoes reflecting across vast gas and dust clouds, or propagating waves.

  4. Singular spectrum analysis for time series with missing data

    USGS Publications Warehouse

    Schoellhamer, D.H.

    2001-01-01

    Geophysical time series often contain missing data, which prevents analysis with many signal processing and multivariate tools. A modification of singular spectrum analysis for time series with missing data is developed and successfully tested with synthetic and actual incomplete time series of suspended-sediment concentration from San Francisco Bay. This method also can be used to low pass filter incomplete time series.

  5. GPS Position Time Series @ JPL

    NASA Technical Reports Server (NTRS)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  6. Permutations and time series analysis.

    PubMed

    Cánovas, Jose S; Guillamón, Antonio

    2009-12-01

    The main aim of this paper is to show how the use of permutations can be useful in the study of time series analysis. In particular, we introduce a test for checking the independence of a time series which is based on the number of admissible permutations on it. The main improvement in our tests is that we are able to give a theoretical distribution for independent time series.

  7. FROG: Time-series analysis

    NASA Astrophysics Data System (ADS)

    Allan, Alasdair

    2014-06-01

    FROG performs time series analysis and display. It provides a simple user interface for astronomers wanting to do time-domain astrophysics but still offers the powerful features found in packages such as PERIOD (ascl:1406.005). FROG includes a number of tools for manipulation of time series. Among other things, the user can combine individual time series, detrend series (multiple methods) and perform basic arithmetic functions. The data can also be exported directly into the TOPCAT (ascl:1101.010) application for further manipulation if needed.

  8. Predicting Nonlinear Time Series

    DTIC Science & Technology

    1993-12-01

    response becomes R,(k) = f (Y FV,(k)) (2.4) where Wy specifies the weight associated with the output of node i to the input of nodej in the next layer and...interconnections for each of these previous nodes. 18 prr~~~o• wfe :t iam i -- ---- --- --- --- Figure 5: Delay block for ATNN [9] Thus, nodej receives the...computed values, aj(tn), and dj(tn) denotes the desired output of nodej at time in. In this thesis, the weights and time delays update after each input

  9. Langevin equations from time series.

    PubMed

    Racca, E; Porporato, A

    2005-02-01

    We discuss the link between the approach to obtain the drift and diffusion of one-dimensional Langevin equations from time series, and Pope and Ching's relationship for stationary signals. The two approaches are based on different interpretations of conditional averages of the time derivatives of the time series at given levels. The analysis provides a useful indication for the correct application of Pope and Ching's relationship to obtain stochastic differential equations from time series and shows its validity, in a generalized sense, for nondifferentiable processes originating from Langevin equations.

  10. Time series with tailored nonlinearities

    NASA Astrophysics Data System (ADS)

    Räth, C.; Laut, I.

    2015-10-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.

  11. Economic Time-Series Page.

    ERIC Educational Resources Information Center

    Bos, Theodore; Culver, Sarah E.

    2000-01-01

    Describes the Economagic Web site, a comprehensive site of free economic time-series data that can be used for research and instruction. Explains that it contains 100,000+ economic data series from sources such as the Federal Reserve Banking System, the Census Bureau, and the Department of Commerce. (CMK)

  12. Entropy of electromyography time series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Zurcher, Ulrich; Sung, Paul S.

    2007-12-01

    A nonlinear analysis based on Renyi entropy is applied to electromyography (EMG) time series from back muscles. The time dependence of the entropy of the EMG signal exhibits a crossover from a subdiffusive regime at short times to a plateau at longer times. We argue that this behavior characterizes complex biological systems. The plateau value of the entropy can be used to differentiate between healthy and low back pain individuals.

  13. Random time series in astronomy.

    PubMed

    Vaughan, Simon

    2013-02-13

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.

  14. Time series analysis of injuries.

    PubMed

    Martinez-Schnell, B; Zaidi, A

    1989-12-01

    We used time series models in the exploratory and confirmatory analysis of selected fatal injuries in the United States from 1972 to 1983. We built autoregressive integrated moving average (ARIMA) models for monthly, weekly, and daily series of deaths and used these models to generate hypotheses. These deaths resulted from six causes of injuries: motor vehicles, suicides, homicides, falls, drownings, and residential fires. For each cause of injury, we estimated calendar effects on the monthly death counts. We confirmed the significant effect of vehicle miles travelled on motor vehicle fatalities with a transfer function model. Finally, we applied intervention analysis to deaths due to motor vehicles.

  15. Inductive time series modeling program

    SciTech Connect

    Kirk, B.L.; Rust, B.W.

    1985-10-01

    A number of features that comprise environmental time series share a common mathematical behavior. Analysis of the Mauna Loa carbon dioxide record and other time series is aimed at constructing mathematical functions which describe as many major features of the data as possible. A trend function is fit to the data, removed, and the resulting residuals analyzed for any significant behavior. This is repeated until the residuals are driven to white noise. In the following discussion, the concept of trend will include cyclic components. The mathematical tools and program packages used are VARPRO (Golub and Pereyra 1973), for the least squares fit, and a modified version of our spectral analysis program (Kirk et al. 1979), for spectrum and noise analysis. The program is written in FORTRAN. All computations are done in double precision, except for the plotting calls where the DISSPLA package is used. The core requirement varies between 600 K and 700 K. The program is implemented on the IBM 360/370. Currently, the program can analyze up to five different time series where each series contains no more than 300 points. 12 refs.

  16. Modeling North Pacific Time Series

    NASA Astrophysics Data System (ADS)

    Overland, J. E.; Percival, D. B.; Mofjeld, H. O.

    2002-05-01

    We present a case study in modeling the North Pacific (NP) index, a time series of the wintertime Aleutian low sea level pressure from 1900 to 1999. We consider three statistical models, namely, a Gaussian stationary autoregressive process, a Gaussian fractionally difference (FD) or ``long-memory" process, and a ``signal plus noise" process consisting of a square wave oscillation with a pentadecadal period embedded in Gaussian white noise. Each model depends upon three parameters, so all three models are equally simple. The shortness of the time series makes it unrealistic to formally prefer one model over the other: we estimate it would take a 300 year record to differentiate between the models. Although the models fit equally well, they have quite different implications for the long-term behavior of the NP index, e.g. generation of regimes of characteristic lengths. Additional information and physical arguments may add support for a particular model. The FD - ``long memory" process suggests multiple physical contributions with different damping constants many North Pacific biological time series which are influenced by atmospheric and oceanic processes, show regime-like ecosystem reorganizations.

  17. Introduction to Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1986-01-01

    The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.

  18. Multiple Indicator Stationary Time Series Models.

    ERIC Educational Resources Information Center

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  19. Detecting chaos from time series

    NASA Astrophysics Data System (ADS)

    Xiaofeng, Gong; Lai, C. H.

    2000-02-01

    In this paper, an entirely data-based method to detect chaos from the time series is developed by introducing icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points (the p -steps icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> -neighbour points). We demonstrate that for deterministic chaotic systems, there exists a linear relationship between the logarithm of the average number of icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points, lnn p ,icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> , and the time step, p . The coefficient can be related to the KS entropy of the system. The effects of the embedding dimension and noise are also discussed.

  20. Sparse Representation for Time-Series Classification

    DTIC Science & Technology

    2015-02-08

    February 8, 2015 16:49 World Scientific Review Volume - 9in x 6in ” time - series classification” page 1 Chapter 1 Sparse Representation for Time - Series ...studies the problem of time - series classification and presents an overview of recent developments in the area of feature extraction and information...problem of target classification, and more generally time - series classification, in two main directions, feature extraction and information fusion. 1

  1. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  2. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  3. A Review of Subsequence Time Series Clustering

    PubMed Central

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  4. A review of subsequence time series clustering.

    PubMed

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  5. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  6. TSAN: a package for time series analysis.

    PubMed

    Wang, D C; Vagnucci, A H

    1980-04-01

    Many biomedical data are in the form of time series. Analyses of these data include: (1) search for any biorhythm; (2) test of homogeneity of several time series; (3) assessment of stationarity; (4) test of normality of the time series histogram; (5) evaluation of dependence between data points. In this paper we present a subroutine package called TSAN. It is developed to accomplish these tasks. Computational methods, as well as flowcharts, for these subroutines are described. Two sample runs are demonstrated.

  7. The Theory of Standardized Time Series.

    DTIC Science & Technology

    1985-04-01

    3.1)),’the method of standardized time series produces asymptotically valid confidence intevals for steady7&tepi1Tsmneters. However, these intervals...the method of standardized time series produces asymptotically valid confidence intevals for steady-state parameters. However, these intervals are...fa o & s d ......ary O W f, . .d by W eek m b o ), Da~cn an o~simulation output analysis confidence intervals standardized time series functional

  8. Forecasting Enrollments with Fuzzy Time Series.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  9. Statistical criteria for characterizing irradiance time series.

    SciTech Connect

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  10. Generation of artificial helioseismic time-series

    NASA Technical Reports Server (NTRS)

    Schou, J.; Brown, T. M.

    1993-01-01

    We present an outline of an algorithm to generate artificial helioseismic time-series, taking into account as much as possible of the knowledge we have on solar oscillations. The hope is that it will be possible to find the causes of some of the systematic errors in analysis algorithms by testing them with such artificial time-series.

  11. Linear Relations in Time Series Models. I.

    ERIC Educational Resources Information Center

    Villegas, C.

    1976-01-01

    A multiple time series is defined as the sum of an autoregressive process on a line and independent Gaussian white noise or a hyperplane that goes through the origin and intersects the line at a single point. This process is a multiple autoregressive time series in which the regression matrices satisfy suitable conditions. For a related article…

  12. On reconstruction of time series in climatology

    NASA Astrophysics Data System (ADS)

    Privalsky, V.; Gluhovsky, A.

    2015-10-01

    The approach to time series reconstruction in climatology based upon cross-correlation coefficients and regression equations is mathematically incorrect because it ignores the dependence of time series upon their past. The proper method described here for the bivariate case requires the autoregressive time- and frequency domains modeling of the time series which contains simultaneous observations of both scalar series with subsequent application of the model to restore the shorter one into the past. The method presents further development of previous efforts taken by a number of authors starting from A. Douglass who introduced some concepts of time series analysis into paleoclimatology. The method is applied to the monthly data of total solar irradiance (TSI), 1979-2014, and sunspot numbers (SSN), 1749-2014, to restore the TSI data over 1749-1978. The results of the reconstruction are in statistical agreement with observations.

  13. A radar image time series

    NASA Technical Reports Server (NTRS)

    Leberl, F.; Fuchs, H.; Ford, J. P.

    1981-01-01

    A set of ten side-looking radar images of a mining area in Arizona that were aquired over a period of 14 yr are studied to demonstrate the photogrammetric differential-rectification technique applied to radar images and to examine changes that occurred in the area over time. Five of the images are rectified by using ground control points and a digital height model taken from a map. Residual coordinate errors in ground control are reduced from several hundred meters in all cases to + or - 19 to 70 m. The contents of the radar images are compared with a Landsat image and with aerial photographs. Effects of radar system parameters on radar images are briefly reviewed.

  14. Entropic Analysis of Electromyography Time Series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  15. Correlation measure to detect time series distances, whence economy globalization

    NASA Astrophysics Data System (ADS)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  16. A matter of time: actual time and the production of the past.

    PubMed

    Scarfone, Dominique

    2006-07-01

    In psychoanalytic theory, space metaphors are frequently used to describe the psychic apparatus. As for time, it is traditionally invoked under the heading of timelessness of the unconscious, more aptly described as the resistance of the repressed to wearing away with time. This paper examines how the insertion of time into psychic events and structural differentiation form a single process. After looking into the parallelism between phenomenological and psychoanalytic views of time and differentiation, the author draws a distinction between two time categories: chronological versus actual. A clinical example is presented.

  17. Network structure of multivariate time series

    PubMed Central

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-01-01

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail. PMID:26487040

  18. Homogenising time series: beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2011-06-01

    In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.

  19. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  20. Modelling of nonlinear filtering Poisson time series

    NASA Astrophysics Data System (ADS)

    Bochkarev, Vladimir V.; Belashova, Inna A.

    2016-08-01

    In this article, algorithms of non-linear filtering of Poisson time series are tested using statistical modelling. The objective is to find a representation of a time series as a wavelet series with a small number of non-linear coefficients, which allows distinguishing statistically significant details. There are well-known efficient algorithms of non-linear wavelet filtering for the case when the values of a time series have a normal distribution. However, if the distribution is not normal, good results can be expected using the maximum likelihood estimations. The filtration is studied according to the criterion of maximum likelihood by the example of Poisson time series. For direct optimisation of the likelihood function, different stochastic (genetic algorithms, annealing method) and deterministic optimization algorithms are used. Testing of the algorithm using both simulated series and empirical data (series of rare words frequencies according to the Google Books Ngram data were used) showed that filtering based on the criterion of maximum likelihood has a great advantage over well-known algorithms for the case of Poisson series. Also, the most perspective methods of optimisation were selected for this problem.

  1. Developing consistent time series landsat data products

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...

  2. Modeling Time Series Data for Supervised Learning

    ERIC Educational Resources Information Center

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  3. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  4. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  5. Real-Time Collaboration over the Internet: What Actually Works?

    ERIC Educational Resources Information Center

    Swigger, Kathleen M.; Brazile, Robert; Byron, Suzanne; Livingston, Alan; Lopez, Victor; Reynes, Josie

    In order to provide teachers and students with electronic learning environments that support mentoring and collaboration through electronic means, the authors developed software that supports same time/different place educational collaborative activities over the Internet. These activities focus on teaching students how to organize and systematize…

  6. Measuring nonlinear behavior in time series data

    NASA Astrophysics Data System (ADS)

    Wai, Phoong Seuk; Ismail, Mohd Tahir

    2014-12-01

    Stationary Test is an important test in detect the time series behavior since financial and economic data series always have missing data, structural change as well as jumps or breaks in the data set. Moreover, stationary test is able to transform the nonlinear time series variable to become stationary by taking difference-stationary process or trend-stationary process. Two different types of hypothesis testing of stationary tests that are Augmented Dickey-Fuller (ADF) test and Kwiatkowski-Philips-Schmidt-Shin (KPSS) test are examine in this paper to describe the properties of the time series variables in financial model. Besides, Least Square method is used in Augmented Dickey-Fuller test to detect the changes of the series and Lagrange multiplier is used in Kwiatkowski-Philips-Schmidt-Shin test to examine the properties of oil price, gold price and Malaysia stock market. Moreover, Quandt-Andrews, Bai-Perron and Chow tests are also use to detect the existence of break in the data series. The monthly index data are ranging from December 1989 until May 2012. Result is shown that these three series exhibit nonlinear properties but are able to transform to stationary series after taking first difference process.

  7. Regularization of Nutation Time Series at GSFC

    NASA Astrophysics Data System (ADS)

    Le Bail, K.; Gipson, J. M.; Bolotin, S.

    2012-12-01

    VLBI is unique in its ability to measure all five Earth orientation parameters. In this paper we focus on the two nutation parameters which characterize the orientation of the Earth's rotation axis in space. We look at the periodicities and the spectral characteristics of these parameters for both R1 and R4 sessions independently. The study of the most significant periodic signals for periods shorter than 600 days is common for these four time series (period of 450 days), and the type of noise determined by the Allan variance is a white noise for the four series. To investigate methods of regularizing the series, we look at a Singular Spectrum Analysis-derived method and at the Kalman filter. The two methods adequately reproduce the tendency of the nutation time series, but the resulting series are noisier using the Singular Spectrum Analysis-derived method.

  8. Homogenising time series: Beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2010-09-01

    For obtaining reliable information about climate change and climate variability the use of high quality data series is essentially important, and one basic tool of quality improvements is the statistical homogenisation of observed time series. In the recent decades large number of homogenisation methods has been developed, but the real effects of their application on time series are still not known entirely. The ongoing COST HOME project (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. The author believes that several old theoretical rules have to be re-evaluated. Some examples of the hot questions, a) Statistically detected change-points can be accepted only with the confirmation of metadata information? b) Do semi-hierarchic algorithms for detecting multiple change-points in time series function effectively in practise? c) Is it good to limit the spatial comparison of candidate series with up to five other series in the neighbourhood? Empirical results - those from the COST benchmark, and other experiments too - show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities seem like part of the climatic variability, thus the pure application of classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality, than in raw time series. The developers and users of homogenisation methods have to bear in mind that

  9. Spectra: Time series power spectrum calculator

    NASA Astrophysics Data System (ADS)

    Gallardo, Tabaré

    2017-01-01

    Spectra calculates the power spectrum of a time series equally spaced or not based on the Spectral Correlation Coefficient (Ferraz-Mello 1981, Astron. Journal 86 (4), 619). It is very efficient for detection of low frequencies.

  10. Advanced spectral methods for climatic time series

    USGS Publications Warehouse

    Ghil, M.; Allen, M.R.; Dettinger, M.D.; Ide, K.; Kondrashov, D.; Mann, M.E.; Robertson, A.W.; Saunders, A.; Tian, Y.; Varadi, F.; Yiou, P.

    2002-01-01

    The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this review we describe the connections between time series analysis and nonlinear dynamics, discuss signal- to-noise enhancement, and present some of the novel methods for spectral analysis. The various steps, as well as the advantages and disadvantages of these methods, are illustrated by their application to an important climatic time series, the Southern Oscillation Index. This index captures major features of interannual climate variability and is used extensively in its prediction. Regional and global sea surface temperature data sets are used to illustrate multivariate spectral methods. Open questions and further prospects conclude the review.

  11. Complex network approach to fractional time series

    SciTech Connect

    Manshour, Pouya

    2015-10-15

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.

  12. Improving Intercomparability of Marine Biogeochemical Time Series

    NASA Astrophysics Data System (ADS)

    Benway, Heather M.; Telszewski, Maciej; Lorenzoni, Laura

    2013-04-01

    Shipboard biogeochemical time series represent one of the most valuable tools scientists have to quantify marine elemental fluxes and associated biogeochemical processes and to understand their links to changing climate. They provide the long, temporally resolved data sets needed to characterize ocean climate, biogeochemistry, and ecosystem variability and change. However, to monitor and differentiate natural cycles and human-driven changes in the global oceans, time series methodologies must be transparent and intercomparable when possible. To review current shipboard biogeochemical time series sampling and analytical methods, the International Ocean Carbon Coordination Project (IOCCP; http://www.ioccp.org/) and the Ocean Carbon and Biogeochemistry Program (http://www.us-ocb.org/) convened an international ocean time series workshop at the Bermuda Institute for Ocean Sciences.

  13. Detecting nonlinear structure in time series

    SciTech Connect

    Theiler, J.

    1991-01-01

    We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs.

  14. Nonlinear Analysis of Surface EMG Time Series

    NASA Astrophysics Data System (ADS)

    Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-04-01

    Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.

  15. Detecting chaos in irregularly sampled time series.

    PubMed

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  16. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  17. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  18. Turbulencelike Behavior of Seismic Time Series

    SciTech Connect

    Manshour, P.; Saberi, S.; Sahimi, Muhammad; Peinke, J.; Pacheco, Amalio F.; Rahimi Tabar, M. Reza

    2009-01-09

    We report on a stochastic analysis of Earth's vertical velocity time series by using methods originally developed for complex hierarchical systems and, in particular, for turbulent flows. Analysis of the fluctuations of the detrended increments of the series reveals a pronounced transition in their probability density function from Gaussian to non-Gaussian. The transition occurs 5-10 hours prior to a moderate or large earthquake, hence representing a new and reliable precursor for detecting such earthquakes.

  19. Learning time series for intelligent monitoring

    NASA Technical Reports Server (NTRS)

    Manganaris, Stefanos; Fisher, Doug

    1994-01-01

    We address the problem of classifying time series according to their morphological features in the time domain. In a supervised machine-learning framework, we induce a classification procedure from a set of preclassified examples. For each class, we infer a model that captures its morphological features using Bayesian model induction and the minimum message length approach to assign priors. In the performance task, we classify a time series in one of the learned classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. We report results from experiments in a monitoring domain of interest to NASA.

  20. Predicting road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  1. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  2. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  3. Layered Ensemble Architecture for Time Series Forecasting.

    PubMed

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  4. Complex network analysis of time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Small, Michael; Kurths, Jürgen

    2016-12-01

    Revealing complicated behaviors from time series constitutes a fundamental problem of continuing interest and it has attracted a great deal of attention from a wide variety of fields on account of its significant importance. The past decade has witnessed a rapid development of complex network studies, which allow to characterize many types of systems in nature and technology that contain a large number of components interacting with each other in a complicated manner. Recently, the complex network theory has been incorporated into the analysis of time series and fruitful achievements have been obtained. Complex network analysis of time series opens up new venues to address interdisciplinary challenges in climate dynamics, multiphase flow, brain functions, ECG dynamics, economics and traffic systems.

  5. Intrinsic superstatistical components of financial time series

    NASA Astrophysics Data System (ADS)

    Vamoş, Călin; Crăciun, Maria

    2014-12-01

    Time series generated by a complex hierarchical system exhibit various types of dynamics at different time scales. A financial time series is an example of such a multiscale structure with time scales ranging from minutes to several years. In this paper we decompose the volatility of financial indices into five intrinsic components and we show that it has a heterogeneous scale structure. The small-scale components have a stochastic nature and they are independent 99% of the time, becoming synchronized during financial crashes and enhancing the heavy tails of the volatility distribution. The deterministic behavior of the large-scale components is related to the nonstationarity of the financial markets evolution. Our decomposition of the financial volatility is a superstatistical model more complex than those usually limited to a superposition of two independent statistics at well-separated time scales.

  6. Clustering Short Time-Series Microarray

    NASA Astrophysics Data System (ADS)

    Ping, Loh Wei; Hasan, Yahya Abu

    2008-01-01

    Most microarray analyses are carried out on static gene expressions. However, the dynamical study of microarrays has lately gained more attention. Most researches on time-series microarray emphasize on the bioscience and medical aspects but few from the numerical aspect. This study attempts to analyze short time-series microarray mathematically using STEM clustering tool which formally preprocess data followed by clustering. We next introduce the Circular Mould Distance (CMD) algorithm with combinations of both preprocessing and clustering analysis. Both methods are subsequently compared in terms of efficiencies.

  7. MODIS Vegetation Indices time series improvement considering real acquisition dates

    NASA Astrophysics Data System (ADS)

    Testa, S.; Borgogno Mondino, E.

    2013-12-01

    Satellite Vegetation Indices (VI) time series images are widely used for the characterization phenology, which requires a high temporal accuracy of the satellite data. The present work is based on the MODerate resolution Imaging Spectroradiometer (MODIS) MOD13Q1 product - Vegetation Indices 16-Day L3 Global 250m, which is generated through a maximum value compositing process that reduces the number of cloudy pixels and excludes, when possible, off-nadir ones. Because of its 16-days compositing period, the distance between two adjacent-in-time values within each pixel NDVI time series can range from 1 to 32 days, thus not acceptable for phenologic studies. Moreover, most of the available smoothing algorithms, which are widely used for phenology characterization, assume that data points are equidistant in time and contemporary over the image. The objective of this work was to assess temporal features of NDVI time series over a test area, composed by Castanea sativa (chestnut) and Fagus sylvatica (beech) pure pixels within the Piemonte region in Northwestern Italy. Firstly, NDVI, Pixel Reliability (PR) and Composite Day of the Year (CDOY) data ranging from 2000 to 2011 were extracted from MOD13Q1 and corresponding time series were generated (in further computations, 2000 was not considered since it is not complete because acquisition began in February and calibration is unreliable until October). Analysis of CDOY time series (containing the actual reference date of each NDVI value) over the selected study areas showed NDVI values to be prevalently generated from data acquired at the centre of each 16-days period (the 9th day), at least constantly along the year. This leads to consider each original NDVI value nominally placed to the centre of its 16-days reference period. Then, a new NDVI time series was generated: a) moving each NDVI value to its actual "acquisition" date, b) interpolating the obtained temporary time series through SPLINE functions, c) sampling such

  8. Circulant Matrices and Time-Series Analysis

    ERIC Educational Resources Information Center

    Pollock, D. S. G.

    2002-01-01

    This paper sets forth some salient results in the algebra of circulant matrices which can be used in time-series analysis. It provides easy derivations of some results that are central to the analysis of statistical periodograms and empirical spectral density functions. A statistical test for the stationarity or homogeneity of empirical processes…

  9. Nonlinear Time Series Analysis via Neural Networks

    NASA Astrophysics Data System (ADS)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  10. Three Analysis Examples for Time Series Data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    With improvements in instrumentation and the automation of data collection, plot level repeated measures and time series data are increasingly available to monitor and assess selected variables throughout the duration of an experiment or project. Records and metadata on variables of interest alone o...

  11. Directionality volatility in electroencephalogram time series

    NASA Astrophysics Data System (ADS)

    Mansor, Mahayaudin M.; Green, David A.; Metcalfe, Andrew V.

    2016-06-01

    We compare time series of electroencephalograms (EEGs) from healthy volunteers with EEGs from subjects diagnosed with epilepsy. The EEG time series from the healthy group are recorded during awake state with their eyes open and eyes closed, and the records from subjects with epilepsy are taken from three different recording regions of pre-surgical diagnosis: hippocampal, epileptogenic and seizure zone. The comparisons for these 5 categories are in terms of deviations from linear time series models with constant variance Gaussian white noise error inputs. One feature investigated is directionality, and how this can be modelled by either non-linear threshold autoregressive models or non-Gaussian errors. A second feature is volatility, which is modelled by Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) processes. Other features include the proportion of variability accounted for by time series models, and the skewness and the kurtosis of the residuals. The results suggest these comparisons may have diagnostic potential for epilepsy and provide early warning of seizures.

  12. Determinism test for very short time series.

    PubMed

    Binder, P-M; Igarashi, Ryu; Seymour, William; Takeishi, Candy

    2005-03-01

    A test for determinism suitable for time series shorter than 100 points is presented, and applied to numerical and observed data. The method exploits the linear d(d(0)) dependence in the expression d(t) approximately d(0)e(lambda t) which describes the growth of small separations between trajectories in chaotic systems.

  13. Offset detection in GPS coordinate time series

    NASA Astrophysics Data System (ADS)

    Gazeaux, J.; King, M. A.; Williams, S. D.

    2013-12-01

    Global Positioning System (GPS) time series are commonly affected by offsets of unknown magnitude and the large volume of data globally warrants investigation of automated detection approaches. The Detection of Offsets in GPS Experiment (DOGEx) showed that accuracy of Global Positioning System (GPS) time series can be significantly improved by applying statistical offset detection methods (see Gazeaux et al. (2013)). However, the best of these approaches did not perform as well as manual detection by expert analysts. Many of the features of GPS coordinates time series have not yet been fully taken into account in existing methods. Here, we apply Bayesian theory in order to make use of prior knowledge of the site noise characteristics and metadata in an attempt to make the offset detection more accurate. In the past decades, Bayesian theory has shown relevant results for a widespread range of applications, but has not yet been applied to GPS coordinates time series. Such methods incorporate different inputs such as a dynamic model (linear trend, periodic signal..) and a-priori information in a process that provides the best estimate of parameters (velocity, phase and amplitude of periodic signals...) based on all the available information. We test the new method on the DOGEx simulated dataset and compare it to previous solutions, and to Monte-Carlo method to test the accuracy of the procedure. We make a preliminary extension of the DOGEx dataset to introduce metadata information, allowing us to test the value of this data type in detecting offsets. The flexibility, robustness and limitations of the new approach are discussed. Gazeaux, J. Williams, S., King, M., Bos, M., Dach, R., Deo, M.,Moore, A.W., Ostini, L., Petrie, E., Roggero, M., Teferle, F.N., Olivares, G.,Webb, F.H. 2013. Detecting offsets in GPS time series: First results from the detection of offsets in GPS experiment. Journal of Geophysical Research: Solid Earth 118. 5. pp:2169-9356. Keywords : GPS

  14. Remote Sensing Time Series Product Tool

    NASA Technical Reports Server (NTRS)

    Predos, Don; Ryan, Robert E.; Ross, Kenton W.

    2006-01-01

    The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced

  15. Delay Differential Analysis of Time Series

    PubMed Central

    Lainscsek, Claudia; Sejnowski, Terrence J.

    2015-01-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  16. Delay differential analysis of time series.

    PubMed

    Lainscsek, Claudia; Sejnowski, Terrence J

    2015-03-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  17. Time-frequency analysis of electroencephalogram series

    NASA Astrophysics Data System (ADS)

    Blanco, S.; Quiroga, R. Quian; Rosso, O. A.; Kochen, S.

    1995-03-01

    In this paper we propose a method, based on the Gabor transform, to quantify and visualize the time evolution of the traditional frequency bands defined in the analysis of electroencephalogram (EEG) series. The information obtained in this way can be used for the information transfer analyses of the epileptic seizure as well as for their characterization. We found an optimal correlation between EEG visual inspection and the proposed method in the characterization of paroxism, spikes, and other transient alterations of background activity. The dynamical changes during an epileptic seizure are shown through the phase portrait. The method proposed was examplified with EEG series obtained with depth electrodes in refractory epileptic patients.

  18. Algorithm for Compressing Time-Series Data

    NASA Technical Reports Server (NTRS)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  19. Modelling population change from time series data

    USGS Publications Warehouse

    Barker, R.J.; Sauer, J.R.; McCullough, D.R.; Barrett, R.H.

    1992-01-01

    Information on change in population size over time is among the most basic inputs for population management. Unfortunately, population changes are generally difficult to identify, and once identified difficult to explain. Sources of variald (patterns) in population data include: changes in environment that affect carrying capaciyy and produce trend, autocorrelative processes, irregular environmentally induced perturbations, and stochasticity arising from population processes. In addition. populations are almost never censused and many surveys (e.g., the North American Breeding Bird Survey) produce multiple, incomplete time series of population indices, providing further sampling complications. We suggest that each source of pattern should be used to address specific hypotheses regarding population change, but that failure to correctly model each source can lead to false conclusions about the dynamics of populations. We consider hypothesis tests based on each source of pattern, and the effects of autocorrelated observations and sampling error. We identify important constraints on analyses of time series that limit their use in identifying underlying relationships.

  20. Pseudotime estimation: deconfounding single cell time series

    PubMed Central

    Reid, John E.; Wernisch, Lorenz

    2016-01-01

    Motivation: Repeated cross-sectional time series single cell data confound several sources of variation, with contributions from measurement noise, stochastic cell-to-cell variation and cell progression at different rates. Time series from single cell assays are particularly susceptible to confounding as the measurements are not averaged over populations of cells. When several genes are assayed in parallel these effects can be estimated and corrected for under certain smoothness assumptions on cell progression. Results: We present a principled probabilistic model with a Bayesian inference scheme to analyse such data. We demonstrate our method’s utility on public microarray, nCounter and RNA-seq datasets from three organisms. Our method almost perfectly recovers withheld capture times in an Arabidopsis dataset, it accurately estimates cell cycle peak times in a human prostate cancer cell line and it correctly identifies two precocious cells in a study of paracrine signalling in mouse dendritic cells. Furthermore, our method compares favourably with Monocle, a state-of-the-art technique. We also show using held-out data that uncertainty in the temporal dimension is a common confounder and should be accounted for in analyses of repeated cross-sectional time series. Availability and Implementation: Our method is available on CRAN in the DeLorean package. Contact: john.reid@mrc-bsu.cam.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27318198

  1. Hurst exponents for short time series

    NASA Astrophysics Data System (ADS)

    Qi, Jingchao; Yang, Huijie

    2011-12-01

    A concept called balanced estimator of diffusion entropy is proposed to detect quantitatively scalings in short time series. The effectiveness is verified by detecting successfully scaling properties for a large number of artificial fractional Brownian motions. Calculations show that this method can give reliable scalings for short time series with length ˜102. It is also used to detect scalings in the Shanghai Stock Index, five stock catalogs, and a total of 134 stocks collected from the Shanghai Stock Exchange Market. The scaling exponent for each catalog is significantly larger compared with that for the stocks included in the catalog. Selecting a window with size 650, the evolution of scaling for the Shanghai Stock Index is obtained by the window's sliding along the series. Global patterns in the evolutionary process are captured from the smoothed evolutionary curve. By comparing the patterns with the important event list in the history of the considered stock market, the evolution of scaling is matched with the stock index series. We can find that the important events fit very well with global transitions of the scaling behaviors.

  2. Multivariate Voronoi Outlier Detection for Time Series.

    PubMed

    Zwilling, Chris E; Wang, Michelle Yongmei

    2014-10-01

    Outlier detection is a primary step in many data mining and analysis applications, including healthcare and medical research. This paper presents a general method to identify outliers in multivariate time series based on a Voronoi diagram, which we call Multivariate Voronoi Outlier Detection (MVOD). The approach copes with outliers in a multivariate framework, via designing and extracting effective attributes or features from the data that can take parametric or nonparametric forms. Voronoi diagrams allow for automatic configuration of the neighborhood relationship of the data points, which facilitates the differentiation of outliers and non-outliers. Experimental evaluation demonstrates that our MVOD is an accurate, sensitive, and robust method for detecting outliers in multivariate time series data.

  3. Detecting anomalous phase synchronization from time series

    SciTech Connect

    Tokuda, Isao T.; Kumar Dana, Syamal; Kurths, Juergen

    2008-06-15

    Modeling approaches are presented for detecting an anomalous route to phase synchronization from time series of two interacting nonlinear oscillators. The anomalous transition is characterized by an enlargement of the mean frequency difference between the oscillators with an initial increase in the coupling strength. Although such a structure is common in a large class of coupled nonisochronous oscillators, prediction of the anomalous transition is nontrivial for experimental systems, whose dynamical properties are unknown. Two approaches are examined; one is a phase equational modeling of coupled limit cycle oscillators and the other is a nonlinear predictive modeling of coupled chaotic oscillators. Application to prototypical models such as two interacting predator-prey systems in both limit cycle and chaotic regimes demonstrates the capability of detecting the anomalous structure from only a few sets of time series. Experimental data from two coupled Chua circuits shows its applicability to real experimental system.

  4. Time-Series Analysis: A Cautionary Tale

    NASA Technical Reports Server (NTRS)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  5. Aggregated Indexing of Biomedical Time Series Data.

    PubMed

    Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A T

    2012-09-01

    Remote and wearable medical sensing has the potential to create very large and high dimensional datasets. Medical time series databases must be able to efficiently store, index, and mine these datasets to enable medical professionals to effectively analyze data collected from their patients. Conventional high dimensional indexing methods are a two stage process. First, a superset of the true matches is efficiently extracted from the database. Second, supersets are pruned by comparing each of their objects to the query object and rejecting any objects falling outside a predetermined radius. This pruning stage heavily dominates the computational complexity of most conventional search algorithms. Therefore, indexing algorithms can be significantly improved by reducing the amount of pruning. This paper presents an online algorithm to aggregate biomedical times series data to significantly reduce the search space (index size) without compromising the quality of search results. This algorithm is built on the observation that biomedical time series signals are composed of cyclical and often similar patterns. This algorithm takes in a stream of segments and groups them to highly concentrated collections. Locality Sensitive Hashing (LSH) is used to reduce the overall complexity of the algorithm, allowing it to run online. The output of this aggregation is used to populate an index. The proposed algorithm yields logarithmic growth of the index (with respect to the total number of objects) while keeping sensitivity and specificity simultaneously above 98%. Both memory and runtime complexities of time series search are improved when using aggregated indexes. In addition, data mining tasks, such as clustering, exhibit runtimes that are orders of magnitudes faster when run on aggregated indexes.

  6. Analysis of Polyphonic Musical Time Series

    NASA Astrophysics Data System (ADS)

    Sommer, Katrin; Weihs, Claus

    A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.

  7. Does the multiple-scattering series in the pion-deuteron scattering actually converge?

    SciTech Connect

    Kudryavtsev, A. E. Romanov, A. I. Gani, V. A.

    2013-07-15

    It is demonstrated that the well-known answer for the multiple-scattering series (MSS) for a light particle interacting to a pair of static nucleons, calculated in the Fixed Centers Approximation (FCA), works well for a wide region of the two-body complex scattering length a. However, this approach is not applicable in a narrow region surrounding the real positive a half-axis, where the MSS does not converge. Simultaneously, for real positive a's the 3-body system forms an infinite set of bound states.

  8. Weighted Dynamic Time Warping for Time Series Classification

    SciTech Connect

    Jeong, Young-Seon; Jeong, Myong K; Omitaomu, Olufemi A

    2011-01-01

    Dynamic time warping (DTW), which finds the minimum path by providing non-linear alignments between two time series, has been widely used as a distance measure for time series classification and clustering. However, DTW does not account for the relative importance regarding the phase difference between a reference point and a testing point. This may lead to misclassification especially in applications where the shape similarity between two sequences is a major consideration for an accurate recognition. Therefore, we propose a novel distance measure, called a weighted DTW (WDTW), which is a penalty-based DTW. Our approach penalizes points with higher phase difference between a reference point and a testing point in order to prevent minimum distance distortion caused by outliers. The rationale underlying the proposed distance measure is demonstrated with some illustrative examples. A new weight function, called the modified logistic weight function (MLWF), is also proposed to systematically assign weights as a function of the phase difference between a reference point and a testing point. By applying different weights to adjacent points, the proposed algorithm can enhance the detection of similarity between two time series. We show that some popular distance measures such as DTW and Euclidean distance are special cases of our proposed WDTW measure. We extend the proposed idea to other variants of DTW such as derivative dynamic time warping (DDTW) and propose the weighted version of DDTW. We have compared the performances of our proposed procedures with other popular approaches using public data sets available through the UCR Time Series Data Mining Archive for both time series classification and clustering problems. The experimental results indicate that the proposed approaches can achieve improved accuracy for time series classification and clustering problems.

  9. Active Mining from Process Time Series by Learning Classifier System

    NASA Astrophysics Data System (ADS)

    Kurahashi, Setsuya; Terano, Takao

    Continuation processes in chemical and/or biotechnical plants always generate a large amount of time series data. However, since conventional process models are described as a set of control models, it is difficult to explain the complicated and active plant behaviors. Based on the background, this research proposes a novel method to develop a process response model from continuous time-series data. The method consists of the following phases: 1) Collect continuous process data at each tag point in a target plant; 2) Normalize the data in the interval between zero and one; 3) Get the delay time, which maximizes the correlation between given two time series data; 4) Select tags with the higher correlation; 5) Develop a process response model to describe the relations among the process data using the delay time and the correlation values; 6) Develop a process prediction model via several tag points data using a neural network; 1) Discover control rules from the process prediction model using Learning Classifier system. The main contribution of the research is to establish a method to mine a set of meaningful control rules from Learning Classifier System using the Minimal Description Length criteria. The proposed method has been applied to an actual process of a biochemical plant and has shown the validity and the effectiveness.

  10. Long GPS coordinate time series: multipath and geometry effects

    NASA Astrophysics Data System (ADS)

    King, M.; Watson, C. S.

    2009-12-01

    Within analyses of Global Positioning System (GPS) observations, unmodelled sub-daily signals are known to propagate into long-period signals via a number of different mechanisms. We report on the effects of time-variable satellite geometry and the propagation of an unmodelled multipath signal. Multipath reflectors at H=0.1 m, 0.2 m and 1.5 m below the antenna are modeled and their effects on GPS coordinate time series are examined. Simulated time series at 20 global IGS sites for 2000-2008 were derived using the satellite geometry as defined by daily broadcast orbits, in addition to that defined using a perfectly repeating synthetic orbit. For the simulations generated using the broadcast orbits with a perfectly clear horizon, we observe the introduction of a time variable bias in the time series of up to several centimeters. Considerable site to site variability of the frequency and magnitude of the signal is observed, in addition to variation as a function of multipath source. When adopting realistic GPS observation geometries obtained from real data (e.g., those that include the effects of tracking outages, local obstructions, etc.), we observe concerning levels of temporal coordinate variation in the presence of the multipath signals. In these cases, we observe spurious signals across the frequency domain, in addition to what appears as offsets and secular trends. Velocity biases of more than 1mm/yr are evident at some few sites. The propagated signal in the vertical component is consistent with a noise model with a spectral index marginally above flicker noise (mean index -1.4), with some sites exhibiting power law magnitudes at comparable levels to actual height time series generated in GIPSY. The propagated signal also shows clear spectral peaks across all coordinate components at harmonics of the draconitic year for a GPS satellite (351.2 days). When a perfectly repeating synthetic GPS constellation is used, the simulations show near-negligible power law

  11. Fractal fluctuations in cardiac time series

    NASA Technical Reports Server (NTRS)

    West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    1999-01-01

    Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.

  12. Time series modeling for automatic target recognition

    NASA Astrophysics Data System (ADS)

    Sokolnikov, Andre

    2012-05-01

    Time series modeling is proposed for identification of targets whose images are not clearly seen. The model building takes into account air turbulence, precipitation, fog, smoke and other factors obscuring and distorting the image. The complex of library data (of images, etc.) serving as a basis for identification provides the deterministic part of the identification process, while the partial image features, distorted parts, irrelevant pieces and absence of particular features comprise the stochastic part of the target identification. The missing data approach is elaborated that helps the prediction process for the image creation or reconstruction. The results are provided.

  13. Time series analyses of global change data.

    PubMed

    Lane, L J; Nichols, M H; Osborn, H B

    1994-01-01

    The hypothesis that statistical analyses of historical time series data can be used to separate the influences of natural variations from anthropogenic sources on global climate change is tested. Point, regional, national, and global temperature data are analyzed. Trend analyses for the period 1901-1987 suggest mean annual temperatures increased (in degrees C per century) globally at the rate of about 0.5, in the USA at about 0.3, in the south-western USA desert region at about 1.2, and at the Walnut Gulch Experimental Watershed in south-eastern Arizona at about 0.8. However, the rates of temperature change are not constant but vary within the 87-year period. Serial correlation and spectral density analysis of the temperature time series showed weak periodicities at various frequencies. The only common periodicity among the temperature series is an apparent cycle of about 43 years. The temperature time series were correlated with the Wolf sunspot index, atmospheric CO(2) concentrations interpolated from the Siple ice core data, and atmospheric CO(2) concentration data from Mauna Loa measurements. Correlation analysis of temperature data with concurrent data on atmospheric CO(2) concentrations and the Wolf sunspot index support previously reported significant correlation over the 1901-1987 period. Correlation analysis between temperature, atmospheric CO(2) concentration, and the Wolf sunspot index for the shorter period, 1958-1987, when continuous Mauna Loa CO(2) data are available, suggest significant correlation between global warming and atmospheric CO(2) concentrations but no significant correlation between global warming and the Wolf sunspot index. This may be because the Wolf sunspot index apparently increased from 1901 until about 1960 and then decreased thereafter, while global warming apparently continued to increase through 1987. Correlation of sunspot activity with global warming may be spurious but additional analyses are required to test this hypothesis

  14. Gwilym Jenkins, Experimental Design and Time Series.

    DTIC Science & Technology

    1984-04-01

    of a changing process. This led to studies of discrete dynamic models and control problems and finally to work on time series and forecasting. A4S...practice based on sound -2- I theory in a never-ending iteration. The results of this mode of thinking come through strongly for example in his book with...arrival in Princeton marked the beginning of a long and happy collaboration between us which later resulted in much visiting to and from between England and

  15. Consistency of IVS nutation time series

    NASA Astrophysics Data System (ADS)

    Gattano, César; Lambert, Sébastien; Bizouard, Christian

    2016-04-01

    We give a review of the various VLBI-derived nutation time series provided by the different operational analysis centers of the IVS and three combination centers (IVS, IERS EOP Center, and Rapid Service/Prediction Center). We focus on the stability of small nutation amplitudes, including the free core nutation and other atmospherically-driven nutations, that are of interest for improving Earth models. We discuss the possible origins of the differences (software packaged, inversion methods, analysis configuration including a priori and estimation strategy) and the consequences for scientific exploitation of the data, especially in terms of nutation modeling and inference of the Earth's internal structure.

  16. Modeling stylized facts for financial time series

    NASA Astrophysics Data System (ADS)

    Krivoruchenko, M. I.; Alessio, E.; Frappietro, V.; Streckert, L. J.

    2004-12-01

    Multivariate probability density functions of returns are constructed in order to model the empirical behavior of returns in a financial time series. They describe the well-established deviations from the Gaussian random walk, such as an approximate scaling and heavy tails of the return distributions, long-ranged volatility-volatility correlations (volatility clustering) and return-volatility correlations (leverage effect). The model is tested successfully to fit joint distributions of the 100+ years of daily price returns of the Dow Jones 30 Industrial Average.

  17. Long GPS coordinate time series: multipath and geometry effects

    NASA Astrophysics Data System (ADS)

    King, M. A.; Watson, C. S.

    2009-04-01

    Within analyses of Global Positioning System (GPS) observations, unmodelled sub-daily signals are known to propagate into long-period signals via a number of different mechanisms. In this paper, we investigate the effects of time-variable satellite geometry and the propagation of an unmodelled multipath signal that is analogous to a change in the elevation dependant phase centre of the receiving antenna. Multipath reflectors at H=0.1 m, 0.2 m and 1.5 m below the antenna are modeled and their effects on GPS coordinate time series are examined. Simulated time series at 20 global IGS sites for 2000-2008 were derived using the satellite geometry as defined by daily broadcast orbits, in addition to that defined using a perfectly repeating synthetic orbit. For the simulations generated using the broadcast orbits with a perfectly clear horizon, we observe the introduction of a time variable bias in the time series of up to several centimeters. Considerable site to site variability of the frequency and magnitude of the signal is observed, in addition to variation as a function of multipath source. When adopting realistic GPS observation geometries obtained from real data (e.g., those that include the effects of tracking outages, local obstructions, etc.), we observe concerning levels of temporal coordinate variation in the presence of the multipath signals. In these cases, we observe spurious signals across the frequency domain, in addition to what appears as offsets and secular trends. Velocity biases of more than 1mm/yr are evident at some few sites. The propagated signal in the vertical component is consistent with a noise model with a spectral index marginally above flicker noise (mean index -1.4), with some sites exhibiting power law magnitudes at comparable levels to actual height time series generated in GIPSY. The propagated signal also shows clear spectral peaks across all coordinate components at harmonics of the draconitic year for a GPS satellite (351.4 days

  18. The CACAO Method for Smoothing, Gap Filling, and Characterizing Seasonal Anomalies in Satellite Time Series

    NASA Technical Reports Server (NTRS)

    Verger, Aleixandre; Baret, F.; Weiss, M.; Kandasamy, S.; Vermote, E.

    2013-01-01

    Consistent, continuous, and long time series of global biophysical variables derived from satellite data are required for global change research. A novel climatology fitting approach called CACAO (Consistent Adjustment of the Climatology to Actual Observations) is proposed to reduce noise and fill gaps in time series by scaling and shifting the seasonal climatological patterns to the actual observations. The shift and scale CACAO parameters adjusted for each season allow quantifying shifts in the timing of seasonal phenology and inter-annual variations in magnitude as compared to the average climatology. CACAO was assessed first over simulated daily Leaf Area Index (LAI) time series with varying fractions of missing data and noise. Then, performances were analyzed over actual satellite LAI products derived from AVHRR Long-Term Data Record for the 1981-2000 period over the BELMANIP2 globally representative sample of sites. Comparison with two widely used temporal filtering methods-the asymmetric Gaussian (AG) model and the Savitzky-Golay (SG) filter as implemented in TIMESAT-revealed that CACAO achieved better performances for smoothing AVHRR time series characterized by high level of noise and frequent missing observations. The resulting smoothed time series captures well the vegetation dynamics and shows no gaps as compared to the 50-60% of still missing data after AG or SG reconstructions. Results of simulation experiments as well as confrontation with actual AVHRR time series indicate that the proposed CACAO method is more robust to noise and missing data than AG and SG methods for phenology extraction.

  19. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  20. Fast Algorithms for Mining Co-evolving Time Series

    DTIC Science & Technology

    2011-09-01

    resolution methods : Fourier and Wavelets . . . . . . . . . . . . . . . . . . 9 2.2.4 Time series forecasting...categorical data. Our work is based on two key properties in those co-evolving time series , dynamics and correlation. Dynamics captures the temporal...applications. 2.2 A survey on time series methods There is a lot of work on time series analysis , on indexing, dimensionality reduction, forecasting

  1. Forecasting the Time Series of Sunspot Numbers

    NASA Astrophysics Data System (ADS)

    Aguirre, L. A.; Letellier, C.; Maquet, J.

    2008-05-01

    Forecasting the solar cycle is of great importance for weather prediction and environmental monitoring, and also constitutes a difficult scientific benchmark in nonlinear dynamical modeling. This paper describes the identification of a model and its use in the forecasting the time series comprised of Wolf’s sunspot numbers. A key feature of this procedure is that the original time series is first transformed into a symmetrical space where the dynamics of the solar dynamo are unfolded in a better way, thus improving the model. The nonlinear model obtained is parsimonious and has both deterministic and stochastic parts. Monte Carlo simulation of the whole model produces very consistent results with the deterministic part of the model but allows for the determination of confidence bands. The obtained model was used to predict cycles 24 and 25, although the forecast of the latter is seen as a crude approximation, given the long prediction horizon required. As for the 24th cycle, two estimates were obtained with peaks of 65±16 and of 87±13 units of sunspot numbers. The simulated results suggest that the 24th cycle will be shorter and less active than the preceding one.

  2. Tremor classification and tremor time series analysis

    NASA Astrophysics Data System (ADS)

    Deuschl, Günther; Lauk, Michael; Timmer, Jens

    1995-03-01

    The separation between physiologic tremor (PT) in normal subjects and the pathological tremors of essential tremor (ET) or Parkinson's disease (PD) was investigated on the basis of monoaxial accelerometric recordings of 35 s hand tremor epochs. Frequency and amplitude were insufficient to separate between these conditions, except for the trivial distinction between normal and pathologic tremors that is already defined on the basis of amplitude. We found that waveform analysis revealed highly significant differences between normal and pathologic tremors, and, more importantly, among different forms of pathologic tremors. We found in our group of 25 patients with PT and 15 with ET a reasonable distinction with the third momentum and the time reversal invariance. A nearly complete distinction between these two conditions on the basis of the asymmetric decay of the autocorrelation function. We conclude that time series analysis can probably be developed into a powerful tool for the objective analysis of tremors.

  3. Managing distribution changes in time series prediction

    NASA Astrophysics Data System (ADS)

    Matias, J. M.; Gonzalez-Manteiga, W.; Taboada, J.; Ordonez, C.

    2006-07-01

    When a problem is modeled statistically, a single distribution model is usually postulated that is assumed to be valid for the entire space. Nonetheless, this practice may be somewhat unrealistic in certain application areas, in which the conditions of the process that generates the data may change; as far as we are aware, however, no techniques have been developed to tackle this problem.This article proposes a technique for modeling and predicting this change in time series with a view to improving estimates and predictions. The technique is applied, among other models, to the hypernormal distribution recently proposed. When tested on real data from a range of stock market indices the technique produces better results that when a single distribution model is assumed to be valid for the entire period of time studied.Moreover, when a global model is postulated, it is highly recommended to select the hypernormal distribution parameter in the same likelihood maximization process.

  4. Time series for blind biosignal classification model.

    PubMed

    Wong, Derek F; Chao, Lidia S; Zeng, Xiaodong; Vai, Mang-I; Lam, Heng-Leong

    2014-11-01

    Biosignals such as electrocardiograms (ECG), electroencephalograms (EEG), and electromyograms (EMG), are important noninvasive measurements useful for making diagnostic decisions. Recently, considerable research has been conducted in order to potentially automate signal classification for assisting in disease diagnosis. However, the biosignal type (ECG, EEG, EMG or other) needs to be known prior to the classification process. If the given biosignal is of an unknown type, none of the existing methodologies can be utilized. In this paper, a blind biosignal classification model (B(2)SC Model) is proposed in order to identify the source biosignal type automatically, and thus ultimately benefit the diagnostic decision. The approach employs time series algorithms for constructing the model. It uses a dynamic time warping (DTW) algorithm with clustering to discover the similarity between two biosignals, and consequently classifies disease without prior knowledge of the source signal type. The empirical experiments presented in this paper demonstrate the effectiveness of the method as well as the scalability of the approach.

  5. Adaptive time-variant models for fuzzy-time-series forecasting.

    PubMed

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  6. Automated time series forecasting for biosurveillance.

    PubMed

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  7. Correcting and combining time series forecasters.

    PubMed

    Firmino, Paulo Renato A; de Mattos Neto, Paulo S G; Ferreira, Tiago A E

    2014-02-01

    Combined forecasters have been in the vanguard of stochastic time series modeling. In this way it has been usual to suppose that each single model generates a residual or prediction error like a white noise. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. The present paper introduces a two-step method for correcting and combining forecasting models. Firstly, the stochastic process underlying the bias of each predictive model is built according to a recursive ARIMA algorithm in order to achieve a white noise behavior. At each iteration of the algorithm the best ARIMA adjustment is determined according to a given information criterion (e.g. Akaike). Then, in the light of the corrected predictions, it is considered a maximum likelihood combined estimator. Applications involving single ARIMA and artificial neural networks models for Dow Jones Industrial Average Index, S&P500 Index, Google Stock Value, and Nasdaq Index series illustrate the usefulness of the proposed framework.

  8. Periodograms for multiband astronomical time series

    NASA Astrophysics Data System (ADS)

    Ivezic, Z.; VanderPlas, J. T.

    2016-05-01

    We summarize the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time- domain data developed by VanderPlas & Ivezic (2015). A Python implementation of this method is available on GitHub. The multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST), and can treat non-uniform sampling and heteroscedastic errors. The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data.

  9. Normalizing the causality between time series

    NASA Astrophysics Data System (ADS)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  10. Using entropy to cut complex time series

    NASA Astrophysics Data System (ADS)

    Mertens, David; Poncela Casasnovas, Julia; Spring, Bonnie; Amaral, L. A. N.

    2013-03-01

    Using techniques from statistical physics, physicists have modeled and analyzed human phenomena varying from academic citation rates to disease spreading to vehicular traffic jams. The last decade's explosion of digital information and the growing ubiquity of smartphones has led to a wealth of human self-reported data. This wealth of data comes at a cost, including non-uniform sampling and statistically significant but physically insignificant correlations. In this talk I present our work using entropy to identify stationary sub-sequences of self-reported human weight from a weight management web site. Our entropic approach-inspired by the infomap network community detection algorithm-is far less biased by rare fluctuations than more traditional time series segmentation techniques. Supported by the Howard Hughes Medical Institute

  11. Scaling laws from geomagnetic time series

    USGS Publications Warehouse

    Voros, Z.; Kovacs, P.; Juhasz, A.; Kormendi, A.; Green, A.W.

    1998-01-01

    The notion of extended self-similarity (ESS) is applied here for the X - component time series of geomagnetic field fluctuations. Plotting nth order structure functions against the fourth order structure function we show that low-frequency geomagnetic fluctuations up to the order n = 10 follow the same scaling laws as MHD fluctuations in solar wind, however, for higher frequencies (f > l/5[h]) a clear departure from the expected universality is observed for n > 6. ESS does not allow to make an unambiguous statement about the non triviality of scaling laws in "geomagnetic" turbulence. However, we suggest to use higher order moments as promising diagnostic tools for mapping the contributions of various remote magnetospheric sources to local observatory data. Copyright 1998 by the American Geophysical Union.

  12. Deconvolution of time series in the laboratory

    NASA Astrophysics Data System (ADS)

    John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian

    2016-10-01

    In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.

  13. Phase correlation of foreign exchange time series

    NASA Astrophysics Data System (ADS)

    Wu, Ming-Chya

    2007-03-01

    Correlation of foreign exchange rates in currency markets is investigated based on the empirical data of USD/DEM and USD/JPY exchange rates for a period from February 1 1986 to December 31 1996. The return of exchange time series is first decomposed into a number of intrinsic mode functions (IMFs) by the empirical mode decomposition method. The instantaneous phases of the resultant IMFs calculated by the Hilbert transform are then used to characterize the behaviors of pricing transmissions, and the correlation is probed by measuring the phase differences between two IMFs in the same order. From the distribution of phase differences, our results show explicitly that the correlations are stronger in daily time scale than in longer time scales. The demonstration for the correlations in periods of 1986-1989 and 1990-1993 indicates two exchange rates in the former period were more correlated than in the latter period. The result is consistent with the observations from the cross-correlation calculation.

  14. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    SciTech Connect

    VanderPlas, Jacob T.; Ivezic, Željko

    2015-10-10

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.

  15. Comparative Analysis on Time Series with Included Structural Break

    NASA Astrophysics Data System (ADS)

    Andreeski, Cvetko J.; Vasant, Pandian

    2009-08-01

    The time series analysis (ARIMA models) is a good approach for identification of time series. But, if we have structural break in the time series, we cannot create only one model of time series. Further more, if we don't have enough data between two structural breaks, it's impossible to create valid time series models for identification of the time series. This paper explores the possibility of identification of the inflation process dynamics via of the system-theoretic, by means of both Box-Jenkins ARIMA methodologies and artificial neural networks.

  16. Timing calibration and spectral cleaning of LOFAR time series data

    NASA Astrophysics Data System (ADS)

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-05-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw time series data for each receiver. The spectral cleaning has a calculated optimal sensitivity corresponding to a power signal-to-noise ratio of 0.08 (or -11 dB) in a spectral window of 25 kHz, for 2 ms of data in 48 antennas. This is well sufficient for our application. Timing calibration across individual antenna pairs has been performed at 0.4 ns precision; for calibration of signal clocks across stations of 48 antennas the precision is 0.1 ns. Monitoring differences in timing calibration per antenna pair over the course of the period 2011 to 2015 shows a precision of 0.08 ns, which is useful for monitoring and correcting drifts in signal path synchronizations. A cross-check method for timing calibration is presented, using a pulse transmitter carried by a drone flying over the array. Timing precision is similar, 0.3 ns, but is limited by transmitter position measurements, while requiring dedicated flights.

  17. Discrepancy between estimated and actual time elapsed after death of a severed head.

    PubMed

    Kojima, T; Miyazaki, T; Yashiki, M; Sakai, K; Yamasaki, Y

    1992-09-01

    A severed head which had been wrapped in seven plastic bags and set in concrete in an airtight insulated plastic box was found approximately 22 months after the occurrence of death. Ammonium magnesium phosphate had formed and on the basis of this and other observed postmortem changes, time elapsed after death was estimated to be from 2 weeks to 6 months. The absence of oxygen is thought to have contributed significantly to the great discrepancy between estimated and actual time elapsed after death.

  18. Discovering significant evolution patterns from satellite image time series.

    PubMed

    Petitjean, François; Masseglia, Florent; Gançarski, Pierre; Forestier, Germain

    2011-12-01

    Satellite Image Time Series (SITS) provide us with precious information on land cover evolution. By studying these series of images we can both understand the changes of specific areas and discover global phenomena that spread over larger areas. Changes that can occur throughout the sensing time can spread over very long periods and may have different start time and end time depending on the location, which complicates the mining and the analysis of series of images. This work focuses on frequent sequential pattern mining (FSPM) methods, since this family of methods fits the above-mentioned issues. This family of methods consists of finding the most frequent evolution behaviors, and is actually able to extract long-term changes as well as short term ones, whenever the change may start and end. However, applying FSPM methods to SITS implies confronting two main challenges, related to the characteristics of SITS and the domain's constraints. First, satellite images associate multiple measures with a single pixel (the radiometric levels of different wavelengths corresponding to infra-red, red, etc.), which makes the search space multi-dimensional and thus requires specific mining algorithms. Furthermore, the non evolving regions, which are the vast majority and overwhelm the evolving ones, challenge the discovery of these patterns. We propose a SITS mining framework that enables discovery of these patterns despite these constraints and characteristics. Our proposal is inspired from FSPM and provides a relevant visualization principle. Experiments carried out on 35 images sensed over 20 years show the proposed approach makes it possible to extract relevant evolution behaviors.

  19. Testing two temporal upscaling schemes for the estimation of the time variability of the actual evapotranspiration

    NASA Astrophysics Data System (ADS)

    Maltese, A.; Capodici, F.; Ciraolo, G.; La Loggia, G.

    2015-10-01

    Temporal availability of grapes actual evapotranspiration is an emerging issue since vineyards farms are more and more converted from rainfed to irrigated agricultural systems. The manuscript aims to verify the accuracy of the actual evapotranspiration retrieval coupling a single source energy balance approach and two different temporal upscaling schemes. The first scheme tests the temporal upscaling of the main input variables, namely the NDVI, albedo and LST; the second scheme tests the temporal upscaling of the energy balance output, the actual evapotranspiration. The temporal upscaling schemes were implemented on: i) airborne remote sensing data acquired monthly during a whole irrigation season over a Sicilian vineyard; ii) low resolution MODIS products released daily or weekly; iii) meteorological data acquired by standard gauge stations. Daily MODIS LST products (MOD11A1) were disaggregated using the DisTrad model, 8-days black and white sky albedo products (MCD43A) allowed modeling the total albedo, and 8-days NDVI products (MOD13Q1) were modeled using the Fisher approach. Results were validated both in time and space. The temporal validation was carried out using the actual evapotranspiration measured in situ using data collected by a flux tower through the eddy covariance technique. The spatial validation involved airborne images acquired at different times from June to September 2008. Results aim to test whether the upscaling of the energy balance input or output data performed better.

  20. Transmission of linear regression patterns between time series: From relationship in time series to complex networks

    NASA Astrophysics Data System (ADS)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  1. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    PubMed

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  2. Time Series Analysis of Mother-Infant Interaction.

    ERIC Educational Resources Information Center

    Rosenfeld, Howard M.

    A method of studying attachment behavior in infants was devised using time series and time sequence analyses. Time series analysis refers to relationships between events coded over adjacent fixed-time units. Time sequence analysis refers to the distribution of exact times at which particular events happen. Using these techniques, multivariate…

  3. Carbon time series in the Norwegian sea

    NASA Astrophysics Data System (ADS)

    Gislefoss, Jorunn S.; Nydal, Reidar; Slagstad, Dag; Sonninen, Eloni; Holmén, Kim

    1998-02-01

    Depth profiles of carbon parameters were obtained monthly from 1991 to 1994 as the first time series from the weathership station M located in the Norwegian Sea at 66°N 2°E. CO 2 was extracted from acidified seawater by a flushing procedure, with nitrogen as the carrier gas. The pure CO 2 gas was measured using a manometric technique, and the gas was further used for 13C and 14C measurements. The precision of the dissolved inorganic carbon (DIC) was better than ±6‰. Satisfactory agreement was obtained with standard seawater from Scripps Institution of Oceanography. The partial pressure of CO 2 (pCO 2) was measured in the atmosphere and surface water, beginning in October 1991. The most visible seasonal variation in DIC, 13C and pCO 2 was due to the plankton bloom in the upper 50-100 m. Typical values for surface water in the winter were: 2.140±0.012 mmol kg -1 for DIC, 1.00±0.04‰ for δ 13C and 357±15 μatm for pCO 2, and the corresponding values in the summer were as low as 2.04 mmol kg -1, greater than 2.1‰, and as low as 270-300 μatm. The values for deep water are more constant during the year, with DIC values of about 2.17±0.01 mmol kg -1, and δ 13C values between 0.97 and 1.14‰. A simple one-dimensional biological model was applied in order to investigate possible short-term variability in DIC caused by the phytoplankton growth and depth variations of the wind-mixed layer. The simulated seasonal pattern was in reasonable agreement with the observed data, but there were significant temporal variations with shorter time interval than the monthly measurements. As a supplement to the measurements at station M, some representative profiles of DIC, δ 13C, Δ 14C, salinity and temperature from other locations in the Nordic Seas and the North Atlantic Ocean are also presented. The results are also compared with some data obtained ( Δ 14C) by the TTO expedition in 1981 and the GEOSECS expedition in 1972. The carbon profiles reflect the stable deep

  4. Measuring persistence in time series of temperature anomalies

    NASA Astrophysics Data System (ADS)

    Triacca, Umberto; Pasini, Antonello; Attanasio, Alessandro

    2014-11-01

    Studies on persistence are important for the clarification of statistical properties of the analyzed time series and for understanding the dynamics of the systems which create these series. In climatology, the analysis of the autocorrelation function has been the main tool to investigate the persistence of a time series. In this paper, we propose to use a more sophisticated econometric instrument. Using this tool, we obtain an estimate of the persistence in global land and ocean and hemispheric temperature time series.

  5. Noise reduction by recycling dynamically coupled time series.

    PubMed

    Mera, M Eugenia; Morán, Manuel

    2011-12-01

    We say that several scalar time series are dynamically coupled if they record the values of measurements of the state variables of the same smooth dynamical system. We show that much of the information lost due to measurement noise in a target time series can be recovered with a noise reduction algorithm by crossing the time series with another time series with which it is dynamically coupled. The method is particularly useful for reduction of measurement noise in short length time series with high uncertainties.

  6. The scaling of time series size towards detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Gao, Xiaolei; Ren, Liwei; Shang, Pengjian; Feng, Guochen

    2016-06-01

    In this paper, we introduce a modification of detrended fluctuation analysis (DFA), called multivariate DFA (MNDFA) method, based on the scaling of time series size N. In traditional DFA method, we obtained the influence of the sequence segmentation interval s, and it inspires us to propose a new model MNDFA to discuss the scaling of time series size towards DFA. The effectiveness of the procedure is verified by numerical experiments with both artificial and stock returns series. Results show that the proposed MNDFA method contains more significant information of series compared to traditional DFA method. The scaling of time series size has an influence on the auto-correlation (AC) in time series. For certain series, we obtain an exponential relationship, and also calculate the slope through the fitting function. Our analysis and finite-size effect test demonstrate that an appropriate choice of the time series size can avoid unnecessary influences, and also make the testing results more accurate.

  7. Actual preoperative fasting time in Brazilian hospitals: the BIGFAST multicenter study

    PubMed Central

    de Aguilar-Nascimento, José E; de Almeida Dias, Ana L; Dock-Nascimento, Diana B; Correia, Maria Isabel TD; Campos, Antonio CL; Portari-Filho, Pedro Eder; Oliveira, Sergio S

    2014-01-01

    Background Prolonged fasting increases organic response to trauma. This multicenter study investigated the gap between the prescribed and the actual preoperative fasting times in Brazilian hospitals and factors associated with this gap. Methods Patients (18–90-years-old) who underwent elective operations between August 2011 and September 2012 were included in the study. The actual and prescribed times for fasting were collected and correlated with sex, age, surgical disease (malignancies or benign disease), operation type, American Society of Anesthesiologists score, type of hospital (public or private), and nutritional status. Results A total of 3,715 patients (58.1% females) with a median age of 49 (18–94) years from 16 Brazilian hospitals entered the study. The median (range) preoperative fasting time was 12 (2–216) hours, and fasting time was longer (P<0.001) in hospitals using a traditional fasting protocol (13 [6–216] hours) than in others that had adopted new guidelines (8 [2–48] hours). Almost 80% (n=2,962) of the patients were operated on after 8 or more hours of fasting and 46.2% (n=1,718) after more than 12 hours. Prolonged fasting was not associated with physical score, age, sex, type of surgery, or type of hospital. Patients operated on due to a benign disease had an extended duration of preoperative fasting. Conclusion Actual preoperative fasting time is significantly longer than prescribed fasting time in Brazilian hospitals. Most of these hospitals still adopt traditional rather than modern fasting guidelines. All patients are at risk of long periods of fasting, especially those in hospitals that follow traditional practices. PMID:24627636

  8. Network-based estimation of time-dependent noise in GPS position time series

    NASA Astrophysics Data System (ADS)

    Dmitrieva, Ksenia; Segall, Paul; DeMets, Charles

    2015-06-01

    Some estimates of GPS velocity uncertainties are very low, 0.1 mm/year with 10 years of data. Yet, residual velocities relative to rigid plate models in nominally stable plate interiors can be an order of magnitude larger. This discrepancy could be caused by underestimating low-frequency time-dependent noise in position time series, such as random walk. We show that traditional estimators, based on individual time series, are insensitive to low-amplitude random walk, yet such noise significantly increases GPS velocity uncertainties. Here, we develop a method for determining representative noise parameters in GPS position time series, by analyzing an entire network simultaneously, which we refer to as the network noise estimator (NNE). We analyze data from the aseismic central-eastern USA, assuming that residual motions relative to North America, corrected for glacial isostatic adjustment (GIA), represent noise. The position time series are decomposed into signal (plate rotation and GIA) and noise components. NNE simultaneously processes multiple stations with a Kalman filter and solves for average noise components for the network by maximum likelihood estimation. Synthetic tests show that NNE correctly estimates even low-level random walk, thus providing better estimates of velocity uncertainties than conventional, single station methods. To test NNE on actual data, we analyze a heterogeneous 15 station GPS network from the central-eastern USA, assuming the noise is a sum of random walk, flicker and white noise. For the horizontal time series, NNE finds higher average random walk than the standard individual station-based method, leading to velocity uncertainties a factor of 2 higher than traditional methods.

  9. Detecting inhomogeneities in pan evaporation time series

    NASA Astrophysics Data System (ADS)

    Kirono, D. G. C.

    2009-04-01

    There is increasingly growing demand for evaporation data for studies of surface water and energy fluxes, especially for studies which address the impacts of global warming. To serve this purpose, a homogeneous evaporation data are necessary. This paper describes the use of two tests for detecting and adjusting discontinuities in Class A pan evaporation time series for 28 stations across Australia, and illustrates the benefit of using corrected records in climate studies. The two tests being the bivariate test of Maronna and Yohai (1978), also known as the Potter method (WMO 2003), and the RHTest of Wang and Feng (2004). Overall, 58 per cent of the inhomogeneities detected by the bivariate test were also identified by the RHTest. The fact that the other 42 per cent of inhomogeneities were not consistently detected is due to different sensitivities of the two methods. Ninety-two per cent of the inhomogeneities detected by the bivariate test are consistent with documented changes that can be strongly associated with the discontinuity. Having identified inhomogeneities, the adjusments were only applied to records which contained inhomogeneities that could be verified as having a non-climatic origin. The benefit of using the original and adjusted pan evaporation records in a climate study were then investigated from two points of view: correlation analyses and trend analysis. As an illustration, the results show that the trend (1970-2004) in the all-stations average was -2.8±1.7 for the original data but only -0.7±1.6 mm/year/year for the adjusted data, demonstrating the importance of screening the data before their use in climate studies. References Maronna, R. and Yohai, V.J. 1978. A bivariate test for the detection of a systematic change in mean. J. Amer. Statis. Assoc., 73, 640-645. Wang, X.L. and Feng, Y. 2004. RHTest User manual. Available from http://cccma.seos.uvic.ca/ETCCDMI/RHTestUserManual.doc WMO. 2003. Guidelines on climate metadata and homogenization

  10. Fast and Flexible Multivariate Time Series Subsequence Search

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.

    2010-01-01

    Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.

  11. Topological diagnostics of the cyclic component of the time series associated with helium

    NASA Astrophysics Data System (ADS)

    Knyazeva, I. S.; Nagovitsyn, Yu. A.; Urt'ev, F. A.; Makarenko, N. G.

    2016-12-01

    Detection of the deterministic component from noised time series is a common procedure in the solar-terrestrial coupling problem when climate is modeled, solar activity is analyzed, or a signal associated with helium is extracted. Such series are mostly generated by the superposition of different processes for which the concept of a noise component cannot be determined formally. A method based on the combination of time-series topological embedding in Euclidean space and the identification of a persistent cycle by homology theory methods is proposed. The method application is demonstrated based on actual data.

  12. From time series to complex networks: the visibility graph.

    PubMed

    Lacasa, Lucas; Luque, Bartolo; Ballesteros, Fernando; Luque, Jordi; Nuño, Juan Carlos

    2008-04-01

    In this work we present a simple and fast computational method, the visibility algorithm, that converts a time series into a graph. The constructed graph inherits several properties of the series in its structure. Thereby, periodic series convert into regular graphs, and random series do so into random graphs. Moreover, fractal series convert into scale-free networks, enhancing the fact that power law degree distributions are related to fractality, something highly discussed recently. Some remarkable examples and analytical tools are outlined to test the method's reliability. Many different measures, recently developed in the complex network theory, could by means of this new approach characterize time series from a new point of view.

  13. Scale-dependent intrinsic entropies of complex time series.

    PubMed

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.

  14. Efficient Algorithms for Segmentation of Item-Set Time Series

    NASA Astrophysics Data System (ADS)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  15. Geostatistical analysis as applied to two environmental radiometric time series.

    PubMed

    Dowdall, Mark; Lind, Bjørn; Gerland, Sebastian; Rudjord, Anne Liv

    2003-03-01

    This article details the results of an investigation into the application of geostatistical data analysis to two environmental radiometric time series. The data series employed consist of 99Tc values for seaweed (Fucus vesiculosus) and seawater samples taken as part of a marine monitoring program conducted on the coast of northern Norway by the Norwegian Radiation Protection Authority. Geostatistical methods were selected in order to provide information on values of the variables at unsampled times and to investigate the temporal correlation exhibited by the data sets. This information is of use in the optimisation of future sampling schemes and for providing information on the temporal behaviour of the variables in question that may not be obtained during a cursory analysis. The results indicate a high degree of temporal correlation within the data sets, the correlation for the seawater and seaweed data being modelled with an exponential and linear function, respectively. The semi-variogram for the seawater data indicates a temporal range of correlation of approximately 395 days with no apparent random component to the overall variance structure and was described best by an exponential function. The temporal structure of the seaweed data was best modelled by a linear function with a small nugget component. Evidence of drift was present in both semi-variograms. Interpolation of the data sets using the fitted models and a simple kriging procedure were compared, using a cross-validation procedure, with simple linear interpolation. Results of this exercise indicate that, for the seawater data, the kriging procedure outperformed the simple interpolation with respect to error distribution and correlation of estimates with actual values. Using the unbounded linear model with the seaweed data produced estimates that were only marginally better than those produced by the simple interpolation.

  16. gatspy: General tools for Astronomical Time Series in Python

    NASA Astrophysics Data System (ADS)

    VanderPlas, Jake

    2016-10-01

    Gatspy contains efficient, well-documented implementations of several common routines for Astronomical time series analysis, including the Lomb-Scargle periodogram, the Supersmoother method, and others.

  17. Apparatus for statistical time-series analysis of electrical signals

    NASA Technical Reports Server (NTRS)

    Stewart, C. H. (Inventor)

    1973-01-01

    An apparatus for performing statistical time-series analysis of complex electrical signal waveforms, permitting prompt and accurate determination of statistical characteristics of the signal is presented.

  18. Trend time-series modeling and forecasting with neural networks.

    PubMed

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  19. Foot gait time series estimation based on support vector machine.

    PubMed

    Pant, Jeevan K; Krishnan, Sridhar

    2014-01-01

    A new algorithm for the estimation of stride interval time series from foot gait signals is proposed. The algorithm is based on the detection of beginning of heel strikes in the signal by using the support vector machine. Morphological operations are used to enhance the accuracy of detection. By taking backward differences of the detected beginning of heel strikes, stride interval time series is estimated. Simulation results are presented which shows that the proposed algorithm yields fairly accurate estimation of stride interval time series where estimation error for mean and standard deviation of the time series is of the order of 10(-4).

  20. Using neural networks for dynamic light scattering time series processing

    NASA Astrophysics Data System (ADS)

    Chicea, Dan

    2017-04-01

    A basic experiment to record dynamic light scattering (DLS) time series was assembled using basic components. The DLS time series processing using the Lorentzian function fit was considered as reference. A Neural Network was designed and trained using simulated frequency spectra for spherical particles in the range 0–350 nm, assumed to be scattering centers, and the neural network design and training procedure are described in detail. The neural network output accuracy was tested both on simulated and on experimental time series. The match with the DLS results, considered as reference, was good serving as a proof of concept for using neural networks in fast DLS time series processing.

  1. Interpretable Early Classification of Multivariate Time Series

    ERIC Educational Resources Information Center

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  2. Simulation of Ground Winds Time Series

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    2008-01-01

    A simulation process has been developed for generation of the longitudinal and lateral components of ground wind atmospheric turbulence as a function of mean wind speed, elevation, temporal frequency range and distance between locations. The distance between locations influences the spectral coherence between the simulated series at adjacent locations. Short distances reduce correlation only at high frequencies; as distances increase correlation is reduced over a wider range of frequencies. The choice of values for the constants d1 and d3 in the PSD model is the subject of work in progress. An improved knowledge of the values for zO as a function of wind direction at the ARES-1 launch pads is necessary for definition of d1. Results of other studies at other locations may be helpful as summarized in Fichtl's recent correspondence. Ideally, further research is needed based on measurements of ground wind turbulence with high resolution anemometers at a number of altitudes at a new KSC tower located closer to the ARES-1 launch pad .The proposed research would be based on turbulence measurements that may be influenced by surface terrain roughness that may be significantly different from roughness prior to 1970 in Fichtl's measurements. Significant improvements in instrumentation, data storage end processing will greatly enhance the capability to model ground wind profiles and ground wind turbulence.

  3. Testing time series irreversibility using complex network methods

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Donner, Reik V.; Kurths, Jürgen

    2013-04-01

    The absence of time-reversal symmetry is a fundamental property of many nonlinear time series. Here, we propose a new set of statistical tests for time series irreversibility based on standard and horizontal visibility graphs. Specifically, we statistically compare the distributions of time-directed variants of the common complex network measures degree and local clustering coefficient. Our approach does not involve surrogate data and is applicable to relatively short time series. We demonstrate its performance for paradigmatic model systems with known time-reversal properties as well as for picking up signatures of nonlinearity in neuro-physiological data.

  4. Common trends in northeast Atlantic squid time series

    NASA Astrophysics Data System (ADS)

    Zuur, A. F.; Pierce, G. J.

    2004-06-01

    In this paper, dynamic factor analysis is used to estimate common trends in time series of squid catch per unit effort in Scottish (UK) waters. Results indicated that time series of most months were related to sea surface temperature measured at Millport (UK) and a few series were related to the NAO index. The DFA methodology identified three common trends in the squid time series not revealed by traditional approaches, which suggest a possible shift in relative abundance of summer- and winter-spawning populations.

  5. Distance measure with improved lower bound for multivariate time series

    NASA Astrophysics Data System (ADS)

    Li, Hailin

    2017-02-01

    Lower bound function is one of the important techniques used to fast search and index time series data. Multivariate time series has two aspects of high dimensionality including the time-based dimension and the variable-based dimension. Due to the influence of variable-based dimension, a novel method is proposed to deal with the lower bound distance computation for multivariate time series. The proposed method like the traditional ones also reduces the dimensionality of time series in its first step and thus does not directly apply the lower bound function on the multivariate time series. The dimensionality reduction is that multivariate time series is reduced to univariate time series denoted as center sequences according to the principle of piecewise aggregate approximation. In addition, an extended lower bound function is designed to obtain good tightness and fast measure the distance between any two center sequences. The experimental results demonstrate that the proposed lower bound function has better tightness and improves the performance of similarity search in multivariate time series datasets.

  6. Multiscale structure of time series revealed by the monotony spectrum

    NASA Astrophysics Data System (ADS)

    Vamoş, Cǎlin

    2017-03-01

    Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.

  7. Short time-series microarray analysis: Methods and challenges

    PubMed Central

    Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina

    2008-01-01

    The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data. PMID:18605994

  8. Horizontal visibility graphs: exact results for random time series.

    PubMed

    Luque, B; Lacasa, L; Ballesteros, F; Luque, J

    2009-10-01

    The visibility algorithm has been recently introduced as a mapping between time series and complex networks. This procedure allows us to apply methods of complex network theory for characterizing time series. In this work we present the horizontal visibility algorithm, a geometrically simpler and analytically solvable version of our former algorithm, focusing on the mapping of random series (series of independent identically distributed random variables). After presenting some properties of the algorithm, we present exact results on the topological properties of graphs associated with random series, namely, the degree distribution, the clustering coefficient, and the mean path length. We show that the horizontal visibility algorithm stands as a simple method to discriminate randomness in time series since any random series maps to a graph with an exponential degree distribution of the shape P(k)=(1/3)(2/3)(k-2), independent of the probability distribution from which the series was generated. Accordingly, visibility graphs with other P(k) are related to nonrandom series. Numerical simulations confirm the accuracy of the theorems for finite series. In a second part, we show that the method is able to distinguish chaotic series from independent and identically distributed (i.i.d.) theory, studying the following situations: (i) noise-free low-dimensional chaotic series, (ii) low-dimensional noisy chaotic series, even in the presence of large amounts of noise, and (iii) high-dimensional chaotic series (coupled map lattice), without needs for additional techniques such as surrogate data or noise reduction methods. Finally, heuristic arguments are given to explain the topological properties of chaotic series, and several sequences that are conjectured to be random are analyzed.

  9. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    The modeling approach in atmospheric sciences is based on the assumption that local fluxes of mass, momentum, heat, etc... can be described as linear functions of the local gradient of some intensive property (concentration, flow strain, temperature,...). This is essentially associated with Gaussian statistics and short range (exponential) correlations. However, the atmosphere is a complex dynamical system displaying a wide range of spatial and temporal scales. A global description of the atmospheric dynamics should include a great number of degrees of freedom, strongly interacting on several temporal and spatial scales, thus generating long range (power-law) correlations and non-Gaussian distribution of fluctuations (Lévy flights, Lévy walks, Continuous Time Random Walks) [1]. This is typically associated with anomalous diffusion and scaling, non-trivial memory features and correlation decays and, especially, with the emergence of flux-gradient relationships that are non-linear and/or non-local in time and/or space. Actually, the local flux-gradient relationship is greatly preferred due to a more clear physical meaning, allowing to perform direct comparisons with experimental data, and, especially, to smaller computational costs in numerical models. In particular, the linearity of this relationship allows to define a transport coefficient (e.g., turbulent diffusivity) and the modeling effort is usually focused on this coefficient. However, the validity of the local (and linear) flux-gradient model is strongly dependent on the range of spatial and temporal scales represented by the model and, consequently, by the sub-grid processes included in the flux-gradient relationship. In this work, in order to check the validity of local and linear flux-gradient relationships, an approach based on the concept of renewal critical events [2] is introduced. In fact, in renewal theory [2], the dynamical origin of anomalous behaviour and non-local flux-gradient relation is

  10. Nonlinear parametric model for Granger causality of time series

    NASA Astrophysics Data System (ADS)

    Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano

    2006-06-01

    The notion of Granger causality between two time series examines if the prediction of one series could be improved by incorporating information of the other. In particular, if the prediction error of the first time series is reduced by including measurements from the second time series, then the second time series is said to have a causal influence on the first one. We propose a radial basis function approach to nonlinear Granger causality. The proposed model is not constrained to be additive in variables from the two time series and can approximate any function of these variables, still being suitable to evaluate causality. Usefulness of this measure of causality is shown in two applications. In the first application, a physiological one, we consider time series of heart rate and blood pressure in congestive heart failure patients and patients affected by sepsis: we find that sepsis patients, unlike congestive heart failure patients, show symmetric causal relationships between the two time series. In the second application, we consider the feedback loop in a model of excitatory and inhibitory neurons: we find that in this system causality measures the combined influence of couplings and membrane time constants.

  11. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  12. The Prediction of Teacher Turnover Employing Time Series Analysis.

    ERIC Educational Resources Information Center

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  13. Investigation on gait time series by means of factorial moments

    NASA Astrophysics Data System (ADS)

    Yang, Huijie; Zhao, Fangcui; Zhuo, Yizhong; Wu, Xizhen; Li, Zhuxia

    2002-09-01

    By means of factorial moments (FM), the fractal structures embedded in gait time series are investigated. Intermittency is found in records for healthy objects. And this kind of intermittency is much sensitive to disease or outside influences. It is found that FM is an effective tool to deal with this kind of time series.

  14. Improved singular spectrum analysis for time series with missing data

    NASA Astrophysics Data System (ADS)

    Shen, Y.; Peng, F.; Li, B.

    2015-07-01

    Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution develops an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach is evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60 %, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30 % for the first four principal components, respectively. Both the mean absolute error and mean root mean squared error of the reconstructed time series by ISSA are also smaller than those by SSAM. The respective improvements are 34.45 and 33.91 % when the missing data accounts for 60 %. The results from real incomplete time series also show that the standard deviation (SD) derived by ISSA is 12.27 mg L-1, smaller than the 13.48 mg L-1 derived by SSAM.

  15. Improved singular spectrum analysis for time series with missing data

    NASA Astrophysics Data System (ADS)

    Shen, Y.; Peng, F.; Li, B.

    2014-12-01

    Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution will develop an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach was evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60%, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30% for the first four principal components, respectively. Besides, both the mean absolute errors and mean root mean squared errors of the reconstructed time series by ISSA are also much smaller than those by SSAM. The respective improvements are 34.45 and 33.91% when the missing data accounts for 60%. The results from real incomplete time series also show that the SD derived by ISSA is 12.27 mg L-1, smaller than 13.48 mg L-1 derived by SSAM.

  16. Measurements of spatial population synchrony: influence of time series transformations.

    PubMed

    Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël

    2015-09-01

    Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.

  17. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  18. A Computer Evolution in Teaching Undergraduate Time Series

    ERIC Educational Resources Information Center

    Hodgess, Erin M.

    2004-01-01

    In teaching undergraduate time series courses, we have used a mixture of various statistical packages. We have finally been able to teach all of the applied concepts within one statistical package; R. This article describes the process that we use to conduct a thorough analysis of a time series. An example with a data set is provided. We compare…

  19. Predictability of nonstationary time series using wavelet and EMD based ARMA models

    NASA Astrophysics Data System (ADS)

    Karthikeyan, L.; Nagesh Kumar, D.

    2013-10-01

    Research has been undertaken to ascertain the predictability of non-stationary time series using wavelet and Empirical Mode Decomposition (EMD) based time series models. Methods have been developed in the past to decompose a time series into components. Forecasting of these components combined with random component could yield predictions. Using this ideology, wavelet and EMD analyses have been incorporated separately which decomposes a time series into independent orthogonal components with both time and frequency localizations. The component series are fit with specific auto-regressive models to obtain forecasts which are later combined to obtain the actual predictions. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability is checked for six and twelve months ahead forecasts across both the methodologies. Based on performance measures, it is observed that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm can be used to model events such as droughts with reasonable accuracy. Also, some modifications that can be made in the model have been discussed that could extend the scope of applicability to other areas in the field of hydrology.

  20. Scaling and Multiscaling in Financial Time Series

    DTIC Science & Technology

    2007-11-02

    Prescribed by ANSI Std Z39-18 Outline 1/ A brief overview of financial markets • Basic definitions and problems related to finance • Scaling in finance 2...quantitative finance • Rational investment and risk management - Price dynamics - Risk quantification and control - Financial instruments: derivatives... finance • Supported by empirical observations • Practical interests. - Stability over time scales (by aggregation) - The same model is valid over a wide

  1. Time series diagnosis of tree hydraulic characteristics.

    PubMed

    Phillips, Nathan G; Oren, Ram; Licata, Julian; Linder, Sune

    2004-08-01

    An in vivo method for diagnosing hydraulic characteristics of branches and whole trees is described. The method imposes short-lived perturbations of transpiration and traces the propagation of the hydraulic response through trees. The water uptake response contains the integrated signature of hydraulic resistance and capacitance within trees. The method produces large signal to noise ratios for analysis, but does not cause damage or destruction to tree stems or branches. Based on results with two conifer tree species, we show that the method allows for the simple parameterization of bulk hydraulic resistance and capacitance of trees. Bulk tree parameterization of resistance and capacitance predicted the overall diel shape of water uptake, but did not predict the overshoot water uptake response in trees to shorter-term variations in transpiration, created by step changes in transpiration rate. Stomatal dynamics likely complicated the use of simple resistance-capacitance models of tree water transport on these short time scales. The results provide insight into dominant hydraulic and physiological factors controlling tree water flux on varying time scales, and allow for the practical assessment of necessary tree hydraulic model complexity in relation to the time step of soil- vegetation-atmosphere transport models.

  2. Time Series of North Pacific Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Worden, A. K.; Webley, P. W.

    2011-12-01

    The record of volcanic eruptions was gathered from the 1986 eruption of Augustine Volcano to present for Alaska, Kamchatka and the Kuriles Islands. In this time over 400 ash producing eruptions were noted, and many more events that produced some other activity, e.g. lava, lahar, small explosion, seismic crisis. This represents a minimum for the volcanic activity in this region. It is thought that the records for Alaska are complete for this time period, but it is possible that activity in the Kuriles and Kamchatka could have been overlooked, particularly smaller events. For the Alaska region, 19 different volcanoes have been active in this time. Mt. Cleveland shows the most activity over the time period (40 % likely to have activity in a 3 month period), followed closely by Pavlof (34% likely)volcano. In Kamchatka only 7 volcanoes have been active, Shiveluch is the most active (83% likely) followed by Bezymianny and Kliuchevskoi volcanoes (tied at 60%). The Kuriles only has had 4 active volcanoes, and only 6 known eruptions. Overall this region is one of the most active in the world, in any 3 month period there is a 77% likelihood of volcano activity. For well instrumented volcanoes, the majority of activity is preceded by significant seismicity. For just over half of the events, explosive activity is preceded by thermal signals in infrared satellite data. Rarely (only about 5% of the time) is a stand alone thermal signal not followed within 3 months by an explosive eruption. For remaining events where an ash plume begins the activity, over 90% of the cases show a thermal signal the eruption. The volcanoes with the most activity are the least likely to produce large ash plumes. Conversely the volcanoes that erupt rarely often begin with larger ash producing events. Though there appears to be a recurrent progression of volcanic activity down the chain from east to west, this may be an artifact of several independent systems, each working at their own rate, that

  3. Time Series Prediction of Hurricane Landfall.

    DTIC Science & Technology

    1986-05-01

    8217 132 111112-2 11111111.8 MICROCOPY RESOLUTION TEST CHART NATIONAL BUR[AU OIf SIANARD lq A .5. 𔃿. SECURITY CLASSIFICATION OF THIS PAGE (When, Dta Entered...parameters to change as the storm moves to a new region of the ocean. For test cases, operational average 72 hour prediction error is at least three...comparatively accurate for forecast times of 24 hours or less. The SANBAR model (Sanders and Burpee , 1968), has been in use at NHC since 1970. It is a

  4. Sunspot Time Series: Passive and Active Intervals

    NASA Astrophysics Data System (ADS)

    Zięba, S.; Nieckarz, Z.

    2014-07-01

    Solar activity slowly and irregularly decreases from the first spotless day (FSD) in the declining phase of the old sunspot cycle and systematically, but also in an irregular way, increases to the new cycle maximum after the last spotless day (LSD). The time interval between the first and the last spotless day can be called the passive interval (PI), while the time interval from the last spotless day to the first one after the new cycle maximum is the related active interval (AI). Minima of solar cycles are inside PIs, while maxima are inside AIs. In this article, we study the properties of passive and active intervals to determine the relation between them. We have found that some properties of PIs, and related AIs, differ significantly between two group of solar cycles; this has allowed us to classify Cycles 8 - 15 as passive cycles, and Cycles 17 - 23 as active ones. We conclude that the solar activity in the PI declining phase (a descending phase of the previous cycle) determines the strength of the approaching maximum in the case of active cycles, while the activity of the PI rising phase (a phase of the ongoing cycle early growth) determines the strength of passive cycles. This can have implications for solar dynamo models. Our approach indicates the important role of solar activity during the declining and the rising phases of the solar-cycle minimum.

  5. Comparison of New and Old Sunspot Number Time Series

    NASA Astrophysics Data System (ADS)

    Cliver, E. W.

    2016-11-01

    Four new sunspot number time series have been published in this Topical Issue: a backbone-based group number in Svalgaard and Schatten ( Solar Phys., 2016; referred to here as SS, 1610 - present), a group number series in Usoskin et al. ( Solar Phys., 2016; UEA, 1749 - present) that employs active day fractions from which it derives an observational threshold in group spot area as a measure of observer merit, a provisional group number series in Cliver and Ling ( Solar Phys., 2016; CL, 1841 - 1976) that removed flaws in the Hoyt and Schatten ( Solar Phys. 179, 189, 1998a; 181, 491, 1998b) normalization scheme for the original relative group sunspot number (RG, 1610 - 1995), and a corrected Wolf (international, RI) number in Clette and Lefèvre ( Solar Phys., 2016; SN, 1700 - present). Despite quite different construction methods, the four new series agree well after about 1900. Before 1900, however, the UEA time series is lower than SS, CL, and SN, particularly so before about 1885. Overall, the UEA series most closely resembles the original RG series. Comparison of the UEA and SS series with a new solar wind B time series (Owens et al. in J. Geophys. Res., 2016; 1845 - present) indicates that the UEA time series is too low before 1900. We point out incongruities in the Usoskin et al. ( Solar Phys., 2016) observer normalization scheme and present evidence that this method under-estimates group counts before 1900. In general, a correction factor time series, obtained by dividing an annual group count series by the corresponding yearly averages of raw group counts for all observers, can be used to assess the reliability of new sunspot number reconstructions.

  6. Detecting unstable periodic orbits from transient chaotic time series

    PubMed

    Dhamala; Lai; Kostelich

    2000-06-01

    We address the detection of unstable periodic orbits from experimentally measured transient chaotic time series. In particular, we examine recurrence times of trajectories in the vector space reconstructed from an ensemble of such time series. Numerical experiments demonstrate that this strategy can yield periodic orbits of low periods even when noise is present. We analyze the probability of finding periodic orbits from transient chaotic time series and derive a scaling law for this probability. The scaling law implies that unstable periodic orbits of high periods are practically undetectable from transient chaos.

  7. High performance biomedical time series indexes using salient segmentation.

    PubMed

    Woodbridge, Jonathan; Mortazavi, Bobak; Bui, Alex A T; Sarrafzadeh, Majid

    2012-01-01

    The advent of remote and wearable medical sensing has created a dire need for efficient medical time series databases. Wearable medical sensing devices provide continuous patient monitoring by various types of sensors and have the potential to create massive amounts of data. Therefore, time series databases must utilize highly optimized indexes in order to efficiently search and analyze stored data. This paper presents a highly efficient technique for indexing medical time series signals using Locality Sensitive Hashing (LSH). Unlike previous work, only salient (or interesting) segments are inserted into the index. This technique reduces search times by up to 95% while yielding near identical search results.

  8. From time series to complex networks: The visibility graph

    PubMed Central

    Lacasa, Lucas; Luque, Bartolo; Ballesteros, Fernando; Luque, Jordi; Nuño, Juan Carlos

    2008-01-01

    In this work we present a simple and fast computational method, the visibility algorithm, that converts a time series into a graph. The constructed graph inherits several properties of the series in its structure. Thereby, periodic series convert into regular graphs, and random series do so into random graphs. Moreover, fractal series convert into scale-free networks, enhancing the fact that power law degree distributions are related to fractality, something highly discussed recently. Some remarkable examples and analytical tools are outlined to test the method's reliability. Many different measures, recently developed in the complex network theory, could by means of this new approach characterize time series from a new point of view. PMID:18362361

  9. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  10. Detecting and visualizing structural changes in groundwater head time series

    NASA Astrophysics Data System (ADS)

    van Geer, Frans

    2013-04-01

    Since the fifties of the past century the dynamic behavior of the groundwater head has been monitored at many locations throughout the Netherlands and elsewhere. The data base of the Geological Survey of the Netherlands contains over 30,000 groundwater time series. For many water management purposes characteristics of the dynamic behavior are required, such as average, median, percentile etc.. These characteristics are estimated from the time series. In principle, the longer the time series, the more reliable the estimate. However, due to natural as well as man induced changes, the characteristics of a long time series are often changing in time as well. For water management it is important to be able to distinguish extreme values as part of the 'normal' pattern from structural changes in the groundwater regime. Whether or not structural changes are present in the time series can't be decided completely objective. Choices have to be made concerning the length of the period and the statistical parameters. Here a method is proposed to visualize the probability of structural changes in the time series using well known basic statistical tests. The visualization method is based on the mean values and standard deviation in a moving window. Apart from several characteristics that are calculated for each period separately, all pairs of two periods are compared and the difference is statistically tested. The results of these well known tests are combined in a visualization to supply to the user comprehensive information to examine structural changes in time series.

  11. Automated analysis of brachial ultrasound time series

    NASA Astrophysics Data System (ADS)

    Liang, Weidong; Browning, Roger L.; Lauer, Ronald M.; Sonka, Milan

    1998-07-01

    Atherosclerosis begins in childhood with the accumulation of lipid in the intima of arteries to form fatty streaks, advances through adult life when occlusive vascular disease may result in coronary heart disease, stroke and peripheral vascular disease. Non-invasive B-mode ultrasound has been found useful in studying risk factors in the symptom-free population. Large amount of data is acquired from continuous imaging of the vessels in a large study population. A high quality brachial vessel diameter measurement method is necessary such that accurate diameters can be measured consistently in all frames in a sequence, across different observers. Though human expert has the advantage over automated computer methods in recognizing noise during diameter measurement, manual measurement suffers from inter- and intra-observer variability. It is also time-consuming. An automated measurement method is presented in this paper which utilizes quality assurance approaches to adapt to specific image features, to recognize and minimize the noise effect. Experimental results showed the method's potential for clinical usage in the epidemiological studies.

  12. Performance of multifractal detrended fluctuation analysis on short time series

    NASA Astrophysics Data System (ADS)

    López, Juan Luis; Contreras, Jesús Guillermo

    2013-02-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  13. Model-free quantification of time-series predictability.

    PubMed

    Garland, Joshua; James, Ryan; Bradley, Elizabeth

    2014-11-01

    This paper provides insight into when, why, and how forecast strategies fail when they are applied to complicated time series. We conjecture that the inherent complexity of real-world time-series data, which results from the dimension, nonlinearity, and nonstationarity of the generating process, as well as from measurement issues such as noise, aggregation, and finite data length, is both empirically quantifiable and directly correlated with predictability. In particular, we argue that redundancy is an effective way to measure complexity and predictive structure in an experimental time series and that weighted permutation entropy is an effective way to estimate that redundancy. To validate these conjectures, we study 120 different time-series data sets. For each time series, we construct predictions using a wide variety of forecast models, then compare the accuracy of the predictions with the permutation entropy of that time series. We use the results to develop a model-free heuristic that can help practitioners recognize when a particular prediction method is not well matched to the task at hand: that is, when the time series has more predictive structure than that method can capture and exploit.

  14. Time Series Decomposition into Oscillation Components and Phase Estimation.

    PubMed

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-02-01

    Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.

  15. Nonlinear independent component analysis and multivariate time series analysis

    NASA Astrophysics Data System (ADS)

    Storck, Jan; Deco, Gustavo

    1997-02-01

    We derive an information-theory-based unsupervised learning paradigm for nonlinear independent component analysis (NICA) with neural networks. We demonstrate that under the constraint of bounded and invertible output transfer functions the two main goals of unsupervised learning, redundancy reduction and maximization of the transmitted information between input and output (Infomax-principle), are equivalent. No assumptions are made concerning the kind of input and output distributions, i.e. the kind of nonlinearity of correlations. An adapted version of the general NICA network is used for the modeling of multivariate time series by unsupervised learning. Given time series of various observables of a dynamical system, our net learns their evolution in time by extracting statistical dependencies between past and present elements of the time series. Multivariate modeling is obtained by making present value of each time series statistically independent not only from their own past but also from the past of the other series. Therefore, in contrast to univariate methods, the information lying in the couplings between the observables is also used and a detection of higher-order cross correlations is possible. We apply our method to time series of the two-dimensional Hénon map and to experimental time series obtained from the measurements of axial velocities in different locations in weakly turbulent Taylor-Couette flow.

  16. Database for Hydrological Time Series of Inland Waters (DAHITI)

    NASA Astrophysics Data System (ADS)

    Schwatke, Christian; Dettmering, Denise

    2016-04-01

    Satellite altimetry was designed for ocean applications. However, since some years, satellite altimetry is also used over inland water to estimate water level time series of lakes, rivers and wetlands. The resulting water level time series can help to understand the water cycle of system earth and makes altimetry to a very useful instrument for hydrological applications. In this poster, we introduce the "Database for Hydrological Time Series of Inland Waters" (DAHITI). Currently, the database contains about 350 water level time series of lakes, reservoirs, rivers, and wetlands which are freely available after a short registration process via http://dahiti.dgfi.tum.de. In this poster, we introduce the product of DAHITI and the functionality of the DAHITI web service. Furthermore, selected examples of inland water targets are presented in detail. DAHITI provides time series of water level heights of inland water bodies and their formal errors . These time series are available within the period of 1992-2015 and have varying temporal resolutions depending on the data coverage of the investigated water body. The accuracies of the water level time series depend mainly on the extent of the investigated water body and the quality of the altimeter measurements. Hereby, an external validation with in-situ data reveals RMS differences between 5 cm and 40 cm for lakes and 10 cm and 140 cm for rivers, respectively.

  17. Estimation of Parameters from Discrete Random Nonstationary Time Series

    NASA Astrophysics Data System (ADS)

    Takayasu, H.; Nakamura, T.

    For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.

  18. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive

  19. Modeling Persistence In Hydrological Time Series Using Fractional Differencing

    NASA Astrophysics Data System (ADS)

    Hosking, J. R. M.

    1984-12-01

    The class of autoregressive integrated moving average (ARIMA) time series models may be generalized by permitting the degree of differencing d to take fractional values. Models including fractional differencing are capable of representing persistent series (d > 0) or short-memory series (d = 0). The class of fractionally differenced ARIMA processes provides a more flexible way than has hitherto been available of simultaneously modeling the long-term and short-term behavior of a time series. In this paper some fundamental properties of fractionally differenced ARIMA processes are presented. Methods of simulating these processes are described. Estimation of the parameters of fractionally differenced ARIMA models is discussed, and an approximate maximum likelihood method is proposed. The methodology is illustrated by fitting fractionally differenced models to time series of streamflows and annual temperatures.

  20. Wavelet analysis and scaling properties of time series

    NASA Astrophysics Data System (ADS)

    Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.

    2005-10-01

    We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.

  1. Analyzing multiple nonlinear time series with extended Granger causality

    NASA Astrophysics Data System (ADS)

    Chen, Yonghong; Rangarajan, Govindan; Feng, Jianfeng; Ding, Mingzhou

    2004-04-01

    Identifying causal relations among simultaneously acquired signals is an important problem in multivariate time series analysis. For linear stochastic systems Granger proposed a simple procedure called the Granger causality to detect such relations. In this work we consider nonlinear extensions of Granger's idea and refer to the result as extended Granger causality. A simple approach implementing the extended Granger causality is presented and applied to multiple chaotic time series and other types of nonlinear signals. In addition, for situations with three or more time series we propose a conditional extended Granger causality measure that enables us to determine whether the causal relation between two signals is direct or mediated by another process.

  2. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    PubMed Central

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  3. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  4. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    SciTech Connect

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  5. A Multiscale Approach to InSAR Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y.; Dicaprio, C. J.

    2009-12-01

    We describe progress in the development of MInTS (Multiscale analysis of InSAR Time Series), an approach to constructed self-consistent time-dependent deformation observations from repeated satellite-based InSAR images of a given region. MInTS relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. In essence, MInTS allows one to considers all data at the same time as opposed to one pixel at a time, thereby taking advantage of both spatial and temporal characteristics of the deformation field. This approach also permits a consistent treatment of all data independent of the presence of localized holes due to unwrapping issues in any given interferogram. Specifically, the presence of holes is accounted for through a weighting scheme that accounts for the extent of actual data versus the area of holes associated with any given wavelet. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to include a set of general functions in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. We also experiment with the use of sparsity inducing regularization as a way to select from a large dictionary of time functions. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate

  6. Quantifying Memory in Complex Physiological Time-Series

    PubMed Central

    Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811

  7. Quantifying memory in complex physiological time-series.

    PubMed

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  8. Formulating and testing a method for perturbing precipitation time series to reflect anticipated climatic changes

    NASA Astrophysics Data System (ADS)

    Jomo Danielsen Sørup, Hjalte; Georgiadis, Stylianos; Bülow Gregersen, Ida; Arnbjerg-Nielsen, Karsten

    2017-01-01

    Urban water infrastructure has very long planning horizons, and planning is thus very dependent on reliable estimates of the impacts of climate change. Many urban water systems are designed using time series with a high temporal resolution. To assess the impact of climate change on these systems, similarly high-resolution precipitation time series for future climate are necessary. Climate models cannot at their current resolutions provide these time series at the relevant scales. Known methods for stochastic downscaling of climate change to urban hydrological scales have known shortcomings in constructing realistic climate-changed precipitation time series at the sub-hourly scale. In the present study we present a deterministic methodology to perturb historical precipitation time series at the minute scale to reflect non-linear expectations to climate change. The methodology shows good skill in meeting the expectations to climate change in extremes at the event scale when evaluated at different timescales from the minute to the daily scale. The methodology also shows good skill with respect to representing expected changes of seasonal precipitation. The methodology is very robust against the actual magnitude of the expected changes as well as the direction of the changes (increase or decrease), even for situations where the extremes are increasing for seasons that in general should have a decreasing trend in precipitation. The methodology can provide planners with valuable time series representing future climate that can be used as input to urban hydrological models and give better estimates of climate change impacts on these systems.

  9. Effects of Assuming Independent Component Failure Times, If They Are Actually Dependent, in a Series System.

    DTIC Science & Technology

    1985-11-26

    PROJECT. TASK * The Ohio State University Research Foundation AREA & WORK UNIT NUMBERS -- 1314 Kinnear Roar \\ -r < Columbus, Ohio .3212 1...AFSOR -82-0307 9. PERFORMING ORGAMIZATIOM NAMIK AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASKeaota AREA & WORK UNIT NUMsensThe a-hio State Uiverity...survival functiou computed in (1). The area under the estimated survival function up to t, remains unctianged. The area under the extended estimated

  10. Effects of Assuming Independent Component Failure Times, If They Are Actually Dependent, In a Series System

    DTIC Science & Technology

    1988-05-31

    chemotherapy regimen consisting of ifosfamide , VP-16, cis-platinum, and bleomycin. Twenty-four patients were entered (staggered entry) and treated with... ifosfamide 750 -9- I FIGURE 1 ESTIMATES OF SURVIVAL FOR RFM/UM MICE C3 a: (D --ESTIMATE OF S(TIt TREATING MORIBUND AS INDEPENDENTLY CENSORED 0

  11. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    PubMed

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  12. Searching for periodicity in weighted time point series.

    NASA Astrophysics Data System (ADS)

    Jetsu, L.; Pelt, J.

    1996-09-01

    Consistent statistics for two methods of searching for periodicity in a series of weighted time points are formulated. An approach based on the bootstrap method to estimate the accuracy of detected periodicity is presented.

  13. A probability distribution approach to synthetic turbulence time series

    NASA Astrophysics Data System (ADS)

    Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael

    2016-11-01

    The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.

  14. A mixed time series model of binomial counts

    NASA Astrophysics Data System (ADS)

    Khoo, Wooi Chen; Ong, Seng Huat

    2015-10-01

    Continuous time series modelling has been an active research in the past few decades. However, time series data in terms of correlated counts appear in many situations such as the counts of rainy days and access downloading. Therefore, the study on count data has become popular in time series modelling recently. This article introduces a new mixture model, which is an univariate non-negative stationary time series model with binomial marginal distribution, arising from the combination of the well-known binomial thinning and Pegram's operators. A brief review of important properties will be carried out and the EM algorithm is applied in parameter estimation. A numerical study is presented to show the performance of the model. Finally, a potential real application will be presented to illustrate the advantage of the new mixture model.

  15. Distinguishing chaotic time series from noise: A random matrix approach

    NASA Astrophysics Data System (ADS)

    Ye, Bin; Chen, Jianxing; Ju, Chen; Li, Huijun; Wang, Xuesong

    2017-03-01

    Deterministically chaotic systems can often give rise to random and unpredictable behaviors which make the time series obtained from them to be almost indistinguishable from noise. Motivated by the fact that data points in a chaotic time series will have intrinsic correlations between them, we propose a random matrix theory (RMT) approach to identify the deterministic or stochastic dynamics of the system. We show that the spectral distributions of the correlation matrices, constructed from the chaotic time series, deviate significantly from the predictions of random matrix ensembles. On the contrary, the eigenvalue statistics for a noisy signal follow closely those of random matrix ensembles. Numerical results also indicate that the approach is to some extent robust to additive observational noise which pollutes the data in many practical situations. Our approach is efficient in recognizing the continuous chaotic dynamics underlying the evolution of the time series.

  16. The use of synthetic input sequences in time series modeling

    NASA Astrophysics Data System (ADS)

    de Oliveira, Dair José; Letellier, Christophe; Gomes, Murilo E. D.; Aguirre, Luis A.

    2008-08-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  17. Fluctuation complexity of agent-based financial time series model by stochastic Potts system

    NASA Astrophysics Data System (ADS)

    Hong, Weijia; Wang, Jun

    2015-03-01

    Financial market is a complex evolved dynamic system with high volatilities and noises, and the modeling and analyzing of financial time series are regarded as the rather challenging tasks in financial research. In this work, by applying the Potts dynamic system, a random agent-based financial time series model is developed in an attempt to uncover the empirical laws in finance, where the Potts model is introduced to imitate the trading interactions among the investing agents. Based on the computer simulation in conjunction with the statistical analysis and the nonlinear analysis, we present numerical research to investigate the fluctuation behaviors of the proposed time series model. Furthermore, in order to get a robust conclusion, we consider the daily returns of Shanghai Composite Index and Shenzhen Component Index, and the comparison analysis of return behaviors between the simulation data and the actual data is exhibited.

  18. Prediction of Long-Memory Time Series: A Tutorial Review

    NASA Astrophysics Data System (ADS)

    Bhansali, R. J.; Kokoszka, P. S.

    Two different approaches, called Type-I and Type-II, to linear least-squares prediction of a long-memory time series are distinguished. In the former, no new theory is required and a long-memory time series is treated on par with a standard short-memory time series and its multistep predictions are obtained by using the existing modelling approaches to prediction of such time series. The latter, by contrast, seeks to model the long-memory stochastic characteristics of the observed time series by a fractional process such that its dth fractional difference, 0 < d < 0.5, follows a standard short-memory process. The various approaches to constructing long-memory stochastic models are reviewed, and the associated question of parameter estimation for these models is discussed. Having fitted a long-memory stochastic model to a time series, linear multi-step forecasts of its future values are constructed from the model itself. The question of how to evaluate the multistep prediction constants is considered and three different methods proposed for doing so are outlined; it is further noted that, under appropriate regularity conditions, these methods apply also to the class of linear long memory processes with infinite variance. In addition, a brief review of the class of non-linear chaotic maps implying long-memory is given.

  19. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  20. Visualizing frequent patterns in large multivariate time series

    NASA Astrophysics Data System (ADS)

    Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.

    2011-01-01

    The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.

  1. A method for detecting changes in long time series

    SciTech Connect

    Downing, D.J.; Lawkins, W.F.; Morris, M.D.; Ostrouchov, G.

    1995-09-01

    Modern scientific activities, both physical and computational, can result in time series of many thousands or even millions of data values. Here the authors describe a statistically motivated algorithm for quick screening of very long time series data for the presence of potentially interesting but arbitrary changes. The basic data model is a stationary Gaussian stochastic process, and the approach to detecting a change is the comparison of two predictions of the series at a time point or contiguous collection of time points. One prediction is a ``forecast``, i.e. based on data from earlier times, while the other a ``backcast``, i.e. based on data from later times. The statistic is the absolute value of the log-likelihood ratio for these two predictions, evaluated at the observed data. A conservative procedure is suggested for specifying critical values for the statistic under the null hypothesis of ``no change``.

  2. Symplectic geometry spectrum regression for prediction of noisy time series

    NASA Astrophysics Data System (ADS)

    Xie, Hong-Bo; Dokos, Socrates; Sivakumar, Bellie; Mengersen, Kerrie

    2016-05-01

    We present the symplectic geometry spectrum regression (SGSR) technique as well as a regularized method based on SGSR for prediction of nonlinear time series. The main tool of analysis is the symplectic geometry spectrum analysis, which decomposes a time series into the sum of a small number of independent and interpretable components. The key to successful regularization is to damp higher order symplectic geometry spectrum components. The effectiveness of SGSR and its superiority over local approximation using ordinary least squares are demonstrated through prediction of two noisy synthetic chaotic time series (Lorenz and Rössler series), and then tested for prediction of three real-world data sets (Mississippi River flow data and electromyographic and mechanomyographic signal recorded from human body).

  3. Similarity estimators for irregular and age-uncertain time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Kurths, J.

    2014-01-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many data sets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age-uncertain time series. We compare the Gaussian-kernel-based cross-correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case, coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  4. Similarity estimators for irregular and age uncertain time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Kurths, J.

    2013-09-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  5. Analyses of Inhomogeneities in Radiosonde Temperature and Humidity Time Series.

    NASA Astrophysics Data System (ADS)

    Zhai, Panmao; Eskridge, Robert E.

    1996-04-01

    Twice daily radiosonde data from selected stations in the United States (period 1948 to 1990) and China (period 1958 to 1990) were sorted into time series. These stations have one sounding taken in darkness and the other in sunlight. The analysis shows that the 0000 and 1200 UTC time series are highly correlated. Therefore, the Easterling and Peterson technique was tested on the 0000 and 1200 time series to detect inhomogeneities and to estimate the size of the biases. Discontinuities were detected using the difference series created from the 0000 and 1200 UTC time series. To establish that the detected bias was significant, a t test was performed to confirm that the change occurs in the daytime series but not in the nighttime series.Both U.S. and Chinese radiosonde temperature and humidity data include inhomogeneities caused by changes in radiosonde sensors and observation times. The U.S. humidity data have inhomogeneities that were caused by instrument changes and the censoring of data. The practice of reporting relative humidity as 19% when it is lower than 20% or the temperature is below 40°C is called censoring. This combination of procedural and instrument changes makes the detection of biases and adjustment of the data very difficult. In the Chinese temperatures, them are inhomogeneities related to a change in the radiation correction procedure.Test results demonstrate that a modified Easterling and Peterson method is suitable for use in detecting and adjusting time series radiosonde data.Accurate stations histories are very desirable. Stations histories can confirm that detected inhomogeneities are related to instrument or procedural changes. Adjustments can then he made to the data with some confidence.

  6. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  7. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    PubMed

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  8. Evaluation of Scaling Invariance Embedded in Short Time Series

    PubMed Central

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356

  9. Evaluation of scaling invariance embedded in short time series.

    PubMed

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  10. Statistical modelling of agrometeorological time series by exponential smoothing

    NASA Astrophysics Data System (ADS)

    Murat, Małgorzata; Malinowska, Iwona; Hoffmann, Holger; Baranowski, Piotr

    2016-01-01

    Meteorological time series are used in modelling agrophysical processes of the soil-plant-atmosphere system which determine plant growth and yield. Additionally, long-term meteorological series are used in climate change scenarios. Such studies often require forecasting or projection of meteorological variables, eg the projection of occurrence of the extreme events. The aim of the article was to determine the most suitable exponential smoothing models to generate forecast using data on air temperature, wind speed, and precipitation time series in Jokioinen (Finland), Dikopshof (Germany), Lleida (Spain), and Lublin (Poland). These series exhibit regular additive seasonality or non-seasonality without any trend, which is confirmed by their autocorrelation functions and partial autocorrelation functions. The most suitable models were indicated by the smallest mean absolute error and the smallest root mean squared error.

  11. Self-affinity in the dengue fever time series

    NASA Astrophysics Data System (ADS)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  12. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    NASA Astrophysics Data System (ADS)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  13. Compounding approach for univariate time series with nonstationary variances

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  14. Multiscale multifractal diffusion entropy analysis of financial time series

    NASA Astrophysics Data System (ADS)

    Huang, Jingjing; Shang, Pengjian

    2015-02-01

    This paper introduces a multiscale multifractal diffusion entropy analysis (MMDEA) method to analyze long-range correlation then applies this method to stock index series. The method combines the techniques of diffusion process and Rényi entropy to focus on the scaling behaviors of stock index series using a multiscale, which allows us to extend the description of stock index variability to include the dependence on the magnitude of the variability and time scale. Compared to multifractal diffusion entropy analysis, the MMDEA can show more details of scale properties and provide a reliable analysis. In this paper, we concentrate not only on the fact that the stock index series has multifractal properties but also that these properties depend on the time scale in which the multifractality is measured. This time scale is related to the frequency band of the signal. We find that stock index variability appears to be far more complex than reported in the studies using a fixed time scale.

  15. Generalized Dynamic Factor Models for Mixed-Measurement Time Series

    PubMed Central

    Cui, Kai; Dunson, David B.

    2013-01-01

    In this article, we propose generalized Bayesian dynamic factor models for jointly modeling mixed-measurement time series. The framework allows mixed-scale measurements associated with each time series, with different measurements having different distributions in the exponential family conditionally on time-varying latent factor(s). Efficient Bayesian computational algorithms are developed for posterior inference on both the latent factors and model parameters, based on a Metropolis Hastings algorithm with adaptive proposals. The algorithm relies on a Greedy Density Kernel Approximation (GDKA) and parameter expansion with latent factor normalization. We tested the framework and algorithms in simulated studies and applied them to the analysis of intertwined credit and recovery risk for Moody’s rated firms from 1982–2008, illustrating the importance of jointly modeling mixed-measurement time series. The article has supplemental materials available online. PMID:24791133

  16. Generalized Dynamic Factor Models for Mixed-Measurement Time Series.

    PubMed

    Cui, Kai; Dunson, David B

    2014-02-12

    In this article, we propose generalized Bayesian dynamic factor models for jointly modeling mixed-measurement time series. The framework allows mixed-scale measurements associated with each time series, with different measurements having different distributions in the exponential family conditionally on time-varying latent factor(s). Efficient Bayesian computational algorithms are developed for posterior inference on both the latent factors and model parameters, based on a Metropolis Hastings algorithm with adaptive proposals. The algorithm relies on a Greedy Density Kernel Approximation (GDKA) and parameter expansion with latent factor normalization. We tested the framework and algorithms in simulated studies and applied them to the analysis of intertwined credit and recovery risk for Moody's rated firms from 1982-2008, illustrating the importance of jointly modeling mixed-measurement time series. The article has supplemental materials available online.

  17. A refined fuzzy time series model for stock market forecasting

    NASA Astrophysics Data System (ADS)

    Jilani, Tahseen Ahmed; Burney, Syed Muhammad Aqil

    2008-05-01

    Time series models have been used to make predictions of stock prices, academic enrollments, weather, road accident casualties, etc. In this paper we present a simple time-variant fuzzy time series forecasting method. The proposed method uses heuristic approach to define frequency-density-based partitions of the universe of discourse. We have proposed a fuzzy metric to use the frequency-density-based partitioning. The proposed fuzzy metric also uses a trend predictor to calculate the forecast. The new method is applied for forecasting TAIEX and enrollments’ forecasting of the University of Alabama. It is shown that the proposed method work with higher accuracy as compared to other fuzzy time series methods developed for forecasting TAIEX and enrollments of the University of Alabama.

  18. An introduction to chaotic and random time series analysis

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1989-01-01

    The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.

  19. Degree-Pruning Dynamic Programming Approaches to Central Time Series Minimizing Dynamic Time Warping Distance.

    PubMed

    Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip

    2016-06-28

    The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.

  20. Wavelet analysis for non-stationary, nonlinear time series

    NASA Astrophysics Data System (ADS)

    Schulte, Justin A.

    2016-08-01

    Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the quasi-biennial oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16 and 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.

  1. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  2. Nonlinear Analysis of Surface EMG Time Series of Back Muscles

    NASA Astrophysics Data System (ADS)

    Dolton, Donald C.; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-10-01

    A nonlinear analysis of surface electromyography time series of subjects with and without low back pain is presented. The mean-square displacement and entropy shows anomalous diffusive behavior on intermediate time range 10 ms < t < 1 s. This behavior implies the presence of correlations in the signal. We discuss the shape of the power spectrum of the signal.

  3. Long-range correlations in time series generated by time-fractional diffusion: A numerical study

    NASA Astrophysics Data System (ADS)

    Barbieri, Davide; Vivoli, Alessandro

    2005-09-01

    Time series models showing power law tails in autocorrelation functions are common in econometrics. A special non-Markovian model for such kind of time series is provided by the random walk introduced by Gorenflo et al. as a discretization of time fractional diffusion. The time series so obtained are analyzed here from a numerical point of view in terms of autocorrelations and covariance matrices.

  4. Improvements to surrogate data methods for nonstationary time series.

    PubMed

    Lucio, J H; Valdés, R; Rodríguez, L R

    2012-05-01

    The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.

  5. Mining approximate periodic pattern in hydrological time series

    NASA Astrophysics Data System (ADS)

    Zhu, Y. L.; Li, S. J.; Bao, N. N.; Wan, D. S.

    2012-04-01

    There is a lot of information about the hidden laws of nature evolution and the influences of human beings activities on the earth surface in long sequence of hydrological time series. Data mining technology can help find those hidden laws, such as flood frequency and abrupt change, which is useful for the decision support of hydrological prediction and flood control scheduling. The periodic nature of hydrological time series is important for trend forecasting of drought and flood and hydraulic engineering planning. In Hydrology, the full period analysis of hydrological time series has attracted a lot of attention, such as the discrete periodogram, simple partial wave method, Fourier analysis method, and maximum entropy spectral analysis method and wavelet analysis. In fact, the hydrological process is influenced both by deterministic factors and stochastic ones. For example, the tidal level is also affected by moon circling the Earth, in addition to the Earth revolution and its rotation. Hence, there is some kind of approximate period hidden in the hydrological time series, sometimes which is also called the cryptic period. Recently, partial period mining originated from the data mining domain can be a remedy for the traditional period analysis methods in hydrology, which has a loose request of the data integrity and continuity. They can find some partial period in the time series. This paper is focused on the partial period mining in the hydrological time series. Based on asynchronous periodic pattern and partial period mining with suffix tree, this paper proposes to mine multi-event asynchronous periodic pattern based on modified suffix tree representation and traversal, and invent a dynamic candidate period intervals adjusting method, which can avoids period omissions or waste of time and space. The experimental results on synthetic data and real water level data of the Yangtze River at Nanjing station indicate that this algorithm can discover hydrological

  6. On fractal analysis of cardiac interbeat time series

    NASA Astrophysics Data System (ADS)

    Guzmán-Vargas, L.; Calleja-Quevedo, E.; Angulo-Brown, F.

    2003-09-01

    In recent years the complexity of a cardiac beat-to-beat time series has been taken as an auxiliary tool to identify the health status of human hearts. Several methods has been employed to characterize the time series complexity. In this work we calculate the fractal dimension of interbeat time series arising from three groups: 10 young healthy persons, 8 elderly healthy persons and 10 patients with congestive heart failures. Our numerical results reflect evident differences in the dynamic behavior corresponding to each group. We discuss these results within the context of the neuroautonomic control of heart rate dynamics. We also propose a numerical simulation which reproduce aging effects of heart rate behavior.

  7. Test to determine the Markov order of a time series.

    PubMed

    Racca, E; Laio, F; Poggi, D; Ridolfi, L

    2007-01-01

    The Markov order of a time series is an important measure of the "memory" of a process, and its knowledge is fundamental for the correct simulation of the characteristics of the process. For this reason, several techniques have been proposed in the past for its estimation. However, most of this methods are rather complex, and often can be applied only in the case of Markov chains. Here we propose a simple and robust test to evaluate the Markov order of a time series. Only the first-order moment of the conditional probability density function characterizing the process is used to evaluate the memory of the process itself. This measure is called the "expected value Markov (EVM) order." We show that there is good agreement between the EVM order and the known Markov order of some synthetic time series.

  8. Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology

    NASA Astrophysics Data System (ADS)

    Scheier, Christian; Tschacher, Wolfgang

    Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.

  9. Time series, correlation matrices and random matrix models

    SciTech Connect

    Vinayak; Seligman, Thomas H.

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series. By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.

  10. Characterizing Complex Time Series from the Scaling of Prediction Error.

    NASA Astrophysics Data System (ADS)

    Hinrichs, Brant Eric

    This thesis concerns characterizing complex time series from the scaling of prediction error. We use the global modeling technique of radial basis function approximation to build models from a state-space reconstruction of a time series that otherwise appears complicated or random (i.e. aperiodic, irregular). Prediction error as a function of prediction horizon is obtained from the model using the direct method. The relationship between the underlying dynamics of the time series and the logarithmic scaling of prediction error as a function of prediction horizon is investigated. We use this relationship to characterize the dynamics of both a model chaotic system and physical data from the optic tectum of an attentive pigeon exhibiting the important phenomena of nonstationary neuronal oscillations in response to visual stimuli.

  11. Time series characterization via horizontal visibility graph and Information Theory

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruna Amin; Carpi, Laura; Rosso, Osvaldo A.; Ravetti, Martín G.

    2016-12-01

    Complex networks theory have gained wider applicability since methods for transformation of time series to networks were proposed and successfully tested. In the last few years, horizontal visibility graph has become a popular method due to its simplicity and good results when applied to natural and artificially generated data. In this work, we explore different ways of extracting information from the network constructed from the horizontal visibility graph and evaluated by Information Theory quantifiers. Most works use the degree distribution of the network, however, we found alternative probability distributions, more efficient than the degree distribution in characterizing dynamical systems. In particular, we find that, when using distributions based on distances and amplitude values, significant shorter time series are required. We analyze fractional Brownian motion time series, and a paleoclimatic proxy record of ENSO from the Pallcacocha Lake to study dynamical changes during the Holocene.

  12. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  13. Permutation test for periodicity in short time series data

    PubMed Central

    Ptitsyn, Andrey A; Zvonic, Sanjin; Gimble, Jeffrey M

    2006-01-01

    Background Periodic processes, such as the circadian rhythm, are important factors modulating and coordinating transcription of genes governing key metabolic pathways. Theoretically, even small fluctuations in the orchestration of circadian gene expression patterns among different tissues may result in functional asynchrony at the organism level and may contribute to a wide range of pathologic disorders. Identification of circadian expression pattern in time series data is important, but equally challenging. Microarray technology allows estimation of relative expression of thousands of genes at each time point. However, this estimation often lacks precision and microarray experiments are prohibitively expensive, limiting the number of data points in a time series expression profile. The data produced in these experiments carries a high degree of stochastic variation, obscuring the periodic pattern and a limited number of replicates, typically covering not more than two complete periods of oscillation. Results To address this issue, we have developed a simple, but effective, computational technique for the identification of a periodic pattern in relatively short time series, typical for microarray studies of circadian expression. This test is based on a random permutation of time points in order to estimate non-randomness of a periodogram. The Permutated time, or Pt-test, is able to detect oscillations within a given period in expression profiles dominated by a high degree of stochastic fluctuations or oscillations of different irrelevant frequencies. We have conducted a comprehensive study of circadian expression on a large data set produced at PBRC, representing three different peripheral murine tissues. We have also re-analyzed a number of similar time series data sets produced and published independently by other research groups over the past few years. Conclusion The Permutated time test (Pt-test) is demonstrated to be effective for detection of periodicity in

  14. Microbial oceanography and the Hawaii Ocean Time-series programme.

    PubMed

    Karl, David M; Church, Matthew J

    2014-10-01

    The Hawaii Ocean Time-series (HOT) programme has been tracking microbial and biogeochemical processes in the North Pacific Subtropical Gyre since October 1988. The near-monthly time series observations have revealed previously undocumented phenomena within a temporally dynamic ecosystem that is vulnerable to climate change. Novel microorganisms, genes and unexpected metabolic pathways have been discovered and are being integrated into our evolving ecological paradigms. Continued research, including higher-frequency observations and at-sea experimentation, will help to provide a comprehensive scientific understanding of microbial processes in the largest biome on Earth.

  15. Testing for intracycle determinism in pseudoperiodic time series

    NASA Astrophysics Data System (ADS)

    Coelho, Mara C. S.; Mendes, Eduardo M. A. M.; Aguirre, Luis A.

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  16. Application of nonlinear time series models to driven systems

    SciTech Connect

    Hunter, N.F. Jr.

    1990-01-01

    In our laboratory we have been engaged in an effort to model nonlinear systems using time series methods. Our objectives have been, first, to understand how the time series response of a nonlinear system unfolds as a function of the underlying state variables, second, to model the evolution of the state variables, and finally, to predict nonlinear system responses. We hope to address the relationship between model parameters and system parameters in the near future. Control of nonlinear systems based on experimentally derived parameters is also a planned topic of future research. 28 refs., 15 figs., 2 tabs.

  17. Adaptive median filtering for preprocessing of time series measurements

    NASA Technical Reports Server (NTRS)

    Paunonen, Matti

    1993-01-01

    A median (L1-norm) filtering program using polynomials was developed. This program was used in automatic recycling data screening. Additionally, a special adaptive program to work with asymmetric distributions was developed. Examples of adaptive median filtering of satellite laser range observations and TV satellite time measurements are given. The program proved to be versatile and time saving in data screening of time series measurements.

  18. Kālī: Time series data modeler

    NASA Astrophysics Data System (ADS)

    Kasliwal, Vishal P.

    2016-07-01

    The fully parallelized and vectorized software package Kālī models time series data using various stochastic processes such as continuous-time ARMA (C-ARMA) processes and uses Bayesian Markov Chain Monte-Carlo (MCMC) for inferencing a stochastic light curve. Kālimacr; is written in c++ with Python language bindings for ease of use. K¯lī is named jointly after the Hindu goddess of time, change, and power and also as an acronym for KArma LIbrary.

  19. Fractal dimension of electroencephalographic time series and underlying brain processes.

    PubMed

    Lutzenberger, W; Preissl, H; Pulvermüller, F

    1995-10-01

    Fractal dimension has been proposed as a useful measure for the characterization of electrophysiological time series. This paper investigates what the pointwise dimension of electroencephalographic (EEG) time series can reveal about underlying neuronal generators. The following theoretical assumptions concerning brain function were made (i) within the cortex, strongly coupled neural assemblies exist which oscillate at certain frequencies when they are active, (ii) several such assemblies can oscillate at a time, and (iii) activity flow between assemblies is minimal. If these assumptions are made, cortical activity can be considered as the weighted sum of a finite number of oscillations (plus noise). It is shown that the correlation dimension of finite time series generated by multiple oscillators increases monotonically with the number of oscillators. Furthermore, it is shown that a reliable estimate of the pointwise dimension of the raw EEG signal can be calculated from a time series as short as a few seconds. These results indicate that (i) The pointwise dimension of the EEG allows conclusions regarding the number of independently oscillating networks in the cortex, and (ii) a reliable estimate of the pointwise dimension of the EEG is possible on the basis of short raw signals.

  20. Learning time series evolution by unsupervised extraction of correlations

    SciTech Connect

    Deco, G.; Schuermann, B. )

    1995-03-01

    As a consequence, we are able to model chaotic and nonchaotic time series. Furthermore, one critical point in modeling time series is the determination of the dimension of the embedding vector used, i.e., the number of components of the past that are needed to predict the future. With this method we can detect the embedding dimension by extracting the influence of the past on the future, i.e., the correlation of remote past and future. Optimal embedding dimensions are obtained for the Henon map and the Mackey-Glass series. When noisy data corrupted by colored noise are used, a model is still possible. The noise will then be decorrelated by the network. In the case of modeling a chemical reaction, the most natural architecture that conserves the volume is a symplectic network which describes a system that conserves the entropy and therefore the transmitted information.

  1. Power Computations in Time Series Analyses for Traffic Safety Interventions

    PubMed Central

    McLeod, A. Ian; Vingilis, E. R.

    2008-01-01

    The evaluation of traffic safety interventions or other policies that can affect road safety often requires the collection of administrative time series data, such as monthly motor vehicle collision data that may be difficult and/or expensive to collect. Furthermore, since policy decisions may be based on the results found from the intervention analysis of the policy, it is important to ensure that the statistical tests have enough power, that is, that we have collected enough time series data both before and after the intervention so that a meaningful change in the series will likely be detected. In this short paper we present a simple methodology for doing this. It is expected that the methodology presented will be useful for sample size determination in a wide variety of traffic safety intervention analysis applications. Our method is illustrated with a proposed traffic safety study that was funded by NIH. PMID:18460394

  2. Segmentation of time series with long-range fractal correlations

    PubMed Central

    Bernaola-Galván, P.; Oliver, J.L.; Hackenberg, M.; Coronado, A.V.; Ivanov, P.Ch.; Carpena, P.

    2012-01-01

    Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome. PMID:23645997

  3. Segmentation of time series with long-range fractal correlations

    NASA Astrophysics Data System (ADS)

    Bernaola-Galván, P.; Oliver, J. L.; Hackenberg, M.; Coronado, A. V.; Ivanov, P. Ch.; Carpena, P.

    2012-06-01

    Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.

  4. Ocean time-series near Bermuda: Hydrostation S and the US JGOFS Bermuda Atlantic time-series study

    NASA Technical Reports Server (NTRS)

    Michaels, Anthony F.; Knap, Anthony H.

    1992-01-01

    Bermuda is the site of two ocean time-series programs. At Hydrostation S, the ongoing biweekly profiles of temperature, salinity and oxygen now span 37 years. This is one of the longest open-ocean time-series data sets and provides a view of decadal scale variability in ocean processes. In 1988, the U.S. JGOFS Bermuda Atlantic Time-series Study began a wide range of measurements at a frequency of 14-18 cruises each year to understand temporal variability in ocean biogeochemistry. On each cruise, the data range from chemical analyses of discrete water samples to data from electronic packages of hydrographic and optics sensors. In addition, a range of biological and geochemical rate measurements are conducted that integrate over time-periods of minutes to days. This sampling strategy yields a reasonable resolution of the major seasonal patterns and of decadal scale variability. The Sargasso Sea also has a variety of episodic production events on scales of days to weeks and these are only poorly resolved. In addition, there is a substantial amount of mesoscale variability in this region and some of the perceived temporal patterns are caused by the intersection of the biweekly sampling with the natural spatial variability. In the Bermuda time-series programs, we have added a series of additional cruises to begin to assess these other sources of variation and their impacts on the interpretation of the main time-series record. However, the adequate resolution of higher frequency temporal patterns will probably require the introduction of new sampling strategies and some emerging technologies such as biogeochemical moorings and autonomous underwater vehicles.

  5. A Time-Series Analysis of Hispanic Unemployment.

    ERIC Educational Resources Information Center

    Defreitas, Gregory

    1986-01-01

    This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)

  6. Chaotic time series prediction using artificial neural networks

    SciTech Connect

    Bartlett, E.B.

    1991-12-31

    This paper describes the use of artificial neural networks to model the complex oscillations defined by a chaotic Verhuist animal population dynamic. A predictive artificial neural network model is developed and tested, and results of computer simulations are given. These results show that the artificial neural network model predicts the chaotic time series with various initial conditions, growth parameters, or noise.

  7. Chaotic time series prediction using artificial neural networks

    SciTech Connect

    Bartlett, E.B.

    1991-01-01

    This paper describes the use of artificial neural networks to model the complex oscillations defined by a chaotic Verhuist animal population dynamic. A predictive artificial neural network model is developed and tested, and results of computer simulations are given. These results show that the artificial neural network model predicts the chaotic time series with various initial conditions, growth parameters, or noise.

  8. Time Series, Stochastic Processes and Completeness of Quantum Theory

    NASA Astrophysics Data System (ADS)

    Kupczynski, Marian

    2011-03-01

    Most of physical experiments are usually described as repeated measurements of some random variables. Experimental data registered by on-line computers form time series of outcomes. The frequencies of different outcomes are compared with the probabilities provided by the algorithms of quantum theory (QT). In spite of statistical predictions of QT a claim was made that it provided the most complete description of the data and of the underlying physical phenomena. This claim could be easily rejected if some fine structures, averaged out in the standard descriptive statistical analysis, were found in time series of experimental data. To search for these structures one has to use more subtle statistical tools which were developed to study time series produced by various stochastic processes. In this talk we review some of these tools. As an example we show how the standard descriptive statistical analysis of the data is unable to reveal a fine structure in a simulated sample of AR (2) stochastic process. We emphasize once again that the violation of Bell inequalities gives no information on the completeness or the non locality of QT. The appropriate way to test the completeness of quantum theory is to search for fine structures in time series of the experimental data by means of the purity tests or by studying the autocorrelation and partial autocorrelation functions.

  9. Catchment classification and similarity using correlation in streamflow time series

    NASA Astrophysics Data System (ADS)

    Fleming, B.; Archfield, S. A.

    2012-12-01

    Catchment classification is an important component of hydrologic analyses, particularly for linking changes in ecological integrity to streamflow alteration, transferring time series or model parameters from gauged to ungauged locations, and as a way to understand the similarity in the response of catchments to change. Metrics of similarity used in catchment classification have ranged from aggregate catchment properties such as geologic or climate characteristics to variables derived from the daily streamflow hydrograph; however, no one set of classification variables can fully describe similarity between catchments as the variables used for such assessments often depend on the question being asked. We propose an alternative method based on similarity for hydrologic classification: correlation between the daily streamflow time series. If one assumes that the streamflow signal is the integrated response of a catchment to both climate and geology, then the strength of correlation in streamflow between two catchments is a measure of the strength of similarity in hydrologic response between those two catchments. Using the nonparametric Spearman rho correlation coefficient between streamflow time series at 54 unregulated and unaltered streamgauges in the mid-Atlantic United States, we show that correlation is a parsimonious classification metric that results in physically interpretable classes. Using the correlation between the deseasonalized streamflow time series and reclassifying the streamgauges, we also find that seasonality plays an important role in understanding catchment flow dynamics, especially those that can be linked to ecological response and similarity although not to a large extent in this study area.

  10. A Method for Comparing Multivariate Time Series with Different Dimensions

    PubMed Central

    Tapinos, Avraam; Mendes, Pedro

    2013-01-01

    In many situations it is desirable to compare dynamical systems based on their behavior. Similarity of behavior often implies similarity of internal mechanisms or dependency on common extrinsic factors. While there are widely used methods for comparing univariate time series, most dynamical systems are characterized by multivariate time series. Yet, comparison of multivariate time series has been limited to cases where they share a common dimensionality. A semi-metric is a distance function that has the properties of non-negativity, symmetry and reflexivity, but not sub-additivity. Here we develop a semi-metric – SMETS – that can be used for comparing groups of time series that may have different dimensions. To demonstrate its utility, the method is applied to dynamic models of biochemical networks and to portfolios of shares. The former is an example of a case where the dependencies between system variables are known, while in the latter the system is treated (and behaves) as a black box. PMID:23393554

  11. New Confidence Interval Estimators Using Standardized Time Series.

    DTIC Science & Technology

    1984-12-01

    We develop new confidence interval estimators for the underlying mean of a stationary simulation process. These estimators can be viewed as...generalizations of Schruben’s so-called standardized time series area confidence interval estimators. Various properties of the new estimators are given.

  12. Daily time series evapotranspiration maps for Oklahoma and Texas panhandle

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Evapotranspiration (ET) is an important process in ecosystems’ water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. ...

  13. What Makes a Coursebook Series Stand the Test of Time?

    ERIC Educational Resources Information Center

    Illes, Eva

    2009-01-01

    Intriguingly, at a time when the ELT market is inundated with state-of-the-art coursebooks teaching modern-day English, a 30-year-old series enjoys continuing popularity in some secondary schools in Hungary. Why would teachers, several of whom are school-based teacher-mentors in the vanguard of the profession, purposefully choose materials which…

  14. IDENTIFICATION OF REGIME SHIFTS IN TIME SERIES USING NEIGHBORHOOD STATISTICS

    EPA Science Inventory

    The identification of alternative dynamic regimes in ecological systems requires several lines of evidence. Previous work on time series analysis of dynamic regimes includes mainly model-fitting methods. We introduce two methods that do not use models. These approaches use state-...

  15. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  16. Model Identification in Time-Series Analysis: Some Empirical Results.

    ERIC Educational Resources Information Center

    Padia, William L.

    Model identification of time-series data is essential to valid statistical tests of intervention effects. Model identification is, at best, inexact in the social and behavioral sciences where one is often confronted with small numbers of observations. These problems are discussed, and the results of independent identifications of 130 social and…

  17. ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES

    PubMed Central

    PENG, C.-K.; COSTA, MADALENA; GOLDBERGER, ARY L.

    2009-01-01

    We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations. PMID:20041035

  18. Time Series Data Visualization in World Wide Telescope

    NASA Astrophysics Data System (ADS)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  19. [Anomaly Detection of Multivariate Time Series Based on Riemannian Manifolds].

    PubMed

    Xu, Yonghong; Hou, Xiaoying; Li Shuting; Cui, Jie

    2015-06-01

    Multivariate time series problems widely exist in production and life in the society. Anomaly detection has provided people with a lot of valuable information in financial, hydrological, meteorological fields, and the research areas of earthquake, video surveillance, medicine and others. In order to quickly and efficiently find exceptions in time sequence so that it can be presented in front of people in an intuitive way, we in this study combined the Riemannian manifold with statistical process control charts, based on sliding window, with a description of the covariance matrix as the time sequence, to achieve the multivariate time series of anomaly detection and its visualization. We made MA analog data flow and abnormal electrocardiogram data from MIT-BIH as experimental objects, and verified the anomaly detection method. The results showed that the method was reasonable and effective.

  20. Classification of time series patterns from complex dynamic systems

    SciTech Connect

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  1. Mixed Spectrum Analysis on fMRI Time-Series.

    PubMed

    Kumar, Arun; Lin, Feng; Rajapakse, Jagath C

    2016-06-01

    Temporal autocorrelation present in functional magnetic resonance image (fMRI) data poses challenges to its analysis. The existing approaches handling autocorrelation in fMRI time-series often presume a specific model of autocorrelation such as an auto-regressive model. The main limitation here is that the correlation structure of voxels is generally unknown and varies in different brain regions because of different levels of neurogenic noises and pulsatile effects. Enforcing a universal model on all brain regions leads to bias and loss of efficiency in the analysis. In this paper, we propose the mixed spectrum analysis of the voxel time-series to separate the discrete component corresponding to input stimuli and the continuous component carrying temporal autocorrelation. A mixed spectral analysis technique based on M-spectral estimator is proposed, which effectively removes autocorrelation effects from voxel time-series and identify significant peaks of the spectrum. As the proposed method does not assume any prior model for the autocorrelation effect in voxel time-series, varying correlation structure among the brain regions does not affect its performance. We have modified the standard M-spectral method for an application on a spatial set of time-series by incorporating the contextual information related to the continuous spectrum of neighborhood voxels, thus reducing considerably the computation cost. Likelihood of the activation is predicted by comparing the amplitude of discrete component at stimulus frequency of voxels across the brain by using normal distribution and modeling spatial correlations among the likelihood with a conditional random field. We also demonstrate the application of the proposed method in detecting other desired frequencies.

  2. Distinguishing quasiperiodic dynamics from chaos in short-time series.

    PubMed

    Zou, Y; Pazó, D; Romano, M C; Thiel, M; Kurths, J

    2007-07-01

    We propose a procedure to distinguish quasiperiodic from chaotic orbits in short-time series, which is based on the recurrence properties in phase space. The histogram of the return times in a recurrence plot is introduced to disclose the recurrence property consisting of only three peaks imposed by Slater's theorem. Noise effects on the statistics are studied. Our approach is demonstrated to be efficient in recognizing regular and chaotic trajectories of a Hamiltonian system with mixed phase space.

  3. Improving predictability of time series using maximum entropy methods

    NASA Astrophysics Data System (ADS)

    Chliamovitch, G.; Dupuis, A.; Golub, A.; Chopard, B.

    2015-04-01

    We discuss how maximum entropy methods may be applied to the reconstruction of Markov processes underlying empirical time series and compare this approach to usual frequency sampling. It is shown that, in low dimension, there exists a subset of the space of stochastic matrices for which the MaxEnt method is more efficient than sampling, in the sense that shorter historical samples have to be considered to reach the same accuracy. Considering short samples is of particular interest when modelling smoothly non-stationary processes, which provides, under some conditions, a powerful forecasting tool. The method is illustrated for a discretized empirical series of exchange rates.

  4. Normalization methods in time series of platelet function assays

    PubMed Central

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  5. Classifying of financial time series based on multiscale entropy and multiscale time irreversibility

    NASA Astrophysics Data System (ADS)

    Xia, Jianan; Shang, Pengjian; Wang, Jing; Shi, Wenbin

    2014-04-01

    Time irreversibility is a fundamental property of many time series. We apply the multiscale entropy (MSE) and multiscale time irreversibility (MSTI) to analyze the financial time series, and succeed to classify the financial markets. Interestingly, both methods have nearly the same classification results, which mean that they are capable of distinguishing different series in a reliable manner. By comparing the results of shuffled data with the original results, we confirm that the asymmetry property is an inherent property of financial time series and it can extend over a wide range of scales. In addition, the effect of noise on Americas markets and Europe markets are relatively more significant than the effect on Asia markets, and loss of time irreversibility has been detected in high noise added series.

  6. Autoregression of Quasi-Stationary Time Series (Invited)

    NASA Astrophysics Data System (ADS)

    Meier, T. M.; Küperkoch, L.

    2009-12-01

    Autoregression is a model based tool for spectral analysis and prediction of time series. It has the potential to increase the resolution of spectral estimates. However, the validity of the assumed model has to be tested. Here we review shortly methods for the determination of the parameters of autoregression and summarize properties of autoregressive prediction and autoregressive spectral analysis. Time series with a limited number of dominant frequencies varying slowly in time (quasi-stationary time series) may well be described by a time-dependent autoregressive model of low order. An algorithm for the estimation of the autoregression parameters in a moving window is presented. Time-varying dominant frequencies are estimated. The comparison to results obtained by Fourier transform based methods and the visualization of the time dependent normalized prediction error are essential for quality assessment of the results. The algorithm is applied to synthetic examples as well as to mircoseism and tremor. The sensitivity of the results to the choice of model and filter parameters is discussed. Autoregressive forward prediction offers the opportunity to detect body wave phases in seismograms and to determine arrival times automatically. Examples are shown for P- and S-phases at local and regional distances. In order to determine S-wave arrival times the autoregressive model is extended to multi-component recordings. For the detection of significant temporal changes in waveforms, the choice of the model appears to be less crucial compared to spectral analysis. Temporal changes in frequency, amplitude, phase, and polarisation are detectable by autoregressive prediction. Quality estimates of automatically determined onset times may be obtained from the slope of the absolute prediction error as a function of time and the signal-to-noise ratio. Results are compared to manual readings.

  7. The Mount Wilson Ca ii K Plage Index Time Series

    NASA Astrophysics Data System (ADS)

    Bertello, L.; Ulrich, R. K.; Boyden, J. E.

    2010-06-01

    It is well established that both total and spectral solar irradiance are modulated by variable magnetic activity on the solar surface. However, there is still disagreement about the contribution of individual solar features for changes in the solar output, in particular over decadal time scales. Ionized Ca ii K line spectroheliograms are one of the major resources for these long-term trend studies, mainly because such measurements have been available now for more than 100 years. In this paper we introduce a new Ca ii K plage and active network index time series derived from the digitization of almost 40 000 photographic solar images that were obtained at the 60-foot solar tower, between 1915 and 1985, as a part of the monitoring program of the Mount Wilson Observatory. We describe here the procedure we applied to calibrate the images and the properties of our new defined index, which is strongly correlated to the average fractional area of the visible solar disk occupied by plages and active network. We show that the long-term variation of this index is in an excellent agreement with the 11-year solar-cycle trend determined from the annual international sunspot numbers series. Our time series agrees also very well with similar indicators derived from a different reduction of the same data base and other Ca ii K spectroheliograms long-term synoptic programs, such as those at Kodaikanal Observatory (India), and at the National Solar Observatory at Sacramento Peak (USA). Finally, we show that using appropriate proxies it is possible to extend this time series up to date, making this data set one of the longest Ca ii K index series currently available.

  8. Wavelet transform approach for fitting financial time series data

    NASA Astrophysics Data System (ADS)

    Ahmed, Amel Abdoullah; Ismail, Mohd Tahir

    2015-10-01

    This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.

  9. Examination of time series through randomly broken windows

    NASA Technical Reports Server (NTRS)

    Sturrock, P. A.; Shoub, E. C.

    1981-01-01

    In order to determine the Fourier transform of a quasi-periodic time series (linear problem), or the power spectrum of a stationary random time series (quadratic problem), data should be recorded without interruption over a long time interval. The effect of regular interruption such as the day/night cycle is well known. The effect of irregular interruption of data collection (the "breaking" of the window function) with the simplifying assumption that there is a uniform probability p that each interval of length tau, of the total interval of length T = N sub tau, yields no data, is investigated. For the linear case it is found that the noise-to-signal ratio will have a (one-sigma) value less than epsilon if N exceeds p(-1)(1-p)epsilon(-2). For the quadratic case, the same requirement is met by the less restrictive requirement that N exceed p(-1)(1-p)epsilon(-1).

  10. A multivariate heuristic model for fuzzy time-series forecasting.

    PubMed

    Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei

    2007-08-01

    Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.

  11. A noise model for InSAR time series

    NASA Astrophysics Data System (ADS)

    Agram, P. S.; Simons, M.

    2015-04-01

    Interferometric synthetic aperture radar (InSAR) time series methods estimate the spatiotemporal evolution of surface deformation by incorporating information from multiple SAR interferograms. While various models have been developed to describe the interferometric phase and correlation statistics in individual interferograms, efforts to model the generalized covariance matrix that is directly applicable to joint analysis of networks of interferograms have been limited in scope. In this work, we build on existing decorrelation and atmospheric phase screen models and develop a covariance model for interferometric phase noise over space and time. We present arguments to show that the exploitation of the full 3-D covariance structure within conventional time series inversion techniques is computationally challenging. However, the presented covariance model can aid in designing new inversion techniques that can at least mitigate the impact of spatial correlated nature of InSAR observations.

  12. Least Squares Time-Series Synchronization in Image Acquisition Systems.

    PubMed

    Piazzo, Lorenzo; Raguso, Maria Carmela; Calzoletti, Luca; Seu, Roberto; Altieri, Bruno

    2016-07-18

    We consider an acquisition system constituted by an array of sensors scanning an image. Each sensor produces a sequence of readouts, called a time-series. In this framework, we discuss the image estimation problem when the time-series are affected by noise and by a time shift. In particular, we introduce an appropriate data model and consider the Least Squares (LS) estimate, showing that it has no closed form. However, the LS problem has a structure that can be exploited to simplify the solution. In particular, based on two known techniques, namely Separable Nonlinear Least Squares (SNLS) and Alternating Least Squares (ALS), we propose and analyze several practical estimation methods. As an additional contribution, we discuss the application of these methods to the data of the Photodetector Array Camera and Spectrometer (PACS), which is an infrared photometer onboard the Herschel satellite. In this context, we investigate the accuracy and the computational complexity of the methods, using both true and simulated data.

  13. Segmentation of biological multivariate time-series data

    NASA Astrophysics Data System (ADS)

    Omranian, Nooshin; Mueller-Roeber, Bernd; Nikoloski, Zoran

    2015-03-01

    Time-series data from multicomponent systems capture the dynamics of the ongoing processes and reflect the interactions between the components. The progression of processes in such systems usually involves check-points and events at which the relationships between the components are altered in response to stimuli. Detecting these events together with the implicated components can help understand the temporal aspects of complex biological systems. Here we propose a regularized regression-based approach for identifying breakpoints and corresponding segments from multivariate time-series data. In combination with techniques from clustering, the approach also allows estimating the significance of the determined breakpoints as well as the key components implicated in the emergence of the breakpoints. Comparative analysis with the existing alternatives demonstrates the power of the approach to identify biologically meaningful breakpoints in diverse time-resolved transcriptomics data sets from the yeast Saccharomyces cerevisiae and the diatom Thalassiosira pseudonana.

  14. Factors That Have An Influence On Time Series

    NASA Astrophysics Data System (ADS)

    Notti, D.; Meisina, C.; Zucca, F.; Crosetto, M.; Montserrat, O.

    2012-01-01

    In the last years the development in the processing of SAR persistent scatterers interferometry (PSI) data has allowed an improvement in time series precision, also with the data processed on regional scale. It is possible now to study the behaviour in the time of different type of natural process. The more recent data are elaborated also with non-linear models and this allows, even if with many restrictions and problems, to study also the temporal variation in the evolution of a process. In this work we have analyzed the time series (TS) of ERS (1992-2001) and RADARSAT (2003-2010) data elaborated with SqueeSARTM processing over three studied areas in NW Italy (Western Piemonte, Province of Pavia and Province of Imperia). We compared the time series with other monitoring data in order to validate them and to find the positive and negative aspects in the detection of natural processes. At the same time the TS were used to understand the kinematics of some geological processes.

  15. Reconstruction of ensembles of coupled time-delay systems from time series.

    PubMed

    Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  16. Nonlinear time series analysis of solar and stellar data

    NASA Astrophysics Data System (ADS)

    Jevtic, Nada

    2003-06-01

    Nonlinear time series analysis was developed to study chaotic systems. Its utility was investigated for the study of solar and stellar data time series. Sunspot data are the longest astronomical time series, and it reflects the long-term variation of the solar magnetic field. Due to periods of low solar activity, such as the Maunder minimum, and the solar cycle's quasiperiodicity, it has been postulated that the solar dynamo is a chaotic system. We show that, due to the definition of sunspot number, using nonlinear time series methods, it is not possible to test this postulate. To complement the sunspot data analysis, theoretically generated data for the α-Ω solar dynamo with meridional circulation were analyzed. Effects of stochastic fluctuations on the energy of an α-Ω dynamo with meridional circulation were investigated. This proved extremely useful in generating a clearer understanding of the effect of dynamical noise on the unperturbed system. This was useful in the study of the light intensity curve of white dwarf PG 1351+489. Dynamical resetting was identified for PG 1351+489, using phase space methods, and then, using nonlinear noise reduction methods, the white noise tail of the power spectrum was lowered by a factor of 40. This allowed the identification of 10 new lines in the power spectrum. Finally, using Poincare section return times, a periodicity in the light curve of cataclysmic variable SS Cygni was identified. We initially expected that time delay methods would be useful as a qualitative comparison tool. However, they were capable, under the proper set of constraints on the data sets, of providing quantitative information about the signal source.

  17. Exploring large scale time-series data using nested timelines

    NASA Astrophysics Data System (ADS)

    Xie, Zaixian; Ward, Matthew O.; Rundensteiner, Elke A.

    2013-01-01

    When data analysts study time-series data, an important task is to discover how data patterns change over time. If the dataset is very large, this task becomes challenging. Researchers have developed many visualization techniques to help address this problem. However, little work has been done regarding the changes of multivariate patterns, such as linear trends and clusters, on time-series data. In this paper, we describe a set of history views to fill this gap. This technique works under two modes: merge and non-merge. For the merge mode, merge algorithms were applied to selected time windows to generate a change-based hierarchy. Contiguous time windows having similar patterns are merged first. Users can choose different levels of merging with the tradeoff between more details in the data and less visual clutter in the visualizations. In the non-merge mode, the framework can use natural hierarchical time units or one defined by domain experts to represent timelines. This can help users navigate across long time periods. Gridbased views were designed to provide a compact overview for the history data. In addition, MDS pattern starfields and distance maps were developed to enable users to quickly investigate the degree of pattern similarity among different time periods. The usability evaluation demonstrated that most participants could understand the concepts of the history views correctly and finished assigned tasks with a high accuracy and relatively fast response time.

  18. Time series forecasting by combining the radial basis function network and the self-organizing map

    NASA Astrophysics Data System (ADS)

    Lin, Gwo-Fong; Chen, Lu-Hsien

    2005-06-01

    Based on a combination of a radial basis function network (RBFN) and a self-organizing map (SOM), a time-series forecasting model is proposed. Traditionally, the positioning of the radial basis centres is a crucial problem for the RBFN. In the proposed model, an SOM is used to construct the two-dimensional feature map from which the number of clusters (i.e. the number of hidden units in the RBFN) can be figured out directly by eye, and then the radial basis centres can be determined easily. The proposed model is examined using simulated time series data. The results demonstrate that the proposed RBFN is more competent in modelling and forecasting time series than an autoregressive integrated moving average (ARIMA) model. Finally, the proposed model is applied to actual groundwater head data. It is found that the proposed model can forecast more precisely than the ARIMA model. For time series forecasting, the proposed model is recommended as an alternative to the existing method, because it has a simple structure and can produce reasonable forecasts.

  19. A Time-Frequency Functional Model for Locally Stationary Time Series Data

    PubMed Central

    Qin, Li; Guo, Wensheng; Litt, Brian

    2009-01-01

    Unlike traditional time series analysis that focuses on one long time series, in many biomedical experiments, it is common to collect multiple time series and focus on how the design covariates impact the patterns of stochastic variation over time. In this article, we propose a time-frequency functional model for a family of time series indexed by a set of covariates. This model can be used to compare groups of time series in terms of the patterns of stochastic variation and to estimate the covariate effects. We focus our development on locally stationary time series and propose the covariate-indexed locally stationary setting, which include stationary processes as special cases. We use smoothing spline ANOVA models for the time-frequency coefficients. A two-stage procedure is introduced for estimation. To reduce the computational demand, we develop an equivalent state space model to the proposed model with an efficient algorithm. We also propose a new simulation method to generate replicated time series from their design spectra. An epileptic intracranial electroencephalogram (IEEG) dataset is analyzed for illustration. PMID:20228961

  20. Time warp edit distance with stiffness adjustment for time series matching.

    PubMed

    Marteau, Pierre-François

    2009-02-01

    In a way similar to the string-to-string correction problem, we address discrete time series similarity in light of a time-series-to-time-series-correction problem for which the similarity between two time series is measured as the minimum cost sequence of edit operations needed to transform one time series into another. To define the edit operations, we use the paradigm of a graphical editing process and end up with a dynamic programming algorithm that we call Time Warp Edit Distance (TWED). TWED is slightly different in form from Dynamic Time Warping (DTW), Longest Common Subsequence (LCSS), or Edit Distance with Real Penalty (ERP) algorithms. In particular, it highlights a parameter that controls a kind of stiffness of the elastic measure along the time axis. We show that the similarity provided by TWED is a potentially useful metric in time series retrieval applications since it could benefit from the triangular inequality property to speed up the retrieval process while tuning the parameters of the elastic measure. In that context, a lower bound is derived to link the matching of time series into downsampled representation spaces to the matching into the original space. The empiric quality of the TWED distance is evaluated on a simple classification task. Compared to Edit Distance, DTW, LCSS, and ERP, TWED has proved to be quite effective on the considered experimental task.

  1. Effects of linear trends on estimation of noise in GNSS position time-series

    NASA Astrophysics Data System (ADS)

    Dmitrieva, K.; Segall, P.; Bradley, A. M.

    2017-01-01

    A thorough understanding of time-dependent noise in Global Navigation Satellite System (GNSS) position time-series is necessary for computing uncertainties in any signals found in the data. However, estimation of time-correlated noise is a challenging task and is complicated by the difficulty in separating noise from signal, the features of greatest interest in the time-series. In this paper, we investigate how linear trends affect the estimation of noise in daily GNSS position time-series. We use synthetic time-series to study the relationship between linear trends and estimates of time-correlated noise for the six most commonly cited noise models. We find that the effects of added linear trends, or conversely de-trending, vary depending on the noise model. The commonly adopted model of random walk (RW), flicker noise (FN) and white noise (WN) is the most severely affected by de-trending, with estimates of low-amplitude RW most severely biased. FN plus WN is least affected by adding or removing trends. Non-integer power-law noise estimates are also less affected by de-trending, but are very sensitive to the addition of trend when the spectral index is less than one. We derive an analytical relationship between linear trends and the estimated RW variance for the special case of pure RW noise. Overall, we find that to ascertain the correct noise model for GNSS position time-series and to estimate the correct noise parameters, it is important to have independent constraints on the actual trends in the data.

  2. Effects of linear trends on estimation of noise in GNSS position time series

    NASA Astrophysics Data System (ADS)

    Dmitrieva, K.; Segall, P.; Bradley, A. M.

    2016-10-01

    A thorough understanding of time dependent noise in Global Navigation Satellite System (GNSS) position time series is necessary for computing uncertainties in any signals found in the data. However, estimation of time-correlated noise is a challenging task and is complicated by the difficulty in separating noise from signal, the features of greatest interest in the time series. In this paper we investigate how linear trends affect the estimation of noise in daily GNSS position time series. We use synthetic time series to study the relationship between linear trends and estimates of time-correlated noise for the six most commonly cited noise models. We find that the effects of added linear trends, or conversely de-trending, vary depending on the noise model. The commonly adopted model of random walk (RW), flicker noise (FN), and white noise (WN) is the most severely affected by de-trending, with estimates of low amplitude RW most severely biased. Flicker noise plus white noise is least affected by adding or removing trends. Non-integer power-law noise estimates are also less affected by de-trending, but are very sensitive to the addition of trend when the spectral index is less than one. We derive an analytical relationship between linear trends and the estimated random walk variance for the special case of pure random walk noise. Overall, we find that to ascertain the correct noise model for GNSS position time series and to estimate the correct noise parameters, it is important to have independent constraints on the actual trends in the data.

  3. Effects of linear trends on estimation of noise in GNSS position time-series

    DOE PAGES

    Dmitrieva, K.; Segall, P.; Bradley, A. M.

    2016-10-20

    A thorough understanding of time-dependent noise in Global Navigation Satellite System (GNSS) position time-series is necessary for computing uncertainties in any signals found in the data. However, estimation of time-correlated noise is a challenging task and is complicated by the difficulty in separating noise from signal, the features of greatest interest in the time-series. In this study, we investigate how linear trends affect the estimation of noise in daily GNSS position time-series. We use synthetic time-series to study the relationship between linear trends and estimates of time-correlated noise for the six most commonly cited noise models. We find that themore » effects of added linear trends, or conversely de-trending, vary depending on the noise model. The commonly adopted model of random walk (RW), flicker noise (FN) and white noise (WN) is the most severely affected by de-trending, with estimates of low-amplitude RW most severely biased. FN plus WN is least affected by adding or removing trends. Non-integer power-law noise estimates are also less affected by de-trending, but are very sensitive to the addition of trend when the spectral index is less than one. We derive an analytical relationship between linear trends and the estimated RW variance for the special case of pure RW noise. Finally, overall, we find that to ascertain the correct noise model for GNSS position time-series and to estimate the correct noise parameters, it is important to have independent constraints on the actual trends in the data.« less

  4. Effects of linear trends on estimation of noise in GNSS position time-series

    SciTech Connect

    Dmitrieva, K.; Segall, P.; Bradley, A. M.

    2016-10-20

    A thorough understanding of time-dependent noise in Global Navigation Satellite System (GNSS) position time-series is necessary for computing uncertainties in any signals found in the data. However, estimation of time-correlated noise is a challenging task and is complicated by the difficulty in separating noise from signal, the features of greatest interest in the time-series. In this study, we investigate how linear trends affect the estimation of noise in daily GNSS position time-series. We use synthetic time-series to study the relationship between linear trends and estimates of time-correlated noise for the six most commonly cited noise models. We find that the effects of added linear trends, or conversely de-trending, vary depending on the noise model. The commonly adopted model of random walk (RW), flicker noise (FN) and white noise (WN) is the most severely affected by de-trending, with estimates of low-amplitude RW most severely biased. FN plus WN is least affected by adding or removing trends. Non-integer power-law noise estimates are also less affected by de-trending, but are very sensitive to the addition of trend when the spectral index is less than one. We derive an analytical relationship between linear trends and the estimated RW variance for the special case of pure RW noise. Finally, overall, we find that to ascertain the correct noise model for GNSS position time-series and to estimate the correct noise parameters, it is important to have independent constraints on the actual trends in the data.

  5. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    PubMed

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic.

  6. Rényi’s information transfer between financial time series

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad

    2012-05-01

    In this paper, we quantify the statistical coherence between financial time series by means of the Rényi entropy. With the help of Campbell’s coding theorem, we show that the Rényi entropy selectively emphasizes only certain sectors of the underlying empirical distribution while strongly suppressing others. This accentuation is controlled with Rényi’s parameter q. To tackle the issue of the information flow between time series, we formulate the concept of Rényi’s transfer entropy as a measure of information that is transferred only between certain parts of underlying distributions. This is particularly pertinent in financial time series, where the knowledge of marginal events such as spikes or sudden jumps is of a crucial importance. We apply the Rényian information flow to stock market time series from 11 world stock indices as sampled at a daily rate in the time period 02.01.1990-31.12.2009. Corresponding heat maps and net information flows are represented graphically. A detailed discussion of the transfer entropy between the DAX and S&P500 indices based on minute tick data gathered in the period 02.04.2008-11.09.2009 is also provided. Our analysis shows that the bivariate information flow between world markets is strongly asymmetric with a distinct information surplus flowing from the Asia-Pacific region to both European and US markets. An important yet less dramatic excess of information also flows from Europe to the US. This is particularly clearly seen from a careful analysis of Rényi information flow between the DAX and S&P500 indices.

  7. FTSPlot: Fast Time Series Visualization for Large Datasets

    PubMed Central

    Riss, Michael

    2014-01-01

    The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of ; the visualization itself can be done with a complexity of and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with ms. The current 64-bit implementation theoretically supports datasets with up to bytes, on the x86_64 architecture currently up to bytes are supported, and benchmarks have been conducted with bytes/1 TiB or double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments. PMID:24732865

  8. Deducing acidification rates based on short-term time series

    PubMed Central

    Lui, Hon-Kit; Arthur Chen, Chen-Tung

    2015-01-01

    We show that, statistically, the simple linear regression (SLR)-determined rate of temporal change in seawater pH (βpH), the so-called acidification rate, can be expressed as a linear combination of a constant (the estimated rate of temporal change in pH) and SLR-determined rates of temporal changes in other variables (deviation largely due to various sampling distributions), despite complications due to different observation durations and temporal sampling distributions. Observations show that five time series data sets worldwide, with observation times from 9 to 23 years, have yielded βpH values that vary from 1.61 × 10−3 to −2.5 × 10−3 pH unit yr−1. After correcting for the deviation, these data now all yield an acidification rate similar to what is expected under the air-sea CO2 equilibrium (−1.6 × 10−3 ~ −1.8 × 10−3 pH unit yr−1). Although long-term time series stations may have evenly distributed datasets, shorter time series may suffer large errors which are correctable by this method. PMID:26143749

  9. A multivariate time-series approach to marital interaction

    PubMed Central

    Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar

    2005-01-01

    Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation). The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa. Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R², p<0.0001 for the multivariate models). The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction. PMID:19742066

  10. A multivariate time-series approach to marital interaction.

    PubMed

    Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar

    2005-08-02

    Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation).The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa.Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R(2), p<0.0001 for the multivariate models).The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction.

  11. Nonstationary hydrological time series forecasting using nonlinear dynamic methods

    NASA Astrophysics Data System (ADS)

    Coulibaly, Paulin; Baldwin, Connely K.

    2005-06-01

    Recent evidence of nonstationary trends in water resources time series as result of natural and/or anthropogenic climate variability and change, has raised more interest in nonlinear dynamic system modeling methods. In this study, the effectiveness of dynamically driven recurrent neural networks (RNN) for complex time-varying water resources system modeling is investigated. An optimal dynamic RNN approach is proposed to directly forecast different nonstationary hydrological time series. The proposed method automatically selects the most optimally trained network in any case. The simulation performance of the dynamic RNN-based model is compared with the results obtained from optimal multivariate adaptive regression splines (MARS) models. It is shown that the dynamically driven RNN model can be a good alternative for the modeling of complex dynamics of a hydrological system, performing better than the MARS model on the three selected hydrological time series, namely the historical storage volumes of the Great Salt Lake, the Saint-Lawrence River flows, and the Nile River flows.

  12. Dynamical Analysis and Visualization of Tornadoes Time Series

    PubMed Central

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281

  13. Dynamical analysis and visualization of tornadoes time series.

    PubMed

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  14. Satellite time series analysis using Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  15. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  16. The Puoko-nui CCD Time-Series Photometer

    NASA Astrophysics Data System (ADS)

    Chote, P.; Sullivan, D. J.

    2013-01-01

    Puoko-nui (te reo Maori for ‘big eye’) is a precision time series photometer developed at Victoria University of Wellington, primarily for use with the 1m McLellan telescope at Mt John University Observatory (MJUO), at Lake Tekapo, New Zealand. GPS based timing provides excellent timing accuracy, and online reduction software processes frames as they are acquired. The user is presented with a simple user interface that includes instrument control and an up to date lightcurve and Fourier amplitude spectrum of the target star. Puoko-nui has been operating in its current form since early 2011, where it is primarily used to monitor pulsating white dwarf stars.

  17. West Africa land use and land cover time series

    USGS Publications Warehouse

    Cotillon, Suzanne E.

    2017-02-16

    Started in 1999, the West Africa Land Use Dynamics project represents an effort to map land use and land cover, characterize the trends in time and space, and understand their effects on the environment across West Africa. The outcome of the West Africa Land Use Dynamics project is the production of a three-time period (1975, 2000, and 2013) land use and land cover dataset for the Sub-Saharan region of West Africa, including the Cabo Verde archipelago. The West Africa Land Use Land Cover Time Series dataset offers a unique basis for characterizing and analyzing land changes across the region, systematically and at an unprecedented level of detail.

  18. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  19. Simple Patterns in Fluctuations of Time Series of Economic Interest

    NASA Astrophysics Data System (ADS)

    Fanchiotti, H.; García Canal, C. A.; García Zúñiga, H.

    Time series corresponding to nominal exchange rates between the US dollar and Argentina, Brazil and European Economic Community currencies; different financial indexes as the Industrial Dow Jones, the British Footsie, the German DAX Composite, the Australian Share Price and the Nikkei Cash and also different Argentine local tax revenues, are analyzed looking for the appearance of simple patterns and the possible definition of forecast evaluators. In every case, the statistical fractal dimensions are obtained from the behavior of the corresponding variance of increments at a given lag. The detrended fluctuation analysis of the data in terms of the corresponding exponent in the resulting power law is carried out. Finally, the frequency power spectra of all the time series considered are computed and compared

  20. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  1. Assestment of correlations and crossover scale in electroseismic time series

    NASA Astrophysics Data System (ADS)

    Guzman-Vargas, L.; Ramírez-Rojas, A.; Angulo-Brown, F.

    2009-04-01

    Evaluating complex fluctuations in electroseismic time series is an important task not only for earthquake prediction but also for understanding complex processes related to earthquake preparation. Previous studies have reported alterations, as the emergence of correlated dynamics in geoelectric potentials prior to an important earthquake (EQ). In this work, we apply the detrended fluctuation analysis and introduce a statistical procedure to characterize the presence of crossovers in scaling exponents, to analyze the fluctuations of geoelectric time series monitored in two sites located in Mexico. We find a complex behavior characterized by the presence of a crossover in the correlation exponents in the vicinity of a M=7.4 EQ occurred on Sept. 14, 1995. Finally, we apply the t-student test to evaluate the level of significance between short and large scaling exponents.

  2. Nonlinear modeling of chaotic time series: Theory and applications

    NASA Astrophysics Data System (ADS)

    Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.

  3. Nonlinear modeling of chaotic time series: Theory and applications

    SciTech Connect

    Casdagli, M.; Eubank, S.; Farmer, J.D.; Gibson, J. Santa Fe Inst., NM ); Des Jardins, D.; Hunter, N.; Theiler, J. )

    1990-01-01

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.

  4. Causal Discovery from Subsampled Time Series Data by Constraint Optimization

    PubMed Central

    Hyttinen, Antti; Plis, Sergey; Järvisalo, Matti; Eberhardt, Frederick; Danks, David

    2017-01-01

    This paper focuses on causal structure estimation from time series data in which measurements are obtained at a coarser timescale than the causal timescale of the underlying system. Previous work has shown that such subsampling can lead to significant errors about the system’s causal structure if not properly taken into account. In this paper, we first consider the search for the system timescale causal structures that correspond to a given measurement timescale structure. We provide a constraint satisfaction procedure whose computational performance is several orders of magnitude better than previous approaches. We then consider finite-sample data as input, and propose the first constraint optimization approach for recovering the system timescale causal structure. This algorithm optimally recovers from possible conflicts due to statistical errors. More generally, these advances allow for a robust and non-parametric estimation of system timescale causal structures from subsampled time series data. PMID:28203316

  5. Interpreting time series of patient satisfaction: macro vs. micro components.

    PubMed

    Frank, Björn; Sudo, Shuichi; Enkawa, Takao

    2009-01-01

    Recent research discovered that economic processes influence national averages of customer satisfaction. Using time-series data from Japanese and South Korean hospitals, we conducted principal component regression analyses to examine whether these findings are transferable to patient satisfaction. Our results reveal that aggregate income has a positive impact and economic expectations have a negative impact on patient satisfaction. Further analyses demonstrate that these strong economic influences make it difficult for hospital managers to use patient satisfaction scores to assess the performance impact of their customer-oriented actions. In order to improve performance evaluations based on patient surveys, we thus recommend managers to remove economic influences from time-series of patient satisfaction.

  6. A Surrogate Test for Pseudo-periodic Time Series Data

    NASA Astrophysics Data System (ADS)

    Small, Michael; Harrison, Robert G.; Tse, C. K.

    2002-07-01

    Standard (linear) surrogate methods are only useful for time series exhibiting no pseudo-periodic structure. We describe a new algorithm that can distinguish between a noisy periodic orbit and deterministic non-periodic inter-cycle dynamics. Possible origins of deterministic non-periodic inter-cycle dynamics include: non-periodic linear or nonlinear dynamics, or chaos. This new algorithm is based on mimicking the large-scale dynamics with a local model, but obliterating the fine scale features with dynamic noise. We demonstrate the application of this method to artificial data and experimental time series, including human electrocardiogram (ECG) recordings during sinus rhythm and ventricular tachycardia (VT). The method is able to successfully differentiate between the chaotic Rössler system and a pseudo periodic realization of the Rössler equations with dynamic noise. Application to ECG data demonstrates that both sinus rhythm and VT exhibit nontrivial inter-cycle dynamics.

  7. Analyzing Single-Molecule Time Series via Nonparametric Bayesian Inference

    PubMed Central

    Hines, Keegan E.; Bankston, John R.; Aldrich, Richard W.

    2015-01-01

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. PMID:25650922

  8. The multiscale analysis between stock market time series

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian

    2015-11-01

    This paper is devoted to multiscale cross-correlation analysis on stock market time series, where multiscale DCCA cross-correlation coefficient as well as multiscale cross-sample entropy (MSCE) is applied. Multiscale DCCA cross-correlation coefficient is a realization of DCCA cross-correlation coefficient on multiple scales. The results of this method present a good scaling characterization. More significantly, this method is able to group stock markets by areas. Compared to multiscale DCCA cross-correlation coefficient, MSCE presents a more remarkable scaling characterization and the value of each log return of financial time series decreases with the increasing of scale factor. But the results of grouping is not as good as multiscale DCCA cross-correlation coefficient.

  9. Causal Discovery from Subsampled Time Series Data by Constraint Optimization.

    PubMed

    Hyttinen, Antti; Plis, Sergey; Järvisalo, Matti; Eberhardt, Frederick; Danks, David

    2016-08-01

    This paper focuses on causal structure estimation from time series data in which measurements are obtained at a coarser timescale than the causal timescale of the underlying system. Previous work has shown that such subsampling can lead to significant errors about the system's causal structure if not properly taken into account. In this paper, we first consider the search for the system timescale causal structures that correspond to a given measurement timescale structure. We provide a constraint satisfaction procedure whose computational performance is several orders of magnitude better than previous approaches. We then consider finite-sample data as input, and propose the first constraint optimization approach for recovering the system timescale causal structure. This algorithm optimally recovers from possible conflicts due to statistical errors. More generally, these advances allow for a robust and non-parametric estimation of system timescale causal structures from subsampled time series data.

  10. Deviations from uniform power law scaling in nonstationary time series

    NASA Technical Reports Server (NTRS)

    Viswanathan, G. M.; Peng, C. K.; Stanley, H. E.; Goldberger, A. L.

    1997-01-01

    A classic problem in physics is the analysis of highly nonstationary time series that typically exhibit long-range correlations. Here we test the hypothesis that the scaling properties of the dynamics of healthy physiological systems are more stable than those of pathological systems by studying beat-to-beat fluctuations in the human heart rate. We develop techniques based on the Fano factor and Allan factor functions, as well as on detrended fluctuation analysis, for quantifying deviations from uniform power-law scaling in nonstationary time series. By analyzing extremely long data sets of up to N = 10(5) beats for 11 healthy subjects, we find that the fluctuations in the heart rate scale approximately uniformly over several temporal orders of magnitude. By contrast, we find that in data sets of comparable length for 14 subjects with heart disease, the fluctuations grow erratically, indicating a loss of scaling stability.

  11. A Multiscale Approach to InSAR Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Muse, P.; Simons, M.; Lin, N.; Dicaprio, C. J.

    2010-12-01

    We present a technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale InSAR Time Series analysis), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. As opposed to single pixel InSAR time series techniques, MInTS takes advantage of both spatial and temporal characteristics of the deformation field. We use a weighting scheme which accounts for the presence of localized holes due to decorrelation or unwrapping errors in any given interferogram. We represent time-dependent deformation using a dictionary of general basis functions, capable of detecting both steady and transient processes. The estimation is regularized using a model resolution based smoothing so as to be able to capture rapid deformation where there are temporally dense radar acquisitions and to avoid oscillations during time periods devoid of acquisitions. MInTS also has the flexibility to explicitly parametrize known time-dependent processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). We use cross validation to choose the regularization penalty parameter in the inversion of for the time-dependent deformation field. We demonstrate MInTS using a set of 63 ERS-1/2 and 29 Envisat interferograms for Long Valley Caldera.

  12. Mode Analysis with Autocorrelation Method (Single Time Series) in Tokamak

    NASA Astrophysics Data System (ADS)

    Saadat, Shervin; Salem, Mohammad K.; Goranneviss, Mahmoud; Khorshid, Pejman

    2010-08-01

    In this paper plasma mode analyzed with statistical method that designated Autocorrelation function. Auto correlation function used from one time series, so for this purpose we need one Minov coil. After autocorrelation analysis on mirnov coil data, spectral density diagram is plotted. Spectral density diagram from symmetries and trends can analyzed plasma mode. RHF fields effects with this method ate investigated in IR-T1 tokamak and results corresponded with multichannel methods such as SVD and FFT.

  13. Time series prediction using artificial neural network for power stabilization

    SciTech Connect

    Puranik, G.; Philip, T.; Nail, B.

    1996-12-31

    Time series prediction has been applied to many business and scientific applications. Prominent among them are stock market prediction, weather forecasting, etc. Here, this technique has been applied to forecast plasma torch voltages to stabilize power using a backpropagation, a model of artificial neural network. The Extended-Delta-Bar-Delta algorithm is used to improve the convergence rate of the network and also to avoid local minima. Results from off-line data was quite promising to use in on-line.

  14. Time Series of SST Anomalies Off Western Africa

    DTIC Science & Technology

    2014-09-09

    of South Africa extending west-northwest from the vicinity of the Cape . b) Locations of surface drifting buoys over January-April 2014 superimposed...in the real ocean with accompanying estimates of forecast uncertainty. Assimilative ocean forecast around South Africa are evaluated from January to...GHRSST XV Proceedings Issue 1 Revision 0 2-6 June 2014, Cape Town , SA Date: 9th September 2014 Page 93 of 232 TIME SERIES OF SST ANOMALIES

  15. The complexity of carbon flux time series in Europe

    NASA Astrophysics Data System (ADS)

    Lange, Holger; Sippel, Sebastian

    2014-05-01

    Observed geophysical time series usually exhibit pronounced variability, part of which is process-related and deterministic ("signal"), another part is due to random fluctuations ("noise"). To discern these two sources for fluctuations is notoriously difficult using conventional analysis methods, unless sophisticated model assumptions are made. Here, we present an almost parameter-free innovative approach with the potential to draw a distinction between deterministic processes and structured noise, based on ordinal pattern statistics. The method determines one measure for the information content of time series (Shannon entropy) and two complexity measures, one based on global properties of the order pattern distribution (Jensen-Shannon complexity) and one based on local (derivative) properties (Fisher information or complexity). Each time series gets classified via its location in an entropy-complexity plane; using this representation, the method draws a qualitative distinction between different types of natural processes. As a case study, we investigate Gross Primary Productivity (GPP) and respiration which are key variables in terrestrial ecosystems quantifying carbon allocation and biomass growth of vegetation. Changes in GPP and ecosystem respiration can be induced by land use change, environmental disasters or extreme events, and changing climate. Numerous attempts to quantify these variables on larger spatial scales exist. Here, we investigate gridded time series at monthly resolution for the European continent either based on upscaled measurements ("observations") or modelled with two different process-based terrestrial ecosystem models ("simulations"). The complexity analysis is either visualized as maps of Europe showing "hotspots" of complexity for GPP and respiration, or used to provide a detailed observations-simulations and model-model comparison. Values found for information and complexity will be compared to known artificial reference processes

  16. Time series regression model for infectious disease and weather.

    PubMed

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context.

  17. New Comprehensive System to Construct Speleothem Fabrics Time Series

    NASA Astrophysics Data System (ADS)

    Frisia, S.; Borsato, A.

    2014-12-01

    Speleothem fabrics record processes that influence the way geochemical proxy data are encoded in speleothems, yet, there has been little advance in the use of fabrics as a complement to palaeo-proxy datasets since the fabric classification proposed by us in 2010. The systematic use of fabrics documentation in speleothem science has been limited by the absence of a comprehensive, numerical system that would allow constructing fabric time series comparable with the widely used geochemical time series. Documentation of speleothem fabrics is fundamental for a robust interpretation of speleothem time series where stable isotopes and trace elements are used as proxy, because fabrics highlight depositional as well as post-depositional processes whose understanding complements reconstructions based on geochemistry. Here we propose a logic system allowing transformation of microscope observations into numbers tied to acronyms that specify each fabric type and subtype. The rationale for ascribing progressive numbers to fabrics is based on the most up-to-date growth models. In this conceptual framework, the progression reflects hydrological conditions, bio-mediation and diagenesis. The lowest numbers are given to calcite fabrics formed at relatively constant drip rates: the columnar types (compact and open). Higher numbers are ascribed to columnar fabrics characterized by presence of impurities that cause elongation or lattice distortion (Elongated, Fascicular Optic and Radiaxial calcites). The sequence progresses with the dendritic fabrics, followed by micrite (M), which has been observed in association with microbial films. Microsparite (Ms) and mosaic calcite (Mc) have the highest numbers, being considered as diagenetic. Acronyms and subfixes are intended to become universally acknowledged. Thus, fabrics can be plotted vs. age to yield time series, where numbers are replaced by the acronyms. This will result in a visual representation of climate- or environmental

  18. Multifractal analysis of time series generated by discrete Ito equations

    SciTech Connect

    Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele

    2015-06-15

    In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.

  19. A data-fitting procedure for chaotic time series

    SciTech Connect

    McDonough, J.M.; Mukerji, S.; Chung, S.

    1998-10-01

    In this paper the authors introduce data characterizations for fitting chaotic data to linear combinations of one-dimensional maps (say, of the unit interval) for use in subgrid-scale turbulence models. They test the efficacy of these characterizations on data generated by a chaotically-forced Burgers` equation and demonstrate very satisfactory results in terms of modeled time series, power spectra and delay maps.

  20. A method for detecting complex correlation in time series

    NASA Astrophysics Data System (ADS)

    Alfi, V.; Petri, A.; Pietronero, L.

    2007-06-01

    We propose a new method for detecting complex correlations in time series of limited size. The method is derived by the Spitzer's identity and proves to work successfully on different model processes, including the ARCH process, in which pairs of variables are uncorrelated, but the three point correlation function is non zero. The application to financial data allows to discriminate among dependent and independent stock price returns where standard statistical analysis fails.

  1. Geodetic Time Series: An Overview of UNAVCO Community Resources and Examples of Time Series Analysis Using GPS and Strainmeter Data

    NASA Astrophysics Data System (ADS)

    Phillips, D. A.; Meertens, C. M.; Hodgkinson, K. M.; Puskas, C. M.; Boler, F. M.; Snett, L.; Mattioli, G. S.

    2013-12-01

    We present an overview of time series data, tools and services available from UNAVCO along with two specific and compelling examples of geodetic time series analysis. UNAVCO provides a diverse suite of geodetic data products and cyberinfrastructure services to support community research and education. The UNAVCO archive includes data from 2500+ continuous GPS stations, borehole geophysics instruments (strainmeters, seismometers, tiltmeters, pore pressure sensors), and long baseline laser strainmeters. These data span temporal scales from seconds to decades and provide global spatial coverage with regionally focused networks including the EarthScope Plate Boundary Observatory (PBO) and COCONet. This rich, open access dataset is a tremendous resource that enables the exploration, identification and analysis of time varying signals associated with crustal deformation, reference frame determinations, isostatic adjustments, atmospheric phenomena, hydrologic processes and more. UNAVCO provides a suite of time series exploration and analysis resources including static plots, dynamic plotting tools, and data products and services designed to enhance time series analysis. The PBO GPS network allow for identification of ~1 mm level deformation signals. At some GPS stations seasonal signals and longer-term trends in both the vertical and horizontal components can be dominated by effects of hydrological loading from natural and anthropogenic sources. Modeling of hydrologic deformation using GLDAS and a variety of land surface models (NOAH, MOSAIC, VIC and CLM) shows promise for independently modeling hydrologic effects and separating them from tectonic deformation as well as anthropogenic loading sources. A major challenge is to identify where loading is dominant and corrections from GLDAS can apply and where pumping is the dominant signal and corrections are not possible without some other data. In another arena, the PBO strainmeter network was designed to capture small short

  2. Learning restricted Boolean network model by time-series data

    PubMed Central

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance μhame, the normalized Hamming distance of state transition μhamst, and the steady-state distribution distance μssd. Results show that the proposed algorithm outperforms the others according to both μhame and μhamst, whereas its performance according to μssd is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data. PMID:25093019

  3. AE mapping of engines for spatially located time series

    NASA Astrophysics Data System (ADS)

    Nivesrangsan, P.; Steel, J. A.; Reuben, R. L.

    2005-09-01

    This paper represents the first step towards using multiple acoustic emission (AE) sensors to produce spatially located time series signals for a running engine. By this it is meant the decomposition of a multi-source signal by acquiring it with an array of sensors and using source location to reconstitute the individual time series attributable to some or all of these signals. Internal combustion engines are a group of monitoring targets which would benefit from such an approach. A series of experiments has been carried out where AE from a standard source has been mapped for a large number of source-sensor pairs on a small diesel engine and on various cast iron blocks of simple geometry. The wave propagation on a typical diesel engine cylinder head or block is complex because of the heterogeneity of the cast iron and the complex geometry with variations in wall-thickness, boundaries and discontinuities. The AE signal distortion for a range of source-sensor pairs has been estimated using time-frequency analysis, and using a reference sensor placed close to the source. At this stage, the emphasis has been on determining a suitable processing scheme to recover a measure of the signal energy, which depends only on the distance of the source and not upon the path. Tentative recommendations are made on a suitable approach to sensor positioning and signal processing with reference to a limited set of data acquired from the running engine.

  4. An Operational Geodatabase Service for Disseminating Raster Time Series Data

    NASA Astrophysics Data System (ADS)

    Asante, K. O.

    2009-12-01

    The volume of raster time series data available for earth science applications is rapidly expanding with improvements in spatial and temporal resolution of earth imaging from remote sensing missions. Current dissemination systems are typically designed for mission efficiency rather than supporting the various needs of diverse user communities. This promotes the building of multiple archives of the same dataset by end users who acquire the skills needed to establish and maintain their own data streams. Such processing often becomes a barrier to the adoption of new datasets. This presentation describes the development of an operational geodatabase service for the dissemination of raster time series. The service combines innovative geocoding schemes with traditional database and geospatial capabilities to facilitate direct access to raster time series. It includes functionality such as search and retrieval, data segmentation, trend analysis and direct integration into third-party applications using predefined data schemas. The service allows end users to interact with data using simple web-based tools without the need for complex data processing skills. A live implementation of the service is demonstrated using sample global environmental datasets.

  5. Toward automatic time-series forecasting using neural networks.

    PubMed

    Yan, Weizhong

    2012-07-01

    Over the past few decades, application of artificial neural networks (ANN) to time-series forecasting (TSF) has been growing rapidly due to several unique features of ANN models. However, to date, a consistent ANN performance over different studies has not been achieved. Many factors contribute to the inconsistency in the performance of neural network models. One such factor is that ANN modeling involves determining a large number of design parameters, and the current design practice is essentially heuristic and ad hoc, this does not exploit the full potential of neural networks. Systematic ANN modeling processes and strategies for TSF are, therefore, greatly needed. Motivated by this need, this paper attempts to develop an automatic ANN modeling scheme. It is based on the generalized regression neural network (GRNN), a special type of neural network. By taking advantage of several GRNN properties (i.e., a single design parameter and fast learning) and by incorporating several design strategies (e.g., fusing multiple GRNNs), we have been able to make the proposed modeling scheme to be effective for modeling large-scale business time series. The initial model was entered into the NN3 time-series competition. It was awarded the best prediction on the reduced dataset among approximately 60 different models submitted by scholars worldwide.

  6. Time series analysis for psychological research: examining and forecasting change.

    PubMed

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  7. Time series analysis for psychological research: examining and forecasting change

    PubMed Central

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  8. Genetic programming and serial processing for time series classification.

    PubMed

    Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I

    2014-01-01

    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.

  9. Cross-sample entropy of foreign exchange time series

    NASA Astrophysics Data System (ADS)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  10. Characterization of aggressive prostate cancer using ultrasound RF time series

    NASA Astrophysics Data System (ADS)

    Khojaste, Amir; Imani, Farhad; Moradi, Mehdi; Berman, David; Siemens, D. Robert; Sauerberi, Eric E.; Boag, Alexander H.; Abolmaesumi, Purang; Mousavi, Parvin

    2015-03-01

    Prostate cancer is the most prevalently diagnosed and the second cause of cancer-related death in North American men. Several approaches have been proposed to augment detection of prostate cancer using different imaging modalities. Due to advantages of ultrasound imaging, these approaches have been the subject of several recent studies. This paper presents the results of a feasibility study on differentiating between lower and higher grade prostate cancer using ultrasound RF time series data. We also propose new spectral features of RF time series to highlight aggressive prostate cancer in small ROIs of size 1 mm × 1 mm in a cohort of 19 ex vivo specimens of human prostate tissue. In leave-one-patient-out cross-validation strategy, an area under accumulated ROC curve of 0.8 has been achieved with overall sensitivity and specificity of 81% and 80%, respectively. The current method shows promising results on differentiating between lower and higher grade of prostate cancer using ultrasound RF time series.

  11. The QuakeSim System for GPS Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Granat, R. A.; Gao, X.; Pierce, M.; Wang, J.

    2010-12-01

    We present a system for analysis of GPS time series data available to geosciences users through a web services / web portal interface. The system provides two time series analysis methods, one based on hidden Markov model (HMM) segmentation, the other based on covariance descriptor analysis (CDA). In addition, it provides data pre-processing routines that perform spike noise removal, linear de-trending, sum-of-sines removal, and common mode removal using probabilistic principle components analysis (PPCA). These components can be composed by the user into the desired series of processing steps for analysis through an intuitive graphical interface. The system is accessed through a web portal that allows both micro-scale (individual station) and macro-scale (whole network) exploration of data sets and analysis results via Google Maps. Users can focus in on or scroll through particular spatial or temporal time windows, or observe dynamic behavior by created movies that display the system state. Analysis results can be exported to KML format for easy combination with other sources of data, such as fault databases and InSAR interferograms. GPS solutions for California member stations of the plate boundary observatory from both the SOPAC and JPL gipsy context groups are automatically imported into the system as that data becomes available. We show the results of the methods as applied to these data sets for an assortment of case studies, and show how the system can be used to analyze both seismic and aseismic signals.

  12. Robust, automatic GPS station velocities and velocity time series

    NASA Astrophysics Data System (ADS)

    Blewitt, G.; Kreemer, C.; Hammond, W. C.

    2014-12-01

    Automation in GPS coordinate time series analysis makes results more objective and reproducible, but not necessarily as robust as the human eye to detect problems. Moreover, it is not a realistic option to manually scan our current load of >20,000 time series per day. This motivates us to find an automatic way to estimate station velocities that is robust to outliers, discontinuities, seasonality, and noise characteristics (e.g., heteroscedasticity). Here we present a non-parametric method based on the Theil-Sen estimator, defined as the median of velocities vij=(xj-xi)/(tj-ti) computed between all pairs (i, j). Theil-Sen estimators produce statistically identical solutions to ordinary least squares for normally distributed data, but they can tolerate up to 29% of data being problematic. To mitigate seasonality, our proposed estimator only uses pairs approximately separated by an integer number of years (N-δt)<(tj-ti )<(N+δt), where δt is chosen to be small enough to capture seasonality, yet large enough to reduce random error. We fix N=1 to maximally protect against discontinuities. In addition to estimating an overall velocity, we also use these pairs to estimate velocity time series. To test our methods, we process real data sets that have already been used with velocities published in the NA12 reference frame. Accuracy can be tested by the scatter of horizontal velocities in the North American plate interior, which is known to be stable to ~0.3 mm/yr. This presents new opportunities for time series interpretation. For example, the pattern of velocity variations at the interannual scale can help separate tectonic from hydrological processes. Without any step detection, velocity estimates prove to be robust for stations affected by the Mw7.2 2010 El Mayor-Cucapah earthquake, and velocity time series show a clear change after the earthquake, without any of the usual parametric constraints, such as relaxation of postseismic velocities to their preseismic values.

  13. Connectionist Architectures for Time Series Prediction of Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Weigend, Andreas Sebastian

    We investigate the effectiveness of connectionist networks for predicting the future continuation of temporal sequences. The problem of overfitting, particularly serious for short records of noisy data, is addressed by the method of weight-elimination: a term penalizing network complexity is added to the usual cost function in back-propagation. We describe the dynamics of the procedure and clarify the meaning of the parameters involved. From a Bayesian perspective, the complexity term can be usefully interpreted as an assumption about prior distribution of the weights. We analyze three time series. On the benchmark sunspot series, the networks outperform traditional statistical approaches. We show that the network performance does not deteriorate when there are more input units than needed. In the second example, the notoriously noisy foreign exchange rates series, we pick one weekday and one currency (DM vs. US). Given exchange rate information up to and including a Monday, the task is to predict the rate for the following Tuesday. Weight-elimination manages to extract a significant part of the dynamics and makes the solution interpretable. In the third example, the networks predict the resource utilization of a chaotic computational ecosystem for hundreds of steps forward in time.

  14. Unraveling the cause-effect relation between time series.

    PubMed

    Liang, X San

    2014-11-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion, namely, information flow, we solve an inverse problem and give this important and challenging question, which is of interest in a wide variety of disciplines, a positive answer. Here causality is measured by the time rate of information flowing from one series to the other. The resulting formula is tight in form, involving only commonly used statistics, namely, sample covariances; an immediate corollary is that causation implies correlation, but correlation does not imply causation. It has been validated with touchstone linear and nonlinear series, purportedly generated with one-way causality that evades the traditional approaches. It has also been applied successfully to the investigation of real-world problems; an example presented here is the cause-and-effect relation between the two climate modes, El Niño and the Indian Ocean Dipole (IOD), which have been linked to hazards in far-flung regions of the globe. In general, the two modes are mutually causal, but the causality is asymmetric: El Niño tends to stabilize IOD, while IOD functions to make El Niño more uncertain. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean.

  15. Time-series animation techniques for visualizing urban growth

    USGS Publications Warehouse

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  16. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  17. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  18. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    NASA Technical Reports Server (NTRS)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  19. Inverse sequential procedures for the monitoring of time series

    NASA Technical Reports Server (NTRS)

    Radok, Uwe; Brown, Timothy J.

    1995-01-01

    When one or more new values are added to a developing time series, they change its descriptive parameters (mean, variance, trend, coherence). A 'change index (CI)' is developed as a quantitative indicator that the changed parameters remain compatible with the existing 'base' data. CI formulate are derived, in terms of normalized likelihood ratios, for small samples from Poisson, Gaussian, and Chi-Square distributions, and for regression coefficients measuring linear or exponential trends. A substantial parameter change creates a rapid or abrupt CI decrease which persists when the length of the bases is changed. Except for a special Gaussian case, the CI has no simple explicit regions for tests of hypotheses. However, its design ensures that the series sampled need not conform strictly to the distribution form assumed for the parameter estimates. The use of the CI is illustrated with both constructed and observed data samples, processed with a Fortran code 'Sequitor'.

  20. Removing atmosphere loading effect from GPS time series

    NASA Astrophysics Data System (ADS)

    Tiampo, K. F.; Samadi Alinia, H.; Samsonov, S. V.; Gonzalez, P. J.

    2015-12-01

    The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various approaches available for compensating ionosphere path delay cannot be used for removal of the tropospheric component. Quantifying the tropospheric delay plays an important role for determination of the vertical GPS component precision, as tropospheric parameters over a large distance have very little correlation with each other. Several methods have been proposed for tropospheric signal elimination from GPS vertical time series. Here we utilize surface temperature fluctuations and seasonal variations in water vapour and air pressure data for various spatial and temporal profiles in order to more accurately remove the atmospheric path delay [Samsonov et al., 2014]. In this paper, we model the atmospheric path delay of vertical position time series by analyzing the signal in the frequency domain and study its dependency on topography in eastern Ontario for the time period from January 2008 to December 2012. Systematic dependency of amplitude of atmospheric path delay as a function of height and its temporal variations based on the development of a new, physics-based model relating tropospheric/atmospheric effects with topography and can help in determining the most accurate GPS position.The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various

  1. Monitoring Forest Regrowth Using a Multi-Platform Time Series

    NASA Technical Reports Server (NTRS)

    Sabol, Donald E., Jr.; Smith, Milton O.; Adams, John B.; Gillespie, Alan R.; Tucker, Compton J.

    1996-01-01

    Over the past 50 years, the forests of western Washington and Oregon have been extensively harvested for timber. This has resulted in a heterogeneous mosaic of remaining mature forests, clear-cuts, new plantations, and second-growth stands that now occur in areas that formerly were dominated by extensive old-growth forests and younger forests resulting from fire disturbance. Traditionally, determination of seral stage and stand condition have been made using aerial photography and spot field observations, a methodology that is not only time- and resource-intensive, but falls short of providing current information on a regional scale. These limitations may be solved, in part, through the use of multispectral images which can cover large areas at spatial resolutions in the order of tens of meters. The use of multiple images comprising a time series potentially can be used to monitor land use (e.g. cutting and replanting), and to observe natural processes such as regeneration, maturation and phenologic change. These processes are more likely to be spectrally observed in a time series composed of images taken during different seasons over a long period of time. Therefore, for many areas, it may be necessary to use a variety of images taken with different imaging systems. A common framework for interpretation is needed that reduces topographic, atmospheric, instrumental, effects as well as differences in lighting geometry between images. The present state of remote-sensing technology in general use does not realize the full potential of the multispectral data in areas of high topographic relief. For example, the primary method for analyzing images of forested landscapes in the Northwest has been with statistical classifiers (e.g. parallelepiped, nearest-neighbor, maximum likelihood, etc.), often applied to uncalibrated multispectral data. Although this approach has produced useful information from individual images in some areas, landcover classes defined by these

  2. Loading effects in GPS vertical displacement time series

    NASA Astrophysics Data System (ADS)

    Memin, A.; Boy, J. P.; Santamaría-Gómez, A.; Watson, C.; Gravelle, M.; Tregoning, P.

    2015-12-01

    Surface deformations due to loading, with yet no comprehensive representation, account for a significant part of the variability in geodetic time series. We assess effects of loading in GPS vertical displacement time series at several frequency bands. We compare displacement derived from up-to-date loading models to two global sets of positioning time series, and investigate how they are reduced looking at interannual periods (> 2 months), intermediate periods (> 7 days) and the whole spectrum (> 1day). We assess the impact of interannual loading on estimating velocities. We compute atmospheric loading effects using surface pressure fields from the ECMWF. We use the inverted barometer (IB) hypothesis valid for periods exceeding a week to describe the ocean response to the pressure forcing. We used general circulation ocean model (ECCO and GLORYS) to account for wind, heat and fresh water flux. We separately use the Toulouse Unstructured Grid Ocean model (TUGO-m), forced by air pressure and winds, to represent the dynamics of the ocean response at high frequencies. The continental water storage is described using GLDAS/Noah and MERRA-land models. Non-hydrology loading reduces the variability of the observed vertical displacement differently according to the frequency band. The hydrology loading leads to a further reduction mostly at annual periods. ECMWF+TUGO-m better agrees with vertical surface motion than the ECMWF+IB model at all frequencies. The interannual deformation is time-correlated at most of the locations. It is adequately described by a power-law process of spectral index varying from -1.5 to -0.2. Depending on the power-law parameters, the predicted non-linear deformation due to mass loading variations leads to vertical velocity biases up to 0.7 mm/yr when estimated from 5 years of continuous observations. The maximum velocity bias can reach up to 1 mm/yr in regions around the southern Tropical band.

  3. Homogenization of historical time series on a subdaily scale

    NASA Astrophysics Data System (ADS)

    Kocen, Renate; Brönnimann, Stefan; Breda, Leila; Spadin, Reto; Begert, Michael; Füllemann, Christine

    2010-05-01

    Homogeneous long-term climatological time series provide useful information on climate back to the preindustrial era. High temporal resolution of climate data is desirable to address trends and variability in the mean climate and in climatic extremes. For Switzerland, three long (~250 yrs) historical time series (Basel, Geneva, Gr. St. Bernhard) that were hitherto available in the form of monthly means only have recently been digitized (in cooperation with MeteoSwiss) on a subdaily scale. The digitized time series contain subdaily data (varies from 2-5 daily measurements) on temperature, precipitation/snow height, pressure and humidity, as subdaily descriptions on wind direction, wind speeds and cloud cover. Long-term climatological records often contain inhomogeneities due to non climatic changes such as station relocations, changes in instrumentation and instrument exposure, changes in observing schedules/practices and environmental changes in the proximity of the observation site. Those disturbances can distort or hide the true climatic signal and could seriously affect the correct assessment and analysis of climate trends, variability and climatic extremes. It is therefore crucial to detect and eliminate artificial shifts and trends, to the extent possible, in the climate data prior to its application. Detailed information of the station history and instruments (metadata) can be of fundamental importance in the process of homogenization in order to support the determination of the exact time of inhomogeneities and the interpretation of statistical test results. While similar methods can be used for the detection of inhomogeneities in subdaily or monthly mean data, quite different correction methods can be chosen. The wealth of information in a high temporal resolution allows more physics-based correction methods. For instance, a detected radiation error in temperature can be corrected with an error model that incorporates radiation and ventilation terms using

  4. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  5. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  6. Improvement in global forecast for chaotic time series

    NASA Astrophysics Data System (ADS)

    Alves, P. R. L.; Duarte, L. G. S.; da Mota, L. A. C. P.

    2016-10-01

    In the Polynomial Global Approach to Time Series Analysis, the most costly (computationally speaking) step is the finding of the fitting polynomial. Here we present two routines that improve the forecasting. In the first, an algorithm that greatly improves this situation is introduced and implemented. The heart of this procedure is implemented on the specific routine which performs a mapping with great efficiency. In comparison with the similar procedure of the TimeS package developed by Carli et al. (2014), an enormous gain in efficiency and an increasing in accuracy are obtained. Another development in this work is the establishment of a level of confidence in global prediction with a statistical test for evaluating if the minimization performed is suitable or not. The other program presented in this article applies the Shapiro-Wilk test for checking the normality of the distribution of errors and calculates the expected deviation. The development is employed in observed and simulated time series to illustrate the performance obtained.

  7. Long-term time series prediction using OP-ELM.

    PubMed

    Grigorievskiy, Alexander; Miche, Yoan; Ventelä, Anne-Mari; Séverin, Eric; Lendasse, Amaury

    2014-03-01

    In this paper, an Optimally Pruned Extreme Learning Machine (OP-ELM) is applied to the problem of long-term time series prediction. Three known strategies for the long-term time series prediction i.e. Recursive, Direct and DirRec are considered in combination with OP-ELM and compared with a baseline linear least squares model and Least-Squares Support Vector Machines (LS-SVM). Among these three strategies DirRec is the most time consuming and its usage with nonlinear models like LS-SVM, where several hyperparameters need to be adjusted, leads to relatively heavy computations. It is shown that OP-ELM, being also a nonlinear model, allows reasonable computational time for the DirRec strategy. In all our experiments, except one, OP-ELM with DirRec strategy outperforms the linear model with any strategy. In contrast to the proposed algorithm, LS-SVM behaves unstably without variable selection. It is also shown that there is no superior strategy for OP-ELM: any of three can be the best. In addition, the prediction accuracy of an ensemble of OP-ELM is studied and it is shown that averaging predictions of the ensemble can improve the accuracy (Mean Square Error) dramatically.

  8. Synthesis of rainfall time series in a high temporal resolution

    NASA Astrophysics Data System (ADS)

    Callau Poduje, Ana Claudia; Haberlandt, Uwe

    2014-05-01

    In order to optimize the design and operation of urban drainage systems, long and continuous rain series in a high temporal resolution are essential. As the length of the rainfall records is often short, particularly the data available with the temporal and regional resolutions required for urban hydrology, it is necessary to find some numerical representation of the precipitation phenomenon to generate long synthetic rainfall series. An Alternating Renewal Model (ARM) is applied for this purpose, which consists of two structures: external and internal. The former is the sequence of wet and dry spells, described by their durations which are simulated stochastically. The internal structure is characterized by the amount of rain corresponding to each wet spell and its distribution within the spell. A multivariate frequency analysis is applied to analyze the internal structure of the wet spells and to generate synthetic events. The stochastic time series must reproduce the statistical characteristics of observed high resolution precipitation measurements used to generate them. The spatio-temporal interdependencies between stations are addressed by resampling the continuous synthetic series based on the Simulated Annealing (SA) procedure. The state of Lower-Saxony and surrounding areas, located in the north-west of Germany is used to develop the ARM. A total of 26 rainfall stations with high temporal resolution records, i.e. rainfall data every 5 minutes, are used to define the events, find the most suitable probability distributions, calibrate the corresponding parameters, simulate long synthetic series and evaluate the results. The length of the available data ranges from 10 to 20 years. The rainfall series involved in the different steps of calculation are compared using a rainfall-runoff model to simulate the runoff behavior in urban areas. The EPA Storm Water Management Model (SWMM) is applied for this evaluation. The results show a good representation of the

  9. Comparison of statistical models for analyzing wheat yield time series.

    PubMed

    Michel, Lucie; Makowski, David

    2013-01-01

    The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha⁻¹ year⁻¹ in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale.

  10. Mapping Brazilian savanna vegetation gradients with Landsat time series

    NASA Astrophysics Data System (ADS)

    Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick

    2016-10-01

    Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important

  11. Comparison of Statistical Models for Analyzing Wheat Yield Time Series

    PubMed Central

    Michel, Lucie; Makowski, David

    2013-01-01

    The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha−1 year−1 in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale. PMID:24205280

  12. Aerosol Climate Time Series Evaluation In ESA Aerosol_cci

    NASA Astrophysics Data System (ADS)

    Popp, T.; de Leeuw, G.; Pinnock, S.

    2015-12-01

    Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. By the end of 2015 full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which are also validated. The paper will summarize and discuss the results of major reprocessing and validation conducted in 2015. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products

  13. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    PubMed Central

    Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli

    2016-01-01

    In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices. PMID:27293423

  14. Assemblage time series reveal biodiversity change but not systematic loss.

    PubMed

    Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E

    2014-04-18

    The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority.

  15. Detecting and characterising ramp events in wind power time series

    NASA Astrophysics Data System (ADS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-12-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain.

  16. Artificial neural networks applied to forecasting time series.

    PubMed

    Montaño Moreno, Juan J; Palmer Pol, Alfonso; Muñoz Gracia, Pilar

    2011-04-01

    This study offers a description and comparison of the main models of Artificial Neural Networks (ANN) which have proved to be useful in time series forecasting, and also a standard procedure for the practical application of ANN in this type of task. The Multilayer Perceptron (MLP), Radial Base Function (RBF), Generalized Regression Neural Network (GRNN), and Recurrent Neural Network (RNN) models are analyzed. With this aim in mind, we use a time series made up of 244 time points. A comparative study establishes that the error made by the four neural network models analyzed is less than 10%. In accordance with the interpretation criteria of this performance, it can be concluded that the neural network models show a close fit regarding their forecasting capacity. The model with the best performance is the RBF, followed by the RNN and MLP. The GRNN model is the one with the worst performance. Finally, we analyze the advantages and limitations of ANN, the possible solutions to these limitations, and provide an orientation towards future research.

  17. Monthly hail time series analysis related to agricultural insurance

    NASA Astrophysics Data System (ADS)

    Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.

    2010-05-01

    Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.

  18. Quantifying evolutionary dynamics from variant-frequency time series

    PubMed Central

    Khatri, Bhavin S.

    2016-01-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series. PMID:27616332

  19. On the maximum-entropy/autoregressive modeling of time series

    NASA Technical Reports Server (NTRS)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  20. Quantifying evolutionary dynamics from variant-frequency time series

    NASA Astrophysics Data System (ADS)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  1. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    NASA Astrophysics Data System (ADS)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  2. FROG: Time Series Analysis for the Web Service Era

    NASA Astrophysics Data System (ADS)

    Allan, A.

    2005-12-01

    The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).

  3. A Mixed Exponential Time Series Model. NMEARMA(p,q).

    DTIC Science & Technology

    1980-03-01

    AD-AO85 316 NAVAL POSTGRADUATE SCHOOL MONTEREY CA F/G 12/1 A MIXED EXPONENTIAL TIME SERIES MODEL. NMEARMA(P,Q).(U MAR GO A .J LAWRANCE , P A LEWIS...This report was prepared by: A. J. Lawrance University of Birmingham Birmingham, England Reviewed by: Released by- Michael G. Sover’ign, Chirman...MODEL, NMEARMA(p,q) by A. J. Lawrance P. A. W. Lewis University of Birmingham Naval Postgraduate School Birmingham, England Monterey, California, USA

  4. Ensemble Deep Learning for Biomedical Time Series Classification

    PubMed Central

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost. PMID:27725828

  5. Modified correlation entropy estimation for a noisy chaotic time series.

    PubMed

    Jayawardena, A W; Xu, Pengcheng; Li, W K

    2010-06-01

    A method of estimating the Kolmogorov-Sinai (KS) entropy, herein referred to as the modified correlation entropy, is presented. The method can be applied to both noise-free and noisy chaotic time series. It has been applied to some clean and noisy data sets and the numerical results show that the modified correlation entropy is closer to the KS entropy of the nonlinear system calculated by the Lyapunov spectrum than the general correlation entropy. Moreover, the modified correlation entropy is more robust to noise than the correlation entropy.

  6. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    NASA Astrophysics Data System (ADS)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  7. On the Prediction of α-Stable Time Series

    NASA Astrophysics Data System (ADS)

    Mohammadi, Mohammad; Mohammadpour, Adel

    2016-07-01

    This paper addresses the point prediction of α-stable time series. Our key idea is to define a new Hilbert space that contains α-stable processes. Then, we apply the advantage of Hilbert space theory for finding the best linear prediction. We show how to use the presented predictor practically for α-stable linear processes. The implementation of the presented method is easier than the implementation of the minimum dispersion method. We reveal the appropriateness of the presented method through an empirical study on predicting the natural logarithms of the volumes of SP500 market.

  8. Time series analysis using semiparametric regression on oil palm production

    NASA Astrophysics Data System (ADS)

    Yundari, Pasaribu, U. S.; Mukhaiyar, U.

    2016-04-01

    This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).

  9. Surrogate-assisted network analysis of nonlinear time series

    NASA Astrophysics Data System (ADS)

    Laut, Ingo; Räth, Christoph

    2016-10-01

    The performance of recurrence networks and symbolic networks to detect weak nonlinearities in time series is compared to the nonlinear prediction error. For the synthetic data of the Lorenz system, the network measures show a comparable performance. In the case of relatively short and noisy real-world data from active galactic nuclei, the nonlinear prediction error yields more robust results than the network measures. The tests are based on surrogate data sets. The correlations in the Fourier phases of data sets from some surrogate generating algorithms are also examined. The phase correlations are shown to have an impact on the performance of the tests for nonlinearity.

  10. Best linear forecast of volatility in financial time series

    NASA Astrophysics Data System (ADS)

    Krivoruchenko, M. I.

    2004-09-01

    The autocorrelation function of volatility in financial time series is fitted well by a superposition of several exponents. This case admits an explicit analytical solution of the problem of constructing the best linear forecast of a stationary stochastic process. We describe and apply the proposed analytical method for forecasting volatility. The leverage effect and volatility clustering are taken into account. Parameters of the predictor function are determined numerically for the Dow Jones 30 Industrial Average. Connection of the proposed method to the popular autoregressive conditional heteroskedasticity models is discussed.

  11. Disease management with ARIMA model in time series.

    PubMed

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  12. Kernel canonical-correlation Granger causality for multiple time series.

    PubMed

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  13. Ensemble Deep Learning for Biomedical Time Series Classification.

    PubMed

    Jin, Lin-Peng; Dong, Jun

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  14. Chaotic time series analysis in economics: Balance and perspectives

    SciTech Connect

    Faggini, Marisa

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  15. Exploring the Dynamics of Personality Change with Time Series Models

    NASA Astrophysics Data System (ADS)

    Keller, Ferdinand; Storch, Maja; Bigler, Susanne

    This paper aims to show possible refinements in time series methods for evaluating the dynamics of personality change. For the study. 13 students attended a course of personality development based on Jungian theory. The course teaches how to contact one's personal self. For four months the students rated their mood, activity, tension, and feeling of inner control on visual analogue scales twice a day. Standard examination with ARIMA models yield that most subjects show a low to moderate correlation to the previous timepoint. About one third of the cases have an additional lag2-relation. Daytime effects are seldom and the residual tests for the ARIMA models suggest that these linear models are sufficient in describing most of the time series. To evalute the expected smooth transformations in personality the data from one subject is analysed and the following hypotheses are empirically tested by the time-variation of parameters in subsequent time windows: 1) Increasing stability in mood and in the feeling of inner control by decreasing standard deviations 2) higher innerpsychic coherence by increasing autocorrelation coefficients 3) dissociation between mood and feeling of inner control by decreasing cross-correlation coefficients between these two dimensions. Application of several statistical tests shows that hypothesis 1 can be accepted while the other two hypotheses cannot be confirmed. Some methodological difficulties emerge when applied to `real' data and some limitations are found in the statistical testing of time-varying parameters. Overall, though, the proposed methods for examining emotional variability have proven valuable and promising for further research.

  16. Coastal Atmosphere and Sea Time Series (CoASTS)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Berthon, Jean-Francoise; Zibordi, Giuseppe; Doyle, John P.; Grossi, Stefania; vanderLinde, Dirk; Targa, Cristina; McClain, Charles R. (Technical Monitor)

    2002-01-01

    In this document, the first three years of a time series of bio-optical marine and atmospheric measurements are presented and analyzed. These measurements were performed from an oceanographic tower in the northern Adriatic Sea within the framework of the Coastal Atmosphere and Sea Time Series (CoASTS) project, an ocean color calibration and validation activity. The data set collected includes spectral measurements of the in-water apparent (diffuse attenuation coefficient, reflectance, Q-factor, etc.) and inherent (absorption and scattering coefficients) optical properties, as well as the concentrations of the main optical components (pigment and suspended matter concentrations). Clear seasonal patterns are exhibited by the marine quantities on which an appreciable short-term variability (on the order of a half day to one day) is superimposed. This short-term variability is well correlated with the changes in salinity at the surface resulting from the southward transport of freshwater coming from the northern rivers. Concentrations of chlorophyll alpha and total suspended matter span more than two orders of magnitude. The bio-optical characteristics of the measurement site pertain to both Case-I (about 64%) and Case-II (about 36%) waters, based on a relationship between the beam attenuation coefficient at 660nm and the chlorophyll alpha concentration. Empirical algorithms relating in-water remote sensing reflectance ratios and optical components or properties of interest (chlorophyll alpha, total suspended matter, and the diffuse attenuation coefficient) are presented.

  17. Assessing earthquake catalogues in Venezuela by analyzing time series data

    NASA Astrophysics Data System (ADS)

    Vasquez, R.; Granado, C.

    2011-12-01

    We applied the Mann-Kendall non-parametric test for identifying significant trends in time series data regarding the seismicity patterns in Venezuela during the period 2001-2010. The entire seismicity region is divided in three areas to perform the test: 1) West with 12774 seismic events; 2) Center for a total of 909 earthquakes and 3) East with 6382 earthquakes. We analyzed the catalogues for every sub region to obtain the b value of the Gutenberg-Richter law based on the maximum likelihood method and the annual magnitude of completeness (Mc) by using the maximum curvature method (MAXC). We assessed statistically the analysis of Z for the time series consisting of the b value and Mc in the three subsets of earthquakes. The confidence interval of this study was 90%. This approach is useful to analyze the performance characteristics of the Venezuelan seismic network and the associated regional catalogues. The results lead to conclude that the Central part of Venezuela does not show an statistically significant trend of the seismicity or Mc, while western region has a decreasing trend in the Mc estimation but no variations in terms of the seismicity. Only the Eastern region presents an increasing trend in its seismicity and Mc values.

  18. Time series clustering analysis of health-promoting behavior

    NASA Astrophysics Data System (ADS)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  19. Time series modelling and forecasting of emergency department overcrowding.

    PubMed

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  20. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  1. Efficient Bayesian inference for natural time series using ARFIMA processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-11-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. In this paper we present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators. For CET we also extend our method to seasonal long memory.

  2. Software for detection and correction of inhomogeneities in time series

    NASA Astrophysics Data System (ADS)

    Stepanek, Petr

    2010-05-01

    During the last decade, software package consisting of AnClim, ProClimDB and LoadData software for processing climatological data has been created. This software offers complex solution in processing climatological time series, starting from loading data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme values evaluation and model outputs verification (ProClimDB and AnClim software). In recent years tools for correction of inhomogeneites in daily data was introduced. Partly methods already programmed in R (e.g. by Christine Gruber, ZAMG) like HOM of Paul Della-Marta and SPLIDHOM method of Olivier Mestre or own methods are available, some of them being able to apply multi-element approach (using e.g. weather types). Available methods can be easily compared and evaluated (both for inhomogeneity detection or correction in this case). Comparison of the available correction methods is also current task of ongoing COST action ESO601 (www. homogenisation.org). Further methods, if available under R, can be easily linked with the software and then the whole processing can benefit from user-friendly environment in which all the most commonly used functions for data handling and climatological processing are available (read more at www.climahom.eu).

  3. On the Reconstruction of Irregularly Sampled Time Series

    NASA Astrophysics Data System (ADS)

    Vio, Roberto; Strohmer, Thomas; Wamsteker, Willem

    2000-01-01

    We consider the question of numerical treatment of irregularly sampled time series. This problem is quite common in astronomy because of factors such as the day-night alternation, weather conditions, nonobservability of the objects under study, etc. For this reason an extensive literature is available on this subject. Most of the proposed techniques, however, are based on heuristic arguments, and their usefulness is essentially in the estimation of power spectra and/or autocovariance functions. Here we propose an approach, based on the reasonable assumption that many signals of astronomical interest are the realization of band-limited processes, which can be used to fill gaps in experimental time series. By using this approach we propose several reconstruction algorithms that, because of their regularization properties, yield reliable signal reconstructions even in case of noisy data and large gaps. A detailed description of these algorithms is provided, their theoretical implications are considered, and their practical performances are tested via numerical experiments. MATLAB software implementing the methods described in this work is obtainable by request from the authors.

  4. Automatising the analysis of stochastic biochemical time-series

    PubMed Central

    2015-01-01

    Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821

  5. Financial Time Series Prediction Using Spiking Neural Networks

    PubMed Central

    Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam

    2014-01-01

    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two “traditional”, rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments. PMID:25170618

  6. Predicting physical time series using dynamic ridge polynomial neural networks.

    PubMed

    Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir

    2014-01-01

    Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques.

  7. Disentangling the stochastic behavior of complex time series

    PubMed Central

    Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus

    2016-01-01

    Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events – or jumps – with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes. PMID:27759055

  8. Unsupervised Classification During Time-Series Model Building.

    PubMed

    Gates, Kathleen M; Lane, Stephanie T; Varangis, E; Giovanello, K; Guiskewicz, K

    2016-12-07

    Researchers who collect multivariate time-series data across individuals must decide whether to model the dynamic processes at the individual level or at the group level. A recent innovation, group iterative multiple model estimation (GIMME), offers one solution to this dichotomy by identifying group-level time-series models in a data-driven manner while also reliably recovering individual-level patterns of dynamic effects. GIMME is unique in that it does not assume homogeneity in processes across individuals in terms of the patterns or weights of temporal effects. However, it can be difficult to make inferences from the nuances in varied individual-level patterns. The present article introduces an algorithm that arrives at subgroups of individuals that have similar dynamic models. Importantly, the researcher does not need to decide the number of subgroups. The final models contain reliable group-, subgroup-, and individual-level patterns that enable generalizable inferences, subgroups of individuals with shared model features, and individual-level patterns and estimates. We show that integrating community detection into the GIMME algorithm improves upon current standards in two important ways: (1) providing reliable classification and (2) increasing the reliability in the recovery of individual-level effects. We demonstrate this method on functional MRI from a sample of former American football players.

  9. Financial time series prediction using spiking neural networks.

    PubMed

    Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam

    2014-01-01

    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  10. Predicting Physical Time Series Using Dynamic Ridge Polynomial Neural Networks

    PubMed Central

    Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir

    2014-01-01

    Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques. PMID:25157950

  11. Hybrid perturbation methods based on statistical time series models

    NASA Astrophysics Data System (ADS)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  12. Disentangling the stochastic behavior of complex time series

    NASA Astrophysics Data System (ADS)

    Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus

    2016-10-01

    Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events – or jumps – with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes.

  13. Integration of Remote Sensing derived Actual Evapotranspiration with Meteorological Data for Real Time Demand Forecasting in Semi-arid Regions

    NASA Astrophysics Data System (ADS)

    Ullah, M. K.; Hafeez, M. M.; Chemin, Y.; Faux, R.; Sixsmith, J.

    2010-12-01

    Irrigated agriculture is major consumer of fresh water, but a large part of the water devour for irrigation is wasted due to poor management of irrigation systems. Improving water management in irrigated areas require the analysis of real time water demand in order to determine the possibilities in which it may be modified and rationalised. Real time water demand information in irrigated areas is a key for planning about sustainable use of irrigation water. These activities are needed not only to improve water productivity, but also to increase the sustainability of irrigated agriculture by saving irrigation water. Demand forecasting entail the complete understanding of spatial and expected temporal variability of metrological parameters and evapotranspiration (ET). ET is the overriding aspect for irrigation demand forecasting at farm to catchment scale. Many models have been used to measure the ET rate, either empirical or functional. The major disadvantage of this approach is that most methods generate only point values, resulting in estimates that are not representative of large areas. These methods are based on crop factors under ideal conditions and cannot therefore represent actual crop ET. Satellite remote sensing is a powerful mean to estimate ET over various spatial and temporal scales. For improved irrigation system management and operation, a holistic approach of integrating remote sensing derived ET from SAM-ET (spatial algorithm for mapping ET) algorithm, for Australian agro-ecosystem, with forecasted meteorological data and field application loss functions for major crops were used to forecast actual water demand in Coleambally Irrigation Area (CIA), New South Wales, Australia. It covers approximately 79,000 ha of intensive irrigation and comprise of number of secondary and tertiary canals. In order to capture the spatial variability, CIA has been divided into 22 nodes based on direction of flow and connectivity. All hydrological data of inflow (i

  14. Two algorithms to fill cloud gaps in LST time series

    NASA Astrophysics Data System (ADS)

    Frey, Corinne; Kuenzer, Claudia

    2013-04-01

    Cloud contamination is a challenge for optical remote sensing. This is especially true for the recording of a fast changing radiative quantity like land surface temperature (LST). The substitution of cloud contaminated pixels with estimated values - gap filling - is not straightforward but possible to a certain extent, as this research shows for medium-resolution time series of MODIS data. Area of interest is the Upper Mekong Delta (UMD). The background for this work is an analysis of the temporal development of 1-km LST in the context of the WISDOM project. The climate of the UMD is characterized by peak rainfalls in the summer months, which is also the time where cloud contamination is highest in the area. Average number of available daytime observations per pixel can go down to less than five for example in the month of June. In winter the average number may reach 25 observations a month. This situation is not appropriate to the calculation of longterm statistics; an adequate gap filling method should be used beforehand. In this research, two different algorithms were tested on an 11 year time series: 1) a gradient based algorithm and 2) a method based on ECMWF era interim re-analysis data. The first algorithm searches for stable inter-image gradients from a given environment and for a certain period of time. These gradients are then used to estimate LST for cloud contaminated pixels in each acquisition. The estimated LSTs are clear-sky LSTs and solely based on the MODIS LST time series. The second method estimates LST on the base of adapted ECMWF era interim skin temperatures and creates a set of expected LSTs. The estimated values were used to fill the gaps in the original dataset, creating two new daily, 1 km datasets. The maps filled with the gradient based method had more than the double amount of valid pixels than the original dataset. The second method (ECMWF era interim based) was able to fill all data gaps. From the gap filled data sets then monthly

  15. Dependency Structures in Differentially Coded Cardiovascular Time Series

    PubMed Central

    Tasic, Tatjana; Jovanovic, Sladjana; Mohamoud, Omer; Skoric, Tamara; Japundzic-Zigon, Nina

    2017-01-01

    Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar rats and from spontaneous hypertensive rats, aged 3 months and 12 months. Results. The highest level of comonotonic behavior of pulse interval with respect to systolic blood pressure is observed at time lags τ = 0, 3, and 4, while a strong counter-monotonic behavior occurs at time lags τ = 1 and 2. Conclusion. Dynamic range of aging rats is considerably reduced in hypertensive groups. Conditional entropy of systolic blood pressure signal, compared to unconditional, shows an increased level of discrepancy, except for a time lag 1, where the equality is preserved in spite of the memory of differential coder. The antiparallel streams play an important role at single beat time lag. PMID:28127384

  16. Time series trends of the safety effects of pavement resurfacing.

    PubMed

    Park, Juneyoung; Abdel-Aty, Mohamed; Wang, Jung-Han

    2017-04-01

    This study evaluated the safety performance of pavement resurfacing projects on urban arterials in Florida using the observational before and after approaches. The safety effects of pavement resurfacing were quantified in the crash modification factors (CMFs) and estimated based on different ranges of heavy vehicle traffic volume and time changes for different severity levels. In order to evaluate the variation of CMFs over time, crash modification functions (CMFunctions) were developed using nonlinear regression and time series models. The results showed that pavement resurfacing projects decrease crash frequency and are found to be more safety effective to reduce severe crashes in general. Moreover, the results of the general relationship between the safety effects and time changes indicated that the CMFs increase over time after the resurfacing treatment. It was also found that pavement resurfacing projects for the urban roadways with higher heavy vehicle volume rate are more safety effective than the roadways with lower heavy vehicle volume rate. Based on the exploration and comparison of the developed CMFucntions, the seasonal autoregressive integrated moving average (SARIMA) and exponential functional form of the nonlinear regression models can be utilized to identify the trend of CMFs over time.

  17. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  18. Detecting Abrupt Changes in a Piecewise Locally Stationary Time Series

    PubMed Central

    Last, Michael; Shumway, Robert

    2007-01-01

    Non-stationary time series arise in many settings, such as seismology, speech-processing, and finance. In many of these settings we are interested in points where a model of local stationarity is violated. We consider the problem of how to detect these change-points, which we identify by finding sharp changes in the time-varying power spectrum. Several different methods are considered, and we find that the symmetrized Kullback-Leibler information discrimination performs best in simulation studies. We derive asymptotic normality of our test statistic, and consistency of estimated change-point locations. We then demonstrate the technique on the problem of detecting arrival phases in earthquakes. PMID:19190715

  19. A quasi-global precipitation time series for drought monitoring

    USGS Publications Warehouse

    Funk, Chris C.; Peterson, Pete J.; Landsfeld, Martin F.; Pedreros, Diego H.; Verdin, James P.; Rowland, James D.; Romero, Bo E.; Husak, Gregory J.; Michaelsen, Joel C.; Verdin, Andrew P.

    2014-01-01

    Estimating precipitation variations in space and time is an important aspect of drought early warning and environmental monitoring. An evolving drier-than-normal season must be placed in historical context so that the severity of rainfall deficits may quickly be evaluated. To this end, scientists at the U.S. Geological Survey Earth Resources Observation and Science Center, working closely with collaborators at the University of California, Santa Barbara Climate Hazards Group, have developed a quasi-global (50°S–50°N, 180°E–180°W), 0.05° resolution, 1981 to near-present gridded precipitation time series: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) data archive.

  20. Estimation of Hurst Exponent for the Financial Time Series

    NASA Astrophysics Data System (ADS)

    Kumar, J.; Manchanda, P.

    2009-07-01

    Till recently statistical methods and Fourier analysis were employed to study fluctuations in stock markets in general and Indian stock market in particular. However current trend is to apply the concepts of wavelet methodology and Hurst exponent, see for example the work of Manchanda, J. Kumar and Siddiqi, Journal of the Frankline Institute 144 (2007), 613-636 and paper of Cajueiro and B. M. Tabak. Cajueiro and Tabak, Physica A, 2003, have checked the efficiency of emerging markets by computing Hurst component over a time window of 4 years of data. Our goal in the present paper is to understand the dynamics of the Indian stock market. We look for the persistency in the stock market through Hurst exponent and fractal dimension of time series data of BSE 100 and NIFTY 50.

  1. Cluster analysis of long time-series medical datasets

    NASA Astrophysics Data System (ADS)

    Hirano, Shoji; Tsumoto, Shusaku

    2004-04-01

    This paper presents a comparative study about the characteristics of clustering methods for inhomogeneous time-series medical datasets. Using various combinations of comparison methods and grouping methods, we performed clustering experiments of the hepatitis data set and evaluated validity of the results. The results suggested that (1) complete-linkage (CL) criterion in agglomerative hierarchical clustering (AHC) outperformed average-linkage (AL) criterion in terms of the interpretability of a dendrogram and clustering results, (2) combination of dynamic time warping (DTW) and CL-AHC constantly produced interpretable results, (3) combination of DTW and rough clustering (RC) would be used to find the core sequences of the clusters, (4) multiscale matching may suffer from the treatment of 'no-match' pairs, however, the problem may be eluded by using RC as a subsequent grouping method.

  2. Vegetation Dynamics of NW Mexico using MODIS time series data

    NASA Astrophysics Data System (ADS)

    Valdes, M.; Bonifaz, R.; Pelaez, G.; Leyva Contreras, A.

    2010-12-01

    Northwestern Mexico is an area subjected to a combination of marine and continental climatic influences which produce a highly variable vegetation dynamics throughout time. Using Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indices data (NDVI and EVI) from 2001 to 2008, mean and standard deviation image values of the time series were calculated. Using this data, annual vegetation dynamics was characterized based on the different values for the different vegetation types. Annual mean values were compared and inter annual variations or anomalies were analyzed calculating departures of de mean. An anomaly was considered if the value was over or under two standard deviations. Using this procedure it was possible determine spatio-temporal patterns over the study area and relate them to climatic conditions.

  3. Adaptive Sensing of Time Series with Application to Remote Exploration

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Cabrol, Nathalie A.; Furlong, Michael; Hardgrove, Craig; Low, Bryan K. H.; Moersch, Jeffrey; Wettergreen, David

    2013-01-01

    We address the problem of adaptive informationoptimal data collection in time series. Here a remote sensor or explorer agent throttles its sampling rate in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility -- all collected datapoints lie in the past, but its resource allocation decisions require predicting far into the future. Our solution is to continually fit a Gaussian process model to the latest data and optimize the sampling plan on line to maximize information gain. We compare the performance characteristics of stationary and nonstationary Gaussian process models. We also describe an application based on geologic analysis during planetary rover exploration. Here adaptive sampling can improve coverage of localized anomalies and potentially benefit mission science yield of long autonomous traverses.

  4. Behavior of road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin

    2014-12-01

    Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.

  5. The Bermuda Atlantic Time-series Study (BATS): A Time-series Window on Sargasso Sea Ecosystem Functioning

    NASA Astrophysics Data System (ADS)

    Lomas, M. W.

    2001-12-01

    The Bermuda Atlantic Time-series Study (BATS), located in the Northwestern Sargasso Sea, was started over 12 years ago as part of the Joint Global Ocean Flux Study. The BATS sampling region lies ~82km southeast of Bermuda in about 4600 meters of water near the Ocean Flux Program site and the Bermuda Testbed Mooring. Over this 12-year period, a suite of core measurements has been made monthly or biweekly during the winter/spring bloom period (January to April). These measurements cover a wide range of physical, chemical and biological stock measurements. In conjunction with these stock measurements, a number of BATS core rate process measurements are made such as primary and bacterial production, and particle mass flux. Over the 12-year record of this program, numerous ancillary projects have greatly enhanced the significance and interpretability of the core measurements. More importantly, this 12-year time-series data set has provided information that allows us to re-examine some of the dominant paradigms in biological oceanography, namely that the open ocean is an unchanging biological "desert". The past decade has seen a shift in fate of the carbon fixed during primary production. Whereas a significant fraction of photosynthetically fixed carbon accumulated in the dissolved organic pool in the early 1990's, the late 1990's are characterized by a reduction in DOC accumulation and a commensurate >2-fold increase in particle flux from the euphotic zone. This change in the partitioning of primary production appears to be associated with significant changes in phytoplankton community structure and climatic forcing. As the BATS time-series record continues to extend so to will our understanding of the mechanisms responsible for these apparent changes in the functioning of the Sargasso Sea ecosystem.

  6. Time series power flow analysis for distribution connected PV generation.

    SciTech Connect

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  7. Analyzing bank filtration by deconvoluting time series of electric conductivity.

    PubMed

    Cirpka, Olaf A; Fienen, Michael N; Hofer, Markus; Hoehn, Eduard; Tessarini, Aronne; Kipfer, Rolf; Kitanidis, Peter K

    2007-01-01

    Knowing the travel-time distributions from infiltrating rivers to pumping wells is important in the management of alluvial aquifers. Commonly, travel-time distributions are determined by releasing a tracer pulse into the river and measuring the breakthrough curve in the wells. As an alternative, one may measure signals of a time-varying natural tracer in the river and in adjacent wells and infer the travel-time distributions by deconvolution. Traditionally this is done by fitting a parametric function such as the solution of the one-dimensional advection-dispersion equation to the data. By choosing a certain parameterization, it is impossible to determine features of the travel-time distribution that do not follow the general shape of the parameterization, i.e., multiple peaks. We present a method to determine travel-time distributions by nonparametric deconvolution of electric-conductivity time series. Smoothness of the inferred transfer function is achieved by a geostatistical approach, in which the transfer function is assumed as a second-order intrinsic random time variable. Nonnegativity is enforced by the method of Lagrange multipliers. We present an approach to directly compute the best nonnegative estimate and to generate sets of plausible solutions. We show how the smoothness of the transfer function can be estimated from the data. The approach is applied to electric-conductivity measurements taken at River Thur, Switzerland, and five wells in the adjacent aquifer, but the method can also be applied to other time-varying natural tracers such as temperature. At our field site, electric-conductivity fluctuations appear to be an excellent natural tracer.

  8. Challenges to Deriving Climate Time Series From Satellite Observations

    NASA Astrophysics Data System (ADS)

    Wentz, F. J.; Mears, C. A.

    2005-12-01

    Satellites have been observing the Earth's weather and climate since the launch of TIROS-1 in 1960. As satellite and sensor technology advanced over the next two decades, the accuracy of the satellite observations improved to the point of being useful for climate monitoring. The launch of the first Microwave Sounding Unit (MSU) in October 1978 and the first Special Sensor Microwave Imager (SSM/I) in June 1987 mark the beginning of research-quality time series for several important climate state variables, including tropospheric temperature and water vapor, cloud and rain water, and ocean surface winds. In this talk, we will illustrate the many obstacles that must be overcome to convert raw satellite measurements into climate data records. Probably the most pivotal issue is sensor calibration. Although an on-board self-calibrating apparatus is part of each satellite sensor, the accuracy of the calibration system is limited and in some cases unexpected calibration problems occurred on-orbit. The lack of exact calibration leads to a second problem: merging sensors flying on many different satellites into one consistent decadal time series. Also drifts in the satellites' orbits, both in altitude and local time of day, must be carefully taken into account else spurious signals will enter the time series. In addition to these technical difficulties, programmatic problems present a different set of hurdles that must be overcome. The maintenance of a long-term climate record may necessitate sustaining a long-term research activity requiring continuity in both expert staffing and funding. The alternative of computing climate records in an operational rather than research environment creates a new set of problems. As the satellite sensor technology continues to advance into the next decade, new challenges will arise. The new sensors will have different channel sets, viewing geometries, and orbital characteristics. Their complexity will be an order of magnitude greater than

  9. [Outlier Detection of Time Series Three-Dimensional Fluorescence Spectroscopy].

    PubMed

    Yu, Shao-hui; Zhang, Yu-jun; Zhao, Nan-jing

    2015-06-01

    The qualitative and quantitative analysis are often interfered by the outliers in time series three-dimensional fluorescence spectroscopy. In this work, an efficient outlier detection method is proposed by taking advantage of the characteristics in time dimension and the spectral dimension. Firstly, the wavelength points that are mostly the outliers are extracted by the variance in time dimension. Secondly, by the analysis of the existence styles of outliers and similarity score of any two samples, the cumulative similarity is introduced in spectral dimension. At last, fluorescence intensity at each wavelength of all samples is modified by the correction matrix in time dimension and the outlier detection is completed according the to cumulative similarity scores. The application of the correction matrix in time dimension not only improves the validity of the method but also reduces the computation by the choice of characteristics region in correction matrix. Numerical experiments show that the outliers can still be detected by the 50 percent of all points in spectral dimension.

  10. Multi-Granular Trend Detection for Time-Series Analysis.

    PubMed

    Arthur Van, Goethem; Staals, Frank; Loffler, Maarten; Dykes, Jason; Speckmann, Bettina

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data sets. Trend detection is an effective way to simplify time-varying data and to summarize salient information for visual display and interactive analysis. We propose a geometric model for trend-detection in one-dimensional time-varying data, inspired by topological grouping structures for moving objects in two- or higher-dimensional space. Our model gives provable guarantees on the trends detected and uses three natural parameters: granularity, support-size, and duration. These parameters can be changed on-demand. Our system also supports a variety of selection brushes and a time-sweep to facilitate refined searches and interactive visualization of (sub-)trends. We explore different visual styles and interactions through which trends, their persistence, and evolution can be explored.

  11. Computer Program Recognizes Patterns in Time-Series Data

    NASA Technical Reports Server (NTRS)

    Hand, Charles

    2003-01-01

    A computer program recognizes selected patterns in time-series data like digitized samples of seismic or electrophysiological signals. The program implements an artificial neural network (ANN) and a set of N clocks for the purpose of determining whether N or more instances of a certain waveform, W, occur within a given time interval, T. The ANN must be trained to recognize W in the incoming stream of data. The first time the ANN recognizes W, it sets clock 1 to count down from T to zero; the second time it recognizes W, it sets clock 2 to count down from T to zero, and so forth through the Nth instance. On the N + 1st instance, the cycle is repeated, starting with clock 1. If any clock has not reached zero when it is reset, then N instances of W have been detected within time T, and the program so indicates. The program can readily be encoded in a field-programmable gate array or an application-specific integrated circuit that could be used, for example, to detect electroencephalographic or electrocardiographic waveforms indicative of epileptic seizures or heart attacks, respectively.

  12. Power estimation using simulations for air pollution time-series studies

    PubMed Central

    2012-01-01

    Background Estimation of power to assess associations of interest can be challenging for time-series studies of the acute health effects of air pollution because there are two dimensions of sample size (time-series length and daily outcome counts), and because these studies often use generalized linear models to control for complex patterns of covariation between pollutants and time trends, meteorology and possibly other pollutants. In general, statistical software packages for power estimation rely on simplifying assumptions that may not adequately capture this complexity. Here we examine the impact of various factors affecting power using simulations, with comparison of power estimates obtained from simulations with those obtained using statistical software. Methods Power was estimated for various analyses within a time-series study of air pollution and emergency department visits using simulations for specified scenarios. Mean daily emergency department visit counts, model parameter value estimates and daily values for air pollution and meteorological variables from actual data (8/1/98 to 7/31/99 in Atlanta) were used to generate simulated daily outcome counts with specified temporal associations with air pollutants and randomly generated error based on a Poisson distribution. Power was estimated by conducting analyses of the association between simulated daily outcome counts and air pollution in 2000 data sets for each scenario. Power estimates from simulations and statistical software (G*Power and PASS) were compared. Results In the simulation results, increasing time-series length and average daily outcome counts both increased power to a similar extent. Our results also illustrate the low power that can result from using outcomes with low daily counts or short time series, and the reduction in power that can accompany use of multipollutant models. Power estimates obtained using standard statistical software were very similar to those from the simulations

  13. Modified superposition: A simple time series approach to closed-loop manual controller identification

    NASA Technical Reports Server (NTRS)

    Biezad, D. J.; Schmidt, D. K.; Leban, F.; Mashiko, S.

    1986-01-01

    Single-channel pilot manual control output in closed-tracking tasks is modeled in terms of linear discrete transfer functions which are parsimonious and guaranteed stable. The transfer functions are found by applying a modified super-position time series generation technique. A Levinson-Durbin algorithm is used to determine the filter which prewhitens the input and a projective (least squares) fit of pulse response estimates is used to guarantee identified model stability. Results from two case studies are compared to previous findings, where the source of data are relatively short data records, approximately 25 seconds long. Time delay effects and pilot seasonalities are discussed and analyzed. It is concluded that single-channel time series controller modeling is feasible on short records, and that it is important for the analyst to determine a criterion for best time domain fit which allows association of model parameter values, such as pure time delay, with actual physical and physiological constraints. The purpose of the modeling is thus paramount.

  14. Adaptive Sampling of Time Series During Remote Exploration

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  15. United States Forest Disturbance Trends Observed Using Landsat Time Series

    NASA Technical Reports Server (NTRS)

    Masek, Jeffrey G.; Goward, Samuel N.; Kennedy, Robert E.; Cohen, Warren B.; Moisen, Gretchen G.; Schleeweis, Karen; Huang, Chengquan

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing U.S. land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest disturbance across the conterminous United States for 1985-2005. The geographic sample design used a probability-based scheme to encompass major forest types and maximize geographic dispersion. For each sample location disturbance was identified in the Landsat series using the Vegetation Change Tracker (VCT) algorithm. The NAFD analysis indicates that, on average, 2.77 Mha/yr of forests were disturbed annually, representing 1.09%/yr of US forestland. These satellite-based national disturbance rates estimates tend to be lower than those derived from land management inventories, reflecting both methodological and definitional differences. In particular the VCT approach used with a biennial time step has limited sensitivity to low-intensity disturbances. Unlike prior satellite studies, our biennial forest disturbance rates vary by nearly a factor of two between high and low years. High western US disturbance rates were associated with active fire years and insect activity, while variability in the east is more strongly related to harvest rates in managed forests. We note that generating a geographic sample based on representing forest type and variability may be problematic since the spatial pattern of disturbance does not necessarily correlate with forest type. We also find that the prevalence of diffuse, non-stand clearing disturbance in US forests makes the application of a biennial geographic sample problematic. Future satellite-based studies of disturbance at regional and national scales should focus on wall-to-wall analyses with annual time step for improved accuracy.

  16. Nonlinear times series analysis of epileptic human electroencephalogram (EEG)

    NASA Astrophysics Data System (ADS)

    Li, Dingzhou

    The problem of seizure anticipation in patients with epilepsy has attracted significant attention in the past few years. In this paper we discuss two approaches, using methods of nonlinear time series analysis applied to scalp electrode recordings, which is able to distinguish between epochs temporally distant from and just prior to, the onset of a seizure in patients with temporal lobe epilepsy. First we describe a method involving a comparison of recordings taken from electrodes adjacent to and remote from the site of the seizure focus. In particular, we define a nonlinear quantity which we call marginal predictability. This quantity is computed using data from remote and from adjacent electrodes. We find that the difference between the marginal predictabilities computed for the remote and adjacent electrodes decreases several tens of minutes prior to seizure onset, compared to its value interictally. We also show that these difl'crcnc es of marginal predictability intervals are independent of the behavior state of the patient. Next we examine the please coherence between different electrodes both in the long-range and the short-range. When time is distant from seizure onsets ("interictally"), epileptic patients have lower long-range phase coherence in the delta (1-4Hz) and beta (18-30Hz) frequency band compared to nonepileptic subjects. When seizures approach (''preictally"), we observe an increase in phase coherence in the beta band. However, interictally there is no difference in short-range phase coherence between this cohort of patients and non-epileptic subjects. Preictally short-range phase coherence also increases in the alpha (10-13Hz) and the beta band. Next we apply the quantity marginal predictability on the phase difference time series. Such marginal predictabilities are lower in the patients than in the non-epileptic subjects. However, when seizure approaches, the former moves asymptotically towards the latter.

  17. SAGE: A tool for time-series analysis of Greenland

    NASA Astrophysics Data System (ADS)

    Duerr, R. E.; Gallaher, D. W.; Khalsa, S. S.; Lewis, S.

    2011-12-01

    The National Snow and Ice Data Center (NSIDC) has developed an operational tool for analysis. This production tool is known as "Services for the Analysis of the Greenland Environment" (SAGE). Using an integrated workspace approach, a researcher has the ability to find relevant data and perform various analysis functions on the data, as well as retrieve the data and analysis results. While there continues to be compelling observational evidence for increased surface melting and rapid thinning along the margins of the Greenland ice sheet, there are still uncertainties with respect to estimates of mass balance of Greenland's ice sheet as a whole. To better understand the dynamics of these issues, it is important for scientists to have access to a variety of datasets from multiple sources, and to be able to integrate and analyze the data. SAGE provides data from various sources, such as AMSR-E and AVHRR datasets, which can be analyzed individually through various time-series plots and aggregation functions; or they can be analyzed together with scatterplots or overlaid time-series plots to provide quick and useful results to support various research products. The application is available at http://nsidc.org/data/sage/. SAGE was built on top of NSIDC's existing Searchlight engine. The SAGE interface gives users access to much of NSIDC's relevant Greenland raster data holdings, as well as data from outside sources. Additionally, various web services provide access for other clients to utilize the functionality that the SAGE interface provides. Combined, these methods of accessing the tool allow scientists the ability to devote more of their time to their research, and less on trying to find and retrieve the data they need.

  18. Acoustic thermometry time series in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dushaw, B. D.; Howe, B. M.; Mercer, J. A.; Worcester; Npal Group*, P. F.

    2002-12-01

    Acoustic measurements of large-scale, depth-averaged temperatures are continuing in the North Pacific as a follow on to the Acoustic Thermometry of Ocean Climate (ATOC) project. An acoustic source is located just north of Kauai. It transmits to six receivers to the east at 1-4-Mm ranges and one receiver to the northwest at about 4-Mm range. The transmission schedule is six times per day at four-day intervals. The time series were obtained from 1998 through 1999 and, after a two-year interruption because of permitting issues, began again in January 2002 to continue for at least another five years. The intense mesoscale thermal variability around Hawaii is evident in all time series; this variability is much greater than that observed near the California coast. The paths to the east, particularly those paths to the California coast, show cooling this year relative to the earlier data. The path to the northwest shows a modest warming. The acoustic rays sample depths below the mixed layer near Hawaii and to the surface as they near the California coast or extend north of the sub-arctic front. The temperatures measured acoustically are compared with those inferred from TOPEX altimetry, ARGO float data, and with ECCO (Estimating the Circulation and Climate of the Ocean) model output. This on-going data collection effort, to be augmented over the next years with a more complete observing array, can be used for, e.g., separating whole-basin climate change from low-mode spatial variability such as the Pacific Decadal Oscillation (PDO). [*NPAL (North Pacific Acoustic Laboratory) Group: J. A. Colosi, B. D. Cornuelle, B. D. Dushaw, M. A. Dzieciuch, B. M. Howe, J. A. Mercer, R. C. Spindel, and P. F. Worcester. Work supported by the Office of Naval Research.

  19. Established time series measure occurrence and frequency of episodic events.

    NASA Astrophysics Data System (ADS)

    Pebody, Corinne; Lampitt, Richard

    2015-04-01

    Established time series measure occurrence and frequency of episodic events. Episodic flux events occur in open oceans. Time series making measurements over significant time scales are one of the few methods that can capture these events and compare their impact with 'normal' flux. Seemingly rare events may be significant on local scales, but without the ability to measure the extent of flux on spatial and temporal scales and combine with the frequency of occurrence, it is difficult to constrain their impact. The Porcupine Abyssal Plain Sustained Observatory (PAP-SO) in the Northeast Atlantic (49 °N 16 °W, 5000m water depth) has measured particle flux since 1989 and zooplankton swimmers since 2000. Sediment traps at 3000m and 100 metres above bottom, collect material year round and we have identified close links between zooplankton and particle flux. Some of these larger animals, for example Diacria trispinosa, make a significant contribution to carbon flux through episodic flux events. D. trispinosa is a euthecosome mollusc which occurs in the Northeast Atlantic, though the PAP-SO is towards the northern limit of its distribution. Pteropods are comprised of aragonite shell, containing soft body parts excepting the muscular foot which extends beyond the mouth of the living animal. Pteropods, both live-on-entry animals and the empty shells are found year round in the 3000m trap. Generally the abundance varies with particle flux, but within that general pattern there are episodic events where significant numbers of these animals containing both organic and inorganic carbon are captured at depth and therefore could be defined as contributing to export flux. Whether the pulse of animals is as a result of the life cycle of D. trispinosa or the effects of the physics of the water column is unclear, but the complexity of the PAP-SO enables us not only to collect these animals but to examine them in parallel to the biogeochemical and physical elements measured by the

  20. Aerosol Climate Time Series in ESA Aerosol_cci

    NASA Astrophysics Data System (ADS)

    Popp, Thomas; de Leeuw, Gerrit; Pinnock, Simon

    2016-04-01

    Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. Meanwhile, full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer, but also from ATSR instruments and the POLDER sensor), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which were also validated and improved in the reprocessing. For the three ATSR algorithms the use of an ensemble method was tested. The paper will summarize and discuss the status of dataset reprocessing and validation. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension

  1. Event-sequence time series analysis in ground-based gamma-ray astronomy

    SciTech Connect

    Barres de Almeida, U.; Chadwick, P.; Daniel, M.; Nolan, S.; McComb, L.

    2008-12-24

    The recent, extreme episodes of variability detected from Blazars by the leading atmospheric Cerenkov experiments motivate the development and application of specialized statistical techniques that enable the study of this rich data set to its furthest extent. The identification of the shortest variability timescales supported by the data and the actual variability structure observed in the light curves of these sources are some of the fundamental aspects being studied, that answers can bring new developments on the understanding of the physics of these objects and on the mechanisms of production of VHE gamma-rays in the Universe. Some of our efforts in studying the time variability of VHE sources involve the application of dynamic programming algorithms to the problem of detecting change-points in a Poisson sequence. In this particular paper we concentrate on the more primary issue of the applicability of counting statistics to the analysis of time-series on VHE gamma-ray astronomy.

  2. VARTOOLS: A program for analyzing astronomical time-series data

    NASA Astrophysics Data System (ADS)

    Hartman, J. D.; Bakos, G. Á.

    2016-10-01

    This paper describes the VARTOOLS program, which is an open-source command-line utility, written in C, for analyzing astronomical time-series data, especially light curves. The program provides a general-purpose set of tools for processing light curves including signal identification, filtering, light curve manipulation, time conversions, and modeling and simulating light curves. Some of the routines implemented include the Generalized Lomb-Scargle periodogram, the Box-Least Squares transit search routine, the Analysis of Variance periodogram, the Discrete Fourier Transform including the CLEAN algorithm, the Weighted Wavelet Z-Transform, light curve arithmetic, linear and non-linear optimization of analytic functions including support for Markov Chain Monte Carlo analyses with non-trivial covariances, characterizing and/or simulating time-correlated noise, and the TFA and SYSREM filtering algorithms, among others. A mechanism is also provided for incorporating a user's own compiled processing routines into the program. VARTOOLS is designed especially for batch processing of light curves, including built-in support for parallel processing, making it useful for large time-domain surveys such as searches for transiting planets. Several examples are provided to illustrate the use of the program.

  3. Simulation of estimating periodicity of seasonally stationary time series

    SciTech Connect

    Tian, C.J.

    1984-06-01

    Herein, some common periodicity estimation methods: the periodogram analysis, the maximum entropy spectral method, the successive average method as well as the graphic method are considered. For comparing these methods and verifying their practical efficiency, simulations are performed on several groups of seasonal stationary time series which are generated by the model x(t) = v(t) + z(t). v(t) being a seasonal component with different forms (Sinusoid, unequal amptitude oscillation, slope signal, exponential decay signal and block signal etc.) and z(t) being autoregressive process under different levels of signal-noise ratio. Computational results, comprehensively illustrate that the successive average method is easier to carry out and more efficient in practice.

  4. Optimal estimation of recurrence structures from time series

    NASA Astrophysics Data System (ADS)

    beim Graben, Peter; Sellers, Kristin K.; Fröhlich, Flavio; Hutt, Axel

    2016-05-01

    Recurrent temporal dynamics is a phenomenon observed frequently in high-dimensional complex systems and its detection is a challenging task. Recurrence quantification analysis utilizing recurrence plots may extract such dynamics, however it still encounters an unsolved pertinent problem: the optimal selection of distance thresholds for estimating the recurrence structure of dynamical systems. The present work proposes a stochastic Markov model for the recurrent dynamics that allows for the analytical derivation of a criterion for the optimal distance threshold. The goodness of fit is assessed by a utility function which assumes a local maximum for that threshold reflecting the optimal estimate of the system's recurrence structure. We validate our approach by means of the nonlinear Lorenz system and its linearized stochastic surrogates. The final application to neurophysiological time series obtained from anesthetized animals illustrates the method and reveals novel dynamic features of the underlying system. We propose the number of optimal recurrence domains as a statistic for classifying an animals' state of consciousness.

  5. WENDEC: a deconvolution program for processing hormone time-series.

    PubMed

    De Nicolao, G; De Nicolao, A

    1995-08-01

    The estimation of the glandular secretory rate from time-series of hormone concentration in plasma can be formulated as a deconvolution problem. In particular, the paper addresses the analysis of frequently sampled data collected in order to study spontaneous pulsatile secretion. Standard deconvolution methods do not allow for the non-negativity constraint and the presence of high-frequency components in the secretory rate. In order to overcome the intrinsic ill-conditioning of the problem, the maximum entropy method is used to obtain a probabilistic representation of the prior knowledge concerning the unknown secretory signal, thus leading to a White Exponential Noise (WEN) model. The deconvolution problem is then posed within a Bayesian framework and solved by means of Maximum-A-Posteriori estimation. The program that implements the algorithm handles non-negativity constraints, provides confidence intervals, and is computationally and memory efficient.

  6. Time series analysis for minority game simulations of financial markets

    NASA Astrophysics Data System (ADS)

    Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy

    2003-04-01

    The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.

  7. SPITZER IRAC PHOTOMETRY FOR TIME SERIES IN CROWDED FIELDS

    SciTech Connect

    Novati, S. Calchi; Beichman, C.; Gould, A.; Fausnaugh, M.; Gaudi, B. S.; Pogge, R. W.; Wibking, B.; Zhu, W.; Poleski, R.; Yee, J. C.; Bryden, G.; Henderson, C. B.; Shvartzvald, Y.; Carey, S.; Udalski, A.; Pawlak, M.; Szymański, M. K.; Skowron, J.; Mróz, P.; Kozłowski, S.; Collaboration: Spitzer team; OGLE group; and others

    2015-12-01

    We develop a new photometry algorithm that is optimized for the Infrared Array Camera (IRAC) Spitzer time series in crowded fields and that is particularly adapted to faint or heavily blended targets. We apply this to the 170 targets from the 2015 Spitzer microlensing campaign and present the results of three variants of this algorithm in an online catalog. We present detailed accounts of the application of this algorithm to two difficult cases, one very faint and the other very crowded. Several of Spitzer's instrumental characteristics that drive the specific features of this algorithm are shared by Kepler and WFIRST, implying that these features may prove to be a useful starting point for algorithms designed for microlensing campaigns by these other missions.

  8. Detecting nonstationarity and state transitions in a time series

    NASA Astrophysics Data System (ADS)

    Gao, J. B.

    2001-06-01

    One cause of complexity in a time series may be due to nonstationarity and transience. In this paper, we analyze the nonstationarity and transience in a number of dynamical systems. We find that the nonstationarity in the metastable chaotic Lorenz system is due to nonrecurrence. The latter determines a lack of fractal structure in the signal. In 1/fα noise, we find that the associated correlation dimension are local graph dimensions calculated from sojourn points. We also design a transient Lorenz system with a slowly oscillating controlling parameter, and a transient Rossler system with a slowly linearly increasing parameter, with parameter ranges covering a sequence of chaotic dynamics with increased phase incoherence. State transitions, from periodic to chaotic, and vice versa, are identified, together with different facets of nonstationarity in each phase.

  9. Assessment of Time Series Complexity Using Improved Approximate Entropy

    NASA Astrophysics Data System (ADS)

    Kong, De-Ren; Xie, Hong-Bo

    2011-09-01

    Approximate entropy (ApEn), a measure quantifying complexity and/or regularity, is believed to be an effective method of analyzing diverse settings. However, the similarity definition of vectors based on Heaviside function may cause some problems in the validity and accuracy of ApEn. To overcome the problems, an improved approximate entropy (iApEn) based on the sigmoid function is proposed. The performance of iApEn is tested on the independent identically distributed (IID) Gaussian noise, the MIX stochastic model, the Rossler map, the logistic map, and the high-dimensional Mackey—Glass oscillator. The results show that iApEn is superior to ApEn in several aspects, including better relative consistency, freedom of parameter selection, robust to noise, and more independence on record length when characterizing time series with different complexities.

  10. Forecasting electricity usage using univariate time series models

    NASA Astrophysics Data System (ADS)

    Hock-Eam, Lim; Chee-Yin, Yip

    2014-12-01

    Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.

  11. Time-series analysis of Campylobacter incidence in Switzerland.

    PubMed

    Wei, W; Schüpbach, G; Held, L

    2015-07-01

    Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.

  12. Optimizing functional network representation of multivariate time series.

    PubMed

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; del Pozo, Francisco; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-01-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.

  13. Practical measures of integrated information for time-series data.

    PubMed

    Barrett, Adam B; Seth, Anil K

    2011-01-20

    A recent measure of 'integrated information', Φ(DM), quantifies the extent to which a system generates more information than the sum of its parts as it transitions between states, possibly reflecting levels of consciousness generated by neural systems. However, Φ(DM) is defined only for discrete Markov systems, which are unusual in biology; as a result, Φ(DM) can rarely be measured in practice. Here, we describe two new measures, Φ(E) and Φ(AR), that overcome these limitations and are easy to apply to time-series data. We use simulations to demonstrate the in-practice applicability of our measures, and to explore their properties. Our results provide new opportunities for examining information integration in real and model systems and carry implications for relations between integrated information, consciousness, and other neurocognitive processes. However, our findings pose challenges for theories that ascribe physical meaning to the measured quantities.

  14. Optimizing Functional Network Representation of Multivariate Time Series

    NASA Astrophysics Data System (ADS)

    Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco Del; Menasalvas, Ernestina; Boccaletti, Stefano

    2012-09-01

    By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.

  15. Efficient Bayesian inference for natural time series using ARFIMA processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Gramacy, Robert; Franzke, Christian; Watkins, Nicholas

    2016-04-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. We present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators [1]. In addition we show how the method can be used to perform joint inference of the stability exponent and the memory parameter when ARFIMA is extended to allow for alpha-stable innovations. Such models can be used to study systems where heavy tails and long range memory coexist. [1] Graves et al, Nonlin. Processes Geophys., 22, 679-700, 2015; doi:10.5194/npg-22-679-2015.

  16. Controlled, distributed data management of an Antarctic time series

    NASA Astrophysics Data System (ADS)

    Leadbetter, Adam; Connor, David; Cunningham, Nathan; Reynolds, Sarah

    2010-05-01

    The Rothera Time Series (RaTS) presents over ten years of oceanographic data collected off the Antarctic Peninsula comprising conductivity, temperature, depth cast data; current meter data; and bottle sample data. The data set has been extensively analysed and is well represented in the scientific literature. However, it has never been available to browse as a coherent entity. Work has been undertaken by both the data collecting organisation (the British Antarctic Survey, BAS) and the associated national data centre (the British Oceanographic Data Centre, BODC) to describe the parameters comprising the dataset in a consistent manner. To this end, each data point in the RaTS dataset has now been ascribed a parameter usage term, selected from the appropriate controlled vocabulary of the Natural Environment Research Council's Data Grid (NDG). By marking up the dataset in this way the semantic richness of the NDG vocabularies is fully accessible, and the dataset can be then explored using the Global Change Master Directory keyword set, the International Standards Organisation topic categories, SeaDataNet disciplines and agreed parameter groups, and the NDG parameter discovery vocabulary. We present a single data discovery and exploration tool, a web portal which allows the user to drill down through the dataset using their chosen keyword set. The spatial coverage of the chosen data is displayed through a Google Earth web plugin. Finally, as the time series data are held at BODC and the discrete sample data held at BAS (which are separate physical locations), a mechanism has been established to provide metadata from one site to another. This takes the form of an Open Geospatial Consortium Web Map Service server at BODC feeding information into the portal hosted at BAS.

  17. Short-term prediction of solar irradiance using time-series analysis

    SciTech Connect

    Chowdhury, B.H. . Dept. of Electrical Engineering)

    1990-01-01

    A new statistical model for solar irradiance prediction is described. The method makes use of the atmospheric parameterizations as well as a time-series model to forecast a sequence of global irradiance in the 3--10 min time frame. A survey of some of the prominent research of the recent past reveals a definite lack of irradiance models that approach subhourly intervals, especially in the range mentioned. In this article, accurate parameterizations of atmospheric phenomena are used in a prewhitening process so that a time-series model may be used effectively to forecast irradiance components up to an hour in advance in the 3--10 min time intervals. The model requires only previous global horizontal irradiance measurement at a site. Results show that when compared with actual data on two locations in the southeaster United States, the forecasts are quite accurate, and the model is site-independent. Under some instances, forecasts may be inaccurate when there are sudden transitional changes in the cloud cover moving across the sun. In order for the proposed irradiance model to predict such transitional changes correctly, frequent forecast updates become necessary.

  18. A framework for periodic outlier pattern detection in time-series sequences.

    PubMed

    Rasheed, Faraz; Alhajj, Reda

    2014-05-01

    Periodic pattern detection in time-ordered sequences is an important data mining task, which discovers in the time series all patterns that exhibit temporal regularities. Periodic pattern mining has a large number of applications in real life; it helps understanding the regular trend of the data along time, and enables the forecast and prediction of future events. An interesting related and vital problem that has not received enough attention is to discover outlier periodic patterns in a time series. Outlier patterns are defined as those which are different from the rest of the patterns; outliers are not noise. While noise does not belong to the data and it is mostly eliminated by preprocessing, outliers are actual instances in the data but have exceptional characteristics compared with the majority of the other instances. Outliers are unusual patterns that rarely occur, and, thus, have lesser support (frequency of appearance) in the data. Outlier patterns may hint toward discrepancy in the data such as fraudulent transactions, network intrusion, change in customer behavior, recession in the economy, epidemic and disease biomarkers, severe weather conditions like tornados, etc. We argue that detecting the periodicity of outlier patterns might be more important in many sequences than the periodicity of regular, more frequent patterns. In this paper, we present a robust and time efficient suffix tree-based algorithm capable of detecting the periodicity of outlier patterns in a time series by giving more significance to less frequent yet periodic patterns. Several experiments have been conducted using both real and synthetic data; all aspects of the proposed approach are compared with the existing algorithm InfoMiner; the reported results demonstrate the effectiveness and applicability of the proposed approach.

  19. Blind source separation problem in GPS time series

    NASA Astrophysics Data System (ADS)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2016-04-01

    A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition

  20. Streamflow properties from time series of surface velocity and stage

    USGS Publications Warehouse

    Plant, W.J.; Keller, W.C.; Hayes, K.; Spicer, K.

    2005-01-01

    Time series of surface velocity and stage have been collected simultaneously. Surface velocity was measured using an array of newly developed continuous-wave microwave sensors. Stage was obtained from the standard U.S. Geological Survey (USGS) measurements. The depth of the river was measured several times during our experiments using sounding weights. The data clearly showed that the point of zero flow was not the bottom at the measurement site, indicating that a downstream control exists. Fathometer measurements confirmed this finding. A model of the surface velocity expected at a site having a downstream control was developed. The model showed that the standard form for the friction velocity does not apply to sites where a downstream control exists. This model fit our measured surface velocity versus stage plots very well with reasonable values of the parameters. Discharges computed using the surface velocities and measured depths matched the USGS rating curve for the site. Values of depth-weighted mean velocities derived from our data did not agree with those expected from Manning's equation due to the downstream control. These results suggest that if real-time surface velocities were available at a gauging station, unstable stream beds could be monitored. Journal of Hydraulic Engineering ?? ASCE.

  1. Predicting Time Series Outputs and Time-to-Failure for an Aircraft Controller Using Bayesian Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.

  2. Traffic time series analysis by using multiscale time irreversibility and entropy.

    PubMed

    Wang, Xuejiao; Shang, Pengjian; Fang, Jintang

    2014-09-01

    Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method-multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy.

  3. Global near real-time disturbance monitoring using MODIS satellite image time series

    NASA Astrophysics Data System (ADS)

    Verbesselt, J.; Kalomenopoulos, M.; de Jong, R.; Zeileis, A.; Herold, M.

    2012-12-01

    Global disturbance monitoring in forested ecosystems is critical to retrieve information on carbon storage dynamics, biodiversity, and other socio-ecological processes. Satellite remote sensing provides a means for cost-effective monitoring at frequent time steps over large areas. However, for information about current change processes, it is required to analyse image time series in a fast and accurate manner and to detect abnormal change in near real time. An increasing number of change detection techniques have become available that are able to process historical satellite image time series data to detect changes in the past. However, methods that detect changes near real-time, i.e. analysing newly acquired data with respect to the historical series, are lacking. We propose a statistical technique for monitoring change in near-real time by comparing current data with a seasonal-trend model fitted onto the historical time series. As such, identification of consistent and abnormal change in near-real time becomes possible as soon as new image data is captured. The method is based on the "Break For Additive Seasonal Trend" (BFAST) concept (http://bfast.r-forge.r-project.org/). Disturbances are detected by analysing 16-daily MODIS combined vegetation and temperature indices. Validation is carried out by comparing the detected disturbances with available disturbance data sets (e.g. deforestation in Brazil and MODIS fire products). Preliminary results demonstrated that abrupt changes at the end of time series can be successfully detected while the method remains robust for strong seasonality and atmospheric noise. Cloud masking, however, was identified as a critical issue since periods of persistent cloudiness can be detected as abnormal change. The proposed method is an automatic and robust change detection approach that can be applied on different types of data (e.g. future sensors like the Sentinel constellation that provide higher spatial resolution at regular time

  4. Interglacial climate dynamics and advanced time series analysis

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  5. Beyond multi-fractals: surrogate time series and fields

    NASA Astrophysics Data System (ADS)

    Venema, V.; Simmer, C.

    2007-12-01

    Most natural complex are characterised by variability on a large range of temporal and spatial scales. The two main methodologies to generate such structures are Fourier/FARIMA based algorithms and multifractal methods. The former is restricted to Gaussian data, whereas the latter requires the structure to be self-similar. This work will present so-called surrogate data as an alternative that works with any (empirical) distribution and power spectrum. The best-known surrogate algorithm is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm. We have studied six different geophysical time series (two clouds, runoff of a small and a large river, temperature and rain) and their surrogates. The power spectra and consequently the 2nd order structure functions were replicated accurately. Even the fourth order structure function was more accurately reproduced by the surrogates as would be possible by a fractal method, because the measured structure deviated too strong from fractal scaling. Only in case of the daily rain sums a fractal method could have been more accurate. Just as Fourier and multifractal methods, the current surrogates are not able to model the asymmetric increment distributions observed for runoff, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found differences for the structure functions on small scales. Surrogate methods are especially valuable for empirical studies, because the time series and fields that are generated are able to mimic measured variables accurately. Our main application is radiative transfer through structured clouds. Like many geophysical fields, clouds can only be sampled sparsely, e.g. with in-situ airborne instruments. However, for radiative transfer calculations we need full 3-dimensional cloud fields. A first study relating the measured properties of the cloud droplets and the radiative properties of the cloud field by generating surrogate cloud

  6. Large Scale Time Series Microscopy of Neovessel Growth During Angiogenesis

    PubMed Central

    Utzinger, Urs; Baggett, Brenda; Weiss, Jeffrey A.; Hoying, James B.; Edgar, Lowell T.

    2016-01-01

    During angiogenesis, growing neovessels must effectively navigate through the tissue space as they elongate and subsequently integrate into a microvascular network. While time series microscopy has provided insight into the cell activities within single growing neovessel sprouts, less in known concerning neovascular dynamics within a large angiogenic tissue bed. Here we developed a time lapse imaging technique that allowed visualization and quantification of sprouting neovessels as they form and grow away from adult parent microvessels in 3-dimensions over cubic millimeters of matrix volume, over the course of up to 5 days on the microscope. Using a new image acquisition procedure and novel morphometric analysis tools, we quantified the elongation dynamics of growing neovessels and found an episodic growth pattern accompanied by fluctuations in neovessel diameter. Average elongation rate was 5 microns/hour for individual vessels, but we also observed considerable dynamic variability in growth character including retraction and complete regression of entire neovessels. We observed neovessel-to-neovessel directed growth over tens to hundreds of microns preceding tip-to-tip inosculation. As we have previously described via static 3D imaging at discrete time points, we identified different collagen fibril structures associated with the growing neovessel tip and stalk, and observed the coordinated alignment of growing neovessels in a deforming matrix. Overall analysis of the entire image volumes demonstrated that although individual neovessels exhibited episodic growth and regression, there was a monotonic increase in parameters associated with the entire vascular bed such as total network length and number of branch points. This new time-lapse imaging approach corroborated morphometric changes in individual neovessels described by us and others, as well as captured dynamic neovessel behaviors unique to days-long angiogenesis within the forming neovascular network. PMID

  7. A multiscale approach to InSAR time series analysis

    NASA Astrophysics Data System (ADS)

    Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y. N.; Dicaprio, C.; Rickerby, A.

    2008-12-01

    We describe a new technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale analysis of InSAR Time Series), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. This approach also permits a consistent treatment of all data independent of the presence of localized holes in any given interferogram. In essence, MInTS allows one to considers all data at the same time (as opposed to one pixel at a time), thereby taking advantage of both spatial and temporal characteristics of the deformation field. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to includes a set of general functions (e.g., splines) in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate our results by comparing with ground-based observations.

  8. TEMPORAL SIGNATURES OF AIR QUALITY OBSERVATIONS AND MODEL OUTPUTS: DO TIME SERIES DECOMPOSITION METHODS CAPTURE RELEVANT TIME SCALES?

    EPA Science Inventory

    Time series decomposition methods were applied to meteorological and air quality data and their numerical model estimates. Decomposition techniques express a time series as the sum of a small number of independent modes which hypothetically represent identifiable forcings, thereb...

  9. A unified nonlinear stochastic time series analysis for climate science.

    PubMed

    Moon, Woosok; Wettlaufer, John S

    2017-03-13

    Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  10. A unified nonlinear stochastic time series analysis for climate science

    PubMed Central

    Moon, Woosok; Wettlaufer, John S.

    2017-01-01

    Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability. PMID:28287128

  11. Advanced tools for astronomical time series and image analysis

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.

    The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

  12. Impact of Sensor Degradation on the MODIS NDVI Time Series

    NASA Technical Reports Server (NTRS)

    Wang, Dongdong; Morton, Douglas; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert

    2011-01-01

    Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, we evaluated the impact of sensor degradation on trend detection using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004/yr decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends MODIS NDVI over North America were consistent with simulated results, with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in NDVI trends over vegetation.

  13. High Cadence Time-Series Photometry of V1647 Orionis

    NASA Astrophysics Data System (ADS)

    Bastien, Fabienne A.; Stassun, K. G.; Weintraub, D. A.

    2010-01-01

    We present high cadence time-series photometry of the 2003-2004 and 2008-2009 FUor/EXor outbursts of V1647 Orionis, the star illuminating McNeil's Nebula. The first dataset was taken as the object was most steeply increasing in brightness while the second was presumably taken after its luminosity had plateaued. We detect two significant periods in our 2003 lightcurve superimposed on a flicker-noise spectrum, while the power spectrum of our 2009 lightcurve is devoid of significant structure. We find that neither of these periods can be attributed to the star's rotation. The dominant period is 4.3d, and we find that it may be akin to the dwarf-nova oscillations observed around cataclysmic variable stars. This 4.3d period would suggest that the inner edge of the star's Keplerian accretion disk was located 2.5 stellar radii away from the star before its luminosity had reached its peak and that, considered together with the flickering, the stellar magnetosphere was interacting with the disk during this phase of the outburst. The second period of 0.13d is consistent with the star's theoretical radial pulsation timescale, and, given that this period is not detected in 2009, we propose that the very high accretion rate at the time of our 2003 observations induced short-term radial pulsations in the star.

  14. A unified nonlinear stochastic time series analysis for climate science

    NASA Astrophysics Data System (ADS)

    Moon, Woosok; Wettlaufer, John S.

    2017-03-01

    Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  15. Impact of Sensor Degradation on the MODIS NDVI Time Series

    NASA Technical Reports Server (NTRS)

    Wang, Dongdong; Morton, Douglas Christopher; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert

    2012-01-01

    Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, the impact of sensor degradation on trend detection was evaluated using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470 nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004 yr-1 decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends in MODIS NDVI over North America were consistentwith simulated results,with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in detection of NDVI trends.

  16. Interrupted time-series analysis: studying trends in neurosurgery.

    PubMed

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  17. Urban Area Monitoring using MODIS Time Series Data

    NASA Astrophysics Data System (ADS)

    Devadiga, S.; Sarkar, S.; Mauoka, E.

    2015-12-01

    Growing urban sprawl and its impact on global climate due to urban heat island effects has been an active area of research over the recent years. This is especially significant in light of rapid urbanization that is happening in some of the first developing nations across the globe. But so far study of urban area growth has been largely restricted to local and regional scales, using high to medium resolution satellite observations, taken at distinct time periods. In this presentation we propose a new approach to detect and monitor urban area expansion using long time series of MODIS data. This work characterizes data points using a vector of several annual metrics computed from the MODIS 8-day and 16-day composite L3 data products, at 250M resolution and over several years and then uses a vector angle mapping classifier to detect and segment the urban area. The classifier is trained using a set of training points obtained from a reference vector point and polygon pre-filtered using the MODIS VI product. This work gains additional significance, given that, despite unprecedented urban growth since 2000, the area covered by the urban class in the MODIS Global Land Cover (MCD12Q1, MCDLCHKM and MCDLC1KM) product hasn't changed since the launch of Terra and Aqua. The proposed approach was applied to delineate the urban area around several cities in Asia known to have maximum growth in the last 15 years. Results were verified using high resolution Landsat data.

  18. Spectrophotometric Time Series of η Carinae's Great Eruption

    NASA Astrophysics Data System (ADS)

    Rest, Armin; Bianco, Federica; Chornock, Ryan; Clocchiatti, Alejandro; James, David; Margheim, Steve; Matheson, Thomas; Prieto, Jose Luis; Smith, Chris; Smith, Nathan; Walborn, Nolan; Welch, Doug; Zenteno, Alfredo

    2014-08-01

    η Car serves as our most important template for understanding non-SN transients from massive stars in external galaxies. However, until recently, no spectra were available because its historic ``Great Eruption'' (GE) occurred from 1838-1858, before the invention of the astronomical spectrograph, and only visual estimates of its brightness were recorded teSF11. Now we can also obtain a spectral sequence of the eruption through its light echoes we discovered, which will be of great value since spectra are our most important tool for inferring physical properties of extragalactic transients. Subsequent spectroscopic follow-up revealed that its outburst was most similar to those of G-type supergiants, rather than reported LBV outburst spectral types of F-type (or earlier) teRest12_eta. These differences between the GE and the extragalactic transients presumed to be its analogues raise questions about traditional scenarios for the outburst. We propose to obtain a spectrophotometric time series of the GE from different directions, allowing the original eruption of η Car to be studied as a function of time as well as latitude, something only possible with light echoes. This unique detailed spectroscopic study of the light echoes of η Car will help us understand (episodic) mass- loss in the most massive evolved stars and their connection to the most energetic core-collapse SNe.

  19. Coastal Atmosphere and Sea Time Series (CoASTS)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Zibordi, Giuseppe; Berthon, Jean-Francoise; Doyle, John P.; Grossi, Stefania; vanderLinde, Dirk; Targa, Cristina; Alberotanza, Luigi; McClain, Charles R. (Technical Monitor)

    2002-01-01

    The Coastal Atmosphere and Sea Time Series (CoASTS) Project aimed at supporting ocean color research and applications, from 1995 up to the time of publication of this document, has ensured the collection of a comprehensive atmospheric and marine data set from an oceanographic tower located in the northern Adriatic Sea. The instruments and the measurement methodologies used to gather quantities relevant for bio-optical modeling and for the calibration and validation of ocean color sensors, are described. Particular emphasis is placed on four items: (1) the evaluation of perturbation effects in radiometric data (i.e., tower-shading, instrument self-shading, and bottom effects); (2) the intercomparison of seawater absorption coefficients from in situ measurements and from laboratory spectrometric analysis on discrete samples; (3) the intercomparison of two filter techniques for in vivo measurement of particulate absorption coefficients; and (4) the analysis of repeatability and reproducibility of the most relevant laboratory measurements carried out on seawater samples (i.e., particulate and yellow substance absorption coefficients, and pigment and total suspended matter concentrations). Sample data are also presented and discussed to illustrate the typical features characterizing the CoASTS measurement site in view of supporting the suitability of the CoASTS data set for bio-optical modeling and ocean color calibration and validation.

  20. Noninvertibility and resonance in discrete-time neural networks for time-series processing

    NASA Astrophysics Data System (ADS)

    Gicquel, N.; Anderson, J. S.; Kevrekidis, I. G.

    1998-01-01

    We present a computer-assisted study emphasizing certain elements of the dynamics of artificial neural networks (ANNs) used for discrete time-series processing and nonlinear system identification. The structure of the network gives rise to the possibility of multiple inverses of a phase point backward in time; this is not possible for the continuous-time system from which the time series are obtained. Using a two-dimensional illustrative model in an oscillatory regime, we study here the interaction of attractors predicted by the discrete-time ANN model (invariant circles and periodic points locked on them) with critical curves. These curves constitute a generalization of critical points for maps of the interval (in the sense of Julia-Fatou); their interaction with the model-predicted attractors plays a crucial role in the organization of the bifurcation structure and ultimately in determining the dynamic behavior predicted by the neural network.

  1. Mackenzie River Delta morphological change based on Landsat time series

    NASA Astrophysics Data System (ADS)

    Vesakoski, Jenni-Mari; Alho, Petteri; Gustafsson, David; Arheimer, Berit; Isberg, Kristina

    2015-04-01

    Arctic rivers are sensitive and yet quite unexplored river systems to which the climate change will impact on. Research has not focused in detail on the fluvial geomorphology of the Arctic rivers mainly due to the remoteness and wideness of the watersheds, problems with data availability and difficult accessibility. Nowadays wide collaborative spatial databases in hydrology as well as extensive remote sensing datasets over the Arctic are available and they enable improved investigation of the Arctic watersheds. Thereby, it is also important to develop and improve methods that enable detecting the fluvio-morphological processes based on the available data. Furthermore, it is essential to reconstruct and improve the understanding of the past fluvial processes in order to better understand prevailing and future fluvial processes. In this study we sum up the fluvial geomorphological change in the Mackenzie River Delta during the last ~30 years. The Mackenzie River Delta (~13 000 km2) is situated in the North Western Territories, Canada where the Mackenzie River enters to the Beaufort Sea, Arctic Ocean near the city of Inuvik. Mackenzie River Delta is lake-rich, productive ecosystem and ecologically sensitive environment. Research objective is achieved through two sub-objectives: 1) Interpretation of the deltaic river channel planform change by applying Landsat time series. 2) Definition of the variables that have impacted the most on detected changes by applying statistics and long hydrological time series derived from Arctic-HYPE model (HYdrologic Predictions for Environment) developed by Swedish Meteorological and Hydrological Institute. According to our satellite interpretation, field observations and statistical analyses, notable spatio-temporal changes have occurred in the morphology of the river channel and delta during the past 30 years. For example, the channels have been developing in braiding and sinuosity. In addition, various linkages between the studied

  2. GPS coordinate time series measurements in Ontario and Quebec, Canada

    NASA Astrophysics Data System (ADS)

    Samadi Alinia, Hadis; Tiampo, Kristy F.; James, Thomas S.

    2017-01-01

    New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario and western Québec provide constraints on the regional three-dimensional crustal velocity field. Five years of continuous observations at fourteen cGPS sites were analyzed using Bernese GPS processing software. Several different sub-networks were chosen from these stations, and the data were processed and compared to in order to select the optimal configuration to accurately estimate the vertical and horizontal station velocities and minimize the associated errors. The coordinate time series were then compared to the crustal motions from global solutions and the optimized solution is presented here. A noise analysis model with power-law and white noise, which best describes the noise characteristics of all three components, was employed for the GPS time series analysis. The linear trend, associated uncertainties, and the spectral index of the power-law noise were calculated using a maximum likelihood estimation approach. The residual horizontal velocities, after removal of rigid plate motion, have a magnitude consistent with expected glacial isostatic adjustment (GIA). The vertical velocities increase from subsidence of almost 1.9 mm/year south of the Great Lakes to uplift near Hudson Bay, where the highest rate is approximately 10.9 mm/year. The residual horizontal velocities range from approximately 0.5 mm/year, oriented south-southeastward, at the Great Lakes to nearly 1.5 mm/year directed toward the interior of Hudson Bay at stations adjacent to its shoreline. Here, the velocity uncertainties are estimated at less than 0.6 mm/year for the horizontal component and 1.1 mm/year for the vertical component. A comparison between the observed velocities and GIA model predictions, for a limited range of Earth models, shows a better fit to the observations for the Earth model with the smallest upper mantle viscosity and the largest lower mantle viscosity. However, the

  3. Academic Workload and Working Time: Retrospective Perceptions versus Time-Series Data

    ERIC Educational Resources Information Center

    Kyvik, Svein

    2013-01-01

    The purpose of this article is to examine the validity of perceptions by academic staff about their past and present workload and working hours. Retrospective assessments are compared with time-series data. The data are drawn from four mail surveys among academic staff in Norwegian universities undertaken in the period 1982-2008. The findings show…

  4. Time-Series Monitoring of Open Star Clusters

    NASA Astrophysics Data System (ADS)

    Hojaev, A. S.; Semakov, D. G.

    2006-08-01

    Star clusters especially a compact ones (with diameter of few to ten arcmin) are suitable targets to search of light variability for orchestera of stars by means of ordinary Casegrain telescope plus CCD system. A special patroling with short time-fixed exposures and mmag accuracy could be used also to study of stellar oscillation for group of stars simultaneously. The last can be carried out both separately from one site and within international campaigns. Detection and study of optical variability of X-ray sources including X-ray binaries with compact objects might be as a result of a long-term monitoring of such clusters as well. We present the program of open star clusters monitoring with Zeiss 1 meter RCC telescope of Maidanak observatory has been recently automated. In combination with quite good seeing at this observatory (see, e.g., Sarazin, M. 1999, URL http://www.eso.org/gen-fac/pubs/astclim/) the automatic telescope equipped with large-format (2KX2K) CCD camera AP-10 available will allow to collect homogenious time-series for analysis. We already started this program in 2001 and had a set of patrol observations with Zeiss 0.6 meter telescope and AP-10 camera in 2003. 7 compact open clusters in the Milky Way (NGC 7801, King1, King 13, King18, King20, Berkeley 55, IC 4996) have been monitored for stellar variability and some results of photometry will be presented. A few interesting variables were discovered and dozens were suspected for variability to the moment in these clusters for the first time. We have made steps to join the Whole-Earth Telescope effort in its future campaigns.

  5. Inverse method for estimating respiration rates from decay time series

    NASA Astrophysics Data System (ADS)

    Forney, D. C.; Rothman, D. H.

    2012-09-01

    Long-term organic matter decomposition experiments typically measure the mass lost from decaying organic matter as a function of time. These experiments can provide information about the dynamics of carbon dioxide input to the atmosphere and controls on natural respiration processes. Decay slows down with time, suggesting that organic matter is composed of components (pools) with varied lability. Yet it is unclear how the appropriate rates, sizes, and number of pools vary with organic matter type, climate, and ecosystem. To better understand these relations, it is necessary to properly extract the decay rates from decomposition data. Here we present a regularized inverse method to identify an optimally-fitting distribution of decay rates associated with a decay time series. We motivate our study by first evaluating a standard, direct inversion of the data. The direct inversion identifies a discrete distribution of decay rates, where mass is concentrated in just a small number of discrete pools. It is consistent with identifying the best fitting "multi-pool" model, without prior assumption of the number of pools. However we find these multi-pool solutions are not robust to noise and are over-parametrized. We therefore introduce a method of regularized inversion, which identifies the solution which best fits the data but not the noise. This method shows that the data are described by a continuous distribution of rates, which we find is well approximated by a lognormal distribution, and consistent with the idea that decomposition results from a continuum of processes at different rates. The ubiquity of the lognormal distribution suggest that decay may be simply described by just two parameters: a mean and a variance of log rates. We conclude by describing a procedure that estimates these two lognormal parameters from decay data. Matlab codes for all numerical methods and procedures are provided.

  6. Inverse method for estimating respiration rates from decay time series

    NASA Astrophysics Data System (ADS)

    Forney, D. C.; Rothman, D. H.

    2012-03-01

    Long-term organic matter decomposition experiments typically measure the mass lost from decaying organic matter as a function of time. These experiments can provide information about the dynamics of carbon dioxide input to the atmosphere and controls on natural respiration processes. Decay slows down with time, suggesting that organic matter is composed of components (pools) with varied lability. Yet it is unclear how the appropriate rates, sizes, and number of pools vary with organic matter type, climate, and ecosystem. To better understand these relations, it is necessary to properly extract the decay rates from decomposition data. Here we present a regularized inverse method to identify an optimally-fitting distribution of decay rates associated with a decay time series. We motivate our study by first evaluating a standard, direct inversion of the data. The direct inversion identifies a discrete distribution of decay rates, where mass is concentrated in just a small number of discrete pools. It is consistent with identifying the best fitting "multi-pool" model, without prior assumption of the number of pools. However we find these multi-pool solutions are not robust to noise and are over-parametrized. We therefore introduce a method of regularized inversion, which identifies the solution which best fits the data but not the noise. This method shows that the data are described by a continuous distribution of rates which we find is well approximated by a lognormal distribution, and consistent with the idea that decomposition results from a continuum of processes at different rates. The ubiquity of the lognormal distribution suggest that decay may be simply described by just two parameters; a mean and a variance of log rates. We conclude by describing a procedure that estimates these two lognormal parameters from decay data. Matlab codes for all numerical methods and procedures are provided.

  7. Representative locations from time series of soil water content using time stability and wavelet analysis.

    PubMed

    Rivera, Diego; Lillo, Mario; Granda, Stalin

    2014-12-01

    The concept of time stability has been widely used in the design and assessment of monitoring networks of soil moisture, as well as in hydrological studies, because it is as a technique that allows identifying of particular locations having the property of representing mean values of soil moisture in the field. In this work, we assess the effect of time stability calculations as new information is added and how time stability calculations are affected at shorter periods, subsampled from the original time series, containing different amounts of precipitation. In doing so, we defined two experiments to explore the time stability behavior. The first experiment sequentially adds new data to the previous time series to investigate the long-term influence of new data in the results. The second experiment applies a windowing approach, taking sequential subsamples from the entire time series to investigate the influence of short-term changes associated with the precipitation in each window. Our results from an operating network (seven monitoring points equipped with four sensors each in a 2-ha blueberry field) show that as information is added to the time series, there are changes in the location of the most stable point (MSP), and that taking the moving 21-day windows, it is clear that most of the variability of soil water content changes is associated with both the amount and intensity of rainfall. The changes of the MSP over each window depend on the amount of water entering the soil and the previous state of the soil water content. For our case study, the upper strata are proxies for hourly to daily changes in soil water content, while the deeper strata are proxies for medium-range stored water. Thus, different locations and depths are representative of processes at different time scales. This situation must be taken into account when water management depends on soil water content values from fixed locations.

  8. Statistical Analysis of Sensor Network Time Series at Multiple Time Scales

    NASA Astrophysics Data System (ADS)

    Granat, R. A.; Donnellan, A.

    2013-12-01

    Modern sensor networks often collect data at multiple time scales in order to observe physical phenomena that occur at different scales. Whether collected by heterogeneous or homogenous sensor networks, measurements at different time scales are usually subject to different dynamics, noise characteristics, and error sources. We explore the impact of these effects on the results of statistical time series analysis methods applied to multi-scale time series data. As a case study, we analyze results from GPS time series position data collected in Japan and the Western United States, which produce raw observations at 1Hz and orbit corrected observations at time resolutions of 5 minutes, 30 minutes, and 24 hours. We utilize the GPS analysis package (GAP) software to perform three types of statistical analysis on these observations: hidden Markov modeling, probabilistic principle components analysis, and covariance distance analysis. We compare the results of these methods at the different time scales and discuss the impact on science understanding of earthquake fault systems generally and recent large seismic events specifically, including the Tohoku-Oki earthquake in Japan and El Mayor-Cucupah earthquake in Mexico.

  9. LAI, FAPAR and FCOVER products derived from AVHRR long time series: principles and evaluation

    NASA Astrophysics Data System (ADS)

    Verger, A.; Baret, F.; Weiss, M.; Lacaze, R.; Makhmara, H.; Pacholczyk, P.; Smets, B.; Kandasamy, S.; Vermote, E.

    2012-04-01

    products. Finally, quality assessment information as well as tentative quantitative uncertainties were proposed. The comparison of the resulting AVHRR LTDR products with actual GEOLAND series derived from VEGETATION demonstrates that they are very consistent, providing continuous time series of global observations of LAI, FAPAR and FCOVER for the last 30-year period, with continuation after 2011.

  10. High-Speed Time-Series CCD Photometry with Agile

    NASA Astrophysics Data System (ADS)

    Mukadam, Anjum S.; Owen, R.; Mannery, E.; MacDonald, N.; Williams, B.; Stauffer, F.; Miller, C.

    2011-12-01

    We have assembled a high-speed time-series CCD photometer named Agile for the 3.5 m telescope at Apache Point Observatory, based on the design of a photometer called Argos at McDonald Observatory. Instead of a mechanical shutter, we use the frame-transfer operation of the CCD to end an exposure and initiate the subsequent new exposure. The frame-transfer operation is triggered by the negative edge of a GPS pulse; the instrument timing is controlled directly by hardware, without any software intervention or delays. This is the central pillar in the design of Argos that we have also used in Agile; this feature makes the accuracy of instrument timing better than a millisecond. Agile is based on a Princeton Instruments Acton VersArray camera with a frame-transfer CCD, which has 1K × 1K active pixels, each of size . Using a focal reducer at the Nasmyth focus of the 3.5 m telescope at Apache Point Observatory, we yield a field of view of 2.2 × 2.2 arcmin2 with an unbinned plate scale of 0.13″ pixel-1. The CCD is back-illuminated and thinned for improved blue sensitivity and provides a quantum efficiency ≥80% in the wavelength range of 4500–7500 Å. The unbinned full-frame readout time can be as fast as 1.1 s this is achieved using a low-noise amplifier operating at 1 MHz with an average read noise of the order of rms. At the slow read rate of 100 kHz to be used for exposure times longer than a few seconds, we determine an average read noise of the order of rms. Agile is optimized to observe variability at short timescales from one-third of a second to several hundred seconds. The variable astronomical sources routinely observed with Agile include pulsating white dwarfs, cataclysmic variables, flare stars, planetary transits, and planetary satellite occultations.

  11. Is the length-of-day time series normally distributed?

    NASA Astrophysics Data System (ADS)

    Sen, A. K.; Niedzielski, T.; Kosek, W.

    2009-09-01

    The non-tidal LOD data are analysed (data span 01.01.1962-09.01.2008) in order to provide the probabilistic characteristics of the Earth rotation rate fluctuations. The skewness of the LOD probability distribution is of -0.31 indicating that the probability distribution is asymmetrical. Moreover, the residual non-tidal LOD data is considered (after removal of the semiannual, annual, 9.3-years, and 18.6-years oscillations, and linear trend). The skewness of this residual data equals to -0.64 and indicates an increased asymmetry in the distribution. For the non-transformed LOD data the kurtosis is of 2.31 and it shows that the distribution is flattened. The kurtosis for the residuals is 5.64 indicating that the distribution is more peaked than a normal distribution. For the non-transformed LOD data, the Jonhson SB distribution provides the best fit. For the residual LOD data, the Johnson SU distribution is found to be the most appropriate model. Both the LOD and its residual time series are appropriately modeled by probability laws that are different from a normal distribution.

  12. Stochastic time series analysis of fetal heart-rate variability

    NASA Astrophysics Data System (ADS)

    Shariati, M. A.; Dripps, J. H.

    1990-06-01

    Fetal Heart Rate(FHR) is one of the important features of fetal biophysical activity and its long term monitoring is used for the antepartum(period of pregnancy before labour) assessment of fetal well being. But as yet no successful method has been proposed to quantitatively represent variety of random non-white patterns seen in FHR. Objective of this paper is to address this issue. In this study the Box-Jenkins method of model identification and diagnostic checking was used on phonocardiographic derived FHR(averaged) time series. Models remained exclusively autoregressive(AR). Kalman filtering in conjunction with maximum likelihood estimation technique forms the parametric estimator. Diagnosrics perfonned on the residuals indicated that a second order model may be adequate in capturing type of variability observed in 1 up to 2 mm data windows of FHR. The scheme may be viewed as a means of data reduction of a highly redundant information source. This allows a much more efficient transmission of FHR information from remote locations to places with facilities and expertise for doser analysis. The extracted parameters is aimed to reflect numerically the important FHR features. These are normally picked up visually by experts for their assessments. As a result long term FHR recorded during antepartum period could then be screened quantitatively for detection of patterns considered normal or abnonnal. 1.

  13. Optimal model-free prediction from multivariate time series

    NASA Astrophysics Data System (ADS)

    Runge, Jakob; Donner, Reik V.; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  14. Chaos Time Series Prediction Based on Membrane Optimization Algorithms

    PubMed Central

    Li, Meng; Yi, Liangzhong; Pei, Zheng; Gao, Zhisheng

    2015-01-01

    This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ, m) and least squares support vector machine (LS-SVM) (γ, σ) by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM) broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE), root mean square error (RMSE), and mean absolute percentage error (MAPE). PMID:25874249

  15. Time series analysis of Monte Carlo neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Nease, Brian Robert

    A time series based approach is applied to the Monte Carlo (MC) fission source distribution to calculate the non-fundamental mode eigenvalues of the system. The approach applies Principal Oscillation Patterns (POPs) to the fission source distribution, transforming the problem into a simple autoregressive order one (AR(1)) process. Proof is provided that the stationary MC process is linear to first order approximation, which is a requirement for the application of POPs. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern k-eigenvalue MC codes calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. The strength of this approach is contrasted against the Fission Matrix method (FMM) in terms of accuracy versus computer memory constraints. Multi-dimensional problems are considered since the approach has strong potential for use in reactor analysis, and the implementation of the method into production codes is discussed. Lastly, the appearance of complex eigenvalues is investigated and solutions are provided.

  16. Time Series Analysis of Symbiotic Stars and Cataclysmic Variables

    NASA Astrophysics Data System (ADS)

    Ren, Jiaying; MacLachlan, G.; Panchmal, A.; Dhuga, K.; Morris, D.

    2010-01-01

    Symbiotic stars (SSs) and Cataclysmic Variables (CVs) are two families of binary systems which occasionally vary in brightness because of accretion from the secondary star. High frequency oscillations, also known as flickering, are thought to occur because of turbulence in the accretion disk especially in and near the vicinity of the boundary layer between the surface of the compact object and the inner edge of the disk. Lower frequency oscillations are also observed but these are typically associated with the orbital and spin motions of the binary system and may be modulated by the presence of a magnetic field. By studying these variations, we probe the emission regions in these compact systems and gain a better understanding of the accretion process. Time-ordered series of apparent magnitudes for several SSs and CVs, obtained from the American Association of Variable Star Observers (AAVSO), have been analyzed. The analysis techniques include Power Spectral Densities, Rescaled R/S Analysis, and Discrete Wavelet Transforms. The results are used to estimate a Hurst exponent which is a measure of long-range memory dependence and self-similarity.

  17. Enhancing time-series detection algorithms for automated biosurveillance.

    PubMed

    Tokars, Jerome I; Burkom, Howard; Xing, Jian; English, Roseanne; Bloom, Steven; Cox, Kenneth; Pavlin, Julie A

    2009-04-01

    BioSense is a US national system that uses data from health information systems for automated disease surveillance. We studied 4 time-series algorithm modifications designed to improve sensitivity for detecting artificially added data. To test these modified algorithms, we used reports of daily syndrome visits from 308 Department of Defense (DoD) facilities and 340 hospital emergency departments (EDs). At a constant alert rate of 1%, sensitivity was improved for both datasets by using a minimum standard deviation (SD) of 1.0, a 14-28 day baseline duration for calculating mean and SD, and an adjustment for total clinic visits as a surrogate denominator. Stratifying baseline days into weekdays versus weekends to account for day-of-week effects increased sensitivity for the DoD data but not for the ED data. These enhanced methods may increase sensitivity without increasing the alert rate and may improve the ability to detect outbreaks by using automated surveillance system data.

  18. Discriminating additive from dynamical noise for chaotic time series.

    PubMed

    Strumik, Marek; Macek, Wiesław M; Redaelli, Stefano

    2005-09-01

    We consider the dynamics of the Hénon and Ikeda maps in the presence of additive and dynamical noise. We show that, from the point of view of computations of some statistical quantities, dynamical noise corrupting these deterministic systems can be considered effectively as an additive "pseudonoise" with the Cauchy distribution. In the case of the Hénon and Ikeda maps, this effect occurs only for one variable of the system, while the noise corrupting the second variable is still Gaussian distributed independent of distribution of dynamical noise. Based on these results and using scaling properties of the correlation entropy, we propose a simple method of discriminating additive from dynamical noise. This approach is also useful for estimation of noise level for chaotic time series. We show that the proposed method works well in a wide range of noise levels, providing that one kind of noise predominates and we analyze the variable of the system for which the contamination follows Cauchy-like distribution in the presence of dynamical noise.

  19. Physiological time-series analysis: what does regularity quantify?

    NASA Technical Reports Server (NTRS)

    Pincus, S. M.; Goldberger, A. L.

    1994-01-01

    Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.

  20. Nonlinear Time Series Analysis in Earth Sciences - Potentials and Pitfalls

    NASA Astrophysics Data System (ADS)

    Kurths, Jürgen; Donges, Jonathan F.; Donner, Reik V.; Marwan, Norbert; Zou, Yong

    2010-05-01

    The application of methods of nonlinear time series analysis has a rich tradition in Earth sciences and has enabled substantially new insights into various complex processes there. However, some approaches and findings have been controversially discussed over the last decades. One reason is that they are often bases on strong restrictions and their violation may lead to pitfalls and misinterpretations. Here, we discuss three general concepts of nonlinear dynamics and statistical physics, synchronization, recurrence and complex networks and explain how to use them for data analysis. We show that the corresponding methods can be applied even to rather short and non-stationary data which are typical in Earth sciences. References Marwan, N., Romano, M., Thiel, M., Kurths, J.: Recurrence plots for the analysis of complex systems, Physics Reports 438, 237-329 (2007) Arenas, A., Diaz-Guilera, A., Kurths, J., Moreno, Y., Zhou, C.: Synchronization in complex networks, Physics Reports 469, 93-153 (2008) Marwan, N., Donges, J.F., Zou, Y., Donner, R. and Kurths, J., Phys. Lett. A 373, 4246 (2009) Donges, J.F., Zou, Y., Marwan, N. and Kurths, J. Europhys. Lett. 87, 48007 (2009) Donner, R., Zou, Y., Donges, J.F., Marwan, N. and Kurths, J., Phys. Rev. E 81, 015101(R) (2010)

  1. On the Fourier and Wavelet Analysis of Coronal Time Series

    NASA Astrophysics Data System (ADS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-07-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  2. A Time Series Approach for Soil Moisture Estimation

    NASA Technical Reports Server (NTRS)

    Kim, Yunjin; vanZyl, Jakob

    2006-01-01

    Soil moisture is a key parameter in understanding the global water cycle and in predicting natural hazards. Polarimetric radar measurements have been used for estimating soil moisture of bare surfaces. In order to estimate soil moisture accurately, the surface roughness effect must be compensated properly. In addition, these algorithms will not produce accurate results for vegetated surfaces. It is difficult to retrieve soil moisture of a vegetated surface since the radar backscattering cross section is sensitive to the vegetation structure and environmental conditions such as the ground slope. Therefore, it is necessary to develop a method to estimate the effect of the surface roughness and vegetation reliably. One way to remove the roughness effect and the vegetation contamination is to take advantage of the temporal variation of soil moisture. In order to understand the global hydrologic cycle, it is desirable to measure soil moisture with one- to two-days revisit. Using these frequent measurements, a time series approach can be implemented to improve the soil moisture retrieval accuracy.

  3. Financial time series analysis based on effective phase transfer entropy

    NASA Astrophysics Data System (ADS)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  4. Innovative techniques to analyze time series of geomagnetic activity indices

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  5. Optimal model-free prediction from multivariate time series.

    PubMed

    Runge, Jakob; Donner, Reik V; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  6. The PRIMAP-hist national historical emissions time series

    NASA Astrophysics Data System (ADS)

    Gütschow, Johannes; Jeffery, M. Louise; Gieseke, Robert; Gebel, Ronja; Stevens, David; Krapp, Mario; Rocha, Marcia

    2016-11-01

    To assess the history of greenhouse gas emissions and individual countries' contributions to emissions and climate change, detailed historical data are needed. We combine several published datasets to create a comprehensive set of emissions pathways for each country and Kyoto gas, covering the years 1850 to 2014 with yearly values, for all UNFCCC member states and most non-UNFCCC territories. The sectoral resolution is that of the main IPCC 1996 categories. Additional time series of CO2 are available for energy and industry subsectors. Country-resolved data are combined from different sources and supplemented using year-to-year growth rates from regionally resolved sources and numerical extrapolations to complete the dataset. Regional deforestation emissions are downscaled to country level using estimates of the deforested area obtained from potential vegetation and simulations of agricultural land. In this paper, we discuss the data sources and methods used and present the resulting dataset, including its limitations and uncertainties. The dataset is available from doi:10.5880/PIK.2016.003 and can be viewed on the website accompanying this paper (http://www.pik-potsdam.de/primap-live/primap-hist/).

  7. Scene Context Dependency of Pattern Constancy of Time Series Imagery

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2008-01-01

    A fundamental element of future generic pattern recognition technology is the ability to extract similar patterns for the same scene despite wide ranging extraneous variables, including lighting, turbidity, sensor exposure variations, and signal noise. In the process of demonstrating pattern constancy of this kind for retinex/visual servo (RVS) image enhancement processing, we found that the pattern constancy performance depended somewhat on scene content. Most notably, the scene topography and, in particular, the scale and extent of the topography in an image, affects the pattern constancy the most. This paper will explore these effects in more depth and present experimental data from several time series tests. These results further quantify the impact of topography on pattern constancy. Despite this residual inconstancy, the results of overall pattern constancy testing support the idea that RVS image processing can be a universal front-end for generic visual pattern recognition. While the effects on pattern constancy were significant, the RVS processing still does achieve a high degree of pattern constancy over a wide spectrum of scene content diversity, and wide ranging extraneousness variations in lighting, turbidity, and sensor exposure.

  8. Imputation of missing data in time series for air pollutants

    NASA Astrophysics Data System (ADS)

    Junger, W. L.; Ponce de Leon, A.

    2015-02-01

    Missing data are major concerns in epidemiological studies of the health effects of environmental air pollutants. This article presents an imputation-based method that is suitable for multivariate time series data, which uses the EM algorithm under the assumption of normal distribution. Different approaches are considered for filtering the temporal component. A simulation study was performed to assess validity and performance of proposed method in comparison with some frequently used methods. Simulations showed that when the amount of missing data was as low as 5%, the complete data analysis yielded satisfactory results regardless of the generating mechanism of the missing data, whereas the validity began to degenerate when the proportion of missing values exceeded 10%. The proposed imputation method exhibited good accuracy and precision in different settings with respect to the patterns of missing observations. Most of the imputations obtained valid results, even under missing not at random. The methods proposed in this study are implemented as a package called mtsdi for the statistical software system R.

  9. Chaotic time series analysis of vision evoked EEG

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Wang, Hong

    2009-12-01

    To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.

  10. Chaotic time series analysis of vision evoked EEG

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Wang, Hong

    2010-01-01

    To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.

  11. Time-series analysis of offshore-wind-wave groupiness

    SciTech Connect

    Liang, H.B.

    1988-01-01

    This research is to applies basic time-series-analysis techniques on the complex envelope function where the study of the offshore-wind-wave groupiness is a relevant interest. In constructing the complex envelope function, a phase-unwrapping technique is integrated into the algorithm for estimating the carrier frequency and preserving the phase information for further studies. The Gaussian random wave model forms the basis of the wave-group statistics by the envelope-amplitude crossings. Good agreement between the theory and the analysis of field records is found. Other linear models, such as the individual-waves approach and the energy approach, are compared to the envelope approach by analyzing the same set of records. It is found that the character of the filter used in each approach dominates the wave-group statistics. Analyses indicate that the deep offshore wind waves are weakly nonlinear and the Gaussian random assumption remains appropriate for describing the sea state. Wave groups statistics derived from the Gaussian random wave model thus become applicable.

  12. Landslide monitoring using airphotos time series and GIS

    NASA Astrophysics Data System (ADS)

    Kavoura, Katerina; Nikolakopoulos, Konstantinos G.; Sabatakakis, Nikolaos

    2014-10-01

    Western Greece is suffering by landslides. The term landslide includes a wide range of ground movement, such as slides, falls, flows etc. mainly based on gravity with the aid of many conditioning and triggering factors. Landslides provoke enormous changes to the natural and artificial relief. The annual cost of repairing the damage amounts to millions of euros. In this paper a combined use of airphotos time series, high resolution remote sensing data and GIS for the landslide monitoring is presented. Analog and digital air-photos used covered a period of almost 70 years from 1945 until 2012. Classical analog airphotos covered the period from 1945 to 2000, while digital airphotos and satellite images covered the 2008-2012 period. The air photos have been orthorectified using the Leica Photogrammetry Suite. Ground control points and a high accuracy DSM were used for the orthorectification of the air photos. The 2008 digital air photo mosaic from the Greek Cadastral with a spatial resolution of 25 cm and the respective DSM was used as the base map for all the others data sets. The RMS error was less than 0.5 pixel. Changes to the artificial constructions provoked by the landslideswere digitized and then implemented in an ARCGIS database. The results are presented in this paper.

  13. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  14. River flow time series using least squares support vector machines

    NASA Astrophysics Data System (ADS)

    Samsudin, R.; Saad, P.; Shabri, A.

    2011-06-01

    This paper proposes a novel hybrid forecasting model known as GLSSVM, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM). The GMDH is used to determine the useful input variables which work as the time series forecasting for the LSSVM model. Monthly river flow data from two stations, the Selangor and Bernam rivers in Selangor state of Peninsular Malaysia were taken into consideration in the development of this hybrid model. The performance of this model was compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA), GMDH and LSSVM models using the long term observations of monthly river flow discharge. The root mean square error (RMSE) and coefficient of correlation (R) are used to evaluate the models' performances. In both cases, the new hybrid model has been found to provide more accurate flow forecasts compared to the other models. The results of the comparison indicate that the new hybrid model is a useful tool and a promising new method for river flow forecasting.

  15. Detection of a sudden change of the field time series based on the Lorenz system.

    PubMed

    Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.

  16. Detection of a sudden change of the field time series based on the Lorenz system

    PubMed Central

    Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832

  17. Reconstructing Ocean Circulation using Coral (triangle)14C Time Series

    SciTech Connect

    Kashgarian, M; Guilderson, T P

    2001-02-23

    the invasion of fossil fuel CO{sub 2} and bomb {sup 14}C into the atmosphere and surface oceans. Therefore the {Delta}{sup 14}C data that are produced in this study can be used to validate the ocean uptake of fossil fuel CO2 in coupled ocean-atmosphere models. This study takes advantage of the quasi-conservative nature of {sup 14}C as a water mass tracer by using {Delta}{sup 14}C time series in corals to identify changes in the shallow circulation of the Pacific. Although the data itself provides fundamental information on surface water mass movement the true strength is a combined approach which is greater than the individual parts; the data helps uncover deficiencies in ocean circulation models and the model results place long {Delta}{sup 14}C time series in a dynamic framework which helps to identify those locations where additional observations are most needed.

  18. A time-series approach to dynamical systems from classical and quantum worlds

    SciTech Connect

    Fossion, Ruben

    2014-01-08

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  19. A time-series approach to dynamical systems from classical and quantum worlds

    NASA Astrophysics Data System (ADS)

    Fossion, Ruben

    2014-01-01

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  20. D City Transformations by Time Series of Aerial Images

    NASA Astrophysics Data System (ADS)

    Adami, A.

    2015-02-01

    Recent photogrammetric applications, based on dense image matching algorithms, allow to use not only images acquired by digital cameras, amateur or not, but also to recover the vast heritage of analogue photographs. This possibility opens up many possibilities in the use and enhancement of existing photos heritage. The research of the original figuration of old buildings, the virtual reconstruction of disappeared architectures and the study of urban development are some of the application areas that exploit the great cultural heritage of photography. Nevertheless there are some restrictions in the use of historical images for automatic reconstruction of buildings such as image quality, availability of camera parameters and ineffective geometry of image acquisition. These constrains are very hard to solve and it is difficult to discover good dataset in the case of terrestrial close range photogrammetry for the above reasons. Even the photographic archives of museums and superintendence, while retaining a wealth of documentation, have no dataset for a dense image matching approach. Compared to the vast collection of historical photos, the class of aerial photos meets both criteria stated above. In this paper historical aerial photographs are used with dense image matching algorithms to realize 3d models of a city in different years. The models can be used to study the urban development of the city and its changes through time. The application relates to the city centre of Verona, for which some time series of aerial photographs have been retrieved. The models obtained in this way allowed, right away, to observe the urban development of the city, the places of expansion and new urban areas. But a more interesting aspect emerged from the analytical comparison between models. The difference, as the Euclidean distance, between two models gives information about new buildings or demolitions. As considering accuracy it is necessary point out that the quality of final