Science.gov

Sample records for accurate time series

  1. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    PubMed Central

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  2. Robust and accurate anomaly detection in ECG artifacts using time series motif discovery.

    PubMed

    Sivaraks, Haemwaan; Ratanamahatana, Chotirat Ann

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods.

  3. Accurate estimation of entropy in very short physiological time series: the problem of atrial fibrillation detection in implanted ventricular devices.

    PubMed

    Lake, Douglas E; Moorman, J Randall

    2011-01-01

    Entropy estimation is useful but difficult in short time series. For example, automated detection of atrial fibrillation (AF) in very short heart beat interval time series would be useful in patients with cardiac implantable electronic devices that record only from the ventricle. Such devices require efficient algorithms, and the clinical situation demands accuracy. Toward these ends, we optimized the sample entropy measure, which reports the probability that short templates will match with others within the series. We developed general methods for the rational selection of the template length m and the tolerance matching r. The major innovation was to allow r to vary so that sufficient matches are found for confident entropy estimation, with conversion of the final probability to a density by dividing by the matching region volume, 2r(m). The optimized sample entropy estimate and the mean heart beat interval each contributed to accurate detection of AF in as few as 12 heartbeats. The final algorithm, called the coefficient of sample entropy (COSEn), was developed using the canonical MIT-BIH database and validated in a new and much larger set of consecutive Holter monitor recordings from the University of Virginia. In patients over the age of 40 yr old, COSEn has high degrees of accuracy in distinguishing AF from normal sinus rhythm in 12-beat calculations performed hourly. The most common errors are atrial or ventricular ectopy, which increase entropy despite sinus rhythm, and atrial flutter, which can have low or high entropy states depending on dynamics of atrioventricular conduction.

  4. Time Series Explorer

    NASA Astrophysics Data System (ADS)

    Scargle, J.

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Examples of application of these tools for automated time series discovery will be given.

  5. Time Series Explorer

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas

    The key, central objectives of the proposed Time Series Explorer project are to develop an organized collection of software tools for analysis of time series data in current and future NASA astrophysics data archives, and to make the tools available in two ways: as a library (the Time Series Toolbox) that individual science users can use to write their own data analysis pipelines, and as an application (the Time Series Automaton) providing an accessible, data-ready interface to many Toolbox algorithms, facilitating rapid exploration and automatic processing of time series databases. A number of time series analysis methods will be implemented, including techniques that range from standard ones to state-of-the-art developments by the proposers and others. Most of the algorithms will be able to handle time series data subject to real-world problems such as data gaps, sampling that is otherwise irregular, asynchronous sampling (in multi-wavelength settings), and data with non-Gaussian measurement errors. The proposed research responds to the ADAP element supporting the development of tools for mining the vast reservoir of information residing in NASA databases. The tools that will be provided to the community of astronomers studying variability of astronomical objects (from nearby stars and extrasolar planets, through galactic and extragalactic sources) will revolutionize the quality of timing analyses that can be carried out, and greatly enhance the scientific throughput of all NASA astrophysics missions past, present, and future. The Automaton will let scientists explore time series - individual records or large data bases -- with the most informative and useful analysis methods available, without having to develop the tools themselves or understand the computational details. Both elements, the Toolbox and the Automaton, will enable deep but efficient exploratory time series data analysis, which is why we have named the project the Time Series Explorer. Science

  6. GPS Position Time Series @ JPL

    NASA Technical Reports Server (NTRS)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  7. Permutations and time series analysis.

    PubMed

    Cánovas, Jose S; Guillamón, Antonio

    2009-12-01

    The main aim of this paper is to show how the use of permutations can be useful in the study of time series analysis. In particular, we introduce a test for checking the independence of a time series which is based on the number of admissible permutations on it. The main improvement in our tests is that we are able to give a theoretical distribution for independent time series.

  8. FROG: Time-series analysis

    NASA Astrophysics Data System (ADS)

    Allan, Alasdair

    2014-06-01

    FROG performs time series analysis and display. It provides a simple user interface for astronomers wanting to do time-domain astrophysics but still offers the powerful features found in packages such as PERIOD (ascl:1406.005). FROG includes a number of tools for manipulation of time series. Among other things, the user can combine individual time series, detrend series (multiple methods) and perform basic arithmetic functions. The data can also be exported directly into the TOPCAT (ascl:1101.010) application for further manipulation if needed.

  9. Predicting Nonlinear Time Series

    DTIC Science & Technology

    1993-12-01

    response becomes R,(k) = f (Y FV,(k)) (2.4) where Wy specifies the weight associated with the output of node i to the input of nodej in the next layer and...interconnections for each of these previous nodes. 18 prr~~~o• wfe :t iam i -- ---- --- --- --- Figure 5: Delay block for ATNN [9] Thus, nodej receives the...computed values, aj(tn), and dj(tn) denotes the desired output of nodej at time in. In this thesis, the weights and time delays update after each input

  10. Apparatus for statistical time-series analysis of electrical signals

    NASA Technical Reports Server (NTRS)

    Stewart, C. H. (Inventor)

    1973-01-01

    An apparatus for performing statistical time-series analysis of complex electrical signal waveforms, permitting prompt and accurate determination of statistical characteristics of the signal is presented.

  11. Langevin equations from time series.

    PubMed

    Racca, E; Porporato, A

    2005-02-01

    We discuss the link between the approach to obtain the drift and diffusion of one-dimensional Langevin equations from time series, and Pope and Ching's relationship for stationary signals. The two approaches are based on different interpretations of conditional averages of the time derivatives of the time series at given levels. The analysis provides a useful indication for the correct application of Pope and Ching's relationship to obtain stochastic differential equations from time series and shows its validity, in a generalized sense, for nondifferentiable processes originating from Langevin equations.

  12. Time series with tailored nonlinearities

    NASA Astrophysics Data System (ADS)

    Räth, C.; Laut, I.

    2015-10-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.

  13. Economic Time-Series Page.

    ERIC Educational Resources Information Center

    Bos, Theodore; Culver, Sarah E.

    2000-01-01

    Describes the Economagic Web site, a comprehensive site of free economic time-series data that can be used for research and instruction. Explains that it contains 100,000+ economic data series from sources such as the Federal Reserve Banking System, the Census Bureau, and the Department of Commerce. (CMK)

  14. Entropy of electromyography time series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Zurcher, Ulrich; Sung, Paul S.

    2007-12-01

    A nonlinear analysis based on Renyi entropy is applied to electromyography (EMG) time series from back muscles. The time dependence of the entropy of the EMG signal exhibits a crossover from a subdiffusive regime at short times to a plateau at longer times. We argue that this behavior characterizes complex biological systems. The plateau value of the entropy can be used to differentiate between healthy and low back pain individuals.

  15. Random time series in astronomy.

    PubMed

    Vaughan, Simon

    2013-02-13

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.

  16. Pattern Recognition in Time Series

    NASA Astrophysics Data System (ADS)

    Lin, Jessica; Williamson, Sheri; Borne, Kirk D.; DeBarr, David

    2012-03-01

    Perhaps the most commonly encountered data types are time series, touching almost every aspect of human life, including astronomy. One obvious problem of handling time-series databases concerns with its typically massive size—gigabytes or even terabytes are common, with more and more databases reaching the petabyte scale. For example, in telecommunication, large companies like AT&T produce several hundred millions long-distance records per day [Cort00]. In astronomy, time-domain surveys are relatively new—these are surveys that cover a significant fraction of the sky with many repeat observations, thereby producing time series for millions or billions of objects. Several such time-domain sky surveys are now completed, such as the MACHO [Alco01],OGLE [Szym05], SDSS Stripe 82 [Bram08], SuperMACHO [Garg08], and Berkeley’s Transients Classification Pipeline (TCP) [Star08] projects. The Pan-STARRS project is an active sky survey—it began in 2010, a 3-year survey covering three-fourths of the sky with ˜60 observations of each field [Kais04]. The Large Synoptic Survey Telescope (LSST) project proposes to survey 50% of the visible sky repeatedly approximately 1000 times over a 10-year period, creating a 100-petabyte image archive and a 20-petabyte science database (http://www.lsst.org/). The LSST science database will include time series of over 100 scientific parameters for each of approximately 50 billion astronomical sources—this will be the largest data collection (and certainly the largest time series database) ever assembled in astronomy, and it rivals any other discipline’s massive data collections for sheer size and complexity. More common in astronomy are time series of flux measurements. As a consequence of many decades of observations (and in some cases, hundreds of years), a large variety of flux variations have been detected in astronomical objects, including periodic variations (e.g., pulsating stars, rotators, pulsars, eclipsing binaries

  17. Time series analysis of injuries.

    PubMed

    Martinez-Schnell, B; Zaidi, A

    1989-12-01

    We used time series models in the exploratory and confirmatory analysis of selected fatal injuries in the United States from 1972 to 1983. We built autoregressive integrated moving average (ARIMA) models for monthly, weekly, and daily series of deaths and used these models to generate hypotheses. These deaths resulted from six causes of injuries: motor vehicles, suicides, homicides, falls, drownings, and residential fires. For each cause of injury, we estimated calendar effects on the monthly death counts. We confirmed the significant effect of vehicle miles travelled on motor vehicle fatalities with a transfer function model. Finally, we applied intervention analysis to deaths due to motor vehicles.

  18. Inductive time series modeling program

    SciTech Connect

    Kirk, B.L.; Rust, B.W.

    1985-10-01

    A number of features that comprise environmental time series share a common mathematical behavior. Analysis of the Mauna Loa carbon dioxide record and other time series is aimed at constructing mathematical functions which describe as many major features of the data as possible. A trend function is fit to the data, removed, and the resulting residuals analyzed for any significant behavior. This is repeated until the residuals are driven to white noise. In the following discussion, the concept of trend will include cyclic components. The mathematical tools and program packages used are VARPRO (Golub and Pereyra 1973), for the least squares fit, and a modified version of our spectral analysis program (Kirk et al. 1979), for spectrum and noise analysis. The program is written in FORTRAN. All computations are done in double precision, except for the plotting calls where the DISSPLA package is used. The core requirement varies between 600 K and 700 K. The program is implemented on the IBM 360/370. Currently, the program can analyze up to five different time series where each series contains no more than 300 points. 12 refs.

  19. Modeling North Pacific Time Series

    NASA Astrophysics Data System (ADS)

    Overland, J. E.; Percival, D. B.; Mofjeld, H. O.

    2002-05-01

    We present a case study in modeling the North Pacific (NP) index, a time series of the wintertime Aleutian low sea level pressure from 1900 to 1999. We consider three statistical models, namely, a Gaussian stationary autoregressive process, a Gaussian fractionally difference (FD) or ``long-memory" process, and a ``signal plus noise" process consisting of a square wave oscillation with a pentadecadal period embedded in Gaussian white noise. Each model depends upon three parameters, so all three models are equally simple. The shortness of the time series makes it unrealistic to formally prefer one model over the other: we estimate it would take a 300 year record to differentiate between the models. Although the models fit equally well, they have quite different implications for the long-term behavior of the NP index, e.g. generation of regimes of characteristic lengths. Additional information and physical arguments may add support for a particular model. The FD - ``long memory" process suggests multiple physical contributions with different damping constants many North Pacific biological time series which are influenced by atmospheric and oceanic processes, show regime-like ecosystem reorganizations.

  20. Automated utilization review is timely, accurate, efficient.

    PubMed

    Schmitz, H H; Schoenhard, W C

    1976-07-01

    Federal utilization review regulations require that hospitals establish admission and extended stay certification processes and conduct medical care evaluation studies. The computerized review system in use at Deaconess Hospital, St. Louis, has satisfied these regulations with minimum expenditure of time, effort, and money while insuring maximum accuracy and timeliness and consistency of reporting.

  1. Multivariate Voronoi Outlier Detection for Time Series.

    PubMed

    Zwilling, Chris E; Wang, Michelle Yongmei

    2014-10-01

    Outlier detection is a primary step in many data mining and analysis applications, including healthcare and medical research. This paper presents a general method to identify outliers in multivariate time series based on a Voronoi diagram, which we call Multivariate Voronoi Outlier Detection (MVOD). The approach copes with outliers in a multivariate framework, via designing and extracting effective attributes or features from the data that can take parametric or nonparametric forms. Voronoi diagrams allow for automatic configuration of the neighborhood relationship of the data points, which facilitates the differentiation of outliers and non-outliers. Experimental evaluation demonstrates that our MVOD is an accurate, sensitive, and robust method for detecting outliers in multivariate time series data.

  2. Introduction to Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1986-01-01

    The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.

  3. Multiple Indicator Stationary Time Series Models.

    ERIC Educational Resources Information Center

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  4. Pseudotime estimation: deconfounding single cell time series

    PubMed Central

    Reid, John E.; Wernisch, Lorenz

    2016-01-01

    Motivation: Repeated cross-sectional time series single cell data confound several sources of variation, with contributions from measurement noise, stochastic cell-to-cell variation and cell progression at different rates. Time series from single cell assays are particularly susceptible to confounding as the measurements are not averaged over populations of cells. When several genes are assayed in parallel these effects can be estimated and corrected for under certain smoothness assumptions on cell progression. Results: We present a principled probabilistic model with a Bayesian inference scheme to analyse such data. We demonstrate our method’s utility on public microarray, nCounter and RNA-seq datasets from three organisms. Our method almost perfectly recovers withheld capture times in an Arabidopsis dataset, it accurately estimates cell cycle peak times in a human prostate cancer cell line and it correctly identifies two precocious cells in a study of paracrine signalling in mouse dendritic cells. Furthermore, our method compares favourably with Monocle, a state-of-the-art technique. We also show using held-out data that uncertainty in the temporal dimension is a common confounder and should be accounted for in analyses of repeated cross-sectional time series. Availability and Implementation: Our method is available on CRAN in the DeLorean package. Contact: john.reid@mrc-bsu.cam.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27318198

  5. Detecting chaos from time series

    NASA Astrophysics Data System (ADS)

    Xiaofeng, Gong; Lai, C. H.

    2000-02-01

    In this paper, an entirely data-based method to detect chaos from the time series is developed by introducing icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points (the p -steps icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> -neighbour points). We demonstrate that for deterministic chaotic systems, there exists a linear relationship between the logarithm of the average number of icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points, lnn p ,icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> , and the time step, p . The coefficient can be related to the KS entropy of the system. The effects of the embedding dimension and noise are also discussed.

  6. Offset detection in GPS coordinate time series

    NASA Astrophysics Data System (ADS)

    Gazeaux, J.; King, M. A.; Williams, S. D.

    2013-12-01

    Global Positioning System (GPS) time series are commonly affected by offsets of unknown magnitude and the large volume of data globally warrants investigation of automated detection approaches. The Detection of Offsets in GPS Experiment (DOGEx) showed that accuracy of Global Positioning System (GPS) time series can be significantly improved by applying statistical offset detection methods (see Gazeaux et al. (2013)). However, the best of these approaches did not perform as well as manual detection by expert analysts. Many of the features of GPS coordinates time series have not yet been fully taken into account in existing methods. Here, we apply Bayesian theory in order to make use of prior knowledge of the site noise characteristics and metadata in an attempt to make the offset detection more accurate. In the past decades, Bayesian theory has shown relevant results for a widespread range of applications, but has not yet been applied to GPS coordinates time series. Such methods incorporate different inputs such as a dynamic model (linear trend, periodic signal..) and a-priori information in a process that provides the best estimate of parameters (velocity, phase and amplitude of periodic signals...) based on all the available information. We test the new method on the DOGEx simulated dataset and compare it to previous solutions, and to Monte-Carlo method to test the accuracy of the procedure. We make a preliminary extension of the DOGEx dataset to introduce metadata information, allowing us to test the value of this data type in detecting offsets. The flexibility, robustness and limitations of the new approach are discussed. Gazeaux, J. Williams, S., King, M., Bos, M., Dach, R., Deo, M.,Moore, A.W., Ostini, L., Petrie, E., Roggero, M., Teferle, F.N., Olivares, G.,Webb, F.H. 2013. Detecting offsets in GPS time series: First results from the detection of offsets in GPS experiment. Journal of Geophysical Research: Solid Earth 118. 5. pp:2169-9356. Keywords : GPS

  7. The scaling of time series size towards detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Gao, Xiaolei; Ren, Liwei; Shang, Pengjian; Feng, Guochen

    2016-06-01

    In this paper, we introduce a modification of detrended fluctuation analysis (DFA), called multivariate DFA (MNDFA) method, based on the scaling of time series size N. In traditional DFA method, we obtained the influence of the sequence segmentation interval s, and it inspires us to propose a new model MNDFA to discuss the scaling of time series size towards DFA. The effectiveness of the procedure is verified by numerical experiments with both artificial and stock returns series. Results show that the proposed MNDFA method contains more significant information of series compared to traditional DFA method. The scaling of time series size has an influence on the auto-correlation (AC) in time series. For certain series, we obtain an exponential relationship, and also calculate the slope through the fitting function. Our analysis and finite-size effect test demonstrate that an appropriate choice of the time series size can avoid unnecessary influences, and also make the testing results more accurate.

  8. Weighted Dynamic Time Warping for Time Series Classification

    SciTech Connect

    Jeong, Young-Seon; Jeong, Myong K; Omitaomu, Olufemi A

    2011-01-01

    Dynamic time warping (DTW), which finds the minimum path by providing non-linear alignments between two time series, has been widely used as a distance measure for time series classification and clustering. However, DTW does not account for the relative importance regarding the phase difference between a reference point and a testing point. This may lead to misclassification especially in applications where the shape similarity between two sequences is a major consideration for an accurate recognition. Therefore, we propose a novel distance measure, called a weighted DTW (WDTW), which is a penalty-based DTW. Our approach penalizes points with higher phase difference between a reference point and a testing point in order to prevent minimum distance distortion caused by outliers. The rationale underlying the proposed distance measure is demonstrated with some illustrative examples. A new weight function, called the modified logistic weight function (MLWF), is also proposed to systematically assign weights as a function of the phase difference between a reference point and a testing point. By applying different weights to adjacent points, the proposed algorithm can enhance the detection of similarity between two time series. We show that some popular distance measures such as DTW and Euclidean distance are special cases of our proposed WDTW measure. We extend the proposed idea to other variants of DTW such as derivative dynamic time warping (DDTW) and propose the weighted version of DDTW. We have compared the performances of our proposed procedures with other popular approaches using public data sets available through the UCR Time Series Data Mining Archive for both time series classification and clustering problems. The experimental results indicate that the proposed approaches can achieve improved accuracy for time series classification and clustering problems.

  9. Scale-dependent intrinsic entropies of complex time series.

    PubMed

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.

  10. Foot gait time series estimation based on support vector machine.

    PubMed

    Pant, Jeevan K; Krishnan, Sridhar

    2014-01-01

    A new algorithm for the estimation of stride interval time series from foot gait signals is proposed. The algorithm is based on the detection of beginning of heel strikes in the signal by using the support vector machine. Morphological operations are used to enhance the accuracy of detection. By taking backward differences of the detected beginning of heel strikes, stride interval time series is estimated. Simulation results are presented which shows that the proposed algorithm yields fairly accurate estimation of stride interval time series where estimation error for mean and standard deviation of the time series is of the order of 10(-4).

  11. Sparse Representation for Time-Series Classification

    DTIC Science & Technology

    2015-02-08

    February 8, 2015 16:49 World Scientific Review Volume - 9in x 6in ” time - series classification” page 1 Chapter 1 Sparse Representation for Time - Series ...studies the problem of time - series classification and presents an overview of recent developments in the area of feature extraction and information...problem of target classification, and more generally time - series classification, in two main directions, feature extraction and information fusion. 1

  12. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  13. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  14. A Review of Subsequence Time Series Clustering

    PubMed Central

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  15. A review of subsequence time series clustering.

    PubMed

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  16. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  17. TSAN: a package for time series analysis.

    PubMed

    Wang, D C; Vagnucci, A H

    1980-04-01

    Many biomedical data are in the form of time series. Analyses of these data include: (1) search for any biorhythm; (2) test of homogeneity of several time series; (3) assessment of stationarity; (4) test of normality of the time series histogram; (5) evaluation of dependence between data points. In this paper we present a subroutine package called TSAN. It is developed to accomplish these tasks. Computational methods, as well as flowcharts, for these subroutines are described. Two sample runs are demonstrated.

  18. The Theory of Standardized Time Series.

    DTIC Science & Technology

    1985-04-01

    3.1)),’the method of standardized time series produces asymptotically valid confidence intevals for steady7&tepi1Tsmneters. However, these intervals...the method of standardized time series produces asymptotically valid confidence intevals for steady-state parameters. However, these intervals are...fa o & s d ......ary O W f, . .d by W eek m b o ), Da~cn an o~simulation output analysis confidence intervals standardized time series functional

  19. Forecasting Enrollments with Fuzzy Time Series.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  20. Accurate Fiber Length Measurement Using Time-of-Flight Technique

    NASA Astrophysics Data System (ADS)

    Terra, Osama; Hussein, Hatem

    2016-06-01

    Fiber artifacts of very well-measured length are required for the calibration of optical time domain reflectometers (OTDR). In this paper accurate length measurement of different fiber lengths using the time-of-flight technique is performed. A setup is proposed to measure accurately lengths from 1 to 40 km at 1,550 and 1,310 nm using high-speed electro-optic modulator and photodetector. This setup offers traceability to the SI unit of time, the second (and hence to meter by definition), by locking the time interval counter to the Global Positioning System (GPS)-disciplined quartz oscillator. Additionally, the length of a recirculating loop artifact is measured and compared with the measurement made for the same fiber by the National Physical Laboratory of United Kingdom (NPL). Finally, a method is proposed to relatively correct the fiber refractive index to allow accurate fiber length measurement.

  1. Multigrid time-accurate integration of Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Arnone, Andrea; Liou, Meng-Sing; Povinelli, Louis A.

    1993-01-01

    Efficient acceleration techniques typical of explicit steady-state solvers are extended to time-accurate calculations. Stability restrictions are greatly reduced by means of a fully implicit time discretization. A four-stage Runge-Kutta scheme with local time stepping, residual smoothing, and multigridding is used instead of traditional time-expensive factorizations. Some applications to natural and forced unsteady viscous flows show the capability of the procedure.

  2. Statistical criteria for characterizing irradiance time series.

    SciTech Connect

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  3. Generation of artificial helioseismic time-series

    NASA Technical Reports Server (NTRS)

    Schou, J.; Brown, T. M.

    1993-01-01

    We present an outline of an algorithm to generate artificial helioseismic time-series, taking into account as much as possible of the knowledge we have on solar oscillations. The hope is that it will be possible to find the causes of some of the systematic errors in analysis algorithms by testing them with such artificial time-series.

  4. Linear Relations in Time Series Models. I.

    ERIC Educational Resources Information Center

    Villegas, C.

    1976-01-01

    A multiple time series is defined as the sum of an autoregressive process on a line and independent Gaussian white noise or a hyperplane that goes through the origin and intersects the line at a single point. This process is a multiple autoregressive time series in which the regression matrices satisfy suitable conditions. For a related article…

  5. On reconstruction of time series in climatology

    NASA Astrophysics Data System (ADS)

    Privalsky, V.; Gluhovsky, A.

    2015-10-01

    The approach to time series reconstruction in climatology based upon cross-correlation coefficients and regression equations is mathematically incorrect because it ignores the dependence of time series upon their past. The proper method described here for the bivariate case requires the autoregressive time- and frequency domains modeling of the time series which contains simultaneous observations of both scalar series with subsequent application of the model to restore the shorter one into the past. The method presents further development of previous efforts taken by a number of authors starting from A. Douglass who introduced some concepts of time series analysis into paleoclimatology. The method is applied to the monthly data of total solar irradiance (TSI), 1979-2014, and sunspot numbers (SSN), 1749-2014, to restore the TSI data over 1749-1978. The results of the reconstruction are in statistical agreement with observations.

  6. A radar image time series

    NASA Technical Reports Server (NTRS)

    Leberl, F.; Fuchs, H.; Ford, J. P.

    1981-01-01

    A set of ten side-looking radar images of a mining area in Arizona that were aquired over a period of 14 yr are studied to demonstrate the photogrammetric differential-rectification technique applied to radar images and to examine changes that occurred in the area over time. Five of the images are rectified by using ground control points and a digital height model taken from a map. Residual coordinate errors in ground control are reduced from several hundred meters in all cases to + or - 19 to 70 m. The contents of the radar images are compared with a Landsat image and with aerial photographs. Effects of radar system parameters on radar images are briefly reviewed.

  7. Entropic Analysis of Electromyography Time Series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  8. Time-Accurate Numerical Simulations of Synthetic Jet Quiescent Air

    NASA Technical Reports Server (NTRS)

    Rupesh, K-A. B.; Ravi, B. R.; Mittal, R.; Raju, R.; Gallas, Q.; Cattafesta, L.

    2007-01-01

    The unsteady evolution of three-dimensional synthetic jet into quiescent air is studied by time-accurate numerical simulations using a second-order accurate mixed explicit-implicit fractional step scheme on Cartesian grids. Both two-dimensional and three-dimensional calculations of synthetic jet are carried out at a Reynolds number (based on average velocity during the discharge phase of the cycle V(sub j), and jet width d) of 750 and Stokes number of 17.02. The results obtained are assessed against PIV and hotwire measurements provided for the NASA LaRC workshop on CFD validation of synthetic jets.

  9. Network structure of multivariate time series

    PubMed Central

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-01-01

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail. PMID:26487040

  10. Homogenising time series: beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2011-06-01

    In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.

  11. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  12. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  13. Extracting Time-Accurate Acceleration Vectors From Nontrivial Accelerometer Arrangements.

    PubMed

    Franck, Jennifer A; Blume, Janet; Crisco, Joseph J; Franck, Christian

    2015-09-01

    Sports-related concussions are of significant concern in many impact sports, and their detection relies on accurate measurements of the head kinematics during impact. Among the most prevalent recording technologies are videography, and more recently, the use of single-axis accelerometers mounted in a helmet, such as the HIT system. Successful extraction of the linear and angular impact accelerations depends on an accurate analysis methodology governed by the equations of motion. Current algorithms are able to estimate the magnitude of acceleration and hit location, but make assumptions about the hit orientation and are often limited in the position and/or orientation of the accelerometers. The newly formulated algorithm presented in this manuscript accurately extracts the full linear and rotational acceleration vectors from a broad arrangement of six single-axis accelerometers directly from the governing set of kinematic equations. The new formulation linearizes the nonlinear centripetal acceleration term with a finite-difference approximation and provides a fast and accurate solution for all six components of acceleration over long time periods (>250 ms). The approximation of the nonlinear centripetal acceleration term provides an accurate computation of the rotational velocity as a function of time and allows for reconstruction of a multiple-impact signal. Furthermore, the algorithm determines the impact location and orientation and can distinguish between glancing, high rotational velocity impacts, or direct impacts through the center of mass. Results are shown for ten simulated impact locations on a headform geometry computed with three different accelerometer configurations in varying degrees of signal noise. Since the algorithm does not require simplifications of the actual impacted geometry, the impact vector, or a specific arrangement of accelerometer orientations, it can be easily applied to many impact investigations in which accurate kinematics need to

  14. Accurate GPS Time-Linked data Acquisition System (ATLAS II) user's manual.

    SciTech Connect

    Jones, Perry L.; Zayas, Jose R.; Ortiz-Moyet, Juan

    2004-02-01

    The Accurate Time-Linked data Acquisition System (ATLAS II) is a small, lightweight, time-synchronized, robust data acquisition system that is capable of acquiring simultaneous long-term time-series data from both a wind turbine rotor and ground-based instrumentation. This document is a user's manual for the ATLAS II hardware and software. It describes the hardware and software components of ATLAS II, and explains how to install and execute the software.

  15. Modelling of nonlinear filtering Poisson time series

    NASA Astrophysics Data System (ADS)

    Bochkarev, Vladimir V.; Belashova, Inna A.

    2016-08-01

    In this article, algorithms of non-linear filtering of Poisson time series are tested using statistical modelling. The objective is to find a representation of a time series as a wavelet series with a small number of non-linear coefficients, which allows distinguishing statistically significant details. There are well-known efficient algorithms of non-linear wavelet filtering for the case when the values of a time series have a normal distribution. However, if the distribution is not normal, good results can be expected using the maximum likelihood estimations. The filtration is studied according to the criterion of maximum likelihood by the example of Poisson time series. For direct optimisation of the likelihood function, different stochastic (genetic algorithms, annealing method) and deterministic optimization algorithms are used. Testing of the algorithm using both simulated series and empirical data (series of rare words frequencies according to the Google Books Ngram data were used) showed that filtering based on the criterion of maximum likelihood has a great advantage over well-known algorithms for the case of Poisson series. Also, the most perspective methods of optimisation were selected for this problem.

  16. Developing consistent time series landsat data products

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...

  17. Modeling Time Series Data for Supervised Learning

    ERIC Educational Resources Information Center

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  18. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  19. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  20. Measuring nonlinear behavior in time series data

    NASA Astrophysics Data System (ADS)

    Wai, Phoong Seuk; Ismail, Mohd Tahir

    2014-12-01

    Stationary Test is an important test in detect the time series behavior since financial and economic data series always have missing data, structural change as well as jumps or breaks in the data set. Moreover, stationary test is able to transform the nonlinear time series variable to become stationary by taking difference-stationary process or trend-stationary process. Two different types of hypothesis testing of stationary tests that are Augmented Dickey-Fuller (ADF) test and Kwiatkowski-Philips-Schmidt-Shin (KPSS) test are examine in this paper to describe the properties of the time series variables in financial model. Besides, Least Square method is used in Augmented Dickey-Fuller test to detect the changes of the series and Lagrange multiplier is used in Kwiatkowski-Philips-Schmidt-Shin test to examine the properties of oil price, gold price and Malaysia stock market. Moreover, Quandt-Andrews, Bai-Perron and Chow tests are also use to detect the existence of break in the data series. The monthly index data are ranging from December 1989 until May 2012. Result is shown that these three series exhibit nonlinear properties but are able to transform to stationary series after taking first difference process.

  1. Regularization of Nutation Time Series at GSFC

    NASA Astrophysics Data System (ADS)

    Le Bail, K.; Gipson, J. M.; Bolotin, S.

    2012-12-01

    VLBI is unique in its ability to measure all five Earth orientation parameters. In this paper we focus on the two nutation parameters which characterize the orientation of the Earth's rotation axis in space. We look at the periodicities and the spectral characteristics of these parameters for both R1 and R4 sessions independently. The study of the most significant periodic signals for periods shorter than 600 days is common for these four time series (period of 450 days), and the type of noise determined by the Allan variance is a white noise for the four series. To investigate methods of regularizing the series, we look at a Singular Spectrum Analysis-derived method and at the Kalman filter. The two methods adequately reproduce the tendency of the nutation time series, but the resulting series are noisier using the Singular Spectrum Analysis-derived method.

  2. Homogenising time series: Beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2010-09-01

    For obtaining reliable information about climate change and climate variability the use of high quality data series is essentially important, and one basic tool of quality improvements is the statistical homogenisation of observed time series. In the recent decades large number of homogenisation methods has been developed, but the real effects of their application on time series are still not known entirely. The ongoing COST HOME project (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. The author believes that several old theoretical rules have to be re-evaluated. Some examples of the hot questions, a) Statistically detected change-points can be accepted only with the confirmation of metadata information? b) Do semi-hierarchic algorithms for detecting multiple change-points in time series function effectively in practise? c) Is it good to limit the spatial comparison of candidate series with up to five other series in the neighbourhood? Empirical results - those from the COST benchmark, and other experiments too - show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities seem like part of the climatic variability, thus the pure application of classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality, than in raw time series. The developers and users of homogenisation methods have to bear in mind that

  3. Spectra: Time series power spectrum calculator

    NASA Astrophysics Data System (ADS)

    Gallardo, Tabaré

    2017-01-01

    Spectra calculates the power spectrum of a time series equally spaced or not based on the Spectral Correlation Coefficient (Ferraz-Mello 1981, Astron. Journal 86 (4), 619). It is very efficient for detection of low frequencies.

  4. Advanced spectral methods for climatic time series

    USGS Publications Warehouse

    Ghil, M.; Allen, M.R.; Dettinger, M.D.; Ide, K.; Kondrashov, D.; Mann, M.E.; Robertson, A.W.; Saunders, A.; Tian, Y.; Varadi, F.; Yiou, P.

    2002-01-01

    The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this review we describe the connections between time series analysis and nonlinear dynamics, discuss signal- to-noise enhancement, and present some of the novel methods for spectral analysis. The various steps, as well as the advantages and disadvantages of these methods, are illustrated by their application to an important climatic time series, the Southern Oscillation Index. This index captures major features of interannual climate variability and is used extensively in its prediction. Regional and global sea surface temperature data sets are used to illustrate multivariate spectral methods. Open questions and further prospects conclude the review.

  5. Complex network approach to fractional time series

    SciTech Connect

    Manshour, Pouya

    2015-10-15

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.

  6. Improving Intercomparability of Marine Biogeochemical Time Series

    NASA Astrophysics Data System (ADS)

    Benway, Heather M.; Telszewski, Maciej; Lorenzoni, Laura

    2013-04-01

    Shipboard biogeochemical time series represent one of the most valuable tools scientists have to quantify marine elemental fluxes and associated biogeochemical processes and to understand their links to changing climate. They provide the long, temporally resolved data sets needed to characterize ocean climate, biogeochemistry, and ecosystem variability and change. However, to monitor and differentiate natural cycles and human-driven changes in the global oceans, time series methodologies must be transparent and intercomparable when possible. To review current shipboard biogeochemical time series sampling and analytical methods, the International Ocean Carbon Coordination Project (IOCCP; http://www.ioccp.org/) and the Ocean Carbon and Biogeochemistry Program (http://www.us-ocb.org/) convened an international ocean time series workshop at the Bermuda Institute for Ocean Sciences.

  7. Detecting nonlinear structure in time series

    SciTech Connect

    Theiler, J.

    1991-01-01

    We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs.

  8. Nonlinear Analysis of Surface EMG Time Series

    NASA Astrophysics Data System (ADS)

    Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-04-01

    Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.

  9. Computational Time-Accurate Body Movement: Methodology, Validation, and Application

    DTIC Science & Technology

    1995-10-01

    used that had a leading-edge sweep angle of 45 deg and a NACA 64A010 symmetrical airfoil section. A cross section of the pylon is a symmetrical...25 2. Information Flow for the Time-Accurate Store Trajectory Prediction Process . . . . . . . . . 26 3. Pitch Rates for NACA -0012 Airfoil...section are comparisons of the computational results to data for a NACA -0012 airfoil following a predefined pitching motion. Validation of the

  10. Detecting chaos in irregularly sampled time series.

    PubMed

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  11. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  12. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  13. Turbulencelike Behavior of Seismic Time Series

    SciTech Connect

    Manshour, P.; Saberi, S.; Sahimi, Muhammad; Peinke, J.; Pacheco, Amalio F.; Rahimi Tabar, M. Reza

    2009-01-09

    We report on a stochastic analysis of Earth's vertical velocity time series by using methods originally developed for complex hierarchical systems and, in particular, for turbulent flows. Analysis of the fluctuations of the detrended increments of the series reveals a pronounced transition in their probability density function from Gaussian to non-Gaussian. The transition occurs 5-10 hours prior to a moderate or large earthquake, hence representing a new and reliable precursor for detecting such earthquakes.

  14. Learning time series for intelligent monitoring

    NASA Technical Reports Server (NTRS)

    Manganaris, Stefanos; Fisher, Doug

    1994-01-01

    We address the problem of classifying time series according to their morphological features in the time domain. In a supervised machine-learning framework, we induce a classification procedure from a set of preclassified examples. For each class, we infer a model that captures its morphological features using Bayesian model induction and the minimum message length approach to assign priors. In the performance task, we classify a time series in one of the learned classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. We report results from experiments in a monitoring domain of interest to NASA.

  15. Predicting road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  16. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  17. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  18. RTbox: a device for highly accurate response time measurements.

    PubMed

    Li, Xiangrui; Liang, Zhen; Kleiner, Mario; Lu, Zhong-Lin

    2010-02-01

    Although computer keyboards and mice are frequently used in measuring response times (RTs), the accuracy of these measurements is quite low. Specialized RT collection devices must be used to obtain more accurate measurements. However, all the existing devices have some shortcomings. We have developed and implemented a new, commercially available device, the RTbox, for highly accurate RT measurements. The RTbox has its own microprocessor and high-resolution clock. It can record the identities and timing of button events with high accuracy, unaffected by potential timing uncertainty or biases during data transmission and processing in the host computer. It stores button events until the host computer chooses to retrieve them. The asynchronous storage greatly simplifies the design of user programs. The RTbox can also receive and record external signals as triggers and can measure RTs with respect to external events. The internal clock of the RTbox can be synchronized with the computer clock, so the device can be used without external triggers. A simple USB connection is sufficient to integrate the RTbox with any standard computer and operating system.

  19. Layered Ensemble Architecture for Time Series Forecasting.

    PubMed

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  20. Complex network analysis of time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Small, Michael; Kurths, Jürgen

    2016-12-01

    Revealing complicated behaviors from time series constitutes a fundamental problem of continuing interest and it has attracted a great deal of attention from a wide variety of fields on account of its significant importance. The past decade has witnessed a rapid development of complex network studies, which allow to characterize many types of systems in nature and technology that contain a large number of components interacting with each other in a complicated manner. Recently, the complex network theory has been incorporated into the analysis of time series and fruitful achievements have been obtained. Complex network analysis of time series opens up new venues to address interdisciplinary challenges in climate dynamics, multiphase flow, brain functions, ECG dynamics, economics and traffic systems.

  1. Intrinsic superstatistical components of financial time series

    NASA Astrophysics Data System (ADS)

    Vamoş, Călin; Crăciun, Maria

    2014-12-01

    Time series generated by a complex hierarchical system exhibit various types of dynamics at different time scales. A financial time series is an example of such a multiscale structure with time scales ranging from minutes to several years. In this paper we decompose the volatility of financial indices into five intrinsic components and we show that it has a heterogeneous scale structure. The small-scale components have a stochastic nature and they are independent 99% of the time, becoming synchronized during financial crashes and enhancing the heavy tails of the volatility distribution. The deterministic behavior of the large-scale components is related to the nonstationarity of the financial markets evolution. Our decomposition of the financial volatility is a superstatistical model more complex than those usually limited to a superposition of two independent statistics at well-separated time scales.

  2. Clustering Short Time-Series Microarray

    NASA Astrophysics Data System (ADS)

    Ping, Loh Wei; Hasan, Yahya Abu

    2008-01-01

    Most microarray analyses are carried out on static gene expressions. However, the dynamical study of microarrays has lately gained more attention. Most researches on time-series microarray emphasize on the bioscience and medical aspects but few from the numerical aspect. This study attempts to analyze short time-series microarray mathematically using STEM clustering tool which formally preprocess data followed by clustering. We next introduce the Circular Mould Distance (CMD) algorithm with combinations of both preprocessing and clustering analysis. Both methods are subsequently compared in terms of efficiencies.

  3. Analyses of Inhomogeneities in Radiosonde Temperature and Humidity Time Series.

    NASA Astrophysics Data System (ADS)

    Zhai, Panmao; Eskridge, Robert E.

    1996-04-01

    Twice daily radiosonde data from selected stations in the United States (period 1948 to 1990) and China (period 1958 to 1990) were sorted into time series. These stations have one sounding taken in darkness and the other in sunlight. The analysis shows that the 0000 and 1200 UTC time series are highly correlated. Therefore, the Easterling and Peterson technique was tested on the 0000 and 1200 time series to detect inhomogeneities and to estimate the size of the biases. Discontinuities were detected using the difference series created from the 0000 and 1200 UTC time series. To establish that the detected bias was significant, a t test was performed to confirm that the change occurs in the daytime series but not in the nighttime series.Both U.S. and Chinese radiosonde temperature and humidity data include inhomogeneities caused by changes in radiosonde sensors and observation times. The U.S. humidity data have inhomogeneities that were caused by instrument changes and the censoring of data. The practice of reporting relative humidity as 19% when it is lower than 20% or the temperature is below 40°C is called censoring. This combination of procedural and instrument changes makes the detection of biases and adjustment of the data very difficult. In the Chinese temperatures, them are inhomogeneities related to a change in the radiation correction procedure.Test results demonstrate that a modified Easterling and Peterson method is suitable for use in detecting and adjusting time series radiosonde data.Accurate stations histories are very desirable. Stations histories can confirm that detected inhomogeneities are related to instrument or procedural changes. Adjustments can then he made to the data with some confidence.

  4. Circulant Matrices and Time-Series Analysis

    ERIC Educational Resources Information Center

    Pollock, D. S. G.

    2002-01-01

    This paper sets forth some salient results in the algebra of circulant matrices which can be used in time-series analysis. It provides easy derivations of some results that are central to the analysis of statistical periodograms and empirical spectral density functions. A statistical test for the stationarity or homogeneity of empirical processes…

  5. Nonlinear Time Series Analysis via Neural Networks

    NASA Astrophysics Data System (ADS)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  6. Three Analysis Examples for Time Series Data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    With improvements in instrumentation and the automation of data collection, plot level repeated measures and time series data are increasingly available to monitor and assess selected variables throughout the duration of an experiment or project. Records and metadata on variables of interest alone o...

  7. Directionality volatility in electroencephalogram time series

    NASA Astrophysics Data System (ADS)

    Mansor, Mahayaudin M.; Green, David A.; Metcalfe, Andrew V.

    2016-06-01

    We compare time series of electroencephalograms (EEGs) from healthy volunteers with EEGs from subjects diagnosed with epilepsy. The EEG time series from the healthy group are recorded during awake state with their eyes open and eyes closed, and the records from subjects with epilepsy are taken from three different recording regions of pre-surgical diagnosis: hippocampal, epileptogenic and seizure zone. The comparisons for these 5 categories are in terms of deviations from linear time series models with constant variance Gaussian white noise error inputs. One feature investigated is directionality, and how this can be modelled by either non-linear threshold autoregressive models or non-Gaussian errors. A second feature is volatility, which is modelled by Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) processes. Other features include the proportion of variability accounted for by time series models, and the skewness and the kurtosis of the residuals. The results suggest these comparisons may have diagnostic potential for epilepsy and provide early warning of seizures.

  8. Determinism test for very short time series.

    PubMed

    Binder, P-M; Igarashi, Ryu; Seymour, William; Takeishi, Candy

    2005-03-01

    A test for determinism suitable for time series shorter than 100 points is presented, and applied to numerical and observed data. The method exploits the linear d(d(0)) dependence in the expression d(t) approximately d(0)e(lambda t) which describes the growth of small separations between trajectories in chaotic systems.

  9. Remote Sensing Time Series Product Tool

    NASA Technical Reports Server (NTRS)

    Predos, Don; Ryan, Robert E.; Ross, Kenton W.

    2006-01-01

    The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced

  10. Delay Differential Analysis of Time Series

    PubMed Central

    Lainscsek, Claudia; Sejnowski, Terrence J.

    2015-01-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  11. Delay differential analysis of time series.

    PubMed

    Lainscsek, Claudia; Sejnowski, Terrence J

    2015-03-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  12. Time-frequency analysis of electroencephalogram series

    NASA Astrophysics Data System (ADS)

    Blanco, S.; Quiroga, R. Quian; Rosso, O. A.; Kochen, S.

    1995-03-01

    In this paper we propose a method, based on the Gabor transform, to quantify and visualize the time evolution of the traditional frequency bands defined in the analysis of electroencephalogram (EEG) series. The information obtained in this way can be used for the information transfer analyses of the epileptic seizure as well as for their characterization. We found an optimal correlation between EEG visual inspection and the proposed method in the characterization of paroxism, spikes, and other transient alterations of background activity. The dynamical changes during an epileptic seizure are shown through the phase portrait. The method proposed was examplified with EEG series obtained with depth electrodes in refractory epileptic patients.

  13. Accurate finite difference methods for time-harmonic wave propagation

    NASA Technical Reports Server (NTRS)

    Harari, Isaac; Turkel, Eli

    1994-01-01

    Finite difference methods for solving problems of time-harmonic acoustics are developed and analyzed. Multidimensional inhomogeneous problems with variable, possibly discontinuous, coefficients are considered, accounting for the effects of employing nonuniform grids. A weighted-average representation is less sensitive to transition in wave resolution (due to variable wave numbers or nonuniform grids) than the standard pointwise representation. Further enhancement in method performance is obtained by basing the stencils on generalizations of Pade approximation, or generalized definitions of the derivative, reducing spurious dispersion, anisotropy and reflection, and by improving the representation of source terms. The resulting schemes have fourth-order accurate local truncation error on uniform grids and third order in the nonuniform case. Guidelines for discretization pertaining to grid orientation and resolution are presented.

  14. Algorithm for Compressing Time-Series Data

    NASA Technical Reports Server (NTRS)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  15. Accurate expressions for solar cell fill factors including series and shunt resistances

    NASA Astrophysics Data System (ADS)

    Green, Martin A.

    2016-02-01

    Together with open-circuit voltage and short-circuit current, fill factor is a key solar cell parameter. In their classic paper on limiting efficiency, Shockley and Queisser first investigated this factor's analytical properties showing, for ideal cells, it could be expressed implicitly in terms of the maximum power point voltage. Subsequently, fill factors usually have been calculated iteratively from such implicit expressions or from analytical approximations. In the absence of detrimental series and shunt resistances, analytical fill factor expressions have recently been published in terms of the Lambert W function available in most mathematical computing software. Using a recently identified perturbative relationship, exact expressions in terms of this function are derived in technically interesting cases when both series and shunt resistances are present but have limited impact, allowing a better understanding of their effect individually and in combination. Approximate expressions for arbitrary shunt and series resistances are then deduced, which are significantly more accurate than any previously published. A method based on the insights developed is also reported for deducing one-diode fits to experimental data.

  16. Modelling population change from time series data

    USGS Publications Warehouse

    Barker, R.J.; Sauer, J.R.; McCullough, D.R.; Barrett, R.H.

    1992-01-01

    Information on change in population size over time is among the most basic inputs for population management. Unfortunately, population changes are generally difficult to identify, and once identified difficult to explain. Sources of variald (patterns) in population data include: changes in environment that affect carrying capaciyy and produce trend, autocorrelative processes, irregular environmentally induced perturbations, and stochasticity arising from population processes. In addition. populations are almost never censused and many surveys (e.g., the North American Breeding Bird Survey) produce multiple, incomplete time series of population indices, providing further sampling complications. We suggest that each source of pattern should be used to address specific hypotheses regarding population change, but that failure to correctly model each source can lead to false conclusions about the dynamics of populations. We consider hypothesis tests based on each source of pattern, and the effects of autocorrelated observations and sampling error. We identify important constraints on analyses of time series that limit their use in identifying underlying relationships.

  17. Hurst exponents for short time series

    NASA Astrophysics Data System (ADS)

    Qi, Jingchao; Yang, Huijie

    2011-12-01

    A concept called balanced estimator of diffusion entropy is proposed to detect quantitatively scalings in short time series. The effectiveness is verified by detecting successfully scaling properties for a large number of artificial fractional Brownian motions. Calculations show that this method can give reliable scalings for short time series with length ˜102. It is also used to detect scalings in the Shanghai Stock Index, five stock catalogs, and a total of 134 stocks collected from the Shanghai Stock Exchange Market. The scaling exponent for each catalog is significantly larger compared with that for the stocks included in the catalog. Selecting a window with size 650, the evolution of scaling for the Shanghai Stock Index is obtained by the window's sliding along the series. Global patterns in the evolutionary process are captured from the smoothed evolutionary curve. By comparing the patterns with the important event list in the history of the considered stock market, the evolution of scaling is matched with the stock index series. We can find that the important events fit very well with global transitions of the scaling behaviors.

  18. Quantitative proteomic analysis by accurate mass retention time pairs.

    PubMed

    Silva, Jeffrey C; Denny, Richard; Dorschel, Craig A; Gorenstein, Marc; Kass, Ignatius J; Li, Guo-Zhong; McKenna, Therese; Nold, Michael J; Richardson, Keith; Young, Phillip; Geromanos, Scott

    2005-04-01

    Current methodologies for protein quantitation include 2-dimensional gel electrophoresis techniques, metabolic labeling, and stable isotope labeling methods to name only a few. The current literature illustrates both pros and cons for each of the previously mentioned methodologies. Keeping with the teachings of William of Ockham, "with all things being equal the simplest solution tends to be correct", a simple LC/MS based methodology is presented that allows relative changes in abundance of proteins in highly complex mixtures to be determined. Utilizing a reproducible chromatographic separations system along with the high mass resolution and mass accuracy of an orthogonal time-of-flight mass spectrometer, the quantitative comparison of tens of thousands of ions emanating from identically prepared control and experimental samples can be made. Using this configuration, we can determine the change in relative abundance of a small number of ions between the two conditions solely by accurate mass and retention time. Employing standard operating procedures for both sample preparation and ESI-mass spectrometry, one typically obtains under 5 ppm mass precision and quantitative variations between 10 and 15%. The principal focus of this paper will demonstrate the quantitative aspects of the methodology and continue with a discussion of the associated, complementary qualitative capabilities.

  19. Approaches for the accurate definition of geological time boundaries

    NASA Astrophysics Data System (ADS)

    Schaltegger, Urs; Baresel, Björn; Ovtcharova, Maria; Goudemand, Nicolas; Bucher, Hugo

    2015-04-01

    Which strategies lead to the most precise and accurate date of a given geological boundary? Geological units are usually defined by the occurrence of characteristic taxa and hence boundaries between these geological units correspond to dramatic faunal and/or floral turnovers and they are primarily defined using first or last occurrences of index species, or ideally by the separation interval between two consecutive, characteristic associations of fossil taxa. These boundaries need to be defined in a way that enables their worldwide recognition and correlation across different stratigraphic successions, using tools as different as bio-, magneto-, and chemo-stratigraphy, and astrochronology. Sedimentary sequences can be dated in numerical terms by applying high-precision chemical-abrasion, isotope-dilution, thermal-ionization mass spectrometry (CA-ID-TIMS) U-Pb age determination to zircon (ZrSiO4) in intercalated volcanic ashes. But, though volcanic activity is common in geological history, ashes are not necessarily close to the boundary we would like to date precisely and accurately. In addition, U-Pb zircon data sets may be very complex and difficult to interpret in terms of the age of ash deposition. To overcome these difficulties we use a multi-proxy approach we applied to the precise and accurate dating of the Permo-Triassic and Early-Middle Triassic boundaries in South China. a) Dense sampling of ashes across the critical time interval and a sufficiently large number of analysed zircons per ash sample can guarantee the recognition of all system complexities. Geochronological datasets from U-Pb dating of volcanic zircon may indeed combine effects of i) post-crystallization Pb loss from percolation of hydrothermal fluids (even using chemical abrasion), with ii) age dispersion from prolonged residence of earlier crystallized zircon in the magmatic system. As a result, U-Pb dates of individual zircons are both apparently younger and older than the depositional age

  20. Detecting anomalous phase synchronization from time series

    SciTech Connect

    Tokuda, Isao T.; Kumar Dana, Syamal; Kurths, Juergen

    2008-06-15

    Modeling approaches are presented for detecting an anomalous route to phase synchronization from time series of two interacting nonlinear oscillators. The anomalous transition is characterized by an enlargement of the mean frequency difference between the oscillators with an initial increase in the coupling strength. Although such a structure is common in a large class of coupled nonisochronous oscillators, prediction of the anomalous transition is nontrivial for experimental systems, whose dynamical properties are unknown. Two approaches are examined; one is a phase equational modeling of coupled limit cycle oscillators and the other is a nonlinear predictive modeling of coupled chaotic oscillators. Application to prototypical models such as two interacting predator-prey systems in both limit cycle and chaotic regimes demonstrates the capability of detecting the anomalous structure from only a few sets of time series. Experimental data from two coupled Chua circuits shows its applicability to real experimental system.

  1. Time-Series Analysis: A Cautionary Tale

    NASA Technical Reports Server (NTRS)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  2. Aggregated Indexing of Biomedical Time Series Data.

    PubMed

    Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A T

    2012-09-01

    Remote and wearable medical sensing has the potential to create very large and high dimensional datasets. Medical time series databases must be able to efficiently store, index, and mine these datasets to enable medical professionals to effectively analyze data collected from their patients. Conventional high dimensional indexing methods are a two stage process. First, a superset of the true matches is efficiently extracted from the database. Second, supersets are pruned by comparing each of their objects to the query object and rejecting any objects falling outside a predetermined radius. This pruning stage heavily dominates the computational complexity of most conventional search algorithms. Therefore, indexing algorithms can be significantly improved by reducing the amount of pruning. This paper presents an online algorithm to aggregate biomedical times series data to significantly reduce the search space (index size) without compromising the quality of search results. This algorithm is built on the observation that biomedical time series signals are composed of cyclical and often similar patterns. This algorithm takes in a stream of segments and groups them to highly concentrated collections. Locality Sensitive Hashing (LSH) is used to reduce the overall complexity of the algorithm, allowing it to run online. The output of this aggregation is used to populate an index. The proposed algorithm yields logarithmic growth of the index (with respect to the total number of objects) while keeping sensitivity and specificity simultaneously above 98%. Both memory and runtime complexities of time series search are improved when using aggregated indexes. In addition, data mining tasks, such as clustering, exhibit runtimes that are orders of magnitudes faster when run on aggregated indexes.

  3. Analysis of Polyphonic Musical Time Series

    NASA Astrophysics Data System (ADS)

    Sommer, Katrin; Weihs, Claus

    A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.

  4. Accurate and Timely Forecasting of CME-Driven Geomagnetic Storms

    NASA Astrophysics Data System (ADS)

    Chen, J.; Kunkel, V.; Skov, T. M.

    2015-12-01

    Wide-spread and severe geomagnetic storms are primarily caused by theejecta of coronal mass ejections (CMEs) that impose long durations ofstrong southward interplanetary magnetic field (IMF) on themagnetosphere, the duration and magnitude of the southward IMF (Bs)being the main determinants of geoeffectiveness. Another importantquantity to forecast is the arrival time of the expected geoeffectiveCME ejecta. In order to accurately forecast these quantities in atimely manner (say, 24--48 hours of advance warning time), it isnecessary to calculate the evolving CME ejecta---its structure andmagnetic field vector in three dimensions---using remote sensing solardata alone. We discuss a method based on the validated erupting fluxrope (EFR) model of CME dynamics. It has been shown using STEREO datathat the model can calculate the correct size, magnetic field, and theplasma parameters of a CME ejecta detected at 1 AU, using the observedCME position-time data alone as input (Kunkel and Chen 2010). Onedisparity is in the arrival time, which is attributed to thesimplified geometry of circular toroidal axis of the CME flux rope.Accordingly, the model has been extended to self-consistently includethe transverse expansion of the flux rope (Kunkel 2012; Kunkel andChen 2015). We show that the extended formulation provides a betterprediction of arrival time even if the CME apex does not propagatedirectly toward the earth. We apply the new method to a number of CMEevents and compare predicted flux ropes at 1 AU to the observed ejectastructures inferred from in situ magnetic and plasma data. The EFRmodel also predicts the asymptotic ambient solar wind speed (Vsw) foreach event, which has not been validated yet. The predicted Vswvalues are tested using the ENLIL model. We discuss the minimum andsufficient required input data for an operational forecasting systemfor predicting the drivers of large geomagnetic storms.Kunkel, V., and Chen, J., ApJ Lett, 715, L80, 2010. Kunkel, V., Ph

  5. Time Series Prediction of Hurricane Landfall.

    DTIC Science & Technology

    1986-05-01

    8217 132 111112-2 11111111.8 MICROCOPY RESOLUTION TEST CHART NATIONAL BUR[AU OIf SIANARD lq A .5. 𔃿. SECURITY CLASSIFICATION OF THIS PAGE (When, Dta Entered...parameters to change as the storm moves to a new region of the ocean. For test cases, operational average 72 hour prediction error is at least three...comparatively accurate for forecast times of 24 hours or less. The SANBAR model (Sanders and Burpee , 1968), has been in use at NHC since 1970. It is a

  6. Nonlinear modeling of chaotic time series: Theory and applications

    NASA Astrophysics Data System (ADS)

    Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.

  7. Nonlinear modeling of chaotic time series: Theory and applications

    SciTech Connect

    Casdagli, M.; Eubank, S.; Farmer, J.D.; Gibson, J. Santa Fe Inst., NM ); Des Jardins, D.; Hunter, N.; Theiler, J. )

    1990-01-01

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.

  8. Fractal fluctuations in cardiac time series

    NASA Technical Reports Server (NTRS)

    West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    1999-01-01

    Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.

  9. Time series modeling for automatic target recognition

    NASA Astrophysics Data System (ADS)

    Sokolnikov, Andre

    2012-05-01

    Time series modeling is proposed for identification of targets whose images are not clearly seen. The model building takes into account air turbulence, precipitation, fog, smoke and other factors obscuring and distorting the image. The complex of library data (of images, etc.) serving as a basis for identification provides the deterministic part of the identification process, while the partial image features, distorted parts, irrelevant pieces and absence of particular features comprise the stochastic part of the target identification. The missing data approach is elaborated that helps the prediction process for the image creation or reconstruction. The results are provided.

  10. Time series analyses of global change data.

    PubMed

    Lane, L J; Nichols, M H; Osborn, H B

    1994-01-01

    The hypothesis that statistical analyses of historical time series data can be used to separate the influences of natural variations from anthropogenic sources on global climate change is tested. Point, regional, national, and global temperature data are analyzed. Trend analyses for the period 1901-1987 suggest mean annual temperatures increased (in degrees C per century) globally at the rate of about 0.5, in the USA at about 0.3, in the south-western USA desert region at about 1.2, and at the Walnut Gulch Experimental Watershed in south-eastern Arizona at about 0.8. However, the rates of temperature change are not constant but vary within the 87-year period. Serial correlation and spectral density analysis of the temperature time series showed weak periodicities at various frequencies. The only common periodicity among the temperature series is an apparent cycle of about 43 years. The temperature time series were correlated with the Wolf sunspot index, atmospheric CO(2) concentrations interpolated from the Siple ice core data, and atmospheric CO(2) concentration data from Mauna Loa measurements. Correlation analysis of temperature data with concurrent data on atmospheric CO(2) concentrations and the Wolf sunspot index support previously reported significant correlation over the 1901-1987 period. Correlation analysis between temperature, atmospheric CO(2) concentration, and the Wolf sunspot index for the shorter period, 1958-1987, when continuous Mauna Loa CO(2) data are available, suggest significant correlation between global warming and atmospheric CO(2) concentrations but no significant correlation between global warming and the Wolf sunspot index. This may be because the Wolf sunspot index apparently increased from 1901 until about 1960 and then decreased thereafter, while global warming apparently continued to increase through 1987. Correlation of sunspot activity with global warming may be spurious but additional analyses are required to test this hypothesis

  11. Gwilym Jenkins, Experimental Design and Time Series.

    DTIC Science & Technology

    1984-04-01

    of a changing process. This led to studies of discrete dynamic models and control problems and finally to work on time series and forecasting. A4S...practice based on sound -2- I theory in a never-ending iteration. The results of this mode of thinking come through strongly for example in his book with...arrival in Princeton marked the beginning of a long and happy collaboration between us which later resulted in much visiting to and from between England and

  12. Consistency of IVS nutation time series

    NASA Astrophysics Data System (ADS)

    Gattano, César; Lambert, Sébastien; Bizouard, Christian

    2016-04-01

    We give a review of the various VLBI-derived nutation time series provided by the different operational analysis centers of the IVS and three combination centers (IVS, IERS EOP Center, and Rapid Service/Prediction Center). We focus on the stability of small nutation amplitudes, including the free core nutation and other atmospherically-driven nutations, that are of interest for improving Earth models. We discuss the possible origins of the differences (software packaged, inversion methods, analysis configuration including a priori and estimation strategy) and the consequences for scientific exploitation of the data, especially in terms of nutation modeling and inference of the Earth's internal structure.

  13. Modeling stylized facts for financial time series

    NASA Astrophysics Data System (ADS)

    Krivoruchenko, M. I.; Alessio, E.; Frappietro, V.; Streckert, L. J.

    2004-12-01

    Multivariate probability density functions of returns are constructed in order to model the empirical behavior of returns in a financial time series. They describe the well-established deviations from the Gaussian random walk, such as an approximate scaling and heavy tails of the return distributions, long-ranged volatility-volatility correlations (volatility clustering) and return-volatility correlations (leverage effect). The model is tested successfully to fit joint distributions of the 100+ years of daily price returns of the Dow Jones 30 Industrial Average.

  14. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  15. Fast Algorithms for Mining Co-evolving Time Series

    DTIC Science & Technology

    2011-09-01

    resolution methods : Fourier and Wavelets . . . . . . . . . . . . . . . . . . 9 2.2.4 Time series forecasting...categorical data. Our work is based on two key properties in those co-evolving time series , dynamics and correlation. Dynamics captures the temporal...applications. 2.2 A survey on time series methods There is a lot of work on time series analysis , on indexing, dimensionality reduction, forecasting

  16. Singular spectrum analysis for time series with missing data

    USGS Publications Warehouse

    Schoellhamer, D.H.

    2001-01-01

    Geophysical time series often contain missing data, which prevents analysis with many signal processing and multivariate tools. A modification of singular spectrum analysis for time series with missing data is developed and successfully tested with synthetic and actual incomplete time series of suspended-sediment concentration from San Francisco Bay. This method also can be used to low pass filter incomplete time series.

  17. Forecasting the Time Series of Sunspot Numbers

    NASA Astrophysics Data System (ADS)

    Aguirre, L. A.; Letellier, C.; Maquet, J.

    2008-05-01

    Forecasting the solar cycle is of great importance for weather prediction and environmental monitoring, and also constitutes a difficult scientific benchmark in nonlinear dynamical modeling. This paper describes the identification of a model and its use in the forecasting the time series comprised of Wolf’s sunspot numbers. A key feature of this procedure is that the original time series is first transformed into a symmetrical space where the dynamics of the solar dynamo are unfolded in a better way, thus improving the model. The nonlinear model obtained is parsimonious and has both deterministic and stochastic parts. Monte Carlo simulation of the whole model produces very consistent results with the deterministic part of the model but allows for the determination of confidence bands. The obtained model was used to predict cycles 24 and 25, although the forecast of the latter is seen as a crude approximation, given the long prediction horizon required. As for the 24th cycle, two estimates were obtained with peaks of 65±16 and of 87±13 units of sunspot numbers. The simulated results suggest that the 24th cycle will be shorter and less active than the preceding one.

  18. Tremor classification and tremor time series analysis

    NASA Astrophysics Data System (ADS)

    Deuschl, Günther; Lauk, Michael; Timmer, Jens

    1995-03-01

    The separation between physiologic tremor (PT) in normal subjects and the pathological tremors of essential tremor (ET) or Parkinson's disease (PD) was investigated on the basis of monoaxial accelerometric recordings of 35 s hand tremor epochs. Frequency and amplitude were insufficient to separate between these conditions, except for the trivial distinction between normal and pathologic tremors that is already defined on the basis of amplitude. We found that waveform analysis revealed highly significant differences between normal and pathologic tremors, and, more importantly, among different forms of pathologic tremors. We found in our group of 25 patients with PT and 15 with ET a reasonable distinction with the third momentum and the time reversal invariance. A nearly complete distinction between these two conditions on the basis of the asymmetric decay of the autocorrelation function. We conclude that time series analysis can probably be developed into a powerful tool for the objective analysis of tremors.

  19. Managing distribution changes in time series prediction

    NASA Astrophysics Data System (ADS)

    Matias, J. M.; Gonzalez-Manteiga, W.; Taboada, J.; Ordonez, C.

    2006-07-01

    When a problem is modeled statistically, a single distribution model is usually postulated that is assumed to be valid for the entire space. Nonetheless, this practice may be somewhat unrealistic in certain application areas, in which the conditions of the process that generates the data may change; as far as we are aware, however, no techniques have been developed to tackle this problem.This article proposes a technique for modeling and predicting this change in time series with a view to improving estimates and predictions. The technique is applied, among other models, to the hypernormal distribution recently proposed. When tested on real data from a range of stock market indices the technique produces better results that when a single distribution model is assumed to be valid for the entire period of time studied.Moreover, when a global model is postulated, it is highly recommended to select the hypernormal distribution parameter in the same likelihood maximization process.

  20. Time series for blind biosignal classification model.

    PubMed

    Wong, Derek F; Chao, Lidia S; Zeng, Xiaodong; Vai, Mang-I; Lam, Heng-Leong

    2014-11-01

    Biosignals such as electrocardiograms (ECG), electroencephalograms (EEG), and electromyograms (EMG), are important noninvasive measurements useful for making diagnostic decisions. Recently, considerable research has been conducted in order to potentially automate signal classification for assisting in disease diagnosis. However, the biosignal type (ECG, EEG, EMG or other) needs to be known prior to the classification process. If the given biosignal is of an unknown type, none of the existing methodologies can be utilized. In this paper, a blind biosignal classification model (B(2)SC Model) is proposed in order to identify the source biosignal type automatically, and thus ultimately benefit the diagnostic decision. The approach employs time series algorithms for constructing the model. It uses a dynamic time warping (DTW) algorithm with clustering to discover the similarity between two biosignals, and consequently classifies disease without prior knowledge of the source signal type. The empirical experiments presented in this paper demonstrate the effectiveness of the method as well as the scalability of the approach.

  1. Watershed mean residence times and travel time distributions: how accurately can they be characterized?

    NASA Astrophysics Data System (ADS)

    Godsey, S. E.; Kirchner, J. W.

    2006-12-01

    The average time that rainfall takes to reach the stream - the mean residence time - is a basic parameter used to characterize watersheds. Watersheds are also characterized by the distribution of travel times for individual parcels of precipitation that fall on different points across the catchment. This travel time distribution is an important control on catchment response to contamination events. Catchments with shorter residence times or narrower distributions will have a flashier response to contamination events, whereas catchments with longer residence times or longer-tailed distributions will have a more persistent response to those same contamination events. Catchments' travel time distributions are typically inferred from time series of passive tracers (such as water isotopes, chloride, or bromide) in rainfall and streamflow. Tracer fluctuations in streamflow are typically damped compared to those in preciptation, because precipitation inputs of different ages (and different tracer signatures) are mixed within the catchment. Mathematically, this mixing process is modeled by the convolution of the travel time distribution and the precipitation tracer inputs to generate the stream tracer outputs. The parameters describing the travel time distribution are typically estimated by maximizing the goodness of fit between the modeled and measured tracer outputs. This approach is potentially subject to at least two sources of uncertainty. First, both the input and output tracer concentrations are subject to measurement error. Second, although the catchment mixing process is continuous, the inputs and outputs are only sampled at discrete points in time. Here we test how these two sources of uncertainty may affect travel time distributions that are estimated from catchment monitoring data. We begin by generating synthetic tracer input time series, and convolve these with a specified travel-time distribution to generate a synthetic output time series. We then subsample

  2. Automated time series forecasting for biosurveillance.

    PubMed

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  3. Correcting and combining time series forecasters.

    PubMed

    Firmino, Paulo Renato A; de Mattos Neto, Paulo S G; Ferreira, Tiago A E

    2014-02-01

    Combined forecasters have been in the vanguard of stochastic time series modeling. In this way it has been usual to suppose that each single model generates a residual or prediction error like a white noise. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. The present paper introduces a two-step method for correcting and combining forecasting models. Firstly, the stochastic process underlying the bias of each predictive model is built according to a recursive ARIMA algorithm in order to achieve a white noise behavior. At each iteration of the algorithm the best ARIMA adjustment is determined according to a given information criterion (e.g. Akaike). Then, in the light of the corrected predictions, it is considered a maximum likelihood combined estimator. Applications involving single ARIMA and artificial neural networks models for Dow Jones Industrial Average Index, S&P500 Index, Google Stock Value, and Nasdaq Index series illustrate the usefulness of the proposed framework.

  4. Periodograms for multiband astronomical time series

    NASA Astrophysics Data System (ADS)

    Ivezic, Z.; VanderPlas, J. T.

    2016-05-01

    We summarize the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time- domain data developed by VanderPlas & Ivezic (2015). A Python implementation of this method is available on GitHub. The multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST), and can treat non-uniform sampling and heteroscedastic errors. The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data.

  5. Removing atmosphere loading effect from GPS time series

    NASA Astrophysics Data System (ADS)

    Tiampo, K. F.; Samadi Alinia, H.; Samsonov, S. V.; Gonzalez, P. J.

    2015-12-01

    The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various approaches available for compensating ionosphere path delay cannot be used for removal of the tropospheric component. Quantifying the tropospheric delay plays an important role for determination of the vertical GPS component precision, as tropospheric parameters over a large distance have very little correlation with each other. Several methods have been proposed for tropospheric signal elimination from GPS vertical time series. Here we utilize surface temperature fluctuations and seasonal variations in water vapour and air pressure data for various spatial and temporal profiles in order to more accurately remove the atmospheric path delay [Samsonov et al., 2014]. In this paper, we model the atmospheric path delay of vertical position time series by analyzing the signal in the frequency domain and study its dependency on topography in eastern Ontario for the time period from January 2008 to December 2012. Systematic dependency of amplitude of atmospheric path delay as a function of height and its temporal variations based on the development of a new, physics-based model relating tropospheric/atmospheric effects with topography and can help in determining the most accurate GPS position.The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various

  6. Normalizing the causality between time series

    NASA Astrophysics Data System (ADS)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  7. Using entropy to cut complex time series

    NASA Astrophysics Data System (ADS)

    Mertens, David; Poncela Casasnovas, Julia; Spring, Bonnie; Amaral, L. A. N.

    2013-03-01

    Using techniques from statistical physics, physicists have modeled and analyzed human phenomena varying from academic citation rates to disease spreading to vehicular traffic jams. The last decade's explosion of digital information and the growing ubiquity of smartphones has led to a wealth of human self-reported data. This wealth of data comes at a cost, including non-uniform sampling and statistically significant but physically insignificant correlations. In this talk I present our work using entropy to identify stationary sub-sequences of self-reported human weight from a weight management web site. Our entropic approach-inspired by the infomap network community detection algorithm-is far less biased by rare fluctuations than more traditional time series segmentation techniques. Supported by the Howard Hughes Medical Institute

  8. Scaling laws from geomagnetic time series

    USGS Publications Warehouse

    Voros, Z.; Kovacs, P.; Juhasz, A.; Kormendi, A.; Green, A.W.

    1998-01-01

    The notion of extended self-similarity (ESS) is applied here for the X - component time series of geomagnetic field fluctuations. Plotting nth order structure functions against the fourth order structure function we show that low-frequency geomagnetic fluctuations up to the order n = 10 follow the same scaling laws as MHD fluctuations in solar wind, however, for higher frequencies (f > l/5[h]) a clear departure from the expected universality is observed for n > 6. ESS does not allow to make an unambiguous statement about the non triviality of scaling laws in "geomagnetic" turbulence. However, we suggest to use higher order moments as promising diagnostic tools for mapping the contributions of various remote magnetospheric sources to local observatory data. Copyright 1998 by the American Geophysical Union.

  9. Deconvolution of time series in the laboratory

    NASA Astrophysics Data System (ADS)

    John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian

    2016-10-01

    In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.

  10. Phase correlation of foreign exchange time series

    NASA Astrophysics Data System (ADS)

    Wu, Ming-Chya

    2007-03-01

    Correlation of foreign exchange rates in currency markets is investigated based on the empirical data of USD/DEM and USD/JPY exchange rates for a period from February 1 1986 to December 31 1996. The return of exchange time series is first decomposed into a number of intrinsic mode functions (IMFs) by the empirical mode decomposition method. The instantaneous phases of the resultant IMFs calculated by the Hilbert transform are then used to characterize the behaviors of pricing transmissions, and the correlation is probed by measuring the phase differences between two IMFs in the same order. From the distribution of phase differences, our results show explicitly that the correlations are stronger in daily time scale than in longer time scales. The demonstration for the correlations in periods of 1986-1989 and 1990-1993 indicates two exchange rates in the former period were more correlated than in the latter period. The result is consistent with the observations from the cross-correlation calculation.

  11. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    SciTech Connect

    VanderPlas, Jacob T.; Ivezic, Željko

    2015-10-10

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.

  12. Detecting and characterising ramp events in wind power time series

    NASA Astrophysics Data System (ADS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-12-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain.

  13. Calibrating GPS With TWSTFT For Accurate Time Transfer

    DTIC Science & Technology

    2008-12-01

    and O. Koudelka, 2008, “Time transfer with nanosecond accuracy for the realization of International Atomic Time,” Metrologia , 45, 185- 198. [4] Z...468-475. [7] Z. Jiang, 2008, “Towards a TWSTFT Network Time Transfer,” Metrologia , 45, S6-S11.

  14. Comparative Analysis on Time Series with Included Structural Break

    NASA Astrophysics Data System (ADS)

    Andreeski, Cvetko J.; Vasant, Pandian

    2009-08-01

    The time series analysis (ARIMA models) is a good approach for identification of time series. But, if we have structural break in the time series, we cannot create only one model of time series. Further more, if we don't have enough data between two structural breaks, it's impossible to create valid time series models for identification of the time series. This paper explores the possibility of identification of the inflation process dynamics via of the system-theoretic, by means of both Box-Jenkins ARIMA methodologies and artificial neural networks.

  15. Timing calibration and spectral cleaning of LOFAR time series data

    NASA Astrophysics Data System (ADS)

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-05-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw time series data for each receiver. The spectral cleaning has a calculated optimal sensitivity corresponding to a power signal-to-noise ratio of 0.08 (or -11 dB) in a spectral window of 25 kHz, for 2 ms of data in 48 antennas. This is well sufficient for our application. Timing calibration across individual antenna pairs has been performed at 0.4 ns precision; for calibration of signal clocks across stations of 48 antennas the precision is 0.1 ns. Monitoring differences in timing calibration per antenna pair over the course of the period 2011 to 2015 shows a precision of 0.08 ns, which is useful for monitoring and correcting drifts in signal path synchronizations. A cross-check method for timing calibration is presented, using a pulse transmitter carried by a drone flying over the array. Timing precision is similar, 0.3 ns, but is limited by transmitter position measurements, while requiring dedicated flights.

  16. Automated analysis of brachial ultrasound time series

    NASA Astrophysics Data System (ADS)

    Liang, Weidong; Browning, Roger L.; Lauer, Ronald M.; Sonka, Milan

    1998-07-01

    Atherosclerosis begins in childhood with the accumulation of lipid in the intima of arteries to form fatty streaks, advances through adult life when occlusive vascular disease may result in coronary heart disease, stroke and peripheral vascular disease. Non-invasive B-mode ultrasound has been found useful in studying risk factors in the symptom-free population. Large amount of data is acquired from continuous imaging of the vessels in a large study population. A high quality brachial vessel diameter measurement method is necessary such that accurate diameters can be measured consistently in all frames in a sequence, across different observers. Though human expert has the advantage over automated computer methods in recognizing noise during diameter measurement, manual measurement suffers from inter- and intra-observer variability. It is also time-consuming. An automated measurement method is presented in this paper which utilizes quality assurance approaches to adapt to specific image features, to recognize and minimize the noise effect. Experimental results showed the method's potential for clinical usage in the epidemiological studies.

  17. Fast and Accurate Estimates of Divergence Times from Big Data.

    PubMed

    Mello, Beatriz; Tao, Qiqing; Tamura, Koichiro; Kumar, Sudhir

    2017-01-01

    Ongoing advances in sequencing technology have led to an explosive expansion in the molecular data available for building increasingly larger and more comprehensive timetrees. However, Bayesian relaxed-clock approaches frequently used to infer these timetrees impose a large computational burden and discourage critical assessment of the robustness of inferred times to model assumptions, influence of calibrations, and selection of optimal data subsets. We analyzed eight large, recently published, empirical datasets to compare time estimates produced by RelTime (a non-Bayesian method) with those reported by using Bayesian approaches. We find that RelTime estimates are very similar to Bayesian approaches, yet RelTime requires orders of magnitude less computational time. This means that the use of RelTime will enable greater rigor in molecular dating, because faster computational speeds encourage more extensive testing of the robustness of inferred timetrees to prior assumptions (models and calibrations) and data subsets. Thus, RelTime provides a reliable and computationally thrifty approach for dating the tree of life using large-scale molecular datasets.

  18. A New Method for Accurate Treatment of Flow Equations in Cylindrical Coordinates Using Series Expansions

    NASA Technical Reports Server (NTRS)

    Constantinescu, G.S.; Lele, S. K.

    2000-01-01

    using these schemes is especially sensitive to the type of equation treatment at the singularity axis. The objective of this work is to develop a generally applicable numerical method for treating the singularities present at the polar axis, which is particularly suitable for highly accurate finite-differences schemes (e.g., Pade schemes) on non-staggered grids. The main idea is to reinterpret the regularity conditions developed in the context of pseudo-spectral methods. A set of exact equations at the singularity axis is derived using the appropriate series expansions for the variables in the original set of equations. The present treatment of the equations preserves the same level of accuracy as for the interior scheme. We also want to point out the wider utility of the method, proposed here in the context of compressible flow equations, as its extension for incompressible flows or for any other set of equations that are solved on a non-staggered mesh in cylindrical coordinates with finite-differences schemes of various level of accuracy is straightforward. The robustness and accuracy of the proposed technique is assessed by comparing results from simulations of laminar forced-jets and turbulent compressible jets using LES with similar calculations in which the equations are solved in Cartesian coordinates at the polar axis, or in which the singularity is removed by employing a staggered mesh in the radial direction without a mesh point at r = 0.

  19. Time-accurate Navier-Stokes calculations with multigrid acceleration

    NASA Technical Reports Server (NTRS)

    Melson, N. D.; Sanetrik, Mark D.; Atkins, Harold L.

    1993-01-01

    An efficient method for calculating unsteady flows is presented, with emphasis on a modified version of the thin-layer Navier-Stokes equations. Fourier stability analysis is used to illustrate the effect of treating the source term implicitly instead of explicity, as well as to illustrate other algorithmic choices. A 2D circular cylinder (with a Reynolds number of 1200 and a Mach number of 0.3) is calculated. The present scheme requires only about 10 percent of the computer time required by global minimum time stepping.

  20. Transmission of linear regression patterns between time series: From relationship in time series to complex networks

    NASA Astrophysics Data System (ADS)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  1. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    PubMed

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  2. Simple tunnel diode circuit for accurate zero crossing timing

    NASA Technical Reports Server (NTRS)

    Metz, A. J.

    1969-01-01

    Tunnel diode circuit, capable of timing the zero crossing point of bipolar pulses, provides effective design for a fast crossing detector. It combines a nonlinear load line with the diode to detect the zero crossing of a wide range of input waveshapes.

  3. Accurate and stable time stepping in ice sheet modeling

    NASA Astrophysics Data System (ADS)

    Cheng, Gong; Lötstedt, Per; von Sydow, Lina

    2017-01-01

    In this paper we introduce adaptive time step control for simulation of the evolution of ice sheets. The discretization error in the approximations is estimated using "Milne's device" by comparing the result from two different methods in a predictor-corrector pair. Using a predictor-corrector pair the expensive part of the procedure, the solution of the velocity and pressure equations, is performed only once per time step and an estimate of the local error is easily obtained. The stability of the numerical solution is maintained and the accuracy is controlled by keeping the local error below a given threshold using PI-control. Depending on the threshold, the time step Δt is bound by stability requirements or accuracy requirements. Our method takes a shorter Δt than an implicit method but with less work in each time step and the solver is simpler. The method is analyzed theoretically with respect to stability and applied to the simulation of a 2D ice slab and a 3D circular ice sheet. The stability bounds in the experiments are explained by and agree well with the theoretical results.

  4. Beyond multi-fractals: surrogate time series and fields

    NASA Astrophysics Data System (ADS)

    Venema, V.; Simmer, C.

    2007-12-01

    Most natural complex are characterised by variability on a large range of temporal and spatial scales. The two main methodologies to generate such structures are Fourier/FARIMA based algorithms and multifractal methods. The former is restricted to Gaussian data, whereas the latter requires the structure to be self-similar. This work will present so-called surrogate data as an alternative that works with any (empirical) distribution and power spectrum. The best-known surrogate algorithm is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm. We have studied six different geophysical time series (two clouds, runoff of a small and a large river, temperature and rain) and their surrogates. The power spectra and consequently the 2nd order structure functions were replicated accurately. Even the fourth order structure function was more accurately reproduced by the surrogates as would be possible by a fractal method, because the measured structure deviated too strong from fractal scaling. Only in case of the daily rain sums a fractal method could have been more accurate. Just as Fourier and multifractal methods, the current surrogates are not able to model the asymmetric increment distributions observed for runoff, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found differences for the structure functions on small scales. Surrogate methods are especially valuable for empirical studies, because the time series and fields that are generated are able to mimic measured variables accurately. Our main application is radiative transfer through structured clouds. Like many geophysical fields, clouds can only be sampled sparsely, e.g. with in-situ airborne instruments. However, for radiative transfer calculations we need full 3-dimensional cloud fields. A first study relating the measured properties of the cloud droplets and the radiative properties of the cloud field by generating surrogate cloud

  5. Time Series Analysis of Mother-Infant Interaction.

    ERIC Educational Resources Information Center

    Rosenfeld, Howard M.

    A method of studying attachment behavior in infants was devised using time series and time sequence analyses. Time series analysis refers to relationships between events coded over adjacent fixed-time units. Time sequence analysis refers to the distribution of exact times at which particular events happen. Using these techniques, multivariate…

  6. Quantifying evolutionary dynamics from variant-frequency time series

    PubMed Central

    Khatri, Bhavin S.

    2016-01-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series. PMID:27616332

  7. Quantifying evolutionary dynamics from variant-frequency time series

    NASA Astrophysics Data System (ADS)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  8. Carbon time series in the Norwegian sea

    NASA Astrophysics Data System (ADS)

    Gislefoss, Jorunn S.; Nydal, Reidar; Slagstad, Dag; Sonninen, Eloni; Holmén, Kim

    1998-02-01

    Depth profiles of carbon parameters were obtained monthly from 1991 to 1994 as the first time series from the weathership station M located in the Norwegian Sea at 66°N 2°E. CO 2 was extracted from acidified seawater by a flushing procedure, with nitrogen as the carrier gas. The pure CO 2 gas was measured using a manometric technique, and the gas was further used for 13C and 14C measurements. The precision of the dissolved inorganic carbon (DIC) was better than ±6‰. Satisfactory agreement was obtained with standard seawater from Scripps Institution of Oceanography. The partial pressure of CO 2 (pCO 2) was measured in the atmosphere and surface water, beginning in October 1991. The most visible seasonal variation in DIC, 13C and pCO 2 was due to the plankton bloom in the upper 50-100 m. Typical values for surface water in the winter were: 2.140±0.012 mmol kg -1 for DIC, 1.00±0.04‰ for δ 13C and 357±15 μatm for pCO 2, and the corresponding values in the summer were as low as 2.04 mmol kg -1, greater than 2.1‰, and as low as 270-300 μatm. The values for deep water are more constant during the year, with DIC values of about 2.17±0.01 mmol kg -1, and δ 13C values between 0.97 and 1.14‰. A simple one-dimensional biological model was applied in order to investigate possible short-term variability in DIC caused by the phytoplankton growth and depth variations of the wind-mixed layer. The simulated seasonal pattern was in reasonable agreement with the observed data, but there were significant temporal variations with shorter time interval than the monthly measurements. As a supplement to the measurements at station M, some representative profiles of DIC, δ 13C, Δ 14C, salinity and temperature from other locations in the Nordic Seas and the North Atlantic Ocean are also presented. The results are also compared with some data obtained ( Δ 14C) by the TTO expedition in 1981 and the GEOSECS expedition in 1972. The carbon profiles reflect the stable deep

  9. Measuring persistence in time series of temperature anomalies

    NASA Astrophysics Data System (ADS)

    Triacca, Umberto; Pasini, Antonello; Attanasio, Alessandro

    2014-11-01

    Studies on persistence are important for the clarification of statistical properties of the analyzed time series and for understanding the dynamics of the systems which create these series. In climatology, the analysis of the autocorrelation function has been the main tool to investigate the persistence of a time series. In this paper, we propose to use a more sophisticated econometric instrument. Using this tool, we obtain an estimate of the persistence in global land and ocean and hemispheric temperature time series.

  10. Noise reduction by recycling dynamically coupled time series.

    PubMed

    Mera, M Eugenia; Morán, Manuel

    2011-12-01

    We say that several scalar time series are dynamically coupled if they record the values of measurements of the state variables of the same smooth dynamical system. We show that much of the information lost due to measurement noise in a target time series can be recovered with a noise reduction algorithm by crossing the time series with another time series with which it is dynamically coupled. The method is particularly useful for reduction of measurement noise in short length time series with high uncertainties.

  11. A Fully Implicit Time Accurate Method for Hypersonic Combustion: Application to Shock-induced Combustion Instability

    NASA Technical Reports Server (NTRS)

    Yungster, Shaye; Radhakrishnan, Krishnan

    1994-01-01

    A new fully implicit, time accurate algorithm suitable for chemically reacting, viscous flows in the transonic-to-hypersonic regime is described. The method is based on a class of Total Variation Diminishing (TVD) schemes and uses successive Gauss-Siedel relaxation sweeps. The inversion of large matrices is avoided by partitioning the system into reacting and nonreacting parts, but still maintaining a fully coupled interaction. As a result, the matrices that have to be inverted are of the same size as those obtained with the commonly used point implicit methods. In this paper we illustrate the applicability of the new algorithm to hypervelocity unsteady combustion applications. We present a series of numerical simulations of the periodic combustion instabilities observed in ballistic-range experiments of blunt projectiles flying at subdetonative speeds through hydrogen-air mixtures. The computed frequencies of oscillation are in excellent agreement with experimental data.

  12. Detecting inhomogeneities in pan evaporation time series

    NASA Astrophysics Data System (ADS)

    Kirono, D. G. C.

    2009-04-01

    There is increasingly growing demand for evaporation data for studies of surface water and energy fluxes, especially for studies which address the impacts of global warming. To serve this purpose, a homogeneous evaporation data are necessary. This paper describes the use of two tests for detecting and adjusting discontinuities in Class A pan evaporation time series for 28 stations across Australia, and illustrates the benefit of using corrected records in climate studies. The two tests being the bivariate test of Maronna and Yohai (1978), also known as the Potter method (WMO 2003), and the RHTest of Wang and Feng (2004). Overall, 58 per cent of the inhomogeneities detected by the bivariate test were also identified by the RHTest. The fact that the other 42 per cent of inhomogeneities were not consistently detected is due to different sensitivities of the two methods. Ninety-two per cent of the inhomogeneities detected by the bivariate test are consistent with documented changes that can be strongly associated with the discontinuity. Having identified inhomogeneities, the adjusments were only applied to records which contained inhomogeneities that could be verified as having a non-climatic origin. The benefit of using the original and adjusted pan evaporation records in a climate study were then investigated from two points of view: correlation analyses and trend analysis. As an illustration, the results show that the trend (1970-2004) in the all-stations average was -2.8±1.7 for the original data but only -0.7±1.6 mm/year/year for the adjusted data, demonstrating the importance of screening the data before their use in climate studies. References Maronna, R. and Yohai, V.J. 1978. A bivariate test for the detection of a systematic change in mean. J. Amer. Statis. Assoc., 73, 640-645. Wang, X.L. and Feng, Y. 2004. RHTest User manual. Available from http://cccma.seos.uvic.ca/ETCCDMI/RHTestUserManual.doc WMO. 2003. Guidelines on climate metadata and homogenization

  13. It's About Time: How Accurate Can Geochronology Become?

    NASA Astrophysics Data System (ADS)

    Harrison, M.; Baldwin, S.; Caffee, M. W.; Gehrels, G. E.; Schoene, B.; Shuster, D. L.; Singer, B. S.

    2015-12-01

    As isotope ratio precisions have improved to as low as ±1 ppm, geochronologic precision has remained essentially unchanged. This largely reflects the nature of radioactivity whereby the parent decays into a different chemical species thus putting as much emphasis on the determining inter-element ratios as isotopic. Even the best current accuracy grows into errors of >0.6 m.y. during the Paleozoic - a span of time equal to ¼ of the Pleistocene. If we are to understand the nature of Paleozoic species variation and climate change at anything like the Cenozoic, we need a 10x improvement in accuracy. The good news is that there is no physical impediment to realizing this. There are enough Pb* atoms in the outer few μm's of a Paleozoic zircon grown moments before eruption to permit ±0.01% accuracy in the U-Pb system. What we need are the resources to synthesize the spikes, enhance ionization yields, exploit microscale sampling, and improve knowledge of λ correspondingly. Despite advances in geochronology over the past 40 years (multicollection, multi-isotope spikes, in situ dating), our ability to translate a daughter atom into a detected ion has remained at the level of 1% or so. This means that a ~102 increase in signal can be achieved before we approach a physical limit. Perhaps the most promising approach is use of broad spectrum lasers that can ionize all neutrals. Radical new approaches to providing mass separation of such signals are emerging, including trapped ion cyclotron resonance and multi-turn, sputtered neutral TOF spectrometers capable of mass resolutions in excess of 105. These innovations hold great promise in geochronology but are largely being developed for cosmochemistry. This may make sense at first glance as cosmochemists are classically atom-limited (IDPs, stardust) but can be a misperception as the outer few μm's of a zircon may represent no more mass than a stardust mote. To reach the fundamental limits of geochronologic signals we need to

  14. Comparison of statistical models for analyzing wheat yield time series.

    PubMed

    Michel, Lucie; Makowski, David

    2013-01-01

    The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha⁻¹ year⁻¹ in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale.

  15. Comparison of Statistical Models for Analyzing Wheat Yield Time Series

    PubMed Central

    Michel, Lucie; Makowski, David

    2013-01-01

    The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha−1 year−1 in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale. PMID:24205280

  16. Time series modelling and forecasting of emergency department overcrowding.

    PubMed

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  17. From time series to complex networks: the visibility graph.

    PubMed

    Lacasa, Lucas; Luque, Bartolo; Ballesteros, Fernando; Luque, Jordi; Nuño, Juan Carlos

    2008-04-01

    In this work we present a simple and fast computational method, the visibility algorithm, that converts a time series into a graph. The constructed graph inherits several properties of the series in its structure. Thereby, periodic series convert into regular graphs, and random series do so into random graphs. Moreover, fractal series convert into scale-free networks, enhancing the fact that power law degree distributions are related to fractality, something highly discussed recently. Some remarkable examples and analytical tools are outlined to test the method's reliability. Many different measures, recently developed in the complex network theory, could by means of this new approach characterize time series from a new point of view.

  18. Efficient Algorithms for Segmentation of Item-Set Time Series

    NASA Astrophysics Data System (ADS)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  19. A Time Series Approach for Soil Moisture Estimation

    NASA Technical Reports Server (NTRS)

    Kim, Yunjin; vanZyl, Jakob

    2006-01-01

    Soil moisture is a key parameter in understanding the global water cycle and in predicting natural hazards. Polarimetric radar measurements have been used for estimating soil moisture of bare surfaces. In order to estimate soil moisture accurately, the surface roughness effect must be compensated properly. In addition, these algorithms will not produce accurate results for vegetated surfaces. It is difficult to retrieve soil moisture of a vegetated surface since the radar backscattering cross section is sensitive to the vegetation structure and environmental conditions such as the ground slope. Therefore, it is necessary to develop a method to estimate the effect of the surface roughness and vegetation reliably. One way to remove the roughness effect and the vegetation contamination is to take advantage of the temporal variation of soil moisture. In order to understand the global hydrologic cycle, it is desirable to measure soil moisture with one- to two-days revisit. Using these frequent measurements, a time series approach can be implemented to improve the soil moisture retrieval accuracy.

  20. Time series power flow analysis for distribution connected PV generation.

    SciTech Connect

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  1. Global near real-time disturbance monitoring using MODIS satellite image time series

    NASA Astrophysics Data System (ADS)

    Verbesselt, J.; Kalomenopoulos, M.; de Jong, R.; Zeileis, A.; Herold, M.

    2012-12-01

    Global disturbance monitoring in forested ecosystems is critical to retrieve information on carbon storage dynamics, biodiversity, and other socio-ecological processes. Satellite remote sensing provides a means for cost-effective monitoring at frequent time steps over large areas. However, for information about current change processes, it is required to analyse image time series in a fast and accurate manner and to detect abnormal change in near real time. An increasing number of change detection techniques have become available that are able to process historical satellite image time series data to detect changes in the past. However, methods that detect changes near real-time, i.e. analysing newly acquired data with respect to the historical series, are lacking. We propose a statistical technique for monitoring change in near-real time by comparing current data with a seasonal-trend model fitted onto the historical time series. As such, identification of consistent and abnormal change in near-real time becomes possible as soon as new image data is captured. The method is based on the "Break For Additive Seasonal Trend" (BFAST) concept (http://bfast.r-forge.r-project.org/). Disturbances are detected by analysing 16-daily MODIS combined vegetation and temperature indices. Validation is carried out by comparing the detected disturbances with available disturbance data sets (e.g. deforestation in Brazil and MODIS fire products). Preliminary results demonstrated that abrupt changes at the end of time series can be successfully detected while the method remains robust for strong seasonality and atmospheric noise. Cloud masking, however, was identified as a critical issue since periods of persistent cloudiness can be detected as abnormal change. The proposed method is an automatic and robust change detection approach that can be applied on different types of data (e.g. future sensors like the Sentinel constellation that provide higher spatial resolution at regular time

  2. gatspy: General tools for Astronomical Time Series in Python

    NASA Astrophysics Data System (ADS)

    VanderPlas, Jake

    2016-10-01

    Gatspy contains efficient, well-documented implementations of several common routines for Astronomical time series analysis, including the Lomb-Scargle periodogram, the Supersmoother method, and others.

  3. Trend time-series modeling and forecasting with neural networks.

    PubMed

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  4. Using neural networks for dynamic light scattering time series processing

    NASA Astrophysics Data System (ADS)

    Chicea, Dan

    2017-04-01

    A basic experiment to record dynamic light scattering (DLS) time series was assembled using basic components. The DLS time series processing using the Lorentzian function fit was considered as reference. A Neural Network was designed and trained using simulated frequency spectra for spherical particles in the range 0–350 nm, assumed to be scattering centers, and the neural network design and training procedure are described in detail. The neural network output accuracy was tested both on simulated and on experimental time series. The match with the DLS results, considered as reference, was good serving as a proof of concept for using neural networks in fast DLS time series processing.

  5. Interpretable Early Classification of Multivariate Time Series

    ERIC Educational Resources Information Center

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  6. Simulation of Ground Winds Time Series

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    2008-01-01

    A simulation process has been developed for generation of the longitudinal and lateral components of ground wind atmospheric turbulence as a function of mean wind speed, elevation, temporal frequency range and distance between locations. The distance between locations influences the spectral coherence between the simulated series at adjacent locations. Short distances reduce correlation only at high frequencies; as distances increase correlation is reduced over a wider range of frequencies. The choice of values for the constants d1 and d3 in the PSD model is the subject of work in progress. An improved knowledge of the values for zO as a function of wind direction at the ARES-1 launch pads is necessary for definition of d1. Results of other studies at other locations may be helpful as summarized in Fichtl's recent correspondence. Ideally, further research is needed based on measurements of ground wind turbulence with high resolution anemometers at a number of altitudes at a new KSC tower located closer to the ARES-1 launch pad .The proposed research would be based on turbulence measurements that may be influenced by surface terrain roughness that may be significantly different from roughness prior to 1970 in Fichtl's measurements. Significant improvements in instrumentation, data storage end processing will greatly enhance the capability to model ground wind profiles and ground wind turbulence.

  7. Testing time series irreversibility using complex network methods

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Donner, Reik V.; Kurths, Jürgen

    2013-04-01

    The absence of time-reversal symmetry is a fundamental property of many nonlinear time series. Here, we propose a new set of statistical tests for time series irreversibility based on standard and horizontal visibility graphs. Specifically, we statistically compare the distributions of time-directed variants of the common complex network measures degree and local clustering coefficient. Our approach does not involve surrogate data and is applicable to relatively short time series. We demonstrate its performance for paradigmatic model systems with known time-reversal properties as well as for picking up signatures of nonlinearity in neuro-physiological data.

  8. Common trends in northeast Atlantic squid time series

    NASA Astrophysics Data System (ADS)

    Zuur, A. F.; Pierce, G. J.

    2004-06-01

    In this paper, dynamic factor analysis is used to estimate common trends in time series of squid catch per unit effort in Scottish (UK) waters. Results indicated that time series of most months were related to sea surface temperature measured at Millport (UK) and a few series were related to the NAO index. The DFA methodology identified three common trends in the squid time series not revealed by traditional approaches, which suggest a possible shift in relative abundance of summer- and winter-spawning populations.

  9. Distance measure with improved lower bound for multivariate time series

    NASA Astrophysics Data System (ADS)

    Li, Hailin

    2017-02-01

    Lower bound function is one of the important techniques used to fast search and index time series data. Multivariate time series has two aspects of high dimensionality including the time-based dimension and the variable-based dimension. Due to the influence of variable-based dimension, a novel method is proposed to deal with the lower bound distance computation for multivariate time series. The proposed method like the traditional ones also reduces the dimensionality of time series in its first step and thus does not directly apply the lower bound function on the multivariate time series. The dimensionality reduction is that multivariate time series is reduced to univariate time series denoted as center sequences according to the principle of piecewise aggregate approximation. In addition, an extended lower bound function is designed to obtain good tightness and fast measure the distance between any two center sequences. The experimental results demonstrate that the proposed lower bound function has better tightness and improves the performance of similarity search in multivariate time series datasets.

  10. Multiscale structure of time series revealed by the monotony spectrum

    NASA Astrophysics Data System (ADS)

    Vamoş, Cǎlin

    2017-03-01

    Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.

  11. Short time-series microarray analysis: Methods and challenges

    PubMed Central

    Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina

    2008-01-01

    The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data. PMID:18605994

  12. Horizontal visibility graphs: exact results for random time series.

    PubMed

    Luque, B; Lacasa, L; Ballesteros, F; Luque, J

    2009-10-01

    The visibility algorithm has been recently introduced as a mapping between time series and complex networks. This procedure allows us to apply methods of complex network theory for characterizing time series. In this work we present the horizontal visibility algorithm, a geometrically simpler and analytically solvable version of our former algorithm, focusing on the mapping of random series (series of independent identically distributed random variables). After presenting some properties of the algorithm, we present exact results on the topological properties of graphs associated with random series, namely, the degree distribution, the clustering coefficient, and the mean path length. We show that the horizontal visibility algorithm stands as a simple method to discriminate randomness in time series since any random series maps to a graph with an exponential degree distribution of the shape P(k)=(1/3)(2/3)(k-2), independent of the probability distribution from which the series was generated. Accordingly, visibility graphs with other P(k) are related to nonrandom series. Numerical simulations confirm the accuracy of the theorems for finite series. In a second part, we show that the method is able to distinguish chaotic series from independent and identically distributed (i.i.d.) theory, studying the following situations: (i) noise-free low-dimensional chaotic series, (ii) low-dimensional noisy chaotic series, even in the presence of large amounts of noise, and (iii) high-dimensional chaotic series (coupled map lattice), without needs for additional techniques such as surrogate data or noise reduction methods. Finally, heuristic arguments are given to explain the topological properties of chaotic series, and several sequences that are conjectured to be random are analyzed.

  13. Nonlinear parametric model for Granger causality of time series

    NASA Astrophysics Data System (ADS)

    Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano

    2006-06-01

    The notion of Granger causality between two time series examines if the prediction of one series could be improved by incorporating information of the other. In particular, if the prediction error of the first time series is reduced by including measurements from the second time series, then the second time series is said to have a causal influence on the first one. We propose a radial basis function approach to nonlinear Granger causality. The proposed model is not constrained to be additive in variables from the two time series and can approximate any function of these variables, still being suitable to evaluate causality. Usefulness of this measure of causality is shown in two applications. In the first application, a physiological one, we consider time series of heart rate and blood pressure in congestive heart failure patients and patients affected by sepsis: we find that sepsis patients, unlike congestive heart failure patients, show symmetric causal relationships between the two time series. In the second application, we consider the feedback loop in a model of excitatory and inhibitory neurons: we find that in this system causality measures the combined influence of couplings and membrane time constants.

  14. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  15. The Prediction of Teacher Turnover Employing Time Series Analysis.

    ERIC Educational Resources Information Center

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  16. Investigation on gait time series by means of factorial moments

    NASA Astrophysics Data System (ADS)

    Yang, Huijie; Zhao, Fangcui; Zhuo, Yizhong; Wu, Xizhen; Li, Zhuxia

    2002-09-01

    By means of factorial moments (FM), the fractal structures embedded in gait time series are investigated. Intermittency is found in records for healthy objects. And this kind of intermittency is much sensitive to disease or outside influences. It is found that FM is an effective tool to deal with this kind of time series.

  17. Improved singular spectrum analysis for time series with missing data

    NASA Astrophysics Data System (ADS)

    Shen, Y.; Peng, F.; Li, B.

    2015-07-01

    Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution develops an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach is evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60 %, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30 % for the first four principal components, respectively. Both the mean absolute error and mean root mean squared error of the reconstructed time series by ISSA are also smaller than those by SSAM. The respective improvements are 34.45 and 33.91 % when the missing data accounts for 60 %. The results from real incomplete time series also show that the standard deviation (SD) derived by ISSA is 12.27 mg L-1, smaller than the 13.48 mg L-1 derived by SSAM.

  18. Improved singular spectrum analysis for time series with missing data

    NASA Astrophysics Data System (ADS)

    Shen, Y.; Peng, F.; Li, B.

    2014-12-01

    Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution will develop an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach was evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60%, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30% for the first four principal components, respectively. Besides, both the mean absolute errors and mean root mean squared errors of the reconstructed time series by ISSA are also much smaller than those by SSAM. The respective improvements are 34.45 and 33.91% when the missing data accounts for 60%. The results from real incomplete time series also show that the SD derived by ISSA is 12.27 mg L-1, smaller than 13.48 mg L-1 derived by SSAM.

  19. Measurements of spatial population synchrony: influence of time series transformations.

    PubMed

    Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël

    2015-09-01

    Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.

  20. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  1. A Computer Evolution in Teaching Undergraduate Time Series

    ERIC Educational Resources Information Center

    Hodgess, Erin M.

    2004-01-01

    In teaching undergraduate time series courses, we have used a mixture of various statistical packages. We have finally been able to teach all of the applied concepts within one statistical package; R. This article describes the process that we use to conduct a thorough analysis of a time series. An example with a data set is provided. We compare…

  2. Approximate but accurate quantum dynamics from the Mori formalism. II. Equilibrium time correlation functions.

    PubMed

    Montoya-Castillo, Andrés; Reichman, David R

    2017-02-28

    The ability to efficiently and accurately calculate equilibrium time correlation functions of many-body condensed phase quantum systems is one of the outstanding problems in theoretical chemistry. The Nakajima-Zwanzig-Mori formalism coupled to the self-consistent solution of the memory kernel has recently proven to be highly successful for the computation of nonequilibrium dynamical averages. Here, we extend this formalism to treat symmetrized equilibrium time correlation functions for the spin-boson model. Following the first paper in this series [A. Montoya-Castillo and D. R. Reichman, J. Chem. Phys. 144, 184104 (2016)], we use a Dyson-type expansion of the projected propagator to obtain a self-consistent solution for the memory kernel that requires only the calculation of normally evolved auxiliary kernels. We employ the approximate mean-field Ehrenfest method to demonstrate the feasibility of this approach. Via comparison with numerically exact results for the correlation function Czz(t)=Re⟨σz(0)σz(t)⟩, we show that the current scheme affords remarkable boosts in accuracy and efficiency over bare Ehrenfest dynamics. We further explore the sensitivity of the resulting dynamics to the choice of kernel closures and the accuracy of the initial canonical density operator.

  3. Scaling and Multiscaling in Financial Time Series

    DTIC Science & Technology

    2007-11-02

    Prescribed by ANSI Std Z39-18 Outline 1/ A brief overview of financial markets • Basic definitions and problems related to finance • Scaling in finance 2...quantitative finance • Rational investment and risk management - Price dynamics - Risk quantification and control - Financial instruments: derivatives... finance • Supported by empirical observations • Practical interests. - Stability over time scales (by aggregation) - The same model is valid over a wide

  4. Time series diagnosis of tree hydraulic characteristics.

    PubMed

    Phillips, Nathan G; Oren, Ram; Licata, Julian; Linder, Sune

    2004-08-01

    An in vivo method for diagnosing hydraulic characteristics of branches and whole trees is described. The method imposes short-lived perturbations of transpiration and traces the propagation of the hydraulic response through trees. The water uptake response contains the integrated signature of hydraulic resistance and capacitance within trees. The method produces large signal to noise ratios for analysis, but does not cause damage or destruction to tree stems or branches. Based on results with two conifer tree species, we show that the method allows for the simple parameterization of bulk hydraulic resistance and capacitance of trees. Bulk tree parameterization of resistance and capacitance predicted the overall diel shape of water uptake, but did not predict the overshoot water uptake response in trees to shorter-term variations in transpiration, created by step changes in transpiration rate. Stomatal dynamics likely complicated the use of simple resistance-capacitance models of tree water transport on these short time scales. The results provide insight into dominant hydraulic and physiological factors controlling tree water flux on varying time scales, and allow for the practical assessment of necessary tree hydraulic model complexity in relation to the time step of soil- vegetation-atmosphere transport models.

  5. Time Series of North Pacific Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Worden, A. K.; Webley, P. W.

    2011-12-01

    The record of volcanic eruptions was gathered from the 1986 eruption of Augustine Volcano to present for Alaska, Kamchatka and the Kuriles Islands. In this time over 400 ash producing eruptions were noted, and many more events that produced some other activity, e.g. lava, lahar, small explosion, seismic crisis. This represents a minimum for the volcanic activity in this region. It is thought that the records for Alaska are complete for this time period, but it is possible that activity in the Kuriles and Kamchatka could have been overlooked, particularly smaller events. For the Alaska region, 19 different volcanoes have been active in this time. Mt. Cleveland shows the most activity over the time period (40 % likely to have activity in a 3 month period), followed closely by Pavlof (34% likely)volcano. In Kamchatka only 7 volcanoes have been active, Shiveluch is the most active (83% likely) followed by Bezymianny and Kliuchevskoi volcanoes (tied at 60%). The Kuriles only has had 4 active volcanoes, and only 6 known eruptions. Overall this region is one of the most active in the world, in any 3 month period there is a 77% likelihood of volcano activity. For well instrumented volcanoes, the majority of activity is preceded by significant seismicity. For just over half of the events, explosive activity is preceded by thermal signals in infrared satellite data. Rarely (only about 5% of the time) is a stand alone thermal signal not followed within 3 months by an explosive eruption. For remaining events where an ash plume begins the activity, over 90% of the cases show a thermal signal the eruption. The volcanoes with the most activity are the least likely to produce large ash plumes. Conversely the volcanoes that erupt rarely often begin with larger ash producing events. Though there appears to be a recurrent progression of volcanic activity down the chain from east to west, this may be an artifact of several independent systems, each working at their own rate, that

  6. Application of cross-sectional time series modeling for the prediction of energy expenditure from heart rate and accelerometry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Accurate estimation of energy expenditure (EE) in children and adolescents is required for a better understanding of physiological, behavioral, and environmental factors affecting energy balance. Cross-sectional time series (CSTS) models, which account for correlation structure of repeated observati...

  7. Sunspot Time Series: Passive and Active Intervals

    NASA Astrophysics Data System (ADS)

    Zięba, S.; Nieckarz, Z.

    2014-07-01

    Solar activity slowly and irregularly decreases from the first spotless day (FSD) in the declining phase of the old sunspot cycle and systematically, but also in an irregular way, increases to the new cycle maximum after the last spotless day (LSD). The time interval between the first and the last spotless day can be called the passive interval (PI), while the time interval from the last spotless day to the first one after the new cycle maximum is the related active interval (AI). Minima of solar cycles are inside PIs, while maxima are inside AIs. In this article, we study the properties of passive and active intervals to determine the relation between them. We have found that some properties of PIs, and related AIs, differ significantly between two group of solar cycles; this has allowed us to classify Cycles 8 - 15 as passive cycles, and Cycles 17 - 23 as active ones. We conclude that the solar activity in the PI declining phase (a descending phase of the previous cycle) determines the strength of the approaching maximum in the case of active cycles, while the activity of the PI rising phase (a phase of the ongoing cycle early growth) determines the strength of passive cycles. This can have implications for solar dynamo models. Our approach indicates the important role of solar activity during the declining and the rising phases of the solar-cycle minimum.

  8. Forecasting Marine Corps Enlisted Manpower Inventory Levels With Univariate Time Series Models

    DTIC Science & Technology

    2006-03-01

    1 B. PURPOSE.........................................................................................................2 C. SCOPE AND METHODOLGY ...Jenkins technique is a sophisticated approach by which to analyze time series data and extrapolate a forecast. This methodology provides a framework...Model by more accurately predicting the personnel resources available for assignment to future manpower requirements. The primary research question

  9. Comparison of New and Old Sunspot Number Time Series

    NASA Astrophysics Data System (ADS)

    Cliver, E. W.

    2016-11-01

    Four new sunspot number time series have been published in this Topical Issue: a backbone-based group number in Svalgaard and Schatten ( Solar Phys., 2016; referred to here as SS, 1610 - present), a group number series in Usoskin et al. ( Solar Phys., 2016; UEA, 1749 - present) that employs active day fractions from which it derives an observational threshold in group spot area as a measure of observer merit, a provisional group number series in Cliver and Ling ( Solar Phys., 2016; CL, 1841 - 1976) that removed flaws in the Hoyt and Schatten ( Solar Phys. 179, 189, 1998a; 181, 491, 1998b) normalization scheme for the original relative group sunspot number (RG, 1610 - 1995), and a corrected Wolf (international, RI) number in Clette and Lefèvre ( Solar Phys., 2016; SN, 1700 - present). Despite quite different construction methods, the four new series agree well after about 1900. Before 1900, however, the UEA time series is lower than SS, CL, and SN, particularly so before about 1885. Overall, the UEA series most closely resembles the original RG series. Comparison of the UEA and SS series with a new solar wind B time series (Owens et al. in J. Geophys. Res., 2016; 1845 - present) indicates that the UEA time series is too low before 1900. We point out incongruities in the Usoskin et al. ( Solar Phys., 2016) observer normalization scheme and present evidence that this method under-estimates group counts before 1900. In general, a correction factor time series, obtained by dividing an annual group count series by the corresponding yearly averages of raw group counts for all observers, can be used to assess the reliability of new sunspot number reconstructions.

  10. Learning a Mahalanobis Distance-Based Dynamic Time Warping Measure for Multivariate Time Series Classification.

    PubMed

    Mei, Jiangyuan; Liu, Meizhu; Wang, Yuan-Fang; Gao, Huijun

    2016-06-01

    Multivariate time series (MTS) datasets broadly exist in numerous fields, including health care, multimedia, finance, and biometrics. How to classify MTS accurately has become a hot research topic since it is an important element in many computer vision and pattern recognition applications. In this paper, we propose a Mahalanobis distance-based dynamic time warping (DTW) measure for MTS classification. The Mahalanobis distance builds an accurate relationship between each variable and its corresponding category. It is utilized to calculate the local distance between vectors in MTS. Then we use DTW to align those MTS which are out of synchronization or with different lengths. After that, how to learn an accurate Mahalanobis distance function becomes another key problem. This paper establishes a LogDet divergence-based metric learning with triplet constraint model which can learn Mahalanobis matrix with high precision and robustness. Furthermore, the proposed method is applied on nine MTS datasets selected from the University of California, Irvine machine learning repository and Robert T. Olszewski's homepage, and the results demonstrate the improved performance of the proposed approach.

  11. Detecting unstable periodic orbits from transient chaotic time series

    PubMed

    Dhamala; Lai; Kostelich

    2000-06-01

    We address the detection of unstable periodic orbits from experimentally measured transient chaotic time series. In particular, we examine recurrence times of trajectories in the vector space reconstructed from an ensemble of such time series. Numerical experiments demonstrate that this strategy can yield periodic orbits of low periods even when noise is present. We analyze the probability of finding periodic orbits from transient chaotic time series and derive a scaling law for this probability. The scaling law implies that unstable periodic orbits of high periods are practically undetectable from transient chaos.

  12. High performance biomedical time series indexes using salient segmentation.

    PubMed

    Woodbridge, Jonathan; Mortazavi, Bobak; Bui, Alex A T; Sarrafzadeh, Majid

    2012-01-01

    The advent of remote and wearable medical sensing has created a dire need for efficient medical time series databases. Wearable medical sensing devices provide continuous patient monitoring by various types of sensors and have the potential to create massive amounts of data. Therefore, time series databases must utilize highly optimized indexes in order to efficiently search and analyze stored data. This paper presents a highly efficient technique for indexing medical time series signals using Locality Sensitive Hashing (LSH). Unlike previous work, only salient (or interesting) segments are inserted into the index. This technique reduces search times by up to 95% while yielding near identical search results.

  13. From time series to complex networks: The visibility graph

    PubMed Central

    Lacasa, Lucas; Luque, Bartolo; Ballesteros, Fernando; Luque, Jordi; Nuño, Juan Carlos

    2008-01-01

    In this work we present a simple and fast computational method, the visibility algorithm, that converts a time series into a graph. The constructed graph inherits several properties of the series in its structure. Thereby, periodic series convert into regular graphs, and random series do so into random graphs. Moreover, fractal series convert into scale-free networks, enhancing the fact that power law degree distributions are related to fractality, something highly discussed recently. Some remarkable examples and analytical tools are outlined to test the method's reliability. Many different measures, recently developed in the complex network theory, could by means of this new approach characterize time series from a new point of view. PMID:18362361

  14. Detecting and visualizing structural changes in groundwater head time series

    NASA Astrophysics Data System (ADS)

    van Geer, Frans

    2013-04-01

    Since the fifties of the past century the dynamic behavior of the groundwater head has been monitored at many locations throughout the Netherlands and elsewhere. The data base of the Geological Survey of the Netherlands contains over 30,000 groundwater time series. For many water management purposes characteristics of the dynamic behavior are required, such as average, median, percentile etc.. These characteristics are estimated from the time series. In principle, the longer the time series, the more reliable the estimate. However, due to natural as well as man induced changes, the characteristics of a long time series are often changing in time as well. For water management it is important to be able to distinguish extreme values as part of the 'normal' pattern from structural changes in the groundwater regime. Whether or not structural changes are present in the time series can't be decided completely objective. Choices have to be made concerning the length of the period and the statistical parameters. Here a method is proposed to visualize the probability of structural changes in the time series using well known basic statistical tests. The visualization method is based on the mean values and standard deviation in a moving window. Apart from several characteristics that are calculated for each period separately, all pairs of two periods are compared and the difference is statistically tested. The results of these well known tests are combined in a visualization to supply to the user comprehensive information to examine structural changes in time series.

  15. Performance of multifractal detrended fluctuation analysis on short time series

    NASA Astrophysics Data System (ADS)

    López, Juan Luis; Contreras, Jesús Guillermo

    2013-02-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  16. Model-free quantification of time-series predictability.

    PubMed

    Garland, Joshua; James, Ryan; Bradley, Elizabeth

    2014-11-01

    This paper provides insight into when, why, and how forecast strategies fail when they are applied to complicated time series. We conjecture that the inherent complexity of real-world time-series data, which results from the dimension, nonlinearity, and nonstationarity of the generating process, as well as from measurement issues such as noise, aggregation, and finite data length, is both empirically quantifiable and directly correlated with predictability. In particular, we argue that redundancy is an effective way to measure complexity and predictive structure in an experimental time series and that weighted permutation entropy is an effective way to estimate that redundancy. To validate these conjectures, we study 120 different time-series data sets. For each time series, we construct predictions using a wide variety of forecast models, then compare the accuracy of the predictions with the permutation entropy of that time series. We use the results to develop a model-free heuristic that can help practitioners recognize when a particular prediction method is not well matched to the task at hand: that is, when the time series has more predictive structure than that method can capture and exploit.

  17. Time Series Decomposition into Oscillation Components and Phase Estimation.

    PubMed

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-02-01

    Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.

  18. Nonlinear independent component analysis and multivariate time series analysis

    NASA Astrophysics Data System (ADS)

    Storck, Jan; Deco, Gustavo

    1997-02-01

    We derive an information-theory-based unsupervised learning paradigm for nonlinear independent component analysis (NICA) with neural networks. We demonstrate that under the constraint of bounded and invertible output transfer functions the two main goals of unsupervised learning, redundancy reduction and maximization of the transmitted information between input and output (Infomax-principle), are equivalent. No assumptions are made concerning the kind of input and output distributions, i.e. the kind of nonlinearity of correlations. An adapted version of the general NICA network is used for the modeling of multivariate time series by unsupervised learning. Given time series of various observables of a dynamical system, our net learns their evolution in time by extracting statistical dependencies between past and present elements of the time series. Multivariate modeling is obtained by making present value of each time series statistically independent not only from their own past but also from the past of the other series. Therefore, in contrast to univariate methods, the information lying in the couplings between the observables is also used and a detection of higher-order cross correlations is possible. We apply our method to time series of the two-dimensional Hénon map and to experimental time series obtained from the measurements of axial velocities in different locations in weakly turbulent Taylor-Couette flow.

  19. Database for Hydrological Time Series of Inland Waters (DAHITI)

    NASA Astrophysics Data System (ADS)

    Schwatke, Christian; Dettmering, Denise

    2016-04-01

    Satellite altimetry was designed for ocean applications. However, since some years, satellite altimetry is also used over inland water to estimate water level time series of lakes, rivers and wetlands. The resulting water level time series can help to understand the water cycle of system earth and makes altimetry to a very useful instrument for hydrological applications. In this poster, we introduce the "Database for Hydrological Time Series of Inland Waters" (DAHITI). Currently, the database contains about 350 water level time series of lakes, reservoirs, rivers, and wetlands which are freely available after a short registration process via http://dahiti.dgfi.tum.de. In this poster, we introduce the product of DAHITI and the functionality of the DAHITI web service. Furthermore, selected examples of inland water targets are presented in detail. DAHITI provides time series of water level heights of inland water bodies and their formal errors . These time series are available within the period of 1992-2015 and have varying temporal resolutions depending on the data coverage of the investigated water body. The accuracies of the water level time series depend mainly on the extent of the investigated water body and the quality of the altimeter measurements. Hereby, an external validation with in-situ data reveals RMS differences between 5 cm and 40 cm for lakes and 10 cm and 140 cm for rivers, respectively.

  20. Chaos Time Series Prediction Based on Membrane Optimization Algorithms

    PubMed Central

    Li, Meng; Yi, Liangzhong; Pei, Zheng; Gao, Zhisheng

    2015-01-01

    This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ, m) and least squares support vector machine (LS-SVM) (γ, σ) by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM) broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE), root mean square error (RMSE), and mean absolute percentage error (MAPE). PMID:25874249

  1. Financial time series analysis based on effective phase transfer entropy

    NASA Astrophysics Data System (ADS)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  2. River flow time series using least squares support vector machines

    NASA Astrophysics Data System (ADS)

    Samsudin, R.; Saad, P.; Shabri, A.

    2011-06-01

    This paper proposes a novel hybrid forecasting model known as GLSSVM, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM). The GMDH is used to determine the useful input variables which work as the time series forecasting for the LSSVM model. Monthly river flow data from two stations, the Selangor and Bernam rivers in Selangor state of Peninsular Malaysia were taken into consideration in the development of this hybrid model. The performance of this model was compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA), GMDH and LSSVM models using the long term observations of monthly river flow discharge. The root mean square error (RMSE) and coefficient of correlation (R) are used to evaluate the models' performances. In both cases, the new hybrid model has been found to provide more accurate flow forecasts compared to the other models. The results of the comparison indicate that the new hybrid model is a useful tool and a promising new method for river flow forecasting.

  3. Estimation of Parameters from Discrete Random Nonstationary Time Series

    NASA Astrophysics Data System (ADS)

    Takayasu, H.; Nakamura, T.

    For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.

  4. Two States Mapping Based Time Series Neural Network Model for Compensation Prediction Residual Error

    NASA Astrophysics Data System (ADS)

    Jung, Insung; Koo, Lockjo; Wang, Gi-Nam

    2008-11-01

    The objective of this paper was to design a model of human bio signal data prediction system for decreasing of prediction error using two states mapping based time series neural network BP (back-propagation) model. Normally, a lot of the industry has been applied neural network model by training them in a supervised manner with the error back-propagation algorithm for time series prediction systems. However, it still has got a residual error between real value and prediction result. Therefore, we designed two states of neural network model for compensation residual error which is possible to use in the prevention of sudden death and metabolic syndrome disease such as hypertension disease and obesity. We determined that most of the simulation cases were satisfied by the two states mapping based time series prediction model. In particular, small sample size of times series were more accurate than the standard MLP model.

  5. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive

  6. Modeling Persistence In Hydrological Time Series Using Fractional Differencing

    NASA Astrophysics Data System (ADS)

    Hosking, J. R. M.

    1984-12-01

    The class of autoregressive integrated moving average (ARIMA) time series models may be generalized by permitting the degree of differencing d to take fractional values. Models including fractional differencing are capable of representing persistent series (d > 0) or short-memory series (d = 0). The class of fractionally differenced ARIMA processes provides a more flexible way than has hitherto been available of simultaneously modeling the long-term and short-term behavior of a time series. In this paper some fundamental properties of fractionally differenced ARIMA processes are presented. Methods of simulating these processes are described. Estimation of the parameters of fractionally differenced ARIMA models is discussed, and an approximate maximum likelihood method is proposed. The methodology is illustrated by fitting fractionally differenced models to time series of streamflows and annual temperatures.

  7. Wavelet analysis and scaling properties of time series

    NASA Astrophysics Data System (ADS)

    Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.

    2005-10-01

    We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.

  8. Analyzing multiple nonlinear time series with extended Granger causality

    NASA Astrophysics Data System (ADS)

    Chen, Yonghong; Rangarajan, Govindan; Feng, Jianfeng; Ding, Mingzhou

    2004-04-01

    Identifying causal relations among simultaneously acquired signals is an important problem in multivariate time series analysis. For linear stochastic systems Granger proposed a simple procedure called the Granger causality to detect such relations. In this work we consider nonlinear extensions of Granger's idea and refer to the result as extended Granger causality. A simple approach implementing the extended Granger causality is presented and applied to multiple chaotic time series and other types of nonlinear signals. In addition, for situations with three or more time series we propose a conditional extended Granger causality measure that enables us to determine whether the causal relation between two signals is direct or mediated by another process.

  9. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    PubMed Central

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  10. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  11. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    SciTech Connect

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  12. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  13. Joint accurate time and stable frequency distribution infrastructure sharing fiber footprint with research network

    NASA Astrophysics Data System (ADS)

    Vojtech, Josef; Slapak, Martin; Skoda, Pavel; Radil, Jan; Havlis, Ondrej; Altmann, Michal; Munster, Petr; Smotlacha, Vladimir; Kundrat, Jan; Velc, Radek; Altmannova, Lada; Hula, Miloslav

    2016-09-01

    In this paper, we present infrastructure for accurate time and stable frequency distribution. It is based on sharing of fibers of research and educational network carrying data traffic. Accurate time and stable frequency transmission uses mainly created dark channels amplified by special bidirectional amplifiers with the same propagation path for both directions. Paper also targets challenges joined with bidirectional transmission, which represents directional non-reciprocities and interaction with parallel data transmissions.

  14. Quantifying Memory in Complex Physiological Time-Series

    PubMed Central

    Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811

  15. Quantifying memory in complex physiological time-series.

    PubMed

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  16. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    PubMed

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  17. Searching for periodicity in weighted time point series.

    NASA Astrophysics Data System (ADS)

    Jetsu, L.; Pelt, J.

    1996-09-01

    Consistent statistics for two methods of searching for periodicity in a series of weighted time points are formulated. An approach based on the bootstrap method to estimate the accuracy of detected periodicity is presented.

  18. A probability distribution approach to synthetic turbulence time series

    NASA Astrophysics Data System (ADS)

    Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael

    2016-11-01

    The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.

  19. A mixed time series model of binomial counts

    NASA Astrophysics Data System (ADS)

    Khoo, Wooi Chen; Ong, Seng Huat

    2015-10-01

    Continuous time series modelling has been an active research in the past few decades. However, time series data in terms of correlated counts appear in many situations such as the counts of rainy days and access downloading. Therefore, the study on count data has become popular in time series modelling recently. This article introduces a new mixture model, which is an univariate non-negative stationary time series model with binomial marginal distribution, arising from the combination of the well-known binomial thinning and Pegram's operators. A brief review of important properties will be carried out and the EM algorithm is applied in parameter estimation. A numerical study is presented to show the performance of the model. Finally, a potential real application will be presented to illustrate the advantage of the new mixture model.

  20. Distinguishing chaotic time series from noise: A random matrix approach

    NASA Astrophysics Data System (ADS)

    Ye, Bin; Chen, Jianxing; Ju, Chen; Li, Huijun; Wang, Xuesong

    2017-03-01

    Deterministically chaotic systems can often give rise to random and unpredictable behaviors which make the time series obtained from them to be almost indistinguishable from noise. Motivated by the fact that data points in a chaotic time series will have intrinsic correlations between them, we propose a random matrix theory (RMT) approach to identify the deterministic or stochastic dynamics of the system. We show that the spectral distributions of the correlation matrices, constructed from the chaotic time series, deviate significantly from the predictions of random matrix ensembles. On the contrary, the eigenvalue statistics for a noisy signal follow closely those of random matrix ensembles. Numerical results also indicate that the approach is to some extent robust to additive observational noise which pollutes the data in many practical situations. Our approach is efficient in recognizing the continuous chaotic dynamics underlying the evolution of the time series.

  1. The use of synthetic input sequences in time series modeling

    NASA Astrophysics Data System (ADS)

    de Oliveira, Dair José; Letellier, Christophe; Gomes, Murilo E. D.; Aguirre, Luis A.

    2008-08-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  2. Crop growth dynamics modeling using time-series satellite imagery

    NASA Astrophysics Data System (ADS)

    Zhao, Yu

    2014-11-01

    In modern agriculture, remote sensing technology plays an essential role in monitoring crop growth and crop yield prediction. To monitor crop growth and predict crop yield, accurate and timely crop growth information is significant, in particularly for large scale farming. As the high cost and low data availability of high-resolution satellite images such as RapidEye, we focus on the time-series low resolution satellite imagery. In this research, NDVI curve, which was retrieved from satellite images of MODIS 8-days 250m surface reflectance, was applied to monitor soybean's yield. Conventional model and vegetation index for yield prediction has problems on describing the growth basic processes affecting yield component formation. In our research, a novel method is developed to well model the Crop Growth Dynamics (CGD) and generate CGD index to describe the soybean's yield component formation. We analyze the standard growth stage of soybean and to model the growth process, we have two key calculate process. The first is normalization of the NDVI-curve coordinate and division of the crop growth based on the standard development stages using EAT (Effective accumulated temperature).The second is modeling the biological growth on each development stage through analyzing the factors of yield component formation. The evaluation was performed through the soybean yield prediction using the CGD Index in the growth stage when the whole dataset for modeling is available and we got precision of 88.5% which is about 10% higher than the conventional method. The validation results showed that prediction accuracy using our CGD modeling is satisfied and can be applied in practice of large scale soybean yield monitoring.

  3. Prediction of Long-Memory Time Series: A Tutorial Review

    NASA Astrophysics Data System (ADS)

    Bhansali, R. J.; Kokoszka, P. S.

    Two different approaches, called Type-I and Type-II, to linear least-squares prediction of a long-memory time series are distinguished. In the former, no new theory is required and a long-memory time series is treated on par with a standard short-memory time series and its multistep predictions are obtained by using the existing modelling approaches to prediction of such time series. The latter, by contrast, seeks to model the long-memory stochastic characteristics of the observed time series by a fractional process such that its dth fractional difference, 0 < d < 0.5, follows a standard short-memory process. The various approaches to constructing long-memory stochastic models are reviewed, and the associated question of parameter estimation for these models is discussed. Having fitted a long-memory stochastic model to a time series, linear multi-step forecasts of its future values are constructed from the model itself. The question of how to evaluate the multistep prediction constants is considered and three different methods proposed for doing so are outlined; it is further noted that, under appropriate regularity conditions, these methods apply also to the class of linear long memory processes with infinite variance. In addition, a brief review of the class of non-linear chaotic maps implying long-memory is given.

  4. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  5. Visualizing frequent patterns in large multivariate time series

    NASA Astrophysics Data System (ADS)

    Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.

    2011-01-01

    The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.

  6. GPS coordinate time series measurements in Ontario and Quebec, Canada

    NASA Astrophysics Data System (ADS)

    Samadi Alinia, Hadis; Tiampo, Kristy F.; James, Thomas S.

    2017-01-01

    New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario and western Québec provide constraints on the regional three-dimensional crustal velocity field. Five years of continuous observations at fourteen cGPS sites were analyzed using Bernese GPS processing software. Several different sub-networks were chosen from these stations, and the data were processed and compared to in order to select the optimal configuration to accurately estimate the vertical and horizontal station velocities and minimize the associated errors. The coordinate time series were then compared to the crustal motions from global solutions and the optimized solution is presented here. A noise analysis model with power-law and white noise, which best describes the noise characteristics of all three components, was employed for the GPS time series analysis. The linear trend, associated uncertainties, and the spectral index of the power-law noise were calculated using a maximum likelihood estimation approach. The residual horizontal velocities, after removal of rigid plate motion, have a magnitude consistent with expected glacial isostatic adjustment (GIA). The vertical velocities increase from subsidence of almost 1.9 mm/year south of the Great Lakes to uplift near Hudson Bay, where the highest rate is approximately 10.9 mm/year. The residual horizontal velocities range from approximately 0.5 mm/year, oriented south-southeastward, at the Great Lakes to nearly 1.5 mm/year directed toward the interior of Hudson Bay at stations adjacent to its shoreline. Here, the velocity uncertainties are estimated at less than 0.6 mm/year for the horizontal component and 1.1 mm/year for the vertical component. A comparison between the observed velocities and GIA model predictions, for a limited range of Earth models, shows a better fit to the observations for the Earth model with the smallest upper mantle viscosity and the largest lower mantle viscosity. However, the

  7. A method for detecting changes in long time series

    SciTech Connect

    Downing, D.J.; Lawkins, W.F.; Morris, M.D.; Ostrouchov, G.

    1995-09-01

    Modern scientific activities, both physical and computational, can result in time series of many thousands or even millions of data values. Here the authors describe a statistically motivated algorithm for quick screening of very long time series data for the presence of potentially interesting but arbitrary changes. The basic data model is a stationary Gaussian stochastic process, and the approach to detecting a change is the comparison of two predictions of the series at a time point or contiguous collection of time points. One prediction is a ``forecast``, i.e. based on data from earlier times, while the other a ``backcast``, i.e. based on data from later times. The statistic is the absolute value of the log-likelihood ratio for these two predictions, evaluated at the observed data. A conservative procedure is suggested for specifying critical values for the statistic under the null hypothesis of ``no change``.

  8. Symplectic geometry spectrum regression for prediction of noisy time series

    NASA Astrophysics Data System (ADS)

    Xie, Hong-Bo; Dokos, Socrates; Sivakumar, Bellie; Mengersen, Kerrie

    2016-05-01

    We present the symplectic geometry spectrum regression (SGSR) technique as well as a regularized method based on SGSR for prediction of nonlinear time series. The main tool of analysis is the symplectic geometry spectrum analysis, which decomposes a time series into the sum of a small number of independent and interpretable components. The key to successful regularization is to damp higher order symplectic geometry spectrum components. The effectiveness of SGSR and its superiority over local approximation using ordinary least squares are demonstrated through prediction of two noisy synthetic chaotic time series (Lorenz and Rössler series), and then tested for prediction of three real-world data sets (Mississippi River flow data and electromyographic and mechanomyographic signal recorded from human body).

  9. Similarity estimators for irregular and age-uncertain time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Kurths, J.

    2014-01-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many data sets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age-uncertain time series. We compare the Gaussian-kernel-based cross-correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case, coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  10. Similarity estimators for irregular and age uncertain time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Kurths, J.

    2013-09-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  11. Correlation measure to detect time series distances, whence economy globalization

    NASA Astrophysics Data System (ADS)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  12. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  13. Heart rate time series characteristics for early detection of infections in critically ill patients.

    PubMed

    Tambuyzer, T; Guiza, F; Boonen, E; Meersseman, P; Vervenne, H; Hansen, T K; Bjerre, M; Van den Berghe, G; Berckmans, D; Aerts, J M; Meyfroidt, G

    2017-04-01

    It is difficult to make a distinction between inflammation and infection. Therefore, new strategies are required to allow accurate detection of infection. Here, we hypothesize that we can distinguish infected from non-infected ICU patients based on dynamic features of serum cytokine concentrations and heart rate time series. Serum cytokine profiles and heart rate time series of 39 patients were available for this study. The serum concentration of ten cytokines were measured using blood sampled every 10 min between 2100 and 0600 hours. Heart rate was recorded every minute. Ten metrics were used to extract features from these time series to obtain an accurate classification of infected patients. The predictive power of the metrics derived from the heart rate time series was investigated using decision tree analysis. Finally, logistic regression methods were used to examine whether classification performance improved with inclusion of features derived from the cytokine time series. The AUC of a decision tree based on two heart rate features was 0.88. The model had good calibration with 0.09 Hosmer-Lemeshow p value. There was no significant additional value of adding static cytokine levels or cytokine time series information to the generated decision tree model. The results suggest that heart rate is a better marker for infection than information captured by cytokine time series when the exact stage of infection is not known. The predictive value of (expensive) biomarkers should always be weighed against the routinely monitored data, and such biomarkers have to demonstrate added value.

  14. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    PubMed

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  15. Evaluation of Scaling Invariance Embedded in Short Time Series

    PubMed Central

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356

  16. Evaluation of scaling invariance embedded in short time series.

    PubMed

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  17. Accurate marker-free alignment with simultaneous geometry determination and reconstruction of tilt series in electron tomography.

    PubMed

    Winkler, Hanspeter; Taylor, Kenneth A

    2006-02-01

    An image alignment method for electron tomography is presented which is based on cross-correlation techniques and which includes a simultaneous refinement of the tilt geometry. A coarsely aligned tilt series is iteratively refined with a procedure consisting of two steps for each cycle: area matching and subsequent geometry correction. The first step, area matching, brings into register equivalent specimen regions in all images of the tilt series. It determines four parameters of a linear two-dimensional transformation, not just translation and rotation as is done during the preceding coarse alignment with conventional methods. The refinement procedure also differs from earlier methods in that the alignment references are now computed from already aligned images by reprojection of a backprojected volume. The second step, geometry correction, refines the initially inaccurate estimates of the geometrical parameters, including the direction of the tilt axis, a tilt angle offset, and the inclination of the specimen with respect to the support film or specimen holder. The correction values serve as an indicator for the progress of the refinement. For each new iteration, the correction values are used to compute an updated set of geometry parameters by a least squares fit. Model calculations show that it is essential to refine the geometrical parameters as well as the accurate alignment of the images to obtain a faithful map of the original structure.

  18. Statistical modelling of agrometeorological time series by exponential smoothing

    NASA Astrophysics Data System (ADS)

    Murat, Małgorzata; Malinowska, Iwona; Hoffmann, Holger; Baranowski, Piotr

    2016-01-01

    Meteorological time series are used in modelling agrophysical processes of the soil-plant-atmosphere system which determine plant growth and yield. Additionally, long-term meteorological series are used in climate change scenarios. Such studies often require forecasting or projection of meteorological variables, eg the projection of occurrence of the extreme events. The aim of the article was to determine the most suitable exponential smoothing models to generate forecast using data on air temperature, wind speed, and precipitation time series in Jokioinen (Finland), Dikopshof (Germany), Lleida (Spain), and Lublin (Poland). These series exhibit regular additive seasonality or non-seasonality without any trend, which is confirmed by their autocorrelation functions and partial autocorrelation functions. The most suitable models were indicated by the smallest mean absolute error and the smallest root mean squared error.

  19. Self-affinity in the dengue fever time series

    NASA Astrophysics Data System (ADS)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  20. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    NASA Astrophysics Data System (ADS)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  1. Time-series modeling of long-term weight self-monitoring data.

    PubMed

    Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka

    2015-08-01

    Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.

  2. Compounding approach for univariate time series with nonstationary variances

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  3. Multiscale multifractal diffusion entropy analysis of financial time series

    NASA Astrophysics Data System (ADS)

    Huang, Jingjing; Shang, Pengjian

    2015-02-01

    This paper introduces a multiscale multifractal diffusion entropy analysis (MMDEA) method to analyze long-range correlation then applies this method to stock index series. The method combines the techniques of diffusion process and Rényi entropy to focus on the scaling behaviors of stock index series using a multiscale, which allows us to extend the description of stock index variability to include the dependence on the magnitude of the variability and time scale. Compared to multifractal diffusion entropy analysis, the MMDEA can show more details of scale properties and provide a reliable analysis. In this paper, we concentrate not only on the fact that the stock index series has multifractal properties but also that these properties depend on the time scale in which the multifractality is measured. This time scale is related to the frequency band of the signal. We find that stock index variability appears to be far more complex than reported in the studies using a fixed time scale.

  4. Generalized Dynamic Factor Models for Mixed-Measurement Time Series

    PubMed Central

    Cui, Kai; Dunson, David B.

    2013-01-01

    In this article, we propose generalized Bayesian dynamic factor models for jointly modeling mixed-measurement time series. The framework allows mixed-scale measurements associated with each time series, with different measurements having different distributions in the exponential family conditionally on time-varying latent factor(s). Efficient Bayesian computational algorithms are developed for posterior inference on both the latent factors and model parameters, based on a Metropolis Hastings algorithm with adaptive proposals. The algorithm relies on a Greedy Density Kernel Approximation (GDKA) and parameter expansion with latent factor normalization. We tested the framework and algorithms in simulated studies and applied them to the analysis of intertwined credit and recovery risk for Moody’s rated firms from 1982–2008, illustrating the importance of jointly modeling mixed-measurement time series. The article has supplemental materials available online. PMID:24791133

  5. Generalized Dynamic Factor Models for Mixed-Measurement Time Series.

    PubMed

    Cui, Kai; Dunson, David B

    2014-02-12

    In this article, we propose generalized Bayesian dynamic factor models for jointly modeling mixed-measurement time series. The framework allows mixed-scale measurements associated with each time series, with different measurements having different distributions in the exponential family conditionally on time-varying latent factor(s). Efficient Bayesian computational algorithms are developed for posterior inference on both the latent factors and model parameters, based on a Metropolis Hastings algorithm with adaptive proposals. The algorithm relies on a Greedy Density Kernel Approximation (GDKA) and parameter expansion with latent factor normalization. We tested the framework and algorithms in simulated studies and applied them to the analysis of intertwined credit and recovery risk for Moody's rated firms from 1982-2008, illustrating the importance of jointly modeling mixed-measurement time series. The article has supplemental materials available online.

  6. A refined fuzzy time series model for stock market forecasting

    NASA Astrophysics Data System (ADS)

    Jilani, Tahseen Ahmed; Burney, Syed Muhammad Aqil

    2008-05-01

    Time series models have been used to make predictions of stock prices, academic enrollments, weather, road accident casualties, etc. In this paper we present a simple time-variant fuzzy time series forecasting method. The proposed method uses heuristic approach to define frequency-density-based partitions of the universe of discourse. We have proposed a fuzzy metric to use the frequency-density-based partitioning. The proposed fuzzy metric also uses a trend predictor to calculate the forecast. The new method is applied for forecasting TAIEX and enrollments’ forecasting of the University of Alabama. It is shown that the proposed method work with higher accuracy as compared to other fuzzy time series methods developed for forecasting TAIEX and enrollments of the University of Alabama.

  7. Time-accurate unsteady aerodynamic and aeroelastic calculations for wings using Euler equations

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    1988-01-01

    A time-accurate approach to simultaneously solve the Euler flow equations and modal structural equations of motion is presented for computing aeroelastic responses of wings. The Euler flow eauations are solved by a time-accurate finite difference scheme with dynamic grids. The coupled aeroelastic equations of motion are solved using the linear acceleration method. The aeroelastic configuration adaptive dynamic grids are time accurately generated using the aeroelastically deformed shape of the wing. The unsteady flow calculations are validated wih experiment, both for a semi-infinite wing and a wall-mounted cantilever rectangular wings. Aeroelastic responses are computed for a rectangular wing using the modal data generated by the finite-element method. The robustness of the present approach in computing unsteady flows and aeroelastic responses that are beyond the capability of earlier approaches using the potential equations are demonstrated.

  8. An introduction to chaotic and random time series analysis

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1989-01-01

    The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.

  9. Degree-Pruning Dynamic Programming Approaches to Central Time Series Minimizing Dynamic Time Warping Distance.

    PubMed

    Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip

    2016-06-28

    The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.

  10. Wavelet analysis for non-stationary, nonlinear time series

    NASA Astrophysics Data System (ADS)

    Schulte, Justin A.

    2016-08-01

    Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the quasi-biennial oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16 and 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.

  11. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  12. Nonlinear Analysis of Surface EMG Time Series of Back Muscles

    NASA Astrophysics Data System (ADS)

    Dolton, Donald C.; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-10-01

    A nonlinear analysis of surface electromyography time series of subjects with and without low back pain is presented. The mean-square displacement and entropy shows anomalous diffusive behavior on intermediate time range 10 ms < t < 1 s. This behavior implies the presence of correlations in the signal. We discuss the shape of the power spectrum of the signal.

  13. Long-range correlations in time series generated by time-fractional diffusion: A numerical study

    NASA Astrophysics Data System (ADS)

    Barbieri, Davide; Vivoli, Alessandro

    2005-09-01

    Time series models showing power law tails in autocorrelation functions are common in econometrics. A special non-Markovian model for such kind of time series is provided by the random walk introduced by Gorenflo et al. as a discretization of time fractional diffusion. The time series so obtained are analyzed here from a numerical point of view in terms of autocorrelations and covariance matrices.

  14. MODIS Vegetation Indices time series improvement considering real acquisition dates

    NASA Astrophysics Data System (ADS)

    Testa, S.; Borgogno Mondino, E.

    2013-12-01

    Satellite Vegetation Indices (VI) time series images are widely used for the characterization phenology, which requires a high temporal accuracy of the satellite data. The present work is based on the MODerate resolution Imaging Spectroradiometer (MODIS) MOD13Q1 product - Vegetation Indices 16-Day L3 Global 250m, which is generated through a maximum value compositing process that reduces the number of cloudy pixels and excludes, when possible, off-nadir ones. Because of its 16-days compositing period, the distance between two adjacent-in-time values within each pixel NDVI time series can range from 1 to 32 days, thus not acceptable for phenologic studies. Moreover, most of the available smoothing algorithms, which are widely used for phenology characterization, assume that data points are equidistant in time and contemporary over the image. The objective of this work was to assess temporal features of NDVI time series over a test area, composed by Castanea sativa (chestnut) and Fagus sylvatica (beech) pure pixels within the Piemonte region in Northwestern Italy. Firstly, NDVI, Pixel Reliability (PR) and Composite Day of the Year (CDOY) data ranging from 2000 to 2011 were extracted from MOD13Q1 and corresponding time series were generated (in further computations, 2000 was not considered since it is not complete because acquisition began in February and calibration is unreliable until October). Analysis of CDOY time series (containing the actual reference date of each NDVI value) over the selected study areas showed NDVI values to be prevalently generated from data acquired at the centre of each 16-days period (the 9th day), at least constantly along the year. This leads to consider each original NDVI value nominally placed to the centre of its 16-days reference period. Then, a new NDVI time series was generated: a) moving each NDVI value to its actual "acquisition" date, b) interpolating the obtained temporary time series through SPLINE functions, c) sampling such

  15. Improvements to surrogate data methods for nonstationary time series.

    PubMed

    Lucio, J H; Valdés, R; Rodríguez, L R

    2012-05-01

    The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.

  16. Temporal Resolution in Time Series and Probabilistic Models of Renewable Power Systems

    NASA Astrophysics Data System (ADS)

    Hoevenaars, Eric

    There are two main types of logistical models used for long-term performance prediction of autonomous power systems: time series and probabilistic. Time series models are more common and are more accurate for sizing storage systems because they are able to track the state of charge. However, the computational time is usually greater than for probabilistic models. It is common for time series models to perform 1-year simulations with a 1-hour time step. This is likely because of the limited availability of high resolution data and the increase in computation time with a shorter time step. Computation time is particularly important because these types of models are often used for component size optimization which requires many model runs. This thesis includes a sensitivity analysis examining the effect of the time step on these simulations. The results show that it can be significant, though it depends on the system configuration and site characteristics. Two probabilistic models are developed to estimate the temporal resolution error of a 1-hour simulation: a time series/probabilistic model and a fully probabilistic model. To demonstrate the application of and evaluate the performance of these models, two case studies are analyzed. One is for a typical residential system and one is for a system designed to provide on-site power at an aquaculture site. The results show that the time series/probabilistic model would be a useful tool if accurate distributions of the sub-hour data can be determined. Additionally, the method of cumulant arithmetic is demonstrated to be a useful technique for incorporating multiple non-Gaussian random variables into a probabilistic model, a feature other models such as Hybrid2 currently do not have. The results from the fully probabilistic model showed that some form of autocorrelation is required to account for seasonal and diurnal trends.

  17. Mining approximate periodic pattern in hydrological time series

    NASA Astrophysics Data System (ADS)

    Zhu, Y. L.; Li, S. J.; Bao, N. N.; Wan, D. S.

    2012-04-01

    There is a lot of information about the hidden laws of nature evolution and the influences of human beings activities on the earth surface in long sequence of hydrological time series. Data mining technology can help find those hidden laws, such as flood frequency and abrupt change, which is useful for the decision support of hydrological prediction and flood control scheduling. The periodic nature of hydrological time series is important for trend forecasting of drought and flood and hydraulic engineering planning. In Hydrology, the full period analysis of hydrological time series has attracted a lot of attention, such as the discrete periodogram, simple partial wave method, Fourier analysis method, and maximum entropy spectral analysis method and wavelet analysis. In fact, the hydrological process is influenced both by deterministic factors and stochastic ones. For example, the tidal level is also affected by moon circling the Earth, in addition to the Earth revolution and its rotation. Hence, there is some kind of approximate period hidden in the hydrological time series, sometimes which is also called the cryptic period. Recently, partial period mining originated from the data mining domain can be a remedy for the traditional period analysis methods in hydrology, which has a loose request of the data integrity and continuity. They can find some partial period in the time series. This paper is focused on the partial period mining in the hydrological time series. Based on asynchronous periodic pattern and partial period mining with suffix tree, this paper proposes to mine multi-event asynchronous periodic pattern based on modified suffix tree representation and traversal, and invent a dynamic candidate period intervals adjusting method, which can avoids period omissions or waste of time and space. The experimental results on synthetic data and real water level data of the Yangtze River at Nanjing station indicate that this algorithm can discover hydrological

  18. On fractal analysis of cardiac interbeat time series

    NASA Astrophysics Data System (ADS)

    Guzmán-Vargas, L.; Calleja-Quevedo, E.; Angulo-Brown, F.

    2003-09-01

    In recent years the complexity of a cardiac beat-to-beat time series has been taken as an auxiliary tool to identify the health status of human hearts. Several methods has been employed to characterize the time series complexity. In this work we calculate the fractal dimension of interbeat time series arising from three groups: 10 young healthy persons, 8 elderly healthy persons and 10 patients with congestive heart failures. Our numerical results reflect evident differences in the dynamic behavior corresponding to each group. We discuss these results within the context of the neuroautonomic control of heart rate dynamics. We also propose a numerical simulation which reproduce aging effects of heart rate behavior.

  19. Test to determine the Markov order of a time series.

    PubMed

    Racca, E; Laio, F; Poggi, D; Ridolfi, L

    2007-01-01

    The Markov order of a time series is an important measure of the "memory" of a process, and its knowledge is fundamental for the correct simulation of the characteristics of the process. For this reason, several techniques have been proposed in the past for its estimation. However, most of this methods are rather complex, and often can be applied only in the case of Markov chains. Here we propose a simple and robust test to evaluate the Markov order of a time series. Only the first-order moment of the conditional probability density function characterizing the process is used to evaluate the memory of the process itself. This measure is called the "expected value Markov (EVM) order." We show that there is good agreement between the EVM order and the known Markov order of some synthetic time series.

  20. Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology

    NASA Astrophysics Data System (ADS)

    Scheier, Christian; Tschacher, Wolfgang

    Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.

  1. Time series, correlation matrices and random matrix models

    SciTech Connect

    Vinayak; Seligman, Thomas H.

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series. By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.

  2. Characterizing Complex Time Series from the Scaling of Prediction Error.

    NASA Astrophysics Data System (ADS)

    Hinrichs, Brant Eric

    This thesis concerns characterizing complex time series from the scaling of prediction error. We use the global modeling technique of radial basis function approximation to build models from a state-space reconstruction of a time series that otherwise appears complicated or random (i.e. aperiodic, irregular). Prediction error as a function of prediction horizon is obtained from the model using the direct method. The relationship between the underlying dynamics of the time series and the logarithmic scaling of prediction error as a function of prediction horizon is investigated. We use this relationship to characterize the dynamics of both a model chaotic system and physical data from the optic tectum of an attentive pigeon exhibiting the important phenomena of nonstationary neuronal oscillations in response to visual stimuli.

  3. Time series characterization via horizontal visibility graph and Information Theory

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruna Amin; Carpi, Laura; Rosso, Osvaldo A.; Ravetti, Martín G.

    2016-12-01

    Complex networks theory have gained wider applicability since methods for transformation of time series to networks were proposed and successfully tested. In the last few years, horizontal visibility graph has become a popular method due to its simplicity and good results when applied to natural and artificially generated data. In this work, we explore different ways of extracting information from the network constructed from the horizontal visibility graph and evaluated by Information Theory quantifiers. Most works use the degree distribution of the network, however, we found alternative probability distributions, more efficient than the degree distribution in characterizing dynamical systems. In particular, we find that, when using distributions based on distances and amplitude values, significant shorter time series are required. We analyze fractional Brownian motion time series, and a paleoclimatic proxy record of ENSO from the Pallcacocha Lake to study dynamical changes during the Holocene.

  4. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  5. Permutation test for periodicity in short time series data

    PubMed Central

    Ptitsyn, Andrey A; Zvonic, Sanjin; Gimble, Jeffrey M

    2006-01-01

    Background Periodic processes, such as the circadian rhythm, are important factors modulating and coordinating transcription of genes governing key metabolic pathways. Theoretically, even small fluctuations in the orchestration of circadian gene expression patterns among different tissues may result in functional asynchrony at the organism level and may contribute to a wide range of pathologic disorders. Identification of circadian expression pattern in time series data is important, but equally challenging. Microarray technology allows estimation of relative expression of thousands of genes at each time point. However, this estimation often lacks precision and microarray experiments are prohibitively expensive, limiting the number of data points in a time series expression profile. The data produced in these experiments carries a high degree of stochastic variation, obscuring the periodic pattern and a limited number of replicates, typically covering not more than two complete periods of oscillation. Results To address this issue, we have developed a simple, but effective, computational technique for the identification of a periodic pattern in relatively short time series, typical for microarray studies of circadian expression. This test is based on a random permutation of time points in order to estimate non-randomness of a periodogram. The Permutated time, or Pt-test, is able to detect oscillations within a given period in expression profiles dominated by a high degree of stochastic fluctuations or oscillations of different irrelevant frequencies. We have conducted a comprehensive study of circadian expression on a large data set produced at PBRC, representing three different peripheral murine tissues. We have also re-analyzed a number of similar time series data sets produced and published independently by other research groups over the past few years. Conclusion The Permutated time test (Pt-test) is demonstrated to be effective for detection of periodicity in

  6. Time-Accurate Solutions of Incompressible Navier-Stokes Equations for Potential Turbopump Applications

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Kwak, Dochan

    2001-01-01

    Two numerical procedures, one based on artificial compressibility method and the other pressure projection method, are outlined for obtaining time-accurate solutions of the incompressible Navier-Stokes equations. The performance of the two method are compared by obtaining unsteady solutions for the evolution of twin vortices behind a at plate. Calculated results are compared with experimental and other numerical results. For an un- steady ow which requires small physical time step, pressure projection method was found to be computationally efficient since it does not require any subiterations procedure. It was observed that the artificial compressibility method requires a fast convergence scheme at each physical time step in order to satisfy incompressibility condition. This was obtained by using a GMRES-ILU(0) solver in our computations. When a line-relaxation scheme was used, the time accuracy was degraded and time-accurate computations became very expensive.

  7. Microbial oceanography and the Hawaii Ocean Time-series programme.

    PubMed

    Karl, David M; Church, Matthew J

    2014-10-01

    The Hawaii Ocean Time-series (HOT) programme has been tracking microbial and biogeochemical processes in the North Pacific Subtropical Gyre since October 1988. The near-monthly time series observations have revealed previously undocumented phenomena within a temporally dynamic ecosystem that is vulnerable to climate change. Novel microorganisms, genes and unexpected metabolic pathways have been discovered and are being integrated into our evolving ecological paradigms. Continued research, including higher-frequency observations and at-sea experimentation, will help to provide a comprehensive scientific understanding of microbial processes in the largest biome on Earth.

  8. Testing for intracycle determinism in pseudoperiodic time series

    NASA Astrophysics Data System (ADS)

    Coelho, Mara C. S.; Mendes, Eduardo M. A. M.; Aguirre, Luis A.

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  9. Application of nonlinear time series models to driven systems

    SciTech Connect

    Hunter, N.F. Jr.

    1990-01-01

    In our laboratory we have been engaged in an effort to model nonlinear systems using time series methods. Our objectives have been, first, to understand how the time series response of a nonlinear system unfolds as a function of the underlying state variables, second, to model the evolution of the state variables, and finally, to predict nonlinear system responses. We hope to address the relationship between model parameters and system parameters in the near future. Control of nonlinear systems based on experimentally derived parameters is also a planned topic of future research. 28 refs., 15 figs., 2 tabs.

  10. Adaptive median filtering for preprocessing of time series measurements

    NASA Technical Reports Server (NTRS)

    Paunonen, Matti

    1993-01-01

    A median (L1-norm) filtering program using polynomials was developed. This program was used in automatic recycling data screening. Additionally, a special adaptive program to work with asymmetric distributions was developed. Examples of adaptive median filtering of satellite laser range observations and TV satellite time measurements are given. The program proved to be versatile and time saving in data screening of time series measurements.

  11. Kālī: Time series data modeler

    NASA Astrophysics Data System (ADS)

    Kasliwal, Vishal P.

    2016-07-01

    The fully parallelized and vectorized software package Kālī models time series data using various stochastic processes such as continuous-time ARMA (C-ARMA) processes and uses Bayesian Markov Chain Monte-Carlo (MCMC) for inferencing a stochastic light curve. Kālimacr; is written in c++ with Python language bindings for ease of use. K¯lī is named jointly after the Hindu goddess of time, change, and power and also as an acronym for KArma LIbrary.

  12. A time-accurate implicit method for chemical non-equilibrium flows at all speeds

    NASA Technical Reports Server (NTRS)

    Shuen, Jian-Shun

    1992-01-01

    A new time accurate coupled solution procedure for solving the chemical non-equilibrium Navier-Stokes equations over a wide range of Mach numbers is described. The scheme is shown to be very efficient and robust for flows with velocities ranging from M less than or equal to 10(exp -10) to supersonic speeds.

  13. The study of coastal groundwater depth and salinity variation using time-series analysis

    SciTech Connect

    Tularam, G.A. . E-mail: a.tularam@griffith.edu.au; Keeler, H.P. . E-mail: p.keeler@ms.unimelb.edu.au

    2006-10-15

    A time-series approach is applied to study and model tidal intrusion into coastal aquifers. The authors examine the effect of tidal behaviour on groundwater level and salinity intrusion for the coastal Brisbane region using auto-correlation and spectral analyses. The results show a close relationship between tidal behaviour, groundwater depth and salinity levels for the Brisbane coast. The known effect can be quantified and incorporated into new models in order to more accurately map salinity intrusion into coastal groundwater table.

  14. Energy-Based Wavelet De-Noising of Hydrologic Time Series

    PubMed Central

    Sang, Yan-Fang; Liu, Changming; Wang, Zhonggen; Wen, Jun; Shang, Lunyu

    2014-01-01

    De-noising is a substantial issue in hydrologic time series analysis, but it is a difficult task due to the defect of methods. In this paper an energy-based wavelet de-noising method was proposed. It is to remove noise by comparing energy distribution of series with the background energy distribution, which is established from Monte-Carlo test. Differing from wavelet threshold de-noising (WTD) method with the basis of wavelet coefficient thresholding, the proposed method is based on energy distribution of series. It can distinguish noise from deterministic components in series, and uncertainty of de-noising result can be quantitatively estimated using proper confidence interval, but WTD method cannot do this. Analysis of both synthetic and observed series verified the comparable power of the proposed method and WTD, but de-noising process by the former is more easily operable. The results also indicate the influences of three key factors (wavelet choice, decomposition level choice and noise content) on wavelet de-noising. Wavelet should be carefully chosen when using the proposed method. The suitable decomposition level for wavelet de-noising should correspond to series' deterministic sub-signal which has the smallest temporal scale. If too much noise is included in a series, accurate de-noising result cannot be obtained by the proposed method or WTD, but the series would show pure random but not autocorrelation characters, so de-noising is no longer needed. PMID:25360533

  15. Energy-based wavelet de-noising of hydrologic time series.

    PubMed

    Sang, Yan-Fang; Liu, Changming; Wang, Zhonggen; Wen, Jun; Shang, Lunyu

    2014-01-01

    De-noising is a substantial issue in hydrologic time series analysis, but it is a difficult task due to the defect of methods. In this paper an energy-based wavelet de-noising method was proposed. It is to remove noise by comparing energy distribution of series with the background energy distribution, which is established from Monte-Carlo test. Differing from wavelet threshold de-noising (WTD) method with the basis of wavelet coefficient thresholding, the proposed method is based on energy distribution of series. It can distinguish noise from deterministic components in series, and uncertainty of de-noising result can be quantitatively estimated using proper confidence interval, but WTD method cannot do this. Analysis of both synthetic and observed series verified the comparable power of the proposed method and WTD, but de-noising process by the former is more easily operable. The results also indicate the influences of three key factors (wavelet choice, decomposition level choice and noise content) on wavelet de-noising. Wavelet should be carefully chosen when using the proposed method. The suitable decomposition level for wavelet de-noising should correspond to series' deterministic sub-signal which has the smallest temporal scale. If too much noise is included in a series, accurate de-noising result cannot be obtained by the proposed method or WTD, but the series would show pure random but not autocorrelation characters, so de-noising is no longer needed.

  16. Fractal dimension of electroencephalographic time series and underlying brain processes.

    PubMed

    Lutzenberger, W; Preissl, H; Pulvermüller, F

    1995-10-01

    Fractal dimension has been proposed as a useful measure for the characterization of electrophysiological time series. This paper investigates what the pointwise dimension of electroencephalographic (EEG) time series can reveal about underlying neuronal generators. The following theoretical assumptions concerning brain function were made (i) within the cortex, strongly coupled neural assemblies exist which oscillate at certain frequencies when they are active, (ii) several such assemblies can oscillate at a time, and (iii) activity flow between assemblies is minimal. If these assumptions are made, cortical activity can be considered as the weighted sum of a finite number of oscillations (plus noise). It is shown that the correlation dimension of finite time series generated by multiple oscillators increases monotonically with the number of oscillators. Furthermore, it is shown that a reliable estimate of the pointwise dimension of the raw EEG signal can be calculated from a time series as short as a few seconds. These results indicate that (i) The pointwise dimension of the EEG allows conclusions regarding the number of independently oscillating networks in the cortex, and (ii) a reliable estimate of the pointwise dimension of the EEG is possible on the basis of short raw signals.

  17. Learning time series evolution by unsupervised extraction of correlations

    SciTech Connect

    Deco, G.; Schuermann, B. )

    1995-03-01

    As a consequence, we are able to model chaotic and nonchaotic time series. Furthermore, one critical point in modeling time series is the determination of the dimension of the embedding vector used, i.e., the number of components of the past that are needed to predict the future. With this method we can detect the embedding dimension by extracting the influence of the past on the future, i.e., the correlation of remote past and future. Optimal embedding dimensions are obtained for the Henon map and the Mackey-Glass series. When noisy data corrupted by colored noise are used, a model is still possible. The noise will then be decorrelated by the network. In the case of modeling a chemical reaction, the most natural architecture that conserves the volume is a symplectic network which describes a system that conserves the entropy and therefore the transmitted information.

  18. Power Computations in Time Series Analyses for Traffic Safety Interventions

    PubMed Central

    McLeod, A. Ian; Vingilis, E. R.

    2008-01-01

    The evaluation of traffic safety interventions or other policies that can affect road safety often requires the collection of administrative time series data, such as monthly motor vehicle collision data that may be difficult and/or expensive to collect. Furthermore, since policy decisions may be based on the results found from the intervention analysis of the policy, it is important to ensure that the statistical tests have enough power, that is, that we have collected enough time series data both before and after the intervention so that a meaningful change in the series will likely be detected. In this short paper we present a simple methodology for doing this. It is expected that the methodology presented will be useful for sample size determination in a wide variety of traffic safety intervention analysis applications. Our method is illustrated with a proposed traffic safety study that was funded by NIH. PMID:18460394

  19. Segmentation of time series with long-range fractal correlations

    PubMed Central

    Bernaola-Galván, P.; Oliver, J.L.; Hackenberg, M.; Coronado, A.V.; Ivanov, P.Ch.; Carpena, P.

    2012-01-01

    Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome. PMID:23645997

  20. Segmentation of time series with long-range fractal correlations

    NASA Astrophysics Data System (ADS)

    Bernaola-Galván, P.; Oliver, J. L.; Hackenberg, M.; Coronado, A. V.; Ivanov, P. Ch.; Carpena, P.

    2012-06-01

    Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.

  1. Long GPS coordinate time series: multipath and geometry effects

    NASA Astrophysics Data System (ADS)

    King, M.; Watson, C. S.

    2009-12-01

    Within analyses of Global Positioning System (GPS) observations, unmodelled sub-daily signals are known to propagate into long-period signals via a number of different mechanisms. We report on the effects of time-variable satellite geometry and the propagation of an unmodelled multipath signal. Multipath reflectors at H=0.1 m, 0.2 m and 1.5 m below the antenna are modeled and their effects on GPS coordinate time series are examined. Simulated time series at 20 global IGS sites for 2000-2008 were derived using the satellite geometry as defined by daily broadcast orbits, in addition to that defined using a perfectly repeating synthetic orbit. For the simulations generated using the broadcast orbits with a perfectly clear horizon, we observe the introduction of a time variable bias in the time series of up to several centimeters. Considerable site to site variability of the frequency and magnitude of the signal is observed, in addition to variation as a function of multipath source. When adopting realistic GPS observation geometries obtained from real data (e.g., those that include the effects of tracking outages, local obstructions, etc.), we observe concerning levels of temporal coordinate variation in the presence of the multipath signals. In these cases, we observe spurious signals across the frequency domain, in addition to what appears as offsets and secular trends. Velocity biases of more than 1mm/yr are evident at some few sites. The propagated signal in the vertical component is consistent with a noise model with a spectral index marginally above flicker noise (mean index -1.4), with some sites exhibiting power law magnitudes at comparable levels to actual height time series generated in GIPSY. The propagated signal also shows clear spectral peaks across all coordinate components at harmonics of the draconitic year for a GPS satellite (351.2 days). When a perfectly repeating synthetic GPS constellation is used, the simulations show near-negligible power law

  2. Ocean time-series near Bermuda: Hydrostation S and the US JGOFS Bermuda Atlantic time-series study

    NASA Technical Reports Server (NTRS)

    Michaels, Anthony F.; Knap, Anthony H.

    1992-01-01

    Bermuda is the site of two ocean time-series programs. At Hydrostation S, the ongoing biweekly profiles of temperature, salinity and oxygen now span 37 years. This is one of the longest open-ocean time-series data sets and provides a view of decadal scale variability in ocean processes. In 1988, the U.S. JGOFS Bermuda Atlantic Time-series Study began a wide range of measurements at a frequency of 14-18 cruises each year to understand temporal variability in ocean biogeochemistry. On each cruise, the data range from chemical analyses of discrete water samples to data from electronic packages of hydrographic and optics sensors. In addition, a range of biological and geochemical rate measurements are conducted that integrate over time-periods of minutes to days. This sampling strategy yields a reasonable resolution of the major seasonal patterns and of decadal scale variability. The Sargasso Sea also has a variety of episodic production events on scales of days to weeks and these are only poorly resolved. In addition, there is a substantial amount of mesoscale variability in this region and some of the perceived temporal patterns are caused by the intersection of the biweekly sampling with the natural spatial variability. In the Bermuda time-series programs, we have added a series of additional cruises to begin to assess these other sources of variation and their impacts on the interpretation of the main time-series record. However, the adequate resolution of higher frequency temporal patterns will probably require the introduction of new sampling strategies and some emerging technologies such as biogeochemical moorings and autonomous underwater vehicles.

  3. A Time-Series Analysis of Hispanic Unemployment.

    ERIC Educational Resources Information Center

    Defreitas, Gregory

    1986-01-01

    This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)

  4. Chaotic time series prediction using artificial neural networks

    SciTech Connect

    Bartlett, E.B.

    1991-12-31

    This paper describes the use of artificial neural networks to model the complex oscillations defined by a chaotic Verhuist animal population dynamic. A predictive artificial neural network model is developed and tested, and results of computer simulations are given. These results show that the artificial neural network model predicts the chaotic time series with various initial conditions, growth parameters, or noise.

  5. Chaotic time series prediction using artificial neural networks

    SciTech Connect

    Bartlett, E.B.

    1991-01-01

    This paper describes the use of artificial neural networks to model the complex oscillations defined by a chaotic Verhuist animal population dynamic. A predictive artificial neural network model is developed and tested, and results of computer simulations are given. These results show that the artificial neural network model predicts the chaotic time series with various initial conditions, growth parameters, or noise.

  6. Time Series, Stochastic Processes and Completeness of Quantum Theory

    NASA Astrophysics Data System (ADS)

    Kupczynski, Marian

    2011-03-01

    Most of physical experiments are usually described as repeated measurements of some random variables. Experimental data registered by on-line computers form time series of outcomes. The frequencies of different outcomes are compared with the probabilities provided by the algorithms of quantum theory (QT). In spite of statistical predictions of QT a claim was made that it provided the most complete description of the data and of the underlying physical phenomena. This claim could be easily rejected if some fine structures, averaged out in the standard descriptive statistical analysis, were found in time series of experimental data. To search for these structures one has to use more subtle statistical tools which were developed to study time series produced by various stochastic processes. In this talk we review some of these tools. As an example we show how the standard descriptive statistical analysis of the data is unable to reveal a fine structure in a simulated sample of AR (2) stochastic process. We emphasize once again that the violation of Bell inequalities gives no information on the completeness or the non locality of QT. The appropriate way to test the completeness of quantum theory is to search for fine structures in time series of the experimental data by means of the purity tests or by studying the autocorrelation and partial autocorrelation functions.

  7. Catchment classification and similarity using correlation in streamflow time series

    NASA Astrophysics Data System (ADS)

    Fleming, B.; Archfield, S. A.

    2012-12-01

    Catchment classification is an important component of hydrologic analyses, particularly for linking changes in ecological integrity to streamflow alteration, transferring time series or model parameters from gauged to ungauged locations, and as a way to understand the similarity in the response of catchments to change. Metrics of similarity used in catchment classification have ranged from aggregate catchment properties such as geologic or climate characteristics to variables derived from the daily streamflow hydrograph; however, no one set of classification variables can fully describe similarity between catchments as the variables used for such assessments often depend on the question being asked. We propose an alternative method based on similarity for hydrologic classification: correlation between the daily streamflow time series. If one assumes that the streamflow signal is the integrated response of a catchment to both climate and geology, then the strength of correlation in streamflow between two catchments is a measure of the strength of similarity in hydrologic response between those two catchments. Using the nonparametric Spearman rho correlation coefficient between streamflow time series at 54 unregulated and unaltered streamgauges in the mid-Atlantic United States, we show that correlation is a parsimonious classification metric that results in physically interpretable classes. Using the correlation between the deseasonalized streamflow time series and reclassifying the streamgauges, we also find that seasonality plays an important role in understanding catchment flow dynamics, especially those that can be linked to ecological response and similarity although not to a large extent in this study area.

  8. A Method for Comparing Multivariate Time Series with Different Dimensions

    PubMed Central

    Tapinos, Avraam; Mendes, Pedro

    2013-01-01

    In many situations it is desirable to compare dynamical systems based on their behavior. Similarity of behavior often implies similarity of internal mechanisms or dependency on common extrinsic factors. While there are widely used methods for comparing univariate time series, most dynamical systems are characterized by multivariate time series. Yet, comparison of multivariate time series has been limited to cases where they share a common dimensionality. A semi-metric is a distance function that has the properties of non-negativity, symmetry and reflexivity, but not sub-additivity. Here we develop a semi-metric – SMETS – that can be used for comparing groups of time series that may have different dimensions. To demonstrate its utility, the method is applied to dynamic models of biochemical networks and to portfolios of shares. The former is an example of a case where the dependencies between system variables are known, while in the latter the system is treated (and behaves) as a black box. PMID:23393554

  9. New Confidence Interval Estimators Using Standardized Time Series.

    DTIC Science & Technology

    1984-12-01

    We develop new confidence interval estimators for the underlying mean of a stationary simulation process. These estimators can be viewed as...generalizations of Schruben’s so-called standardized time series area confidence interval estimators. Various properties of the new estimators are given.

  10. Daily time series evapotranspiration maps for Oklahoma and Texas panhandle

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Evapotranspiration (ET) is an important process in ecosystems’ water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. ...

  11. What Makes a Coursebook Series Stand the Test of Time?

    ERIC Educational Resources Information Center

    Illes, Eva

    2009-01-01

    Intriguingly, at a time when the ELT market is inundated with state-of-the-art coursebooks teaching modern-day English, a 30-year-old series enjoys continuing popularity in some secondary schools in Hungary. Why would teachers, several of whom are school-based teacher-mentors in the vanguard of the profession, purposefully choose materials which…

  12. IDENTIFICATION OF REGIME SHIFTS IN TIME SERIES USING NEIGHBORHOOD STATISTICS

    EPA Science Inventory

    The identification of alternative dynamic regimes in ecological systems requires several lines of evidence. Previous work on time series analysis of dynamic regimes includes mainly model-fitting methods. We introduce two methods that do not use models. These approaches use state-...

  13. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  14. Model Identification in Time-Series Analysis: Some Empirical Results.

    ERIC Educational Resources Information Center

    Padia, William L.

    Model identification of time-series data is essential to valid statistical tests of intervention effects. Model identification is, at best, inexact in the social and behavioral sciences where one is often confronted with small numbers of observations. These problems are discussed, and the results of independent identifications of 130 social and…

  15. ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES

    PubMed Central

    PENG, C.-K.; COSTA, MADALENA; GOLDBERGER, ARY L.

    2009-01-01

    We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations. PMID:20041035

  16. Time Series Data Visualization in World Wide Telescope

    NASA Astrophysics Data System (ADS)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  17. Long GPS coordinate time series: multipath and geometry effects

    NASA Astrophysics Data System (ADS)

    King, M. A.; Watson, C. S.

    2009-04-01

    Within analyses of Global Positioning System (GPS) observations, unmodelled sub-daily signals are known to propagate into long-period signals via a number of different mechanisms. In this paper, we investigate the effects of time-variable satellite geometry and the propagation of an unmodelled multipath signal that is analogous to a change in the elevation dependant phase centre of the receiving antenna. Multipath reflectors at H=0.1 m, 0.2 m and 1.5 m below the antenna are modeled and their effects on GPS coordinate time series are examined. Simulated time series at 20 global IGS sites for 2000-2008 were derived using the satellite geometry as defined by daily broadcast orbits, in addition to that defined using a perfectly repeating synthetic orbit. For the simulations generated using the broadcast orbits with a perfectly clear horizon, we observe the introduction of a time variable bias in the time series of up to several centimeters. Considerable site to site variability of the frequency and magnitude of the signal is observed, in addition to variation as a function of multipath source. When adopting realistic GPS observation geometries obtained from real data (e.g., those that include the effects of tracking outages, local obstructions, etc.), we observe concerning levels of temporal coordinate variation in the presence of the multipath signals. In these cases, we observe spurious signals across the frequency domain, in addition to what appears as offsets and secular trends. Velocity biases of more than 1mm/yr are evident at some few sites. The propagated signal in the vertical component is consistent with a noise model with a spectral index marginally above flicker noise (mean index -1.4), with some sites exhibiting power law magnitudes at comparable levels to actual height time series generated in GIPSY. The propagated signal also shows clear spectral peaks across all coordinate components at harmonics of the draconitic year for a GPS satellite (351.4 days

  18. [Anomaly Detection of Multivariate Time Series Based on Riemannian Manifolds].

    PubMed

    Xu, Yonghong; Hou, Xiaoying; Li Shuting; Cui, Jie

    2015-06-01

    Multivariate time series problems widely exist in production and life in the society. Anomaly detection has provided people with a lot of valuable information in financial, hydrological, meteorological fields, and the research areas of earthquake, video surveillance, medicine and others. In order to quickly and efficiently find exceptions in time sequence so that it can be presented in front of people in an intuitive way, we in this study combined the Riemannian manifold with statistical process control charts, based on sliding window, with a description of the covariance matrix as the time sequence, to achieve the multivariate time series of anomaly detection and its visualization. We made MA analog data flow and abnormal electrocardiogram data from MIT-BIH as experimental objects, and verified the anomaly detection method. The results showed that the method was reasonable and effective.

  19. Classification of time series patterns from complex dynamic systems

    SciTech Connect

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  20. Mixed Spectrum Analysis on fMRI Time-Series.

    PubMed

    Kumar, Arun; Lin, Feng; Rajapakse, Jagath C

    2016-06-01

    Temporal autocorrelation present in functional magnetic resonance image (fMRI) data poses challenges to its analysis. The existing approaches handling autocorrelation in fMRI time-series often presume a specific model of autocorrelation such as an auto-regressive model. The main limitation here is that the correlation structure of voxels is generally unknown and varies in different brain regions because of different levels of neurogenic noises and pulsatile effects. Enforcing a universal model on all brain regions leads to bias and loss of efficiency in the analysis. In this paper, we propose the mixed spectrum analysis of the voxel time-series to separate the discrete component corresponding to input stimuli and the continuous component carrying temporal autocorrelation. A mixed spectral analysis technique based on M-spectral estimator is proposed, which effectively removes autocorrelation effects from voxel time-series and identify significant peaks of the spectrum. As the proposed method does not assume any prior model for the autocorrelation effect in voxel time-series, varying correlation structure among the brain regions does not affect its performance. We have modified the standard M-spectral method for an application on a spatial set of time-series by incorporating the contextual information related to the continuous spectrum of neighborhood voxels, thus reducing considerably the computation cost. Likelihood of the activation is predicted by comparing the amplitude of discrete component at stimulus frequency of voxels across the brain by using normal distribution and modeling spatial correlations among the likelihood with a conditional random field. We also demonstrate the application of the proposed method in detecting other desired frequencies.

  1. A fluorescence-based quantitative real-time PCR assay for accurate Pocillopora damicornis species identification

    NASA Astrophysics Data System (ADS)

    Thomas, Luke; Stat, Michael; Evans, Richard D.; Kennington, W. Jason

    2016-09-01

    Pocillopora damicornis is one of the most extensively studied coral species globally, but high levels of phenotypic plasticity within the genus make species identification based on morphology alone unreliable. As a result, there is a compelling need to develop cheap and time-effective molecular techniques capable of accurately distinguishing P. damicornis from other congeneric species. Here, we develop a fluorescence-based quantitative real-time PCR (qPCR) assay to genotype a single nucleotide polymorphism that accurately distinguishes P. damicornis from other morphologically similar Pocillopora species. We trial the assay across colonies representing multiple Pocillopora species and then apply the assay to screen samples of Pocillopora spp. collected at regional scales along the coastline of Western Australia. This assay offers a cheap and time-effective alternative to Sanger sequencing and has broad applications including studies on gene flow, dispersal, recruitment and physiological thresholds of P. damicornis.

  2. Distinguishing quasiperiodic dynamics from chaos in short-time series.

    PubMed

    Zou, Y; Pazó, D; Romano, M C; Thiel, M; Kurths, J

    2007-07-01

    We propose a procedure to distinguish quasiperiodic from chaotic orbits in short-time series, which is based on the recurrence properties in phase space. The histogram of the return times in a recurrence plot is introduced to disclose the recurrence property consisting of only three peaks imposed by Slater's theorem. Noise effects on the statistics are studied. Our approach is demonstrated to be efficient in recognizing regular and chaotic trajectories of a Hamiltonian system with mixed phase space.

  3. Improving predictability of time series using maximum entropy methods

    NASA Astrophysics Data System (ADS)

    Chliamovitch, G.; Dupuis, A.; Golub, A.; Chopard, B.

    2015-04-01

    We discuss how maximum entropy methods may be applied to the reconstruction of Markov processes underlying empirical time series and compare this approach to usual frequency sampling. It is shown that, in low dimension, there exists a subset of the space of stochastic matrices for which the MaxEnt method is more efficient than sampling, in the sense that shorter historical samples have to be considered to reach the same accuracy. Considering short samples is of particular interest when modelling smoothly non-stationary processes, which provides, under some conditions, a powerful forecasting tool. The method is illustrated for a discretized empirical series of exchange rates.

  4. Normalization methods in time series of platelet function assays

    PubMed Central

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  5. Classifying of financial time series based on multiscale entropy and multiscale time irreversibility

    NASA Astrophysics Data System (ADS)

    Xia, Jianan; Shang, Pengjian; Wang, Jing; Shi, Wenbin

    2014-04-01

    Time irreversibility is a fundamental property of many time series. We apply the multiscale entropy (MSE) and multiscale time irreversibility (MSTI) to analyze the financial time series, and succeed to classify the financial markets. Interestingly, both methods have nearly the same classification results, which mean that they are capable of distinguishing different series in a reliable manner. By comparing the results of shuffled data with the original results, we confirm that the asymmetry property is an inherent property of financial time series and it can extend over a wide range of scales. In addition, the effect of noise on Americas markets and Europe markets are relatively more significant than the effect on Asia markets, and loss of time irreversibility has been detected in high noise added series.

  6. Accurate early-time and late-time modeling of countercurrent spontaneous imbibition

    NASA Astrophysics Data System (ADS)

    March, Rafael; Doster, Florian; Geiger, Sebastian

    2016-08-01

    Spontaneous countercurrent imbibition into a finite porous medium is an important physical mechanism for many applications, included but not limited to irrigation, CO2 storage, and oil recovery. Symmetry considerations that are often valid in fractured porous media allow us to study the process in a one-dimensional domain. In 1-D, for incompressible fluids and homogeneous rocks, the onset of imbibition can be captured by self-similar solutions and the imbibed volume scales with √t. At later times, the imbibition rate decreases and the finite size of the medium has to be taken into account. This requires numerical solutions. Here we present a new approach to approximate the whole imbibition process semianalytically. The onset is captured by a semianalytical solution. We also provide an a priori estimate of the time until which the imbibed volume scales with √t. This time is significantly longer than the time it takes until the imbibition front reaches the model boundary. The remainder of the imbibition process is obtained from a self-similarity solution. We test our approach against numerical solutions that employ parametrizations relevant for oil recovery and CO2 sequestration. We show that this concept improves common first-order approaches that heavily underestimate early-time behavior and note that it can be readily included into dual-porosity models.

  7. The extraction of multiple cropping index of China based on NDVI time-series

    NASA Astrophysics Data System (ADS)

    Huang, Haitao; Gao, Zhiqiang

    2011-09-01

    Multiple cropping index reflects the intensity of arable land been used by a certain planting system. The bond between multiple cropping index and NDVI time-series is the crop cycle rule, which determines the crop process of seeding, jointing, tasseling, ripeness and harvesting and so on. The cycle rule can be retrieved by NDVI time-series for that peaks and valleys on the time-series curve correspond to different periods of crop growth. In this paper, we aim to extract the multiple cropping index of China from NDVI time-series. Because of cloud contamination, some NDVI values are depressed. MVC (Maximum Value Composite) synthesis is used to SPOT-VGT data to remove the noise, but this method doesn't work sufficiently. In order to accurately extract the multiple cropping index, the algorithm HANTS (Harmonic Analysis of Time Series) is employed to remove the cloud contamination. The reconstructed NDVI time-series can explicitly characterize the biophysical process of planting, seedling, elongating, heading, harvesting of crops. Based on the reconstructed curve, we calculate the multiple cropping index of arable land by extracting the number of peaks of the curve for that one peak represents one season crop. This paper presents a method to extracting the multiple cropping index from remote sensing image and then the multiple cropping index of China is extracted from VEGETATION decadal composites NDVI time series of year 2000 and 2009. From the processed data, we can get the spatial distribution of tillage system of China, and then further discussion about cropping index change between the 10 years is conducted.

  8. A time accurate finite volume high resolution scheme for three dimensional Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Hsu, Andrew T.

    1989-01-01

    A time accurate, three-dimensional, finite volume, high resolution scheme for solving the compressible full Navier-Stokes equations is presented. The present derivation is based on the upwind split formulas, specifically with the application of Roe's (1981) flux difference splitting. A high-order accurate (up to the third order) upwind interpolation formula for the inviscid terms is derived to account for nonuniform meshes. For the viscous terms, discretizations consistent with the finite volume concept are described. A variant of second-order time accurate method is proposed that utilizes identical procedures in both the predictor and corrector steps. Avoiding the definition of midpoint gives a consistent and easy procedure, in the framework of finite volume discretization, for treating viscous transport terms in the curvilinear coordinates. For the boundary cells, a new treatment is introduced that not only avoids the use of 'ghost cells' and the associated problems, but also satisfies the tangency conditions exactly and allows easy definition of viscous transport terms at the first interface next to the boundary cells. Numerical tests of steady and unsteady high speed flows show that the present scheme gives accurate solutions.

  9. Autoregression of Quasi-Stationary Time Series (Invited)

    NASA Astrophysics Data System (ADS)

    Meier, T. M.; Küperkoch, L.

    2009-12-01

    Autoregression is a model based tool for spectral analysis and prediction of time series. It has the potential to increase the resolution of spectral estimates. However, the validity of the assumed model has to be tested. Here we review shortly methods for the determination of the parameters of autoregression and summarize properties of autoregressive prediction and autoregressive spectral analysis. Time series with a limited number of dominant frequencies varying slowly in time (quasi-stationary time series) may well be described by a time-dependent autoregressive model of low order. An algorithm for the estimation of the autoregression parameters in a moving window is presented. Time-varying dominant frequencies are estimated. The comparison to results obtained by Fourier transform based methods and the visualization of the time dependent normalized prediction error are essential for quality assessment of the results. The algorithm is applied to synthetic examples as well as to mircoseism and tremor. The sensitivity of the results to the choice of model and filter parameters is discussed. Autoregressive forward prediction offers the opportunity to detect body wave phases in seismograms and to determine arrival times automatically. Examples are shown for P- and S-phases at local and regional distances. In order to determine S-wave arrival times the autoregressive model is extended to multi-component recordings. For the detection of significant temporal changes in waveforms, the choice of the model appears to be less crucial compared to spectral analysis. Temporal changes in frequency, amplitude, phase, and polarisation are detectable by autoregressive prediction. Quality estimates of automatically determined onset times may be obtained from the slope of the absolute prediction error as a function of time and the signal-to-noise ratio. Results are compared to manual readings.

  10. The Mount Wilson Ca ii K Plage Index Time Series

    NASA Astrophysics Data System (ADS)

    Bertello, L.; Ulrich, R. K.; Boyden, J. E.

    2010-06-01

    It is well established that both total and spectral solar irradiance are modulated by variable magnetic activity on the solar surface. However, there is still disagreement about the contribution of individual solar features for changes in the solar output, in particular over decadal time scales. Ionized Ca ii K line spectroheliograms are one of the major resources for these long-term trend studies, mainly because such measurements have been available now for more than 100 years. In this paper we introduce a new Ca ii K plage and active network index time series derived from the digitization of almost 40 000 photographic solar images that were obtained at the 60-foot solar tower, between 1915 and 1985, as a part of the monitoring program of the Mount Wilson Observatory. We describe here the procedure we applied to calibrate the images and the properties of our new defined index, which is strongly correlated to the average fractional area of the visible solar disk occupied by plages and active network. We show that the long-term variation of this index is in an excellent agreement with the 11-year solar-cycle trend determined from the annual international sunspot numbers series. Our time series agrees also very well with similar indicators derived from a different reduction of the same data base and other Ca ii K spectroheliograms long-term synoptic programs, such as those at Kodaikanal Observatory (India), and at the National Solar Observatory at Sacramento Peak (USA). Finally, we show that using appropriate proxies it is possible to extend this time series up to date, making this data set one of the longest Ca ii K index series currently available.

  11. Wavelet transform approach for fitting financial time series data

    NASA Astrophysics Data System (ADS)

    Ahmed, Amel Abdoullah; Ismail, Mohd Tahir

    2015-10-01

    This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.

  12. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    SciTech Connect

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Anderson, Kevin K.; Daly, Don S.; Smith, Richard D.

    2005-08-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/- 5 ppm and 1 ppm) and NET value (no constraint, +/- 0.05 and 0.01 on a 0-1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LCMS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate easurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/- 1 ppm and elution time easurements within +/- 0.01 NET.

  13. Time Accurate Unsteady Pressure Loads Simulated for the Space Launch System at a Wind Tunnel Condition

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, Bil; Streett, Craig L; Glass, Christopher E.; Schuster, David M.

    2015-01-01

    Using the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics code, an unsteady, time-accurate flow field about a Space Launch System configuration was simulated at a transonic wind tunnel condition (Mach = 0.9). Delayed detached eddy simulation combined with Reynolds Averaged Naiver-Stokes and a Spallart-Almaras turbulence model were employed for the simulation. Second order accurate time evolution scheme was used to simulate the flow field, with a minimum of 0.2 seconds of simulated time to as much as 1.4 seconds. Data was collected at 480 pressure taps at locations, 139 of which matched a 3% wind tunnel model, tested in the Transonic Dynamic Tunnel (TDT) facility at NASA Langley Research Center. Comparisons between computation and experiment showed agreement within 5% in terms of location for peak RMS levels, and 20% for frequency and magnitude of power spectral densities. Grid resolution and time step sensitivity studies were performed to identify methods for improved accuracy comparisons to wind tunnel data. With limited computational resources, accurate trends for reduced vibratory loads on the vehicle were observed. Exploratory methods such as determining minimized computed errors based on CFL number and sub-iterations, as well as evaluating frequency content of the unsteady pressures and evaluation of oscillatory shock structures were used in this study to enhance computational efficiency and solution accuracy. These techniques enabled development of a set of best practices, for the evaluation of future flight vehicle designs in terms of vibratory loads.

  14. Examination of time series through randomly broken windows

    NASA Technical Reports Server (NTRS)

    Sturrock, P. A.; Shoub, E. C.

    1981-01-01

    In order to determine the Fourier transform of a quasi-periodic time series (linear problem), or the power spectrum of a stationary random time series (quadratic problem), data should be recorded without interruption over a long time interval. The effect of regular interruption such as the day/night cycle is well known. The effect of irregular interruption of data collection (the "breaking" of the window function) with the simplifying assumption that there is a uniform probability p that each interval of length tau, of the total interval of length T = N sub tau, yields no data, is investigated. For the linear case it is found that the noise-to-signal ratio will have a (one-sigma) value less than epsilon if N exceeds p(-1)(1-p)epsilon(-2). For the quadratic case, the same requirement is met by the less restrictive requirement that N exceed p(-1)(1-p)epsilon(-1).

  15. A multivariate heuristic model for fuzzy time-series forecasting.

    PubMed

    Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei

    2007-08-01

    Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.

  16. A noise model for InSAR time series

    NASA Astrophysics Data System (ADS)

    Agram, P. S.; Simons, M.

    2015-04-01

    Interferometric synthetic aperture radar (InSAR) time series methods estimate the spatiotemporal evolution of surface deformation by incorporating information from multiple SAR interferograms. While various models have been developed to describe the interferometric phase and correlation statistics in individual interferograms, efforts to model the generalized covariance matrix that is directly applicable to joint analysis of networks of interferograms have been limited in scope. In this work, we build on existing decorrelation and atmospheric phase screen models and develop a covariance model for interferometric phase noise over space and time. We present arguments to show that the exploitation of the full 3-D covariance structure within conventional time series inversion techniques is computationally challenging. However, the presented covariance model can aid in designing new inversion techniques that can at least mitigate the impact of spatial correlated nature of InSAR observations.

  17. Least Squares Time-Series Synchronization in Image Acquisition Systems.

    PubMed

    Piazzo, Lorenzo; Raguso, Maria Carmela; Calzoletti, Luca; Seu, Roberto; Altieri, Bruno

    2016-07-18

    We consider an acquisition system constituted by an array of sensors scanning an image. Each sensor produces a sequence of readouts, called a time-series. In this framework, we discuss the image estimation problem when the time-series are affected by noise and by a time shift. In particular, we introduce an appropriate data model and consider the Least Squares (LS) estimate, showing that it has no closed form. However, the LS problem has a structure that can be exploited to simplify the solution. In particular, based on two known techniques, namely Separable Nonlinear Least Squares (SNLS) and Alternating Least Squares (ALS), we propose and analyze several practical estimation methods. As an additional contribution, we discuss the application of these methods to the data of the Photodetector Array Camera and Spectrometer (PACS), which is an infrared photometer onboard the Herschel satellite. In this context, we investigate the accuracy and the computational complexity of the methods, using both true and simulated data.

  18. Segmentation of biological multivariate time-series data

    NASA Astrophysics Data System (ADS)

    Omranian, Nooshin; Mueller-Roeber, Bernd; Nikoloski, Zoran

    2015-03-01

    Time-series data from multicomponent systems capture the dynamics of the ongoing processes and reflect the interactions between the components. The progression of processes in such systems usually involves check-points and events at which the relationships between the components are altered in response to stimuli. Detecting these events together with the implicated components can help understand the temporal aspects of complex biological systems. Here we propose a regularized regression-based approach for identifying breakpoints and corresponding segments from multivariate time-series data. In combination with techniques from clustering, the approach also allows estimating the significance of the determined breakpoints as well as the key components implicated in the emergence of the breakpoints. Comparative analysis with the existing alternatives demonstrates the power of the approach to identify biologically meaningful breakpoints in diverse time-resolved transcriptomics data sets from the yeast Saccharomyces cerevisiae and the diatom Thalassiosira pseudonana.

  19. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, M.

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.

  20. Active Mining from Process Time Series by Learning Classifier System

    NASA Astrophysics Data System (ADS)

    Kurahashi, Setsuya; Terano, Takao

    Continuation processes in chemical and/or biotechnical plants always generate a large amount of time series data. However, since conventional process models are described as a set of control models, it is difficult to explain the complicated and active plant behaviors. Based on the background, this research proposes a novel method to develop a process response model from continuous time-series data. The method consists of the following phases: 1) Collect continuous process data at each tag point in a target plant; 2) Normalize the data in the interval between zero and one; 3) Get the delay time, which maximizes the correlation between given two time series data; 4) Select tags with the higher correlation; 5) Develop a process response model to describe the relations among the process data using the delay time and the correlation values; 6) Develop a process prediction model via several tag points data using a neural network; 1) Discover control rules from the process prediction model using Learning Classifier system. The main contribution of the research is to establish a method to mine a set of meaningful control rules from Learning Classifier System using the Minimal Description Length criteria. The proposed method has been applied to an actual process of a biochemical plant and has shown the validity and the effectiveness.

  1. Factors That Have An Influence On Time Series

    NASA Astrophysics Data System (ADS)

    Notti, D.; Meisina, C.; Zucca, F.; Crosetto, M.; Montserrat, O.

    2012-01-01

    In the last years the development in the processing of SAR persistent scatterers interferometry (PSI) data has allowed an improvement in time series precision, also with the data processed on regional scale. It is possible now to study the behaviour in the time of different type of natural process. The more recent data are elaborated also with non-linear models and this allows, even if with many restrictions and problems, to study also the temporal variation in the evolution of a process. In this work we have analyzed the time series (TS) of ERS (1992-2001) and RADARSAT (2003-2010) data elaborated with SqueeSARTM processing over three studied areas in NW Italy (Western Piemonte, Province of Pavia and Province of Imperia). We compared the time series with other monitoring data in order to validate them and to find the positive and negative aspects in the detection of natural processes. At the same time the TS were used to understand the kinematics of some geological processes.

  2. Reconstruction of ensembles of coupled time-delay systems from time series.

    PubMed

    Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  3. Extracting a common pulse like signal from Time Serie using a non linear Kalman Filter

    NASA Astrophysics Data System (ADS)

    Gazeaux, J.; Batista, D.; Ammann, C.; Naveau, P.; Jégat, C.; Gao, C.

    2009-04-01

    To understand the nature and cause of natural climate variability, it is important to attribute past climate variations to particular forcing factors. In this work, our main focus is to introduce an automatic assimilation procedure to estimate the magnitude of strong but short-lived perturbations, such as large explosive volcanic eruptions, using climate/proxies time series. The extraction and decomposition procedure is run on real multivariate time series of sulfate from ice cores drilled at different sites in Greenland. The sulfate ejected by volcanoes is transported through the stratosphere towards the poles and deposited via sedimentation near the pole. Sulfate in Greenland is then a marker of huge volcanic eruptions which occur all over the world. Such pulse-like processes are highly non linear, as much in time as for their intensity. If they are not detected, such pulse-like signals of extreme and rare events can perturb an objective calculation of the trend. This work is then as much an estimation procedure for such signals, as a first step to estimate a posteriori trend in the time series. Our extraction algorithm handles multivariate time series with a common but unknown forcing. This statistical procedure is based on a multivariate multi-state space model and a non linear Kalman Filter. The non linearity is solved using the calculation of a twice conditional expectation and variance. It can provide an accurate estimate of the timing and duration of individual pulse-like events from a set of different series covering the same temporal space. It not only allows for a more objective estimation of its associated peak amplitude and the subsequent time evolution of the signal, but at the same time it provides a measure of confidence through the posterior probability for each pulse-like event. The flexibility, robustness and limitations of our approach are discussed by applying our method to simulated time series and to the Monte-Carlo method to test the

  4. Time-Accurate, Unstructured-Mesh Navier-Stokes Computations with the Space-Time CESE Method

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2006-01-01

    Application of the newly emerged space-time conservation element solution element (CESE) method to compressible Navier-Stokes equations is studied. In contrast to Euler equations solvers, several issues such as boundary conditions, numerical dissipation, and grid stiffness warrant systematic investigations and validations. Non-reflecting boundary conditions applied at the truncated boundary are also investigated from the stand point of acoustic wave propagation. Validations of the numerical solutions are performed by comparing with exact solutions for steady-state as well as time-accurate viscous flow problems. The test cases cover a broad speed regime for problems ranging from acoustic wave propagation to 3D hypersonic configurations. Model problems pertinent to hypersonic configurations demonstrate the effectiveness of the CESE method in treating flows with shocks, unsteady waves, and separations. Good agreement with exact solutions suggests that the space-time CESE method provides a viable alternative for time-accurate Navier-Stokes calculations of a broad range of problems.

  5. Nonlinear time series analysis of solar and stellar data

    NASA Astrophysics Data System (ADS)

    Jevtic, Nada

    2003-06-01

    Nonlinear time series analysis was developed to study chaotic systems. Its utility was investigated for the study of solar and stellar data time series. Sunspot data are the longest astronomical time series, and it reflects the long-term variation of the solar magnetic field. Due to periods of low solar activity, such as the Maunder minimum, and the solar cycle's quasiperiodicity, it has been postulated that the solar dynamo is a chaotic system. We show that, due to the definition of sunspot number, using nonlinear time series methods, it is not possible to test this postulate. To complement the sunspot data analysis, theoretically generated data for the α-Ω solar dynamo with meridional circulation were analyzed. Effects of stochastic fluctuations on the energy of an α-Ω dynamo with meridional circulation were investigated. This proved extremely useful in generating a clearer understanding of the effect of dynamical noise on the unperturbed system. This was useful in the study of the light intensity curve of white dwarf PG 1351+489. Dynamical resetting was identified for PG 1351+489, using phase space methods, and then, using nonlinear noise reduction methods, the white noise tail of the power spectrum was lowered by a factor of 40. This allowed the identification of 10 new lines in the power spectrum. Finally, using Poincare section return times, a periodicity in the light curve of cataclysmic variable SS Cygni was identified. We initially expected that time delay methods would be useful as a qualitative comparison tool. However, they were capable, under the proper set of constraints on the data sets, of providing quantitative information about the signal source.

  6. Exploring large scale time-series data using nested timelines

    NASA Astrophysics Data System (ADS)

    Xie, Zaixian; Ward, Matthew O.; Rundensteiner, Elke A.

    2013-01-01

    When data analysts study time-series data, an important task is to discover how data patterns change over time. If the dataset is very large, this task becomes challenging. Researchers have developed many visualization techniques to help address this problem. However, little work has been done regarding the changes of multivariate patterns, such as linear trends and clusters, on time-series data. In this paper, we describe a set of history views to fill this gap. This technique works under two modes: merge and non-merge. For the merge mode, merge algorithms were applied to selected time windows to generate a change-based hierarchy. Contiguous time windows having similar patterns are merged first. Users can choose different levels of merging with the tradeoff between more details in the data and less visual clutter in the visualizations. In the non-merge mode, the framework can use natural hierarchical time units or one defined by domain experts to represent timelines. This can help users navigate across long time periods. Gridbased views were designed to provide a compact overview for the history data. In addition, MDS pattern starfields and distance maps were developed to enable users to quickly investigate the degree of pattern similarity among different time periods. The usability evaluation demonstrated that most participants could understand the concepts of the history views correctly and finished assigned tasks with a high accuracy and relatively fast response time.

  7. A Time-Frequency Functional Model for Locally Stationary Time Series Data

    PubMed Central

    Qin, Li; Guo, Wensheng; Litt, Brian

    2009-01-01

    Unlike traditional time series analysis that focuses on one long time series, in many biomedical experiments, it is common to collect multiple time series and focus on how the design covariates impact the patterns of stochastic variation over time. In this article, we propose a time-frequency functional model for a family of time series indexed by a set of covariates. This model can be used to compare groups of time series in terms of the patterns of stochastic variation and to estimate the covariate effects. We focus our development on locally stationary time series and propose the covariate-indexed locally stationary setting, which include stationary processes as special cases. We use smoothing spline ANOVA models for the time-frequency coefficients. A two-stage procedure is introduced for estimation. To reduce the computational demand, we develop an equivalent state space model to the proposed model with an efficient algorithm. We also propose a new simulation method to generate replicated time series from their design spectra. An epileptic intracranial electroencephalogram (IEEG) dataset is analyzed for illustration. PMID:20228961

  8. Time warp edit distance with stiffness adjustment for time series matching.

    PubMed

    Marteau, Pierre-François

    2009-02-01

    In a way similar to the string-to-string correction problem, we address discrete time series similarity in light of a time-series-to-time-series-correction problem for which the similarity between two time series is measured as the minimum cost sequence of edit operations needed to transform one time series into another. To define the edit operations, we use the paradigm of a graphical editing process and end up with a dynamic programming algorithm that we call Time Warp Edit Distance (TWED). TWED is slightly different in form from Dynamic Time Warping (DTW), Longest Common Subsequence (LCSS), or Edit Distance with Real Penalty (ERP) algorithms. In particular, it highlights a parameter that controls a kind of stiffness of the elastic measure along the time axis. We show that the similarity provided by TWED is a potentially useful metric in time series retrieval applications since it could benefit from the triangular inequality property to speed up the retrieval process while tuning the parameters of the elastic measure. In that context, a lower bound is derived to link the matching of time series into downsampled representation spaces to the matching into the original space. The empiric quality of the TWED distance is evaluated on a simple classification task. Compared to Edit Distance, DTW, LCSS, and ERP, TWED has proved to be quite effective on the considered experimental task.

  9. Accurate ampacity determination: Temperature-Sag Model for operational real time ratings

    SciTech Connect

    Seppa, T.O.

    1995-07-01

    This report presents a method for determining transmission line ratings based on the relationship between the conductor`s temperature and its sag. The method is based on the Ruling Span principle and the use of transmission line tension monitoring systems. The report also presents a method of accurately calibrating the final sag of the conductor and determining the actual Ruling Span length of the line sections between deadend structures. Main error sources for two other real time methods are also examined.

  10. Rényi’s information transfer between financial time series

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad

    2012-05-01

    In this paper, we quantify the statistical coherence between financial time series by means of the Rényi entropy. With the help of Campbell’s coding theorem, we show that the Rényi entropy selectively emphasizes only certain sectors of the underlying empirical distribution while strongly suppressing others. This accentuation is controlled with Rényi’s parameter q. To tackle the issue of the information flow between time series, we formulate the concept of Rényi’s transfer entropy as a measure of information that is transferred only between certain parts of underlying distributions. This is particularly pertinent in financial time series, where the knowledge of marginal events such as spikes or sudden jumps is of a crucial importance. We apply the Rényian information flow to stock market time series from 11 world stock indices as sampled at a daily rate in the time period 02.01.1990-31.12.2009. Corresponding heat maps and net information flows are represented graphically. A detailed discussion of the transfer entropy between the DAX and S&P500 indices based on minute tick data gathered in the period 02.04.2008-11.09.2009 is also provided. Our analysis shows that the bivariate information flow between world markets is strongly asymmetric with a distinct information surplus flowing from the Asia-Pacific region to both European and US markets. An important yet less dramatic excess of information also flows from Europe to the US. This is particularly clearly seen from a careful analysis of Rényi information flow between the DAX and S&P500 indices.

  11. FTSPlot: Fast Time Series Visualization for Large Datasets

    PubMed Central

    Riss, Michael

    2014-01-01

    The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of ; the visualization itself can be done with a complexity of and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with ms. The current 64-bit implementation theoretically supports datasets with up to bytes, on the x86_64 architecture currently up to bytes are supported, and benchmarks have been conducted with bytes/1 TiB or double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments. PMID:24732865

  12. Deducing acidification rates based on short-term time series

    PubMed Central

    Lui, Hon-Kit; Arthur Chen, Chen-Tung

    2015-01-01

    We show that, statistically, the simple linear regression (SLR)-determined rate of temporal change in seawater pH (βpH), the so-called acidification rate, can be expressed as a linear combination of a constant (the estimated rate of temporal change in pH) and SLR-determined rates of temporal changes in other variables (deviation largely due to various sampling distributions), despite complications due to different observation durations and temporal sampling distributions. Observations show that five time series data sets worldwide, with observation times from 9 to 23 years, have yielded βpH values that vary from 1.61 × 10−3 to −2.5 × 10−3 pH unit yr−1. After correcting for the deviation, these data now all yield an acidification rate similar to what is expected under the air-sea CO2 equilibrium (−1.6 × 10−3 ~ −1.8 × 10−3 pH unit yr−1). Although long-term time series stations may have evenly distributed datasets, shorter time series may suffer large errors which are correctable by this method. PMID:26143749

  13. A multivariate time-series approach to marital interaction

    PubMed Central

    Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar

    2005-01-01

    Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation). The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa. Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R², p<0.0001 for the multivariate models). The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction. PMID:19742066

  14. A multivariate time-series approach to marital interaction.

    PubMed

    Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar

    2005-08-02

    Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation).The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa.Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R(2), p<0.0001 for the multivariate models).The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction.

  15. Nonstationary hydrological time series forecasting using nonlinear dynamic methods

    NASA Astrophysics Data System (ADS)

    Coulibaly, Paulin; Baldwin, Connely K.

    2005-06-01

    Recent evidence of nonstationary trends in water resources time series as result of natural and/or anthropogenic climate variability and change, has raised more interest in nonlinear dynamic system modeling methods. In this study, the effectiveness of dynamically driven recurrent neural networks (RNN) for complex time-varying water resources system modeling is investigated. An optimal dynamic RNN approach is proposed to directly forecast different nonstationary hydrological time series. The proposed method automatically selects the most optimally trained network in any case. The simulation performance of the dynamic RNN-based model is compared with the results obtained from optimal multivariate adaptive regression splines (MARS) models. It is shown that the dynamically driven RNN model can be a good alternative for the modeling of complex dynamics of a hydrological system, performing better than the MARS model on the three selected hydrological time series, namely the historical storage volumes of the Great Salt Lake, the Saint-Lawrence River flows, and the Nile River flows.

  16. Dynamical Analysis and Visualization of Tornadoes Time Series

    PubMed Central

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281

  17. Dynamical analysis and visualization of tornadoes time series.

    PubMed

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  18. Satellite time series analysis using Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  19. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  20. The Puoko-nui CCD Time-Series Photometer

    NASA Astrophysics Data System (ADS)

    Chote, P.; Sullivan, D. J.

    2013-01-01

    Puoko-nui (te reo Maori for ‘big eye’) is a precision time series photometer developed at Victoria University of Wellington, primarily for use with the 1m McLellan telescope at Mt John University Observatory (MJUO), at Lake Tekapo, New Zealand. GPS based timing provides excellent timing accuracy, and online reduction software processes frames as they are acquired. The user is presented with a simple user interface that includes instrument control and an up to date lightcurve and Fourier amplitude spectrum of the target star. Puoko-nui has been operating in its current form since early 2011, where it is primarily used to monitor pulsating white dwarf stars.

  1. West Africa land use and land cover time series

    USGS Publications Warehouse

    Cotillon, Suzanne E.

    2017-02-16

    Started in 1999, the West Africa Land Use Dynamics project represents an effort to map land use and land cover, characterize the trends in time and space, and understand their effects on the environment across West Africa. The outcome of the West Africa Land Use Dynamics project is the production of a three-time period (1975, 2000, and 2013) land use and land cover dataset for the Sub-Saharan region of West Africa, including the Cabo Verde archipelago. The West Africa Land Use Land Cover Time Series dataset offers a unique basis for characterizing and analyzing land changes across the region, systematically and at an unprecedented level of detail.

  2. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  3. Simple Patterns in Fluctuations of Time Series of Economic Interest

    NASA Astrophysics Data System (ADS)

    Fanchiotti, H.; García Canal, C. A.; García Zúñiga, H.

    Time series corresponding to nominal exchange rates between the US dollar and Argentina, Brazil and European Economic Community currencies; different financial indexes as the Industrial Dow Jones, the British Footsie, the German DAX Composite, the Australian Share Price and the Nikkei Cash and also different Argentine local tax revenues, are analyzed looking for the appearance of simple patterns and the possible definition of forecast evaluators. In every case, the statistical fractal dimensions are obtained from the behavior of the corresponding variance of increments at a given lag. The detrended fluctuation analysis of the data in terms of the corresponding exponent in the resulting power law is carried out. Finally, the frequency power spectra of all the time series considered are computed and compared

  4. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  5. Assestment of correlations and crossover scale in electroseismic time series

    NASA Astrophysics Data System (ADS)

    Guzman-Vargas, L.; Ramírez-Rojas, A.; Angulo-Brown, F.

    2009-04-01

    Evaluating complex fluctuations in electroseismic time series is an important task not only for earthquake prediction but also for understanding complex processes related to earthquake preparation. Previous studies have reported alterations, as the emergence of correlated dynamics in geoelectric potentials prior to an important earthquake (EQ). In this work, we apply the detrended fluctuation analysis and introduce a statistical procedure to characterize the presence of crossovers in scaling exponents, to analyze the fluctuations of geoelectric time series monitored in two sites located in Mexico. We find a complex behavior characterized by the presence of a crossover in the correlation exponents in the vicinity of a M=7.4 EQ occurred on Sept. 14, 1995. Finally, we apply the t-student test to evaluate the level of significance between short and large scaling exponents.

  6. Causal Discovery from Subsampled Time Series Data by Constraint Optimization

    PubMed Central

    Hyttinen, Antti; Plis, Sergey; Järvisalo, Matti; Eberhardt, Frederick; Danks, David

    2017-01-01

    This paper focuses on causal structure estimation from time series data in which measurements are obtained at a coarser timescale than the causal timescale of the underlying system. Previous work has shown that such subsampling can lead to significant errors about the system’s causal structure if not properly taken into account. In this paper, we first consider the search for the system timescale causal structures that correspond to a given measurement timescale structure. We provide a constraint satisfaction procedure whose computational performance is several orders of magnitude better than previous approaches. We then consider finite-sample data as input, and propose the first constraint optimization approach for recovering the system timescale causal structure. This algorithm optimally recovers from possible conflicts due to statistical errors. More generally, these advances allow for a robust and non-parametric estimation of system timescale causal structures from subsampled time series data. PMID:28203316

  7. Interpreting time series of patient satisfaction: macro vs. micro components.

    PubMed

    Frank, Björn; Sudo, Shuichi; Enkawa, Takao

    2009-01-01

    Recent research discovered that economic processes influence national averages of customer satisfaction. Using time-series data from Japanese and South Korean hospitals, we conducted principal component regression analyses to examine whether these findings are transferable to patient satisfaction. Our results reveal that aggregate income has a positive impact and economic expectations have a negative impact on patient satisfaction. Further analyses demonstrate that these strong economic influences make it difficult for hospital managers to use patient satisfaction scores to assess the performance impact of their customer-oriented actions. In order to improve performance evaluations based on patient surveys, we thus recommend managers to remove economic influences from time-series of patient satisfaction.

  8. A Surrogate Test for Pseudo-periodic Time Series Data

    NASA Astrophysics Data System (ADS)

    Small, Michael; Harrison, Robert G.; Tse, C. K.

    2002-07-01

    Standard (linear) surrogate methods are only useful for time series exhibiting no pseudo-periodic structure. We describe a new algorithm that can distinguish between a noisy periodic orbit and deterministic non-periodic inter-cycle dynamics. Possible origins of deterministic non-periodic inter-cycle dynamics include: non-periodic linear or nonlinear dynamics, or chaos. This new algorithm is based on mimicking the large-scale dynamics with a local model, but obliterating the fine scale features with dynamic noise. We demonstrate the application of this method to artificial data and experimental time series, including human electrocardiogram (ECG) recordings during sinus rhythm and ventricular tachycardia (VT). The method is able to successfully differentiate between the chaotic Rössler system and a pseudo periodic realization of the Rössler equations with dynamic noise. Application to ECG data demonstrates that both sinus rhythm and VT exhibit nontrivial inter-cycle dynamics.

  9. Analyzing Single-Molecule Time Series via Nonparametric Bayesian Inference

    PubMed Central

    Hines, Keegan E.; Bankston, John R.; Aldrich, Richard W.

    2015-01-01

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. PMID:25650922

  10. The multiscale analysis between stock market time series

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian

    2015-11-01

    This paper is devoted to multiscale cross-correlation analysis on stock market time series, where multiscale DCCA cross-correlation coefficient as well as multiscale cross-sample entropy (MSCE) is applied. Multiscale DCCA cross-correlation coefficient is a realization of DCCA cross-correlation coefficient on multiple scales. The results of this method present a good scaling characterization. More significantly, this method is able to group stock markets by areas. Compared to multiscale DCCA cross-correlation coefficient, MSCE presents a more remarkable scaling characterization and the value of each log return of financial time series decreases with the increasing of scale factor. But the results of grouping is not as good as multiscale DCCA cross-correlation coefficient.

  11. Causal Discovery from Subsampled Time Series Data by Constraint Optimization.

    PubMed

    Hyttinen, Antti; Plis, Sergey; Järvisalo, Matti; Eberhardt, Frederick; Danks, David

    2016-08-01

    This paper focuses on causal structure estimation from time series data in which measurements are obtained at a coarser timescale than the causal timescale of the underlying system. Previous work has shown that such subsampling can lead to significant errors about the system's causal structure if not properly taken into account. In this paper, we first consider the search for the system timescale causal structures that correspond to a given measurement timescale structure. We provide a constraint satisfaction procedure whose computational performance is several orders of magnitude better than previous approaches. We then consider finite-sample data as input, and propose the first constraint optimization approach for recovering the system timescale causal structure. This algorithm optimally recovers from possible conflicts due to statistical errors. More generally, these advances allow for a robust and non-parametric estimation of system timescale causal structures from subsampled time series data.

  12. Deviations from uniform power law scaling in nonstationary time series

    NASA Technical Reports Server (NTRS)

    Viswanathan, G. M.; Peng, C. K.; Stanley, H. E.; Goldberger, A. L.

    1997-01-01

    A classic problem in physics is the analysis of highly nonstationary time series that typically exhibit long-range correlations. Here we test the hypothesis that the scaling properties of the dynamics of healthy physiological systems are more stable than those of pathological systems by studying beat-to-beat fluctuations in the human heart rate. We develop techniques based on the Fano factor and Allan factor functions, as well as on detrended fluctuation analysis, for quantifying deviations from uniform power-law scaling in nonstationary time series. By analyzing extremely long data sets of up to N = 10(5) beats for 11 healthy subjects, we find that the fluctuations in the heart rate scale approximately uniformly over several temporal orders of magnitude. By contrast, we find that in data sets of comparable length for 14 subjects with heart disease, the fluctuations grow erratically, indicating a loss of scaling stability.

  13. A Multiscale Approach to InSAR Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Muse, P.; Simons, M.; Lin, N.; Dicaprio, C. J.

    2010-12-01

    We present a technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale InSAR Time Series analysis), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. As opposed to single pixel InSAR time series techniques, MInTS takes advantage of both spatial and temporal characteristics of the deformation field. We use a weighting scheme which accounts for the presence of localized holes due to decorrelation or unwrapping errors in any given interferogram. We represent time-dependent deformation using a dictionary of general basis functions, capable of detecting both steady and transient processes. The estimation is regularized using a model resolution based smoothing so as to be able to capture rapid deformation where there are temporally dense radar acquisitions and to avoid oscillations during time periods devoid of acquisitions. MInTS also has the flexibility to explicitly parametrize known time-dependent processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). We use cross validation to choose the regularization penalty parameter in the inversion of for the time-dependent deformation field. We demonstrate MInTS using a set of 63 ERS-1/2 and 29 Envisat interferograms for Long Valley Caldera.

  14. Mode Analysis with Autocorrelation Method (Single Time Series) in Tokamak

    NASA Astrophysics Data System (ADS)

    Saadat, Shervin; Salem, Mohammad K.; Goranneviss, Mahmoud; Khorshid, Pejman

    2010-08-01

    In this paper plasma mode analyzed with statistical method that designated Autocorrelation function. Auto correlation function used from one time series, so for this purpose we need one Minov coil. After autocorrelation analysis on mirnov coil data, spectral density diagram is plotted. Spectral density diagram from symmetries and trends can analyzed plasma mode. RHF fields effects with this method ate investigated in IR-T1 tokamak and results corresponded with multichannel methods such as SVD and FFT.

  15. Time series prediction using artificial neural network for power stabilization

    SciTech Connect

    Puranik, G.; Philip, T.; Nail, B.

    1996-12-31

    Time series prediction has been applied to many business and scientific applications. Prominent among them are stock market prediction, weather forecasting, etc. Here, this technique has been applied to forecast plasma torch voltages to stabilize power using a backpropagation, a model of artificial neural network. The Extended-Delta-Bar-Delta algorithm is used to improve the convergence rate of the network and also to avoid local minima. Results from off-line data was quite promising to use in on-line.

  16. Time Series of SST Anomalies Off Western Africa

    DTIC Science & Technology

    2014-09-09

    of South Africa extending west-northwest from the vicinity of the Cape . b) Locations of surface drifting buoys over January-April 2014 superimposed...in the real ocean with accompanying estimates of forecast uncertainty. Assimilative ocean forecast around South Africa are evaluated from January to...GHRSST XV Proceedings Issue 1 Revision 0 2-6 June 2014, Cape Town , SA Date: 9th September 2014 Page 93 of 232 TIME SERIES OF SST ANOMALIES

  17. The complexity of carbon flux time series in Europe

    NASA Astrophysics Data System (ADS)

    Lange, Holger; Sippel, Sebastian

    2014-05-01

    Observed geophysical time series usually exhibit pronounced variability, part of which is process-related and deterministic ("signal"), another part is due to random fluctuations ("noise"). To discern these two sources for fluctuations is notoriously difficult using conventional analysis methods, unless sophisticated model assumptions are made. Here, we present an almost parameter-free innovative approach with the potential to draw a distinction between deterministic processes and structured noise, based on ordinal pattern statistics. The method determines one measure for the information content of time series (Shannon entropy) and two complexity measures, one based on global properties of the order pattern distribution (Jensen-Shannon complexity) and one based on local (derivative) properties (Fisher information or complexity). Each time series gets classified via its location in an entropy-complexity plane; using this representation, the method draws a qualitative distinction between different types of natural processes. As a case study, we investigate Gross Primary Productivity (GPP) and respiration which are key variables in terrestrial ecosystems quantifying carbon allocation and biomass growth of vegetation. Changes in GPP and ecosystem respiration can be induced by land use change, environmental disasters or extreme events, and changing climate. Numerous attempts to quantify these variables on larger spatial scales exist. Here, we investigate gridded time series at monthly resolution for the European continent either based on upscaled measurements ("observations") or modelled with two different process-based terrestrial ecosystem models ("simulations"). The complexity analysis is either visualized as maps of Europe showing "hotspots" of complexity for GPP and respiration, or used to provide a detailed observations-simulations and model-model comparison. Values found for information and complexity will be compared to known artificial reference processes

  18. Time series regression model for infectious disease and weather.

    PubMed

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context.

  19. New Comprehensive System to Construct Speleothem Fabrics Time Series

    NASA Astrophysics Data System (ADS)

    Frisia, S.; Borsato, A.

    2014-12-01

    Speleothem fabrics record processes that influence the way geochemical proxy data are encoded in speleothems, yet, there has been little advance in the use of fabrics as a complement to palaeo-proxy datasets since the fabric classification proposed by us in 2010. The systematic use of fabrics documentation in speleothem science has been limited by the absence of a comprehensive, numerical system that would allow constructing fabric time series comparable with the widely used geochemical time series. Documentation of speleothem fabrics is fundamental for a robust interpretation of speleothem time series where stable isotopes and trace elements are used as proxy, because fabrics highlight depositional as well as post-depositional processes whose understanding complements reconstructions based on geochemistry. Here we propose a logic system allowing transformation of microscope observations into numbers tied to acronyms that specify each fabric type and subtype. The rationale for ascribing progressive numbers to fabrics is based on the most up-to-date growth models. In this conceptual framework, the progression reflects hydrological conditions, bio-mediation and diagenesis. The lowest numbers are given to calcite fabrics formed at relatively constant drip rates: the columnar types (compact and open). Higher numbers are ascribed to columnar fabrics characterized by presence of impurities that cause elongation or lattice distortion (Elongated, Fascicular Optic and Radiaxial calcites). The sequence progresses with the dendritic fabrics, followed by micrite (M), which has been observed in association with microbial films. Microsparite (Ms) and mosaic calcite (Mc) have the highest numbers, being considered as diagenetic. Acronyms and subfixes are intended to become universally acknowledged. Thus, fabrics can be plotted vs. age to yield time series, where numbers are replaced by the acronyms. This will result in a visual representation of climate- or environmental

  20. Multifractal analysis of time series generated by discrete Ito equations

    SciTech Connect

    Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele

    2015-06-15

    In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.

  1. A data-fitting procedure for chaotic time series

    SciTech Connect

    McDonough, J.M.; Mukerji, S.; Chung, S.

    1998-10-01

    In this paper the authors introduce data characterizations for fitting chaotic data to linear combinations of one-dimensional maps (say, of the unit interval) for use in subgrid-scale turbulence models. They test the efficacy of these characterizations on data generated by a chaotically-forced Burgers` equation and demonstrate very satisfactory results in terms of modeled time series, power spectra and delay maps.

  2. A method for detecting complex correlation in time series

    NASA Astrophysics Data System (ADS)

    Alfi, V.; Petri, A.; Pietronero, L.

    2007-06-01

    We propose a new method for detecting complex correlations in time series of limited size. The method is derived by the Spitzer's identity and proves to work successfully on different model processes, including the ARCH process, in which pairs of variables are uncorrelated, but the three point correlation function is non zero. The application to financial data allows to discriminate among dependent and independent stock price returns where standard statistical analysis fails.

  3. Geodetic Time Series: An Overview of UNAVCO Community Resources and Examples of Time Series Analysis Using GPS and Strainmeter Data

    NASA Astrophysics Data System (ADS)

    Phillips, D. A.; Meertens, C. M.; Hodgkinson, K. M.; Puskas, C. M.; Boler, F. M.; Snett, L.; Mattioli, G. S.

    2013-12-01

    We present an overview of time series data, tools and services available from UNAVCO along with two specific and compelling examples of geodetic time series analysis. UNAVCO provides a diverse suite of geodetic data products and cyberinfrastructure services to support community research and education. The UNAVCO archive includes data from 2500+ continuous GPS stations, borehole geophysics instruments (strainmeters, seismometers, tiltmeters, pore pressure sensors), and long baseline laser strainmeters. These data span temporal scales from seconds to decades and provide global spatial coverage with regionally focused networks including the EarthScope Plate Boundary Observatory (PBO) and COCONet. This rich, open access dataset is a tremendous resource that enables the exploration, identification and analysis of time varying signals associated with crustal deformation, reference frame determinations, isostatic adjustments, atmospheric phenomena, hydrologic processes and more. UNAVCO provides a suite of time series exploration and analysis resources including static plots, dynamic plotting tools, and data products and services designed to enhance time series analysis. The PBO GPS network allow for identification of ~1 mm level deformation signals. At some GPS stations seasonal signals and longer-term trends in both the vertical and horizontal components can be dominated by effects of hydrological loading from natural and anthropogenic sources. Modeling of hydrologic deformation using GLDAS and a variety of land surface models (NOAH, MOSAIC, VIC and CLM) shows promise for independently modeling hydrologic effects and separating them from tectonic deformation as well as anthropogenic loading sources. A major challenge is to identify where loading is dominant and corrections from GLDAS can apply and where pumping is the dominant signal and corrections are not possible without some other data. In another arena, the PBO strainmeter network was designed to capture small short

  4. Comparison of nonparametric trend analysis according to the types of time series data

    NASA Astrophysics Data System (ADS)

    Heo, J.; Shin, H.; Kim, T.; Jang, H.; Kim, H.

    2013-12-01

    In the analysis of hydrological data, the determination of the existence of overall trend due to climate change has been a major concern and the important part of design and management of water resources for the future. The existence of trend could be identified by plotting hydrologic time series. However, statistical methods are more accurate and objective tools to perform trend analysis. Statistical methods divided into parametric and nonparametric methods. In the case of parametric method, the population should be assumed to be normally distributed. However, most of hydrological data tend to be represented by non-normal distribution, then the nonparametric method considered more suitable than parametric method. In this study, simulations were performed with different types of time series data and four nonparametric methods (Mann-Kendall test, Spearman's rho test, SEN test, and Hotelling-Pabst test) generally used in trend analysis were applied to assess the power of each trend analysis. The time series data were classified into three types which are Trend+Random, Trend+Cycle+Random, and Trend+Non-random. In order to add a change to the data, 11 kinds of different slopes were overlapped at each simulation. As the results, nonparametric methods have almost similar power for Trend+random type and Trend+Non-random series. On the other hand, Mann-Kendall and SEN tests have slightly higher power than Spearman's rho and Hotelling-Pabst tests for Trend+Cycle+Random series.

  5. Learning restricted Boolean network model by time-series data

    PubMed Central

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance μhame, the normalized Hamming distance of state transition μhamst, and the steady-state distribution distance μssd. Results show that the proposed algorithm outperforms the others according to both μhame and μhamst, whereas its performance according to μssd is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data. PMID:25093019

  6. AE mapping of engines for spatially located time series

    NASA Astrophysics Data System (ADS)

    Nivesrangsan, P.; Steel, J. A.; Reuben, R. L.

    2005-09-01

    This paper represents the first step towards using multiple acoustic emission (AE) sensors to produce spatially located time series signals for a running engine. By this it is meant the decomposition of a multi-source signal by acquiring it with an array of sensors and using source location to reconstitute the individual time series attributable to some or all of these signals. Internal combustion engines are a group of monitoring targets which would benefit from such an approach. A series of experiments has been carried out where AE from a standard source has been mapped for a large number of source-sensor pairs on a small diesel engine and on various cast iron blocks of simple geometry. The wave propagation on a typical diesel engine cylinder head or block is complex because of the heterogeneity of the cast iron and the complex geometry with variations in wall-thickness, boundaries and discontinuities. The AE signal distortion for a range of source-sensor pairs has been estimated using time-frequency analysis, and using a reference sensor placed close to the source. At this stage, the emphasis has been on determining a suitable processing scheme to recover a measure of the signal energy, which depends only on the distance of the source and not upon the path. Tentative recommendations are made on a suitable approach to sensor positioning and signal processing with reference to a limited set of data acquired from the running engine.

  7. An Operational Geodatabase Service for Disseminating Raster Time Series Data

    NASA Astrophysics Data System (ADS)

    Asante, K. O.

    2009-12-01

    The volume of raster time series data available for earth science applications is rapidly expanding with improvements in spatial and temporal resolution of earth imaging from remote sensing missions. Current dissemination systems are typically designed for mission efficiency rather than supporting the various needs of diverse user communities. This promotes the building of multiple archives of the same dataset by end users who acquire the skills needed to establish and maintain their own data streams. Such processing often becomes a barrier to the adoption of new datasets. This presentation describes the development of an operational geodatabase service for the dissemination of raster time series. The service combines innovative geocoding schemes with traditional database and geospatial capabilities to facilitate direct access to raster time series. It includes functionality such as search and retrieval, data segmentation, trend analysis and direct integration into third-party applications using predefined data schemas. The service allows end users to interact with data using simple web-based tools without the need for complex data processing skills. A live implementation of the service is demonstrated using sample global environmental datasets.

  8. Toward automatic time-series forecasting using neural networks.

    PubMed

    Yan, Weizhong

    2012-07-01

    Over the past few decades, application of artificial neural networks (ANN) to time-series forecasting (TSF) has been growing rapidly due to several unique features of ANN models. However, to date, a consistent ANN performance over different studies has not been achieved. Many factors contribute to the inconsistency in the performance of neural network models. One such factor is that ANN modeling involves determining a large number of design parameters, and the current design practice is essentially heuristic and ad hoc, this does not exploit the full potential of neural networks. Systematic ANN modeling processes and strategies for TSF are, therefore, greatly needed. Motivated by this need, this paper attempts to develop an automatic ANN modeling scheme. It is based on the generalized regression neural network (GRNN), a special type of neural network. By taking advantage of several GRNN properties (i.e., a single design parameter and fast learning) and by incorporating several design strategies (e.g., fusing multiple GRNNs), we have been able to make the proposed modeling scheme to be effective for modeling large-scale business time series. The initial model was entered into the NN3 time-series competition. It was awarded the best prediction on the reduced dataset among approximately 60 different models submitted by scholars worldwide.

  9. Time series analysis for psychological research: examining and forecasting change.

    PubMed

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  10. Time series analysis for psychological research: examining and forecasting change

    PubMed Central

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  11. Genetic programming and serial processing for time series classification.

    PubMed

    Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I

    2014-01-01

    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.

  12. Cross-sample entropy of foreign exchange time series

    NASA Astrophysics Data System (ADS)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  13. Characterization of aggressive prostate cancer using ultrasound RF time series

    NASA Astrophysics Data System (ADS)

    Khojaste, Amir; Imani, Farhad; Moradi, Mehdi; Berman, David; Siemens, D. Robert; Sauerberi, Eric E.; Boag, Alexander H.; Abolmaesumi, Purang; Mousavi, Parvin

    2015-03-01

    Prostate cancer is the most prevalently diagnosed and the second cause of cancer-related death in North American men. Several approaches have been proposed to augment detection of prostate cancer using different imaging modalities. Due to advantages of ultrasound imaging, these approaches have been the subject of several recent studies. This paper presents the results of a feasibility study on differentiating between lower and higher grade prostate cancer using ultrasound RF time series data. We also propose new spectral features of RF time series to highlight aggressive prostate cancer in small ROIs of size 1 mm × 1 mm in a cohort of 19 ex vivo specimens of human prostate tissue. In leave-one-patient-out cross-validation strategy, an area under accumulated ROC curve of 0.8 has been achieved with overall sensitivity and specificity of 81% and 80%, respectively. The current method shows promising results on differentiating between lower and higher grade of prostate cancer using ultrasound RF time series.

  14. The QuakeSim System for GPS Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Granat, R. A.; Gao, X.; Pierce, M.; Wang, J.

    2010-12-01

    We present a system for analysis of GPS time series data available to geosciences users through a web services / web portal interface. The system provides two time series analysis methods, one based on hidden Markov model (HMM) segmentation, the other based on covariance descriptor analysis (CDA). In addition, it provides data pre-processing routines that perform spike noise removal, linear de-trending, sum-of-sines removal, and common mode removal using probabilistic principle components analysis (PPCA). These components can be composed by the user into the desired series of processing steps for analysis through an intuitive graphical interface. The system is accessed through a web portal that allows both micro-scale (individual station) and macro-scale (whole network) exploration of data sets and analysis results via Google Maps. Users can focus in on or scroll through particular spatial or temporal time windows, or observe dynamic behavior by created movies that display the system state. Analysis results can be exported to KML format for easy combination with other sources of data, such as fault databases and InSAR interferograms. GPS solutions for California member stations of the plate boundary observatory from both the SOPAC and JPL gipsy context groups are automatically imported into the system as that data becomes available. We show the results of the methods as applied to these data sets for an assortment of case studies, and show how the system can be used to analyze both seismic and aseismic signals.

  15. Application of the accurate mass and time tag approach in studies of the human blood lipidome

    PubMed Central

    Ding, Jie; Sorensen, Christina M.; Jaitly, Navdeep; Jiang, Hongliang; Orton, Daniel J.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Richard D.; Metz, Thomas O.

    2008-01-01

    We report a preliminary demonstration of the accurate mass and time (AMT) tag approach for lipidomics. Initial data-dependent LC-MS/MS analyses of human plasma, erythrocyte, and lymphocyte lipids were performed in order to identify lipid molecular species in conjunction with complementary accurate mass and isotopic distribution information. Identified lipids were used to populate initial lipid AMT tag databases containing 250 and 45 entries for those species detected in positive and negative electrospray ionization (ESI) modes, respectively. The positive ESI database was then utilized to identify human plasma, erythrocyte, and lymphocyte lipids in high-throughput LC-MS analyses based on the AMT tag approach. We were able to define the lipid profiles of human plasma, erythrocytes, and lymphocytes based on qualitative and quantitative differences in lipid abundance. PMID:18502191

  16. WARP: accurate retrieval of shapes using phase of fourier descriptors and time warping distance.

    PubMed

    Bartolini, Ilaria; Ciaccia, Paolo; Patella, Marco

    2005-01-01

    Effective and efficient retrieval of similar shapes from large image databases is still a challenging problem in spite of the high relevance that shape information can have in describing image contents. In this paper, we propose a novel Fourier-based approach, called WARP, for matching and retrieving similar shapes. The unique characteristics of WARP are the exploitation of the phase of Fourier coefficients and the use of the Dynamic Time Warping (DTW) distance to compare shape descriptors. While phase information provides a more accurate description of object boundaries than using only the amplitude of Fourier coefficients, the DTW distance permits us to accurately match images even in the presence of (limited) phase shiftings. In terms of classical precision/recall measures, we experimentally demonstrate that WARP can gain, say, up to 35 percent in precision at a 20 percent recall level with respect to Fourier-based techniques that use neither phase nor DTW distance.

  17. Robust, automatic GPS station velocities and velocity time series

    NASA Astrophysics Data System (ADS)

    Blewitt, G.; Kreemer, C.; Hammond, W. C.

    2014-12-01

    Automation in GPS coordinate time series analysis makes results more objective and reproducible, but not necessarily as robust as the human eye to detect problems. Moreover, it is not a realistic option to manually scan our current load of >20,000 time series per day. This motivates us to find an automatic way to estimate station velocities that is robust to outliers, discontinuities, seasonality, and noise characteristics (e.g., heteroscedasticity). Here we present a non-parametric method based on the Theil-Sen estimator, defined as the median of velocities vij=(xj-xi)/(tj-ti) computed between all pairs (i, j). Theil-Sen estimators produce statistically identical solutions to ordinary least squares for normally distributed data, but they can tolerate up to 29% of data being problematic. To mitigate seasonality, our proposed estimator only uses pairs approximately separated by an integer number of years (N-δt)<(tj-ti )<(N+δt), where δt is chosen to be small enough to capture seasonality, yet large enough to reduce random error. We fix N=1 to maximally protect against discontinuities. In addition to estimating an overall velocity, we also use these pairs to estimate velocity time series. To test our methods, we process real data sets that have already been used with velocities published in the NA12 reference frame. Accuracy can be tested by the scatter of horizontal velocities in the North American plate interior, which is known to be stable to ~0.3 mm/yr. This presents new opportunities for time series interpretation. For example, the pattern of velocity variations at the interannual scale can help separate tectonic from hydrological processes. Without any step detection, velocity estimates prove to be robust for stations affected by the Mw7.2 2010 El Mayor-Cucapah earthquake, and velocity time series show a clear change after the earthquake, without any of the usual parametric constraints, such as relaxation of postseismic velocities to their preseismic values.

  18. A Physiological Time Series Dynamics-Based Approach to Patient Monitoring and Outcome Prediction

    PubMed Central

    Lehman, Li-Wei H.; Adams, Ryan P.; Mayaud, Louis; Moody, George B.; Malhotra, Atul; Mark, Roger G.; Nemati, Shamim

    2015-01-01

    Cardiovascular variables such as heart rate (HR) and blood pressure (BP) are regulated by an underlying control system, and therefore, the time series of these vital signs exhibit rich dynamical patterns of interaction in response to external perturbations (e.g., drug administration), as well as pathological states (e.g., onset of sepsis and hypotension). A question of interest is whether “similar” dynamical patterns can be identified across a heterogeneous patient cohort, and be used for prognosis of patients’ health and progress. In this paper, we used a switching vector autoregressive framework to systematically learn and identify a collection of vital sign time series dynamics, which are possibly recurrent within the same patient and may be shared across the entire cohort. We show that these dynamical behaviors can be used to characterize the physiological “state” of a patient. We validate our technique using simulated time series of the cardiovascular system, and human recordings of HR and BP time series from an orthostatic stress study with known postural states. Using the HR and BP dynamics of an intensive care unit (ICU) cohort of over 450 patients from the MIMIC II database, we demonstrate that the discovered cardiovascular dynamics are significantly associated with hospital mortality (dynamic modes 3 and 9, p = 0.001, p = 0.006 from logistic regression after adjusting for the APACHE scores). Combining the dynamics of BP time series and SAPS-I or APACHE-III provided a more accurate assessment of patient survival/mortality in the hospital than using SAPS-I and APACHE-III alone (p = 0.005 and p = 0.045). Our results suggest that the discovered dynamics of vital sign time series may contain additional prognostic value beyond that of the baseline acuity measures, and can potentially be used as an independent predictor of outcomes in the ICU. PMID:25014976

  19. Measuring frequency domain granger causality for multiple blocks of interacting time series.

    PubMed

    Faes, Luca; Nollo, Giandomenico

    2013-04-01

    In the past years, several frequency-domain causality measures based on vector autoregressive time series modeling have been suggested to assess directional connectivity in neural systems. The most followed approaches are based on representing the considered set of multiple time series as a realization of two or three vector-valued processes, yielding the so-called Geweke linear feedback measures, or as a realization of multiple scalar-valued processes, yielding popular measures like the directed coherence (DC) and the partial DC (PDC). In the present study, these two approaches are unified and generalized by proposing novel frequency-domain causality measures which extend the existing measures to the analysis of multiple blocks of time series. Specifically, the block DC (bDC) and block PDC (bPDC) extend DC and PDC to vector-valued processes, while their logarithmic counterparts, denoted as multivariate total feedback [Formula: see text] and direct feedback [Formula: see text], represent into a full multivariate framework the Geweke's measures. Theoretical analysis of the proposed measures shows that they: (i) possess desirable properties of causality measures; (ii) are able to reflect either direct causality (bPDC, [Formula: see text] or total (direct + indirect) causality (bDC, [Formula: see text] between time series blocks; (iii) reduce to the DC and PDC measures for scalar-valued processes, and to the Geweke's measures for pairs of processes; (iv) are able to capture internal dependencies between the scalar constituents of the analyzed vector processes. Numerical analysis showed that the proposed measures can be efficiently estimated from short time series, allow to represent in an objective, compact way the information derived from the causal analysis of several pairs of time series, and may detect frequency domain causality more accurately than existing measures. The proposed measures find their natural application in the evaluation of directional

  20. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    PubMed Central

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Smith, Richard D.

    2007-01-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/− 5 ppm and 1 ppm) and NET value (no constraint, +/− 0.05 and 0.01 on a 0–1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LC-MS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate measurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/− 1 ppm and elution time measurements within +/− 0.01 NET. PMID:15979333

  1. Connectionist Architectures for Time Series Prediction of Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Weigend, Andreas Sebastian

    We investigate the effectiveness of connectionist networks for predicting the future continuation of temporal sequences. The problem of overfitting, particularly serious for short records of noisy data, is addressed by the method of weight-elimination: a term penalizing network complexity is added to the usual cost function in back-propagation. We describe the dynamics of the procedure and clarify the meaning of the parameters involved. From a Bayesian perspective, the complexity term can be usefully interpreted as an assumption about prior distribution of the weights. We analyze three time series. On the benchmark sunspot series, the networks outperform traditional statistical approaches. We show that the network performance does not deteriorate when there are more input units than needed. In the second example, the notoriously noisy foreign exchange rates series, we pick one weekday and one currency (DM vs. US). Given exchange rate information up to and including a Monday, the task is to predict the rate for the following Tuesday. Weight-elimination manages to extract a significant part of the dynamics and makes the solution interpretable. In the third example, the networks predict the resource utilization of a chaotic computational ecosystem for hundreds of steps forward in time.

  2. Unraveling the cause-effect relation between time series.

    PubMed

    Liang, X San

    2014-11-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion, namely, information flow, we solve an inverse problem and give this important and challenging question, which is of interest in a wide variety of disciplines, a positive answer. Here causality is measured by the time rate of information flowing from one series to the other. The resulting formula is tight in form, involving only commonly used statistics, namely, sample covariances; an immediate corollary is that causation implies correlation, but correlation does not imply causation. It has been validated with touchstone linear and nonlinear series, purportedly generated with one-way causality that evades the traditional approaches. It has also been applied successfully to the investigation of real-world problems; an example presented here is the cause-and-effect relation between the two climate modes, El Niño and the Indian Ocean Dipole (IOD), which have been linked to hazards in far-flung regions of the globe. In general, the two modes are mutually causal, but the causality is asymmetric: El Niño tends to stabilize IOD, while IOD functions to make El Niño more uncertain. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean.

  3. Time-series animation techniques for visualizing urban growth

    USGS Publications Warehouse

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  4. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  5. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  6. An accurate assay for HCV based on real-time fluorescence detection of isothermal RNA amplification.

    PubMed

    Wu, Xuping; Wang, Jianfang; Song, Jinyun; Li, Jiayan; Yang, Yongfeng

    2016-09-01

    Hepatitis C virus (HCV) is one of the common reasons of liver fibrosis and hepatocellular carcinoma (HCC). Early, rapid and accurate HCV RNA detection is important to prevent and control liver disease. A simultaneous amplification and testing (SAT) assay, which is based on isothermal amplification of RNA and real-time fluorescence detection, was designed to optimize routine HCV RNA detection. In this study, HCV RNA and an internal control (IC) were amplified and analyzed simultaneously by SAT assay and detection of fluorescence using routine real-time PCR equipment. The assay detected as few as 10 copies of HCV RNA transcripts. We tested 705 serum samples with SAT, among which 96.4% (680/705) showed consistent results compared with routine real-time PCR. About 92% (23/25) discordant samples were confirmed to be same results as SAT-HCV by using a second real-time PCR. The sensitivity and specificity of SAT-HCV assay were 99.6% (461/463) and 100% (242/242), respectively. In conclusion, the SAT assay is an accurate test with a high specificity and sensitivity which may increase the detection rate of HCV. It is therefore a promising tool to diagnose HCV infection.

  7. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    NASA Technical Reports Server (NTRS)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  8. Inverse sequential procedures for the monitoring of time series

    NASA Technical Reports Server (NTRS)

    Radok, Uwe; Brown, Timothy J.

    1995-01-01

    When one or more new values are added to a developing time series, they change its descriptive parameters (mean, variance, trend, coherence). A 'change index (CI)' is developed as a quantitative indicator that the changed parameters remain compatible with the existing 'base' data. CI formulate are derived, in terms of normalized likelihood ratios, for small samples from Poisson, Gaussian, and Chi-Square distributions, and for regression coefficients measuring linear or exponential trends. A substantial parameter change creates a rapid or abrupt CI decrease which persists when the length of the bases is changed. Except for a special Gaussian case, the CI has no simple explicit regions for tests of hypotheses. However, its design ensures that the series sampled need not conform strictly to the distribution form assumed for the parameter estimates. The use of the CI is illustrated with both constructed and observed data samples, processed with a Fortran code 'Sequitor'.

  9. Time accurate application of the MacCormack 2-4 scheme on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Hudson, Dale A.; Long, Lyle N.

    1995-01-01

    Many recent computational efforts in turbulence and acoustics research have used higher order numerical algorithms. One popular method has been the explicit MacCormack 2-4 scheme. The MacCormack 2-4 scheme is second order accurate in time and fourth order accurate in space, and is stable for CFL's below 2/3. Current research has shown that the method can give accurate results but does exhibit significant Gibbs phenomena at sharp discontinuities. The impact of adding Jameson type second, third, and fourth order artificial viscosity was examined here. Category 2 problems, the nonlinear traveling wave and the Riemann problem, were computed using a CFL number of 0.25. This research has found that dispersion errors can be significantly reduced or nearly eliminated by using a combination of second and third order terms in the damping. Use of second and fourth order terms reduced the magnitude of dispersion errors but not as effectively as the second and third order combination. The program was coded using Thinking Machine's CM Fortran, a variant of Fortran 90/High Performance Fortran, and was executed on a 2K CM-200. Simple extrapolation boundary conditions were used for both problems.

  10. Monitoring Forest Regrowth Using a Multi-Platform Time Series

    NASA Technical Reports Server (NTRS)

    Sabol, Donald E., Jr.; Smith, Milton O.; Adams, John B.; Gillespie, Alan R.; Tucker, Compton J.

    1996-01-01

    Over the past 50 years, the forests of western Washington and Oregon have been extensively harvested for timber. This has resulted in a heterogeneous mosaic of remaining mature forests, clear-cuts, new plantations, and second-growth stands that now occur in areas that formerly were dominated by extensive old-growth forests and younger forests resulting from fire disturbance. Traditionally, determination of seral stage and stand condition have been made using aerial photography and spot field observations, a methodology that is not only time- and resource-intensive, but falls short of providing current information on a regional scale. These limitations may be solved, in part, through the use of multispectral images which can cover large areas at spatial resolutions in the order of tens of meters. The use of multiple images comprising a time series potentially can be used to monitor land use (e.g. cutting and replanting), and to observe natural processes such as regeneration, maturation and phenologic change. These processes are more likely to be spectrally observed in a time series composed of images taken during different seasons over a long period of time. Therefore, for many areas, it may be necessary to use a variety of images taken with different imaging systems. A common framework for interpretation is needed that reduces topographic, atmospheric, instrumental, effects as well as differences in lighting geometry between images. The present state of remote-sensing technology in general use does not realize the full potential of the multispectral data in areas of high topographic relief. For example, the primary method for analyzing images of forested landscapes in the Northwest has been with statistical classifiers (e.g. parallelepiped, nearest-neighbor, maximum likelihood, etc.), often applied to uncalibrated multispectral data. Although this approach has produced useful information from individual images in some areas, landcover classes defined by these

  11. Dynamical complexity of short and noisy time series - Compression-Complexity vs. Shannon entropy

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi

    2017-01-01

    Shannon entropy has been extensively used for characterizing complexity of time series arising from chaotic dynamical systems and stochastic processes such as Markov chains. However, for short and noisy time series, Shannon entropy performs poorly. Complexity measures which are based on lossless compression algorithms are a good substitute in such scenarios. We evaluate the performance of two such Compression-Complexity Measures namely Lempel-Ziv complexity (LZ) and Effort-To-Compress (ETC) on short time series from chaotic dynamical systems in the presence of noise. Both LZ and ETC outperform Shannon entropy (H) in accurately characterizing the dynamical complexity of such systems. For very short binary sequences (which arise in neuroscience applications), ETC has higher number of distinct complexity values than LZ and H, thus enabling a finer resolution. For two-state ergodic Markov chains, we empirically show that ETC converges to a steady state value faster than LZ. Compression-Complexity measures are promising for applications which involve short and noisy time series.

  12. Loading effects in GPS vertical displacement time series

    NASA Astrophysics Data System (ADS)

    Memin, A.; Boy, J. P.; Santamaría-Gómez, A.; Watson, C.; Gravelle, M.; Tregoning, P.

    2015-12-01

    Surface deformations due to loading, with yet no comprehensive representation, account for a significant part of the variability in geodetic time series. We assess effects of loading in GPS vertical displacement time series at several frequency bands. We compare displacement derived from up-to-date loading models to two global sets of positioning time series, and investigate how they are reduced looking at interannual periods (> 2 months), intermediate periods (> 7 days) and the whole spectrum (> 1day). We assess the impact of interannual loading on estimating velocities. We compute atmospheric loading effects using surface pressure fields from the ECMWF. We use the inverted barometer (IB) hypothesis valid for periods exceeding a week to describe the ocean response to the pressure forcing. We used general circulation ocean model (ECCO and GLORYS) to account for wind, heat and fresh water flux. We separately use the Toulouse Unstructured Grid Ocean model (TUGO-m), forced by air pressure and winds, to represent the dynamics of the ocean response at high frequencies. The continental water storage is described using GLDAS/Noah and MERRA-land models. Non-hydrology loading reduces the variability of the observed vertical displacement differently according to the frequency band. The hydrology loading leads to a further reduction mostly at annual periods. ECMWF+TUGO-m better agrees with vertical surface motion than the ECMWF+IB model at all frequencies. The interannual deformation is time-correlated at most of the locations. It is adequately described by a power-law process of spectral index varying from -1.5 to -0.2. Depending on the power-law parameters, the predicted non-linear deformation due to mass loading variations leads to vertical velocity biases up to 0.7 mm/yr when estimated from 5 years of continuous observations. The maximum velocity bias can reach up to 1 mm/yr in regions around the southern Tropical band.

  13. Homogenization of historical time series on a subdaily scale

    NASA Astrophysics Data System (ADS)

    Kocen, Renate; Brönnimann, Stefan; Breda, Leila; Spadin, Reto; Begert, Michael; Füllemann, Christine

    2010-05-01

    Homogeneous long-term climatological time series provide useful information on climate back to the preindustrial era. High temporal resolution of climate data is desirable to address trends and variability in the mean climate and in climatic extremes. For Switzerland, three long (~250 yrs) historical time series (Basel, Geneva, Gr. St. Bernhard) that were hitherto available in the form of monthly means only have recently been digitized (in cooperation with MeteoSwiss) on a subdaily scale. The digitized time series contain subdaily data (varies from 2-5 daily measurements) on temperature, precipitation/snow height, pressure and humidity, as subdaily descriptions on wind direction, wind speeds and cloud cover. Long-term climatological records often contain inhomogeneities due to non climatic changes such as station relocations, changes in instrumentation and instrument exposure, changes in observing schedules/practices and environmental changes in the proximity of the observation site. Those disturbances can distort or hide the true climatic signal and could seriously affect the correct assessment and analysis of climate trends, variability and climatic extremes. It is therefore crucial to detect and eliminate artificial shifts and trends, to the extent possible, in the climate data prior to its application. Detailed information of the station history and instruments (metadata) can be of fundamental importance in the process of homogenization in order to support the determination of the exact time of inhomogeneities and the interpretation of statistical test results. While similar methods can be used for the detection of inhomogeneities in subdaily or monthly mean data, quite different correction methods can be chosen. The wealth of information in a high temporal resolution allows more physics-based correction methods. For instance, a detected radiation error in temperature can be corrected with an error model that incorporates radiation and ventilation terms using

  14. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  15. Discovering significant evolution patterns from satellite image time series.

    PubMed

    Petitjean, François; Masseglia, Florent; Gançarski, Pierre; Forestier, Germain

    2011-12-01

    Satellite Image Time Series (SITS) provide us with precious information on land cover evolution. By studying these series of images we can both understand the changes of specific areas and discover global phenomena that spread over larger areas. Changes that can occur throughout the sensing time can spread over very long periods and may have different start time and end time depending on the location, which complicates the mining and the analysis of series of images. This work focuses on frequent sequential pattern mining (FSPM) methods, since this family of methods fits the above-mentioned issues. This family of methods consists of finding the most frequent evolution behaviors, and is actually able to extract long-term changes as well as short term ones, whenever the change may start and end. However, applying FSPM methods to SITS implies confronting two main challenges, related to the characteristics of SITS and the domain's constraints. First, satellite images associate multiple measures with a single pixel (the radiometric levels of different wavelengths corresponding to infra-red, red, etc.), which makes the search space multi-dimensional and thus requires specific mining algorithms. Furthermore, the non evolving regions, which are the vast majority and overwhelm the evolving ones, challenge the discovery of these patterns. We propose a SITS mining framework that enables discovery of these patterns despite these constraints and characteristics. Our proposal is inspired from FSPM and provides a relevant visualization principle. Experiments carried out on 35 images sensed over 20 years show the proposed approach makes it possible to extract relevant evolution behaviors.

  16. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  17. Improvement in global forecast for chaotic time series

    NASA Astrophysics Data System (ADS)

    Alves, P. R. L.; Duarte, L. G. S.; da Mota, L. A. C. P.

    2016-10-01

    In the Polynomial Global Approach to Time Series Analysis, the most costly (computationally speaking) step is the finding of the fitting polynomial. Here we present two routines that improve the forecasting. In the first, an algorithm that greatly improves this situation is introduced and implemented. The heart of this procedure is implemented on the specific routine which performs a mapping with great efficiency. In comparison with the similar procedure of the TimeS package developed by Carli et al. (2014), an enormous gain in efficiency and an increasing in accuracy are obtained. Another development in this work is the establishment of a level of confidence in global prediction with a statistical test for evaluating if the minimization performed is suitable or not. The other program presented in this article applies the Shapiro-Wilk test for checking the normality of the distribution of errors and calculates the expected deviation. The development is employed in observed and simulated time series to illustrate the performance obtained.

  18. Long-term time series prediction using OP-ELM.

    PubMed

    Grigorievskiy, Alexander; Miche, Yoan; Ventelä, Anne-Mari; Séverin, Eric; Lendasse, Amaury

    2014-03-01

    In this paper, an Optimally Pruned Extreme Learning Machine (OP-ELM) is applied to the problem of long-term time series prediction. Three known strategies for the long-term time series prediction i.e. Recursive, Direct and DirRec are considered in combination with OP-ELM and compared with a baseline linear least squares model and Least-Squares Support Vector Machines (LS-SVM). Among these three strategies DirRec is the most time consuming and its usage with nonlinear models like LS-SVM, where several hyperparameters need to be adjusted, leads to relatively heavy computations. It is shown that OP-ELM, being also a nonlinear model, allows reasonable computational time for the DirRec strategy. In all our experiments, except one, OP-ELM with DirRec strategy outperforms the linear model with any strategy. In contrast to the proposed algorithm, LS-SVM behaves unstably without variable selection. It is also shown that there is no superior strategy for OP-ELM: any of three can be the best. In addition, the prediction accuracy of an ensemble of OP-ELM is studied and it is shown that averaging predictions of the ensemble can improve the accuracy (Mean Square Error) dramatically.

  19. On the use of spring baseflow recession for a more accurate parameterization of aquifer transit time distribution functions

    NASA Astrophysics Data System (ADS)

    Farlin, J.; Maloszewski, P.

    2013-05-01

    Baseflow recession analysis and groundwater dating have up to now developed as two distinct branches of hydrogeology and have been used to solve entirely different problems. We show that by combining two classical models, namely the Boussinesq equation describing spring baseflow recession, and the exponential piston-flow model used in groundwater dating studies, the parameters describing the transit time distribution of an aquifer can be in some cases estimated to a far more accurate degree than with the latter alone. Under the assumption that the aquifer basis is sub-horizontal, the mean transit time of water in the saturated zone can be estimated from spring baseflow recession. This provides an independent estimate of groundwater transit time that can refine those obtained from tritium measurements. The approach is illustrated in a case study predicting atrazine concentration trend in a series of springs draining the fractured-rock aquifer known as the Luxembourg Sandstone. A transport model calibrated on tritium measurements alone predicted different times to trend reversal following the nationwide ban on atrazine in 2005 with different rates of decrease. For some of the springs, the actual time of trend reversal and the rate of change agreed extremely well with the model calibrated using both tritium measurements and the recession of spring discharge during the dry season. The agreement between predicted and observed values was however poorer for the springs displaying the most gentle recessions, possibly indicating a stronger influence of continuous groundwater recharge during the summer months.

  20. On the use of spring baseflow recession for a more accurate parameterization of aquifer transit time distribution functions

    NASA Astrophysics Data System (ADS)

    Farlin, J.; Maloszewski, P.

    2012-12-01

    Baseflow recession analysis and groundwater dating have up to now developed as two distinct branches of hydrogeology and were used to solve entirely different problems. We show that by combining two classical models, namely Boussinesq's Equation describing spring baseflow recession and the exponential piston-flow model used in groundwater dating studies, the parameters describing the transit time distribution of an aquifer can be in some cases estimated to a far more accurate degree than with the latter alone. Under the assumption that the aquifer basis is sub-horizontal, the mean residence time of water in the saturated zone can be estimated from spring baseflow recession. This provides an independent estimate of groundwater residence time that can refine those obtained from tritium measurements. This approach is demonstrated in a case study predicting atrazine concentration trend in a series of springs draining the fractured-rock aquifer known as the Luxembourg Sandstone. A transport model calibrated on tritium measurements alone predicted different times to trend reversal following the nationwide ban on atrazine in 2005 with different rates of decrease. For some of the springs, the best agreement between observed and predicted time of trend reversal was reached for the model calibrated using both tritium measurements and the recession of spring discharge during the dry season. The agreement between predicted and observed values was however poorer for the springs displaying the most gentle recessions, possibly indicating the stronger influence of continuous groundwater recharge during the dry period.

  1. Time resolved diffuse optical spectroscopy with geometrically accurate models for bulk parameter recovery

    PubMed Central

    Guggenheim, James A.; Bargigia, Ilaria; Farina, Andrea; Pifferi, Antonio; Dehghani, Hamid

    2016-01-01

    A novel straightforward, accessible and efficient approach is presented for performing hyperspectral time-domain diffuse optical spectroscopy to determine the optical properties of samples accurately using geometry specific models. To allow bulk parameter recovery from measured spectra, a set of libraries based on a numerical model of the domain being investigated is developed as opposed to the conventional approach of using an analytical semi-infinite slab approximation, which is known and shown to introduce boundary effects. Results demonstrate that the method improves the accuracy of derived spectrally varying optical properties over the use of the semi-infinite approximation. PMID:27699137

  2. Nonlinear Aeroelastic Analysis Using a Time-Accurate Navier-Stokes Equations Solver

    NASA Technical Reports Server (NTRS)

    Kuruvila, Geojoe; Bartels, Robert E.; Hong, Moeljo S.; Bhatia, G.

    2007-01-01

    A method to simulate limit cycle oscillation (LCO) due to control surface freeplay using a modified CFL3D, a time-accurate Navier-Stokes computational fluid dynamics (CFD) analysis code with structural modeling capability, is presented. This approach can be used to analyze aeroelastic response of aircraft with structural behavior characterized by nonlinearity in the force verses displacement curve. A limited validation of the method, using very low Mach number experimental data for a three-degrees-of-freedom (pitch/plunge/flap deflection) airfoil model with flap freeplay, is also presented.

  3. Evaluating the capability of time-of-flight cameras for accurately imaging a cyclically loaded beam

    NASA Astrophysics Data System (ADS)

    Lahamy, Hervé; Lichti, Derek; El-Badry, Mamdouh; Qi, Xiaojuan; Detchev, Ivan; Steward, Jeremy; Moravvej, Mohammad

    2015-05-01

    Time-of-flight cameras are used for diverse applications ranging from human-machine interfaces and gaming to robotics and earth topography. This paper aims at evaluating the capability of the Mesa Imaging SR4000 and the Microsoft Kinect 2.0 time-of-flight cameras for accurately imaging the top surface of a concrete beam subjected to fatigue loading in laboratory conditions. Whereas previous work has demonstrated the success of such sensors for measuring the response at point locations, the aim here is to measure the entire beam surface in support of the overall objective of evaluating the effectiveness of concrete beam reinforcement with steel fibre reinforced polymer sheets. After applying corrections for lens distortions to the data and differencing images over time to remove systematic errors due to internal scattering, the periodic deflections experienced by the beam have been estimated for the entire top surface of the beam and at witness plates attached. The results have been assessed by comparison with measurements from highly-accurate laser displacement transducers. This study concludes that both the Microsoft Kinect 2.0 and the Mesa Imaging SR4000s are capable of sensing a moving surface with sub-millimeter accuracy once the image distortions have been modeled and removed.

  4. Accurate Time/Frequency Transfer Method Using Bi-Directional WDM Transmission

    NASA Technical Reports Server (NTRS)

    Imaoka, Atsushi; Kihara, Masami

    1996-01-01

    An accurate time transfer method is proposed using b-directional wavelength division multiplexing (WDM) signal transmission along a single optical fiber. This method will be used in digital telecommunication networks and yield a time synchronization accuracy of better than 1 ns for long transmission lines over several tens of kilometers. The method can accurately measure the difference in delay between two wavelength signals caused by the chromatic dispersion of the fiber in conventional simple bi-directional dual-wavelength frequency transfer methods. We describe the characteristics of this difference in delay and then show that the accuracy of the delay measurements can be obtained below 0.1 ns by transmitting 156 Mb/s times reference signals of 1.31 micrometer and 1.55 micrometers along a 50 km fiber using the proposed method. The sub-nanosecond delay measurement using the simple bi-directional dual-wavelength transmission along a 100 km fiber with a wavelength spacing of 1 nm in the 1.55 micrometer range is also shown.

  5. An Accurate Timing Alignment Method with Time-to-Digital Converter Linearity Calibration for High-Resolution TOF PET.

    PubMed

    Li, Hongdi; Wang, Chao; An, Shaohui; Lu, Xingyu; Dong, Yun; Liu, Shitao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Wong, Wai-Hoi

    2015-06-01

    Accurate PET system timing alignment minimizes the coincidence time window and therefore reduces random events and improves image quality. It is also critical for time-of-flight (TOF) image reconstruction. Here, we use a thin annular cylinder (shell) phantom filled with a radioactive source and located axially and centrally in a PET camera for the timing alignment of a TOF PET system. This timing alignment method involves measuring the time differences between the selected coincidence detector pairs, calibrating the differential and integral nonlinearity of the time-to-digital converter (TDC) with the same raw data and deriving the intrinsic time biases for each detector using an iterative algorithm. The raw time bias for each detector is downloaded to the front-end electronics and the residual fine time bias can be applied during the TOF list-mode reconstruction. Our results showed that a timing alignment accuracy of better than ±25 ps can be achieved, and a preliminary timing resolution of 473 ps (full width at half maximum) was measured in our prototype TOF PET/CT system.

  6. An Accurate Timing Alignment Method with Time-to-Digital Converter Linearity Calibration for High-Resolution TOF PET

    PubMed Central

    Li, Hongdi; Wang, Chao; An, Shaohui; Lu, Xingyu; Dong, Yun; Liu, Shitao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Wong, Wai-Hoi

    2015-01-01

    Accurate PET system timing alignment minimizes the coincidence time window and therefore reduces random events and improves image quality. It is also critical for time-of-flight (TOF) image reconstruction. Here, we use a thin annular cylinder (shell) phantom filled with a radioactive source and located axially and centrally in a PET camera for the timing alignment of a TOF PET system. This timing alignment method involves measuring the time differences between the selected coincidence detector pairs, calibrating the differential and integral nonlinearity of the time-to-digital converter (TDC) with the same raw data and deriving the intrinsic time biases for each detector using an iterative algorithm. The raw time bias for each detector is downloaded to the front-end electronics and the residual fine time bias can be applied during the TOF list-mode reconstruction. Our results showed that a timing alignment accuracy of better than ±25 ps can be achieved, and a preliminary timing resolution of 473 ps (full width at half maximum) was measured in our prototype TOF PET/CT system. PMID:26543243

  7. Synthesis of rainfall time series in a high temporal resolution

    NASA Astrophysics Data System (ADS)

    Callau Poduje, Ana Claudia; Haberlandt, Uwe

    2014-05-01

    In order to optimize the design and operation of urban drainage systems, long and continuous rain series in a high temporal resolution are essential. As the length of the rainfall records is often short, particularly the data available with the temporal and regional resolutions required for urban hydrology, it is necessary to find some numerical representation of the precipitation phenomenon to generate long synthetic rainfall series. An Alternating Renewal Model (ARM) is applied for this purpose, which consists of two structures: external and internal. The former is the sequence of wet and dry spells, described by their durations which are simulated stochastically. The internal structure is characterized by the amount of rain corresponding to each wet spell and its distribution within the spell. A multivariate frequency analysis is applied to analyze the internal structure of the wet spells and to generate synthetic events. The stochastic time series must reproduce the statistical characteristics of observed high resolution precipitation measurements used to generate them. The spatio-temporal interdependencies between stations are addressed by resampling the continuous synthetic series based on the Simulated Annealing (SA) procedure. The state of Lower-Saxony and surrounding areas, located in the north-west of Germany is used to develop the ARM. A total of 26 rainfall stations with high temporal resolution records, i.e. rainfall data every 5 minutes, are used to define the events, find the most suitable probability distributions, calibrate the corresponding parameters, simulate long synthetic series and evaluate the results. The length of the available data ranges from 10 to 20 years. The rainfall series involved in the different steps of calculation are compared using a rainfall-runoff model to simulate the runoff behavior in urban areas. The EPA Storm Water Management Model (SWMM) is applied for this evaluation. The results show a good representation of the

  8. Time series predictions with neural nets: Application to airborne pollen forecasting

    NASA Astrophysics Data System (ADS)

    Arizmendi, C. M.; Sanchez, J. R.; Ramos, N. E.; Ramos, G. I.

    1993-09-01

    Pollen allergy is a common disease causing rhinoconjunctivitis (hay fever) in 5 10% of the population. Medical studies have indicated that pollen related diseases could be highly reduced if future pollen contents in the air could be predicted. In this work we have developed a new forecasting method that applies the ability of neural nets to predict the future behaviour of chaotic systems in order to make accurate predictions of the airborne pollen concentration. The method requires that the neural net be fed with non-zero values, which restricts the method predictions to the period following the start of pollen flight. The operational method outlined here constitutes a different point of view with respect to the more generally used forecasts of time series analysis, which require input of many meteorological parameters. Excellent forecasts were obtained training a neural net by using only the time series pollen concentration values.

  9. Predictive time-series modeling using artificial neural networks for Linac beam symmetry: an empirical study.

    PubMed

    Li, Qiongge; Chan, Maria F

    2017-01-01

    Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field.

  10. Mapping Brazilian savanna vegetation gradients with Landsat time series

    NASA Astrophysics Data System (ADS)

    Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick

    2016-10-01

    Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important

  11. Aerosol Climate Time Series Evaluation In ESA Aerosol_cci

    NASA Astrophysics Data System (ADS)

    Popp, T.; de Leeuw, G.; Pinnock, S.

    2015-12-01

    Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. By the end of 2015 full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which are also validated. The paper will summarize and discuss the results of major reprocessing and validation conducted in 2015. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products

  12. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    PubMed Central

    Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli

    2016-01-01

    In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices. PMID:27293423

  13. Assemblage time series reveal biodiversity change but not systematic loss.

    PubMed

    Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E

    2014-04-18

    The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority.

  14. Hardware and Software Developments for the Accurate Time-Linked Data Acquisition System

    SciTech Connect

    BERG,DALE E.; RUMSEY,MARK A.; ZAYAS,JOSE R.

    1999-11-09

    Wind-energy researchers at Sandia National Laboratories have developed a new, light-weight, modular data acquisition system capable of acquiring long-term, continuous, multi-channel time-series data from operating wind-turbines. New hardware features have been added to this system to make it more flexible and permit programming via telemetry. User-friendly Windows-based software has been developed for programming the hardware and acquiring, storing, analyzing, and archiving the data. This paper briefly reviews the major components of the system, summarizes the recent hardware enhancements and operating experiences, and discusses the features and capabilities of the software programs that have been developed.

  15. Artificial neural networks applied to forecasting time series.

    PubMed

    Montaño Moreno, Juan J; Palmer Pol, Alfonso; Muñoz Gracia, Pilar

    2011-04-01

    This study offers a description and comparison of the main models of Artificial Neural Networks (ANN) which have proved to be useful in time series forecasting, and also a standard procedure for the practical application of ANN in this type of task. The Multilayer Perceptron (MLP), Radial Base Function (RBF), Generalized Regression Neural Network (GRNN), and Recurrent Neural Network (RNN) models are analyzed. With this aim in mind, we use a time series made up of 244 time points. A comparative study establishes that the error made by the four neural network models analyzed is less than 10%. In accordance with the interpretation criteria of this performance, it can be concluded that the neural network models show a close fit regarding their forecasting capacity. The model with the best performance is the RBF, followed by the RNN and MLP. The GRNN model is the one with the worst performance. Finally, we analyze the advantages and limitations of ANN, the possible solutions to these limitations, and provide an orientation towards future research.

  16. Monthly hail time series analysis related to agricultural insurance

    NASA Astrophysics Data System (ADS)

    Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.

    2010-05-01

    Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.

  17. On the maximum-entropy/autoregressive modeling of time series

    NASA Technical Reports Server (NTRS)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  18. A four-stage hybrid model for hydrological time series forecasting.

    PubMed

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  19. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    NASA Astrophysics Data System (ADS)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  20. FROG: Time Series Analysis for the Web Service Era

    NASA Astrophysics Data System (ADS)

    Allan, A.

    2005-12-01

    The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).

  1. A Mixed Exponential Time Series Model. NMEARMA(p,q).

    DTIC Science & Technology

    1980-03-01

    AD-AO85 316 NAVAL POSTGRADUATE SCHOOL MONTEREY CA F/G 12/1 A MIXED EXPONENTIAL TIME SERIES MODEL. NMEARMA(P,Q).(U MAR GO A .J LAWRANCE , P A LEWIS...This report was prepared by: A. J. Lawrance University of Birmingham Birmingham, England Reviewed by: Released by- Michael G. Sover’ign, Chirman...MODEL, NMEARMA(p,q) by A. J. Lawrance P. A. W. Lewis University of Birmingham Naval Postgraduate School Birmingham, England Monterey, California, USA

  2. Ensemble Deep Learning for Biomedical Time Series Classification

    PubMed Central

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost. PMID:27725828

  3. Modified correlation entropy estimation for a noisy chaotic time series.

    PubMed

    Jayawardena, A W; Xu, Pengcheng; Li, W K

    2010-06-01

    A method of estimating the Kolmogorov-Sinai (KS) entropy, herein referred to as the modified correlation entropy, is presented. The method can be applied to both noise-free and noisy chaotic time series. It has been applied to some clean and noisy data sets and the numerical results show that the modified correlation entropy is closer to the KS entropy of the nonlinear system calculated by the Lyapunov spectrum than the general correlation entropy. Moreover, the modified correlation entropy is more robust to noise than the correlation entropy.

  4. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    NASA Astrophysics Data System (ADS)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  5. On the Prediction of α-Stable Time Series

    NASA Astrophysics Data System (ADS)

    Mohammadi, Mohammad; Mohammadpour, Adel

    2016-07-01

    This paper addresses the point prediction of α-stable time series. Our key idea is to define a new Hilbert space that contains α-stable processes. Then, we apply the advantage of Hilbert space theory for finding the best linear prediction. We show how to use the presented predictor practically for α-stable linear processes. The implementation of the presented method is easier than the implementation of the minimum dispersion method. We reveal the appropriateness of the presented method through an empirical study on predicting the natural logarithms of the volumes of SP500 market.

  6. Time series analysis using semiparametric regression on oil palm production

    NASA Astrophysics Data System (ADS)

    Yundari, Pasaribu, U. S.; Mukhaiyar, U.

    2016-04-01

    This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).

  7. Surrogate-assisted network analysis of nonlinear time series

    NASA Astrophysics Data System (ADS)

    Laut, Ingo; Räth, Christoph

    2016-10-01

    The performance of recurrence networks and symbolic networks to detect weak nonlinearities in time series is compared to the nonlinear prediction error. For the synthetic data of the Lorenz system, the network measures show a comparable performance. In the case of relatively short and noisy real-world data from active galactic nuclei, the nonlinear prediction error yields more robust results than the network measures. The tests are based on surrogate data sets. The correlations in the Fourier phases of data sets from some surrogate generating algorithms are also examined. The phase correlations are shown to have an impact on the performance of the tests for nonlinearity.

  8. Best linear forecast of volatility in financial time series

    NASA Astrophysics Data System (ADS)

    Krivoruchenko, M. I.

    2004-09-01

    The autocorrelation function of volatility in financial time series is fitted well by a superposition of several exponents. This case admits an explicit analytical solution of the problem of constructing the best linear forecast of a stationary stochastic process. We describe and apply the proposed analytical method for forecasting volatility. The leverage effect and volatility clustering are taken into account. Parameters of the predictor function are determined numerically for the Dow Jones 30 Industrial Average. Connection of the proposed method to the popular autoregressive conditional heteroskedasticity models is discussed.

  9. Disease management with ARIMA model in time series.

    PubMed

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  10. Kernel canonical-correlation Granger causality for multiple time series.

    PubMed

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  11. Ensemble Deep Learning for Biomedical Time Series Classification.

    PubMed

    Jin, Lin-Peng; Dong, Jun

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  12. Chaotic time series analysis in economics: Balance and perspectives

    SciTech Connect

    Faggini, Marisa

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  13. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    PubMed

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic.

  14. Accurate Behavioral Simulator of All-Digital Time-Domain Smart Temperature Sensors by Using SIMULINK.

    PubMed

    Chen, Chun-Chi; Chen, Chao-Lieh; Lin, You-Ting

    2016-08-08

    This study proposes a new behavioral simulator that uses SIMULINK for all-digital CMOS time-domain smart temperature sensors (TDSTSs) for performing rapid and accurate simulations. Inverter-based TDSTSs offer the benefits of low cost and simple structure for temperature-to-digital conversion and have been developed. Typically, electronic design automation tools, such as HSPICE, are used to simulate TDSTSs for performance evaluations. However, such tools require extremely long simulation time and complex procedures to analyze the results and generate figures. In this paper, we organize simple but accurate equations into a temperature-dependent model (TDM) by which the TDSTSs evaluate temperature behavior. Furthermore, temperature-sensing models of a single CMOS NOT gate were devised using HSPICE simulations. Using the TDM and these temperature-sensing models, a novel simulator in SIMULINK environment was developed to substantially accelerate the simulation and simplify the evaluation procedures. Experiments demonstrated that the simulation results of the proposed simulator have favorable agreement with those obtained from HSPICE simulations, showing that the proposed simulator functions successfully. This is the first behavioral simulator addressing the rapid simulation of TDSTSs.

  15. Accurate Behavioral Simulator of All-Digital Time-Domain Smart Temperature Sensors by Using SIMULINK

    PubMed Central

    Chen, Chun-Chi; Chen, Chao-Lieh; Lin, You-Ting

    2016-01-01

    This study proposes a new behavioral simulator that uses SIMULINK for all-digital CMOS time-domain smart temperature sensors (TDSTSs) for performing rapid and accurate simulations. Inverter-based TDSTSs offer the benefits of low cost and simple structure for temperature-to-digital conversion and have been developed. Typically, electronic design automation tools, such as HSPICE, are used to simulate TDSTSs for performance evaluations. However, such tools require extremely long simulation time and complex procedures to analyze the results and generate figures. In this paper, we organize simple but accurate equations into a temperature-dependent model (TDM) by which the TDSTSs evaluate temperature behavior. Furthermore, temperature-sensing models of a single CMOS NOT gate were devised using HSPICE simulations. Using the TDM and these temperature-sensing models, a novel simulator in SIMULINK environment was developed to substantially accelerate the simulation and simplify the evaluation procedures. Experiments demonstrated that the simulation results of the proposed simulator have favorable agreement with those obtained from HSPICE simulations, showing that the proposed simulator functions successfully. This is the first behavioral simulator addressing the rapid simulation of TDSTSs. PMID:27509507

  16. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    NASA Astrophysics Data System (ADS)

    An, Zhe; Rey, Daniel; Ye, Jingxin; Abarbanel, Henry D. I.

    2017-01-01

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of the full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. We show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.

  17. Bayesian inference of selection in a heterogeneous environment from genetic time-series data.

    PubMed

    Gompert, Zachariah

    2016-01-01

    Evolutionary geneticists have sought to characterize the causes and molecular targets of selection in natural populations for many years. Although this research programme has been somewhat successful, most statistical methods employed were designed to detect consistent, weak to moderate selection. In contrast, phenotypic studies in nature show that selection varies in time and that individual bouts of selection can be strong. Measurements of the genomic consequences of such fluctuating selection could help test and refine hypotheses concerning the causes of ecological specialization and the maintenance of genetic variation in populations. Herein, I proposed a Bayesian nonhomogeneous hidden Markov model to estimate effective population sizes and quantify variable selection in heterogeneous environments from genetic time-series data. The model is described and then evaluated using a series of simulated data, including cases where selection occurs on a trait with a simple or polygenic molecular basis. The proposed method accurately distinguished neutral loci from non-neutral loci under strong selection, but not from those under weak selection. Selection coefficients were accurately estimated when selection was constant or when the fitness values of genotypes varied linearly with the environment, but these estimates were less accurate when fitness was polygenic or the relationship between the environment and the fitness of genotypes was nonlinear. Past studies of temporal evolutionary dynamics in laboratory populations have been remarkably successful. The proposed method makes similar analyses of genetic time-series data from natural populations more feasible and thereby could help answer fundamental questions about the causes and consequences of evolution in the wild.

  18. Time Accurate CFD Simulations of the Orion Launch Abort Vehicle in the Transonic Regime

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph; Rojahn, Josh

    2011-01-01

    Significant asymmetries in the fluid dynamics were calculated for some cases in the CFD simulations of the Orion Launch Abort Vehicle through its abort trajectories. The CFD simulations were performed steady state with symmetric boundary conditions and geometries. The trajectory points at issue were in the transonic regime, at 0 and 5 angles of attack with the Abort Motors with and without the Attitude Control Motors (ACM) firing. In some of the cases the asymmetric fluid dynamics resulted in aerodynamic side forces that were large enough that would overcome the control authority of the ACMs. MSFC s Fluid Dynamics Group supported the investigation into the cause of the flow asymmetries with time accurate CFD simulations, utilizing a hybrid RANS-LES turbulence model. The results show that the flow over the vehicle and the subsequent interaction with the AB and ACM motor plumes were unsteady. The resulting instantaneous aerodynamic forces were oscillatory with fairly large magnitudes. Time averaged aerodynamic forces were essentially symmetric.

  19. In-Band Asymmetry Compensation for Accurate Time/Phase Transport over Optical Transport Network

    PubMed Central

    Siu, Sammy; Hu, Hsiu-fang; Lin, Shinn-Yan; Liao, Chia-Shu; Lai, Yi-Liang

    2014-01-01

    The demands of precise time/phase synchronization have been increasing recently due to the next generation of telecommunication synchronization. This paper studies the issues that are relevant to distributing accurate time/phase over optical transport network (OTN). Each node and link can introduce asymmetry, which affects the adequate time/phase accuracy over the networks. In order to achieve better accuracy, protocol level full timing support is used (e.g., Telecom-Boundary clock). Due to chromatic dispersion, the use of different wavelengths consequently causes fiber link delay asymmetry. The analytical result indicates that it introduces significant time error (i.e., phase offset) within 0.3397 ns/km in C-band or 0.3943 ns/km in L-band depending on the wavelength spacing. With the proposed scheme in this paper, the fiber link delay asymmetry can be compensated relying on the estimated mean fiber link delay by the Telecom-Boundary clock, while the OTN control plane is responsible for processing the fiber link delay asymmetry to determine the asymmetry compensation in the timing chain. PMID:24982948

  20. Time-Accurate Unsteady Pressure Loads Simulated for the Space Launch System at Wind Tunnel Conditions

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.; Brauckmann, Gregory J.; Kleb, William L.; Glass, Christopher E.; Streett, Craig L.; Schuster, David M.

    2015-01-01

    A transonic flow field about a Space Launch System (SLS) configuration was simulated with the Fully Unstructured Three-Dimensional (FUN3D) computational fluid dynamics (CFD) code at wind tunnel conditions. Unsteady, time-accurate computations were performed using second-order Delayed Detached Eddy Simulation (DDES) for up to 1.5 physical seconds. The surface pressure time history was collected at 619 locations, 169 of which matched locations on a 2.5 percent wind tunnel model that was tested in the 11 ft. x 11 ft. test section of the NASA Ames Research Center's Unitary Plan Wind Tunnel. Comparisons between computation and experiment showed that the peak surface pressure RMS level occurs behind the forward attach hardware, and good agreement for frequency and power was obtained in this region. Computational domain, grid resolution, and time step sensitivity studies were performed. These included an investigation of pseudo-time sub-iteration convergence. Using these sensitivity studies and experimental data comparisons, a set of best practices to date have been established for FUN3D simulations for SLS launch vehicle analysis. To the author's knowledge, this is the first time DDES has been used in a systematic approach and establish simulation time needed, to analyze unsteady pressure loads on a space launch vehicle such as the NASA SLS.

  1. Prediction of altimetric sea level anomalies using time series models based on spatial correlation

    NASA Astrophysics Data System (ADS)

    Miziński, Bartłomiej; Niedzielski, Tomasz

    2014-05-01

    Sea level anomaly (SLA) times series, which are time-varying gridded data, can be modelled and predicted using time series methods. This approach has been shown to provide accurate forecasts within the Prognocean system, the novel infrastructure for anticipating sea level change designed and built at the University of Wrocław (Poland) which utilizes the real-time SLA data from Archiving, Validation and Interpretation of Satellite Oceanographic data (AVISO). The system runs a few models concurrently, and our ocean prediction experiment includes both uni- and multivariate time series methods. The univariate ones are: extrapolation of polynomial-harmonic model (PH), extrapolation of polynomial-harmonic model and autoregressive prediction (PH+AR), extrapolation of polynomial-harmonic model and self-exciting threshold autoregressive prediction (PH+SETAR). The following multivariate methods are used: extrapolation of polynomial-harmonic model and vector autoregressive prediction (PH+VAR), extrapolation of polynomial-harmonic model and generalized space-time autoregressive prediction (PH+GSTAR). As the aforementioned models and the corresponding forecasts are computed in real time, hence independently and in the same computational setting, we are allowed to compare the accuracies offered by the models. The objective of this work is to verify the hypothesis that the multivariate prediction techniques, which make use of cross-correlation and spatial correlation, perform better than the univariate ones. The analysis is based on the daily-fitted and updated time series models predicting the SLA data (lead time of two weeks) over several months when El Niño/Southern Oscillation (ENSO) was in its neutral state.

  2. Exploring the Dynamics of Personality Change with Time Series Models

    NASA Astrophysics Data System (ADS)

    Keller, Ferdinand; Storch, Maja; Bigler, Susanne

    This paper aims to show possible refinements in time series methods for evaluating the dynamics of personality change. For the study. 13 students attended a course of personality development based on Jungian theory. The course teaches how to contact one's personal self. For four months the students rated their mood, activity, tension, and feeling of inner control on visual analogue scales twice a day. Standard examination with ARIMA models yield that most subjects show a low to moderate correlation to the previous timepoint. About one third of the cases have an additional lag2-relation. Daytime effects are seldom and the residual tests for the ARIMA models suggest that these linear models are sufficient in describing most of the time series. To evalute the expected smooth transformations in personality the data from one subject is analysed and the following hypotheses are empirically tested by the time-variation of parameters in subsequent time windows: 1) Increasing stability in mood and in the feeling of inner control by decreasing standard deviations 2) higher innerpsychic coherence by increasing autocorrelation coefficients 3) dissociation between mood and feeling of inner control by decreasing cross-correlation coefficients between these two dimensions. Application of several statistical tests shows that hypothesis 1 can be accepted while the other two hypotheses cannot be confirmed. Some methodological difficulties emerge when applied to `real' data and some limitations are found in the statistical testing of time-varying parameters. Overall, though, the proposed methods for examining emotional variability have proven valuable and promising for further research.

  3. Detection of intermittent events in atmospheric time series

    NASA Astrophysics Data System (ADS)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    associated with the occurrence of critical events in the atmospheric dynamics. The critical events are associated with transitions between meta-stable configurations. Consequently, this approach could give some effort in the study of Extreme Events in meteorology and climatology and in weather classification schemes. Then, the renewal approach could give some effort in the modelling of non-Gaussian closures for turbulent fluxes [3]. In the proposed approach the main features that need to be estimated are: (a) the distribution of life-times of a given atmospheric meta-stable structure (Waiting Times between two critical events); (b) the statistical distribution of fluctuations; (c) the presence of memory in the time series. These features are related to the evaluation of memory content and scaling from the time series. In order to analyze these features, in recent years some novel statistical techniques have been developed. In particular, the analysis of Diffusion Entropy [4] was shown to be a robust method for the determination of the dynamical scaling. This property is related to the power-law behaviour of the life-time statistics and to the memory properties of the time series. The analysis of Renewal Aging [5], based on renewal theory [2], allows to estimate the content of memory in a time series that is related to the amount of critical events in the time series itself. After a brief review of the statistical techniques (Diffusion Entropy and Renewal Aging), an application to experimental atmospheric time series will be illustrated. References [1] Weiss G.H., Rubin R.J., Random Walks: theory and selected applications, Advances in Chemical Physics,1983, 52, 363-505 (1983). [2] D.R. Cox, Renewal Theory, Methuen, London (1962). [3] P. Paradisi, R. Cesari, F. Mainardi, F. Tampieri: The fractional Fick's law for non-local transport processes, Physica A, 293, p. 130-142 (2001). [4] P. Grigolini, L. Palatella, G. Raffaelli, Fractals 9 (2001) 439. [5] P. Allegrini, F. Barbi, P

  4. A solution for measuring accurate reaction time to visual stimuli realized with a programmable microcontroller.

    PubMed

    Ohyanagi, Toshio; Sengoku, Yasuhito

    2010-02-01

    This article presents a new solution for measuring accurate reaction time (SMART) to visual stimuli. The SMART is a USB device realized with a Cypress Programmable System-on-Chip (PSoC) mixed-signal array programmable microcontroller. A brief overview of the hardware and firmware of the PSoC is provided, together with the results of three experiments. In Experiment 1, we investigated the timing accuracy of the SMART in measuring reaction time (RT) under different conditions of operating systems (OSs; Windows XP or Vista) and monitor displays (a CRT or an LCD). The results indicated that the timing error in measuring RT by the SMART was less than 2 msec, on average, under all combinations of OS and display and that the SMART was tolerant to jitter and noise. In Experiment 2, we tested the SMART with 8 participants. The results indicated that there was no significant difference among RTs obtained with the SMART under the different conditions of OS and display. In Experiment 3, we used Microsoft (MS) PowerPoint to present visual stimuli on the display. We found no significant difference in RTs obtained using MS DirectX technology versus using the PowerPoint file with the SMART. We are certain that the SMART is a simple and practical solution for measuring RTs accurately. Although there are some restrictions in using the SMART with RT paradigms, the SMART is capable of providing both researchers and health professionals working in clinical settings with new ways of using RT paradigms in their work.

  5. Short-term prediction of solar irradiance using time-series analysis

    SciTech Connect

    Chowdhury, B.H. . Dept. of Electrical Engineering)

    1990-01-01

    A new statistical model for solar irradiance prediction is described. The method makes use of the atmospheric parameterizations as well as a time-series model to forecast a sequence of global irradiance in the 3--10 min time frame. A survey of some of the prominent research of the recent past reveals a definite lack of irradiance models that approach subhourly intervals, especially in the range mentioned. In this article, accurate parameterizations of atmospheric phenomena are used in a prewhitening process so that a time-series model may be used effectively to forecast irradiance components up to an hour in advance in the 3--10 min time intervals. The model requires only previous global horizontal irradiance measurement at a site. Results show that when compared with actual data on two locations in the southeaster United States, the forecasts are quite accurate, and the model is site-independent. Under some instances, forecasts may be inaccurate when there are sudden transitional changes in the cloud cover moving across the sun. In order for the proposed irradiance model to predict such transitional changes correctly, frequent forecast updates become necessary.

  6. A 40 Year Time Series of SBUV Observations: the Version 8.6 Processing

    NASA Technical Reports Server (NTRS)

    McPeters, Richard; Bhartia, P. K.; Flynn, L.

    2012-01-01

    Under a NASA program to produce long term data records from instruments on multiple satellites (MEaSUREs), data from a series of eight SBUV and SBUV 12 instruments have been reprocessed to create a 40 year long ozone time series. Data from the Nimbus 4 BUV, Nimbus 7 SBUV, and SBUV/2 instruments on NOAA 9, 11, 14, 16, 17, and 18 were used covering the period 1970 to 1972 and 1979 to the present. In past analyses an ozone time series was created from these instruments by adjusting ozone itself, instrument by instrument, for consistency during overlap periods. In the version 8.6 processing adjustments were made to the radiance calibration of each instrument to maintain a consistent calibration over the entire time series. Data for all eight instruments were then reprocessed using the adjusted radiances. Reprocessing is necessary to produce an accurate latitude dependence. Other improvements incorporated in version 8.6 included the use of the ozone cross sections of Brion, Daumont, and Malicet, and the use of a cloud height climatology derived from Aura OMI measurements. The new cross sections have a more accurate temperature dependence than the cross sections previously used. The OMI-based cloud heights account for the penetration of UV into the upper layers of clouds. The consistency of the version 8.6 time series was evaluated by intra-instrument comparisons during overlap periods, comparisons with ground-based instruments, and comparisons with measurements made by instruments on other satellites such as SAGE II and UARS MLS. These comparisons show that for the instruments on NOAA 16, 17 and 18, the instrument calibrations were remarkably stable and consistent from instrument to instrument. The data record from the Nimbus 7 SBUV was also very stable, and SAGE and ground-based comparisons show that the' calibration was consistent with measurements made years laterby the NOAA 16 instrument. The calibrations of the SBUV/2 instruments on NOAA 9, 11, and 14 were more of

  7. Coastal Atmosphere and Sea Time Series (CoASTS)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Berthon, Jean-Francoise; Zibordi, Giuseppe; Doyle, John P.; Grossi, Stefania; vanderLinde, Dirk; Targa, Cristina; McClain, Charles R. (Technical Monitor)

    2002-01-01

    In this document, the first three years of a time series of bio-optical marine and atmospheric measurements are presented and analyzed. These measurements were performed from an oceanographic tower in the northern Adriatic Sea within the framework of the Coastal Atmosphere and Sea Time Series (CoASTS) project, an ocean color calibration and validation activity. The data set collected includes spectral measurements of the in-water apparent (diffuse attenuation coefficient, reflectance, Q-factor, etc.) and inherent (absorption and scattering coefficients) optical properties, as well as the concentrations of the main optical components (pigment and suspended matter concentrations). Clear seasonal patterns are exhibited by the marine quantities on which an appreciable short-term variability (on the order of a half day to one day) is superimposed. This short-term variability is well correlated with the changes in salinity at the surface resulting from the southward transport of freshwater coming from the northern rivers. Concentrations of chlorophyll alpha and total suspended matter span more than two orders of magnitude. The bio-optical characteristics of the measurement site pertain to both Case-I (about 64%) and Case-II (about 36%) waters, based on a relationship between the beam attenuation coefficient at 660nm and the chlorophyll alpha concentration. Empirical algorithms relating in-water remote sensing reflectance ratios and optical components or properties of interest (chlorophyll alpha, total suspended matter, and the diffuse attenuation coefficient) are presented.

  8. Assessing earthquake catalogues in Venezuela by analyzing time series data

    NASA Astrophysics Data System (ADS)

    Vasquez, R.; Granado, C.

    2011-12-01

    We applied the Mann-Kendall non-parametric test for identifying significant trends in time series data regarding the seismicity patterns in Venezuela during the period 2001-2010. The entire seismicity region is divided in three areas to perform the test: 1) West with 12774 seismic events; 2) Center for a total of 909 earthquakes and 3) East with 6382 earthquakes. We analyzed the catalogues for every sub region to obtain the b value of the Gutenberg-Richter law based on the maximum likelihood method and the annual magnitude of completeness (Mc) by using the maximum curvature method (MAXC). We assessed statistically the analysis of Z for the time series consisting of the b value and Mc in the three subsets of earthquakes. The confidence interval of this study was 90%. This approach is useful to analyze the performance characteristics of the Venezuelan seismic network and the associated regional catalogues. The results lead to conclude that the Central part of Venezuela does not show an statistically significant trend of the seismicity or Mc, while western region has a decreasing trend in the Mc estimation but no variations in terms of the seismicity. Only the Eastern region presents an increasing trend in its seismicity and Mc values.

  9. Time series clustering analysis of health-promoting behavior

    NASA Astrophysics Data System (ADS)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  10. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  11. Efficient Bayesian inference for natural time series using ARFIMA processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-11-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. In this paper we present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators. For CET we also extend our method to seasonal long memory.

  12. Software for detection and correction of inhomogeneities in time series

    NASA Astrophysics Data System (ADS)

    Stepanek, Petr

    2010-05-01

    During the last decade, software package consisting of AnClim, ProClimDB and LoadData software for processing climatological data has been created. This software offers complex solution in processing climatological time series, starting from loading data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme values evaluation and model outputs verification (ProClimDB and AnClim software). In recent years tools for correction of inhomogeneites in daily data was introduced. Partly methods already programmed in R (e.g. by Christine Gruber, ZAMG) like HOM of Paul Della-Marta and SPLIDHOM method of Olivier Mestre or own methods are available, some of them being able to apply multi-element approach (using e.g. weather types). Available methods can be easily compared and evaluated (both for inhomogeneity detection or correction in this case). Comparison of the available correction methods is also current task of ongoing COST action ESO601 (www. homogenisation.org). Further methods, if available under R, can be easily linked with the software and then the whole processing can benefit from user-friendly environment in which all the most commonly used functions for data handling and climatological processing are available (read more at www.climahom.eu).

  13. On the Reconstruction of Irregularly Sampled Time Series

    NASA Astrophysics Data System (ADS)

    Vio, Roberto; Strohmer, Thomas; Wamsteker, Willem

    2000-01-01

    We consider the question of numerical treatment of irregularly sampled time series. This problem is quite common in astronomy because of factors such as the day-night alternation, weather conditions, nonobservability of the objects under study, etc. For this reason an extensive literature is available on this subject. Most of the proposed techniques, however, are based on heuristic arguments, and their usefulness is essentially in the estimation of power spectra and/or autocovariance functions. Here we propose an approach, based on the reasonable assumption that many signals of astronomical interest are the realization of band-limited processes, which can be used to fill gaps in experimental time series. By using this approach we propose several reconstruction algorithms that, because of their regularization properties, yield reliable signal reconstructions even in case of noisy data and large gaps. A detailed description of these algorithms is provided, their theoretical implications are considered, and their practical performances are tested via numerical experiments. MATLAB software implementing the methods described in this work is obtainable by request from the authors.

  14. Automatising the analysis of stochastic biochemical time-series

    PubMed Central

    2015-01-01

    Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821

  15. Financial Time Series Prediction Using Spiking Neural Networks

    PubMed Central

    Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam

    2014-01-01

    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two “traditional”, rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments. PMID:25170618

  16. Predicting physical time series using dynamic ridge polynomial neural networks.

    PubMed

    Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir

    2014-01-01

    Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques.

  17. Disentangling the stochastic behavior of complex time series

    PubMed Central

    Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus

    2016-01-01

    Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events – or jumps – with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes. PMID:27759055

  18. Unsupervised Classification During Time-Series Model Building.

    PubMed

    Gates, Kathleen M; Lane, Stephanie T; Varangis, E; Giovanello, K; Guiskewicz, K

    2016-12-07

    Researchers who collect multivariate time-series data across individuals must decide whether to model the dynamic processes at the individual level or at the group level. A recent innovation, group iterative multiple model estimation (GIMME), offers one solution to this dichotomy by identifying group-level time-series models in a data-driven manner while also reliably recovering individual-level patterns of dynamic effects. GIMME is unique in that it does not assume homogeneity in processes across individuals in terms of the patterns or weights of temporal effects. However, it can be difficult to make inferences from the nuances in varied individual-level patterns. The present article introduces an algorithm that arrives at subgroups of individuals that have similar dynamic models. Importantly, the researcher does not need to decide the number of subgroups. The final models contain reliable group-, subgroup-, and individual-level patterns that enable generalizable inferences, subgroups of individuals with shared model features, and individual-level patterns and estimates. We show that integrating community detection into the GIMME algorithm improves upon current standards in two important ways: (1) providing reliable classification and (2) increasing the reliability in the recovery of individual-level effects. We demonstrate this method on functional MRI from a sample of former American football players.

  19. Financial time series prediction using spiking neural networks.

    PubMed

    Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam

    2014-01-01

    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  20. Predicting Physical Time Series Using Dynamic Ridge Polynomial Neural Networks

    PubMed Central

    Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir

    2014-01-01

    Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques. PMID:25157950

  1. Hybrid perturbation methods based on statistical time series models

    NASA Astrophysics Data System (ADS)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  2. Disentangling the stochastic behavior of complex time series

    NASA Astrophysics Data System (ADS)

    Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus

    2016-10-01

    Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events – or jumps – with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes.

  3. Two algorithms to fill cloud gaps in LST time series

    NASA Astrophysics Data System (ADS)

    Frey, Corinne; Kuenzer, Claudia

    2013-04-01

    Cloud contamination is a challenge for optical remote sensing. This is especially true for the recording of a fast changing radiative quantity like land surface temperature (LST). The substitution of cloud contaminated pixels with estimated values - gap filling - is not straightforward but possible to a certain extent, as this research shows for medium-resolution time series of MODIS data. Area of interest is the Upper Mekong Delta (UMD). The background for this work is an analysis of the temporal development of 1-km LST in the context of the WISDOM project. The climate of the UMD is characterized by peak rainfalls in the summer months, which is also the time where cloud contamination is highest in the area. Average number of available daytime observations per pixel can go down to less than five for example in the month of June. In winter the average number may reach 25 observations a month. This situation is not appropriate to the calculation of longterm statistics; an adequate gap filling method should be used beforehand. In this research, two different algorithms were tested on an 11 year time series: 1) a gradient based algorithm and 2) a method based on ECMWF era interim re-analysis data. The first algorithm searches for stable inter-image gradients from a given environment and for a certain period of time. These gradients are then used to estimate LST for cloud contaminated pixels in each acquisition. The estimated LSTs are clear-sky LSTs and solely based on the MODIS LST time series. The second method estimates LST on the base of adapted ECMWF era interim skin temperatures and creates a set of expected LSTs. The estimated values were used to fill the gaps in the original dataset, creating two new daily, 1 km datasets. The maps filled with the gradient based method had more than the double amount of valid pixels than the original dataset. The second method (ECMWF era interim based) was able to fill all data gaps. From the gap filled data sets then monthly

  4. Dependency Structures in Differentially Coded Cardiovascular Time Series

    PubMed Central

    Tasic, Tatjana; Jovanovic, Sladjana; Mohamoud, Omer; Skoric, Tamara; Japundzic-Zigon, Nina

    2017-01-01

    Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar rats and from spontaneous hypertensive rats, aged 3 months and 12 months. Results. The highest level of comonotonic behavior of pulse interval with respect to systolic blood pressure is observed at time lags τ = 0, 3, and 4, while a strong counter-monotonic behavior occurs at time lags τ = 1 and 2. Conclusion. Dynamic range of aging rats is considerably reduced in hypertensive groups. Conditional entropy of systolic blood pressure signal, compared to unconditional, shows an increased level of discrepancy, except for a time lag 1, where the equality is preserved in spite of the memory of differential coder. The antiparallel streams play an important role at single beat time lag. PMID:28127384

  5. Time series trends of the safety effects of pavement resurfacing.

    PubMed

    Park, Juneyoung; Abdel-Aty, Mohamed; Wang, Jung-Han

    2017-04-01

    This study evaluated the safety performance of pavement resurfacing projects on urban arterials in Florida using the observational before and after approaches. The safety effects of pavement resurfacing were quantified in the crash modification factors (CMFs) and estimated based on different ranges of heavy vehicle traffic volume and time changes for different severity levels. In order to evaluate the variation of CMFs over time, crash modification functions (CMFunctions) were developed using nonlinear regression and time series models. The results showed that pavement resurfacing projects decrease crash frequency and are found to be more safety effective to reduce severe crashes in general. Moreover, the results of the general relationship between the safety effects and time changes indicated that the CMFs increase over time after the resurfacing treatment. It was also found that pavement resurfacing projects for the urban roadways with higher heavy vehicle volume rate are more safety effective than the roadways with lower heavy vehicle volume rate. Based on the exploration and comparison of the developed CMFucntions, the seasonal autoregressive integrated moving average (SARIMA) and exponential functional form of the nonlinear regression models can be utilized to identify the trend of CMFs over time.

  6. Accurate retention time determination of co-eluting proteins in analytical chromatography by means of spectral data.

    PubMed

    Dismer, Florian; Hansen, Sigrid; Oelmeier, Stefan Alexander; Hubbuch, Jürgen

    2013-03-01

    Chromatography is the method of choice for the separation of proteins, at both analytical and preparative scale. Orthogonal purification strategies for industrial use can easily be implemented by combining different modes of adsorption. Nevertheless, with flexibility comes the freedom of choice and optimal conditions for consecutive steps need to be identified in a robust and reproducible fashion. One way to address this issue is the use of mathematical models that allow for an in silico process optimization. Although this has been shown to work, model parameter estimation for complex feedstocks becomes the bottleneck in process development. An integral part of parameter assessment is the accurate measurement of retention times in a series of isocratic or gradient elution experiments. As high-resolution analytics that can differentiate between proteins are often not readily available, pure protein is mandatory for parameter determination. In this work, we present an approach that has the potential to solve this problem. Based on the uniqueness of UV absorption spectra of proteins, we were able to accurately measure retention times in systems of up to four co-eluting compounds. The presented approach is calibration-free, meaning that prior knowledge of pure component absorption spectra is not required. Actually, pure protein spectra can be determined from co-eluting proteins as part of the methodology. The approach was tested for size-exclusion chromatograms of 38 mixtures of co-eluting proteins. Retention times were determined with an average error of 0.6 s (1.6% of average peak width), approximated and measured pure component spectra showed an average coefficient of correlation of 0.992.

  7. Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.

    PubMed

    Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa

    2017-02-01

    Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.

  8. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  9. Detecting Abrupt Changes in a Piecewise Locally Stationary Time Series

    PubMed Central

    Last, Michael; Shumway, Robert

    2007-01-01

    Non-stationary time series arise in many settings, such as seismology, speech-processing, and finance. In many of these settings we are interested in points where a model of local stationarity is violated. We consider the problem of how to detect these change-points, which we identify by finding sharp changes in the time-varying power spectrum. Several different methods are considered, and we find that the symmetrized Kullback-Leibler information discrimination performs best in simulation studies. We derive asymptotic normality of our test statistic, and consistency of estimated change-point locations. We then demonstrate the technique on the problem of detecting arrival phases in earthquakes. PMID:19190715

  10. A quasi-global precipitation time series for drought monitoring

    USGS Publications Warehouse

    Funk, Chris C.; Peterson, Pete J.; Landsfeld, Martin F.; Pedreros, Diego H.; Verdin, James P.; Rowland, James D.; Romero, Bo E.; Husak, Gregory J.; Michaelsen, Joel C.; Verdin, Andrew P.

    2014-01-01

    Estimating precipitation variations in space and time is an important aspect of drought early warning and environmental monitoring. An evolving drier-than-normal season must be placed in historical context so that the severity of rainfall deficits may quickly be evaluated. To this end, scientists at the U.S. Geological Survey Earth Resources Observation and Science Center, working closely with collaborators at the University of California, Santa Barbara Climate Hazards Group, have developed a quasi-global (50°S–50°N, 180°E–180°W), 0.05° resolution, 1981 to near-present gridded precipitation time series: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) data archive.

  11. Estimation of Hurst Exponent for the Financial Time Series

    NASA Astrophysics Data System (ADS)

    Kumar, J.; Manchanda, P.

    2009-07-01

    Till recently statistical methods and Fourier analysis were employed to study fluctuations in stock markets in general and Indian stock market in particular. However current trend is to apply the concepts of wavelet methodology and Hurst exponent, see for example the work of Manchanda, J. Kumar and Siddiqi, Journal of the Frankline Institute 144 (2007), 613-636 and paper of Cajueiro and B. M. Tabak. Cajueiro and Tabak, Physica A, 2003, have checked the efficiency of emerging markets by computing Hurst component over a time window of 4 years of data. Our goal in the present paper is to understand the dynamics of the Indian stock market. We look for the persistency in the stock market through Hurst exponent and fractal dimension of time series data of BSE 100 and NIFTY 50.

  12. Cluster analysis of long time-series medical datasets

    NASA Astrophysics Data System (ADS)

    Hirano, Shoji; Tsumoto, Shusaku

    2004-04-01

    This paper presents a comparative study about the characteristics of clustering methods for inhomogeneous time-series medical datasets. Using various combinations of comparison methods and grouping methods, we performed clustering experiments of the hepatitis data set and evaluated validity of the results. The results suggested that (1) complete-linkage (CL) criterion in agglomerative hierarchical clustering (AHC) outperformed average-linkage (AL) criterion in terms of the interpretability of a dendrogram and clustering results, (2) combination of dynamic time warping (DTW) and CL-AHC constantly produced interpretable results, (3) combination of DTW and rough clustering (RC) would be used to find the core sequences of the clusters, (4) multiscale matching may suffer from the treatment of 'no-match' pairs, however, the problem may be eluded by using RC as a subsequent grouping method.

  13. Vegetation Dynamics of NW Mexico using MODIS time series data

    NASA Astrophysics Data System (ADS)

    Valdes, M.; Bonifaz, R.; Pelaez, G.; Leyva Contreras, A.

    2010-12-01

    Northwestern Mexico is an area subjected to a combination of marine and continental climatic influences which produce a highly variable vegetation dynamics throughout time. Using Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indices data (NDVI and EVI) from 2001 to 2008, mean and standard deviation image values of the time series were calculated. Using this data, annual vegetation dynamics was characterized based on the different values for the different vegetation types. Annual mean values were compared and inter annual variations or anomalies were analyzed calculating departures of de mean. An anomaly was considered if the value was over or under two standard deviations. Using this procedure it was possible determine spatio-temporal patterns over the study area and relate them to climatic conditions.

  14. Adaptive Sensing of Time Series with Application to Remote Exploration

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Cabrol, Nathalie A.; Furlong, Michael; Hardgrove, Craig; Low, Bryan K. H.; Moersch, Jeffrey; Wettergreen, David

    2013-01-01

    We address the problem of adaptive informationoptimal data collection in time series. Here a remote sensor or explorer agent throttles its sampling rate in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility -- all collected datapoints lie in the past, but its resource allocation decisions require predicting far into the future. Our solution is to continually fit a Gaussian process model to the latest data and optimize the sampling plan on line to maximize information gain. We compare the performance characteristics of stationary and nonstationary Gaussian process models. We also describe an application based on geologic analysis during planetary rover exploration. Here adaptive sampling can improve coverage of localized anomalies and potentially benefit mission science yield of long autonomous traverses.

  15. Behavior of road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin

    2014-12-01

    Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.

  16. The Bermuda Atlantic Time-series Study (BATS): A Time-series Window on Sargasso Sea Ecosystem Functioning

    NASA Astrophysics Data System (ADS)

    Lomas, M. W.

    2001-12-01

    The Bermuda Atlantic Time-series Study (BATS), located in the Northwestern Sargasso Sea, was started over 12 years ago as part of the Joint Global Ocean Flux Study. The BATS sampling region lies ~82km southeast of Bermuda in about 4600 meters of water near the Ocean Flux Program site and the Bermuda Testbed Mooring. Over this 12-year period, a suite of core measurements has been made monthly or biweekly during the winter/spring bloom period (January to April). These measurements cover a wide range of physical, chemical and biological stock measurements. In conjunction with these stock measurements, a number of BATS core rate process measurements are made such as primary and bacterial production, and particle mass flux. Over the 12-year record of this program, numerous ancillary projects have greatly enhanced the significance and interpretability of the core measurements. More importantly, this 12-year time-series data set has provided information that allows us to re-examine some of the dominant paradigms in biological oceanography, namely that the open ocean is an unchanging biological "desert". The past decade has seen a shift in fate of the carbon fixed during primary production. Whereas a significant fraction of photosynthetically fixed carbon accumulated in the dissolved organic pool in the early 1990's, the late 1990's are characterized by a reduction in DOC accumulation and a commensurate >2-fold increase in particle flux from the euphotic zone. This change in the partitioning of primary production appears to be associated with significant changes in phytoplankton community structure and climatic forcing. As the BATS time-series record continues to extend so to will our understanding of the mechanisms responsible for these apparent changes in the functioning of the Sargasso Sea ecosystem.

  17. Fast and Flexible Multivariate Time Series Subsequence Search

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.

    2010-01-01

    Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.

  18. The Space-Time Conservative Schemes for Large-Scale, Time-Accurate Flow Simulations with Tetrahedral Meshes

    NASA Technical Reports Server (NTRS)

    Venkatachari, Balaji Shankar; Streett, Craig L.; Chang, Chau-Lyan; Friedlander, David J.; Wang, Xiao-Yen; Chang, Sin-Chung

    2016-01-01

    Despite decades of development of unstructured mesh methods, high-fidelity time-accurate simulations are still predominantly carried out on structured, or unstructured hexahedral meshes by using high-order finite-difference, weighted essentially non-oscillatory (WENO), or hybrid schemes formed by their combinations. In this work, the space-time conservation element solution element (CESE) method is used to simulate several flow problems including supersonic jet/shock interaction and its impact on launch vehicle acoustics, and direct numerical simulations of turbulent flows using tetrahedral meshes. This paper provides a status report for the continuing development of the space-time conservation element solution element (CESE) numerical and software framework under the Revolutionary Computational Aerosciences (RCA) project. Solution accuracy and large-scale parallel performance of the numerical framework is assessed with the goal of providing a viable paradigm for future high-fidelity flow physics simulations.

  19. Salt marsh mapping based on a short-time interval NDVI time-series from HJ-1 CCD imagery

    NASA Astrophysics Data System (ADS)

    SUN, C.

    2015-12-01

    Salt marshes are regard as one of the most dynamic and valuable ecosystems in coastal zone. It is crucial to obtain accurate information on the species composition and spatial distribution of salt marshes in time since they are experiencing tremendous replacement and disappearance. However, discriminating various types of salt marshes is a rather difficult task because of the strong spectral similarities. In previous studies, salt marsh mappings were mainly focused on high-spatial and hyperspectral resolution imageries combined with auxiliary information but this method can hardly extend to a large region. With high temporal and moderate spatial resolutions, Chinese HJ-1 CCD imagery would not only allow monitoring phenological changes of salt marsh vegetation in short-time intervals, but also cover large areas of salt marshes. Taking the middle coast of Jiangsu (east China) as an example, our study first constructed a monthly NDVI time-series to classify various types of salt marshes. Then, we tested the idea of compressed time-series continuously to broaden the applicability and portability of this particular approach. The results showed that (1) the overall accuracy of salt marsh mapping based on the monthly NDVI time-series reached 90.3%, which increased approximately 16.0% in contrast with a single-phase classification strategy; (2) a compressed time-series, including NDVI from six key months (April, June to September, and November) demonstrated very little decline (2.3%) in overall accuracy but led to obvious improvements in unstable regions; (3) Spartina alterniflora identification could be achieved with only a scene NDVI image from November, which could provide an effective way to regularly monitor its distribution. Besides, by comparing the calibrated performance between HJ-1 CCD and other sensors (i.e., Landsat TM/ETM+, OLI), we certified the reliability of HJ-1 CCD imagery, which is expected to pave the way for laws expansibility from this imagery.

  20. Analyzing bank filtration by deconvoluting time series of electric conductivity.

    PubMed

    Cirpka, Olaf A; Fienen, Michael N; Hofer, Markus; Hoehn, Eduard; Tessarini, Aronne; Kipfer, Rolf; Kitanidis, Peter K

    2007-01-01

    Knowing the travel-time distributions from infiltrating rivers to pumping wells is important in the management of alluvial aquifers. Commonly, travel-time distributions are determined by releasing a tracer pulse into the river and measuring the breakthrough curve in the wells. As an alternative, one may measure signals of a time-varying natural tracer in the river and in adjacent wells and infer the travel-time distributions by deconvolution. Traditionally this is done by fitting a parametric function such as the solution of the one-dimensional advection-dispersion equation to the data. By choosing a certain parameterization, it is impossible to determine features of the travel-time distribution that do not follow the general shape of the parameterization, i.e., multiple peaks. We present a method to determine travel-time distributions by nonparametric deconvolution of electric-conductivity time series. Smoothness of the inferred transfer function is achieved by a geostatistical approach, in which the transfer function is assumed as a second-order intrinsic random time variable. Nonnegativity is enforced by the method of Lagrange multipliers. We present an approach to directly compute the best nonnegative estimate and to generate sets of plausible solutions. We show how the smoothness of the transfer function can be estimated from the data. The approach is applied to electric-conductivity measurements taken at River Thur, Switzerland, and five wells in the adjacent aquifer, but the method can also be applied to other time-varying natural tracers such as temperature. At our field site, electric-conductivity fluctuations appear to be an excellent natural tracer.

  1. Time-Accurate Simulations and Acoustic Analysis of Slat Free-Shear-Layer. Part 2

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Singer, Bart A.; Lockard, David P.

    2002-01-01

    Unsteady computational simulations of a multi-element, high-lift configuration are performed. Emphasis is placed on accurate spatiotemporal resolution of the free shear layer in the slat-cove region. The excessive dissipative effects of the turbulence model, so prevalent in previous simulations, are circumvented by switching off the turbulence-production term in the slat cove region. The justifications and physical arguments for taking such a step are explained in detail. The removal of this excess damping allows the shear layer to amplify large-scale structures, to achieve a proper non-linear saturation state, and to permit vortex merging. The large-scale disturbances are self-excited, and unlike our prior fully turbulent simulations, no external forcing of the shear layer is required. To obtain the farfield acoustics, the Ffowcs Williams and Hawkings equation is evaluated numerically using the simulated time-accurate flow data. The present comparison between the computed and measured farfield acoustic spectra shows much better agreement for the amplitude and frequency content than past calculations. The effect of the angle-of-attack on the slat's flow features radiated acoustic field are also simulated presented.

  2. Challenges to Deriving Climate Time Series From Satellite Observations

    NASA Astrophysics Data System (ADS)

    Wentz, F. J.; Mears, C. A.

    2005-12-01

    Satellites have been observing the Earth's weather and climate since the launch of TIROS-1 in 1960. As satellite and sensor technology advanced over the next two decades, the accuracy of the satellite observations improved to the point of being useful for climate monitoring. The launch of the first Microwave Sounding Unit (MSU) in October 1978 and the first Special Sensor Microwave Imager (SSM/I) in June 1987 mark the beginning of research-quality time series for several important climate state variables, including tropospheric temperature and water vapor, cloud and rain water, and ocean surface winds. In this talk, we will illustrate the many obstacles that must be overcome to convert raw satellite measurements into climate data records. Probably the most pivotal issue is sensor calibration. Although an on-board self-calibrating apparatus is part of each satellite sensor, the accuracy of the calibration system is limited and in some cases unexpected calibration problems occurred on-orbit. The lack of exact calibration leads to a second problem: merging sensors flying on many different satellites into one consistent decadal time series. Also drifts in the satellites' orbits, both in altitude and local time of day, must be carefully taken into account else spurious signals will enter the time series. In addition to these technical difficulties, programmatic problems present a different set of hurdles that must be overcome. The maintenance of a long-term climate record may necessitate sustaining a long-term research activity requiring continuity in both expert staffing and funding. The alternative of computing climate records in an operational rather than research environment creates a new set of problems. As the satellite sensor technology continues to advance into the next decade, new challenges will arise. The new sensors will have different channel sets, viewing geometries, and orbital characteristics. Their complexity will be an order of magnitude greater than

  3. Adaptive time-variant models for fuzzy-time-series forecasting.

    PubMed

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  4. 42 CFR 414.806 - Penalties associated with the failure to submit timely and accurate ASP data.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... timely and accurate ASP data. 414.806 Section 414.806 Public Health CENTERS FOR MEDICARE & MEDICAID... Penalties associated with the failure to submit timely and accurate ASP data. Section 1847A(d)(4) specifies the penalties associated with misrepresentations associated with ASP data. If the Secretary...

  5. 42 CFR 414.806 - Penalties associated with the failure to submit timely and accurate ASP data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... timely and accurate ASP data. 414.806 Section 414.806 Public Health CENTERS FOR MEDICARE & MEDICAID... associated with the failure to submit timely and accurate ASP data. Section 1847A(d)(4) specifies the penalties associated with misrepresentations associated with ASP data. If the Secretary determines that...

  6. 42 CFR 414.806 - Penalties associated with the failure to submit timely and accurate ASP data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... timely and accurate ASP data. 414.806 Section 414.806 Public Health CENTERS FOR MEDICARE & MEDICAID... Penalties associated with the failure to submit timely and accurate ASP data. Section 1847A(d)(4) specifies the penalties associated with misrepresentations associated with ASP data. If the Secretary...

  7. 42 CFR 414.806 - Penalties associated with the failure to submit timely and accurate ASP data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... timely and accurate ASP data. 414.806 Section 414.806 Public Health CENTERS FOR MEDICARE & MEDICAID... associated with the failure to submit timely and accurate ASP data. Section 1847A(d)(4) specifies the penalties associated with misrepresentations associated with ASP data. If the Secretary determines that...

  8. 42 CFR 414.806 - Penalties associated with the failure to submit timely and accurate ASP data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... timely and accurate ASP data. 414.806 Section 414.806 Public Health CENTERS FOR MEDICARE & MEDICAID... Penalties associated with the failure to submit timely and accurate ASP data. Section 1847A(d)(4) specifies the penalties associated with misrepresentations associated with ASP data. If the Secretary...

  9. Performance of vegetation indices from Landsat time series in deforestation monitoring

    NASA Astrophysics Data System (ADS)

    Schultz, Michael; Clevers, Jan G. P. W.; Carter, Sarah; Verbesselt, Jan; Avitabile, Valerio; Quang, Hien Vu; Herold, Martin

    2016-10-01

    The performance of Landsat time series (LTS) of eight vegetation indices (VIs) was assessed for monitoring deforestation across the tropics. Three sites were selected based on differing remote sensing observation frequencies, deforestation drivers and environmental factors. The LTS of each VI was analysed using the Breaks For Additive Season and Trend (BFAST) Monitor method to identify deforestation. A robust reference database was used to evaluate the performance regarding spatial accuracy, sensitivity to observation frequency and combined use of multiple VIs. The canopy cover sensitive Normalized Difference Fraction Index (NDFI) was the most accurate. Among those tested, wetness related VIs (Normalized Difference Moisture Index (NDMI) and the Tasselled Cap wetness (TCw)) were spatially more accurate than greenness related VIs (Normalized Difference Vegetation Index (NDVI) and Tasselled Cap greenness (TCg)). When VIs were fused on feature level, spatial accuracy was improved and overestimation of change reduced. NDVI and NDFI produced the most robust results when observation frequency varies.

  10. [Outlier Detection of Time Series Three-Dimensional Fluorescence Spectroscopy].

    PubMed

    Yu, Shao-hui; Zhang, Yu-jun; Zhao, Nan-jing

    2015-06-01

    The qualitative and quantitative analysis are often interfered by the outliers in time series three-dimensional fluorescence spectroscopy. In this work, an efficient outlier detection method is proposed by taking advantage of the characteristics in time dimension and the spectral dimension. Firstly, the wavelength points that are mostly the outliers are extracted by the variance in time dimension. Secondly, by the analysis of the existence styles of outliers and similarity score of any two samples, the cumulative similarity is introduced in spectral dimension. At last, fluorescence intensity at each wavelength of all samples is modified by the correction matrix in time dimension and the outlier detection is completed according the to cumulative similarity scores. The application of the correction matrix in time dimension not only improves the validity of the method but also reduces the computation by the choice of characteristics region in correction matrix. Numerical experiments show that the outliers can still be detected by the 50 percent of all points in spectral dimension.

  11. Multi-Granular Trend Detection for Time-Series Analysis.

    PubMed

    Arthur Van, Goethem; Staals, Frank; Loffler, Maarten; Dykes, Jason; Speckmann, Bettina

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data sets. Trend detection is an effective way to simplify time-varying data and to summarize salient information for visual display and interactive analysis. We propose a geometric model for trend-detection in one-dimensional time-varying data, inspired by topological grouping structures for moving objects in two- or higher-dimensional space. Our model gives provable guarantees on the trends detected and uses three natural parameters: granularity, support-size, and duration. These parameters can be changed on-demand. Our system also supports a variety of selection brushes and a time-sweep to facilitate refined searches and interactive visualization of (sub-)trends. We explore different visual styles and interactions through which trends, their persistence, and evolution can be explored.

  12. Computer Program Recognizes Patterns in Time-Series Data

    NASA Technical Reports Server (NTRS)

    Hand, Charles

    2003-01-01

    A computer program recognizes selected patterns in time-series data like digitized samples of seismic or electrophysiological signals. The program implements an artificial neural network (ANN) and a set of N clocks for the purpose of determining whether N or more instances of a certain waveform, W, occur within a given time interval, T. The ANN must be trained to recognize W in the incoming stream of data. The first time the ANN recognizes W, it sets clock 1 to count down from T to zero; the second time it recognizes W, it sets clock 2 to count down from T to zero, and so forth through the Nth instance. On the N + 1st instance, the cycle is repeated, starting with clock 1. If any clock has not reached zero when it is reset, then N instances of W have been detected within time T, and the program so indicates. The program can readily be encoded in a field-programmable gate array or an application-specific integrated circuit that could be used, for example, to detect electroencephalographic or electrocardiographic waveforms indicative of epileptic seizures or heart attacks, respectively.

  13. Adaptive Sampling of Time Series During Remote Exploration

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  14. United States Forest Disturbance Trends Observed Using Landsat Time Series

    NASA Technical Reports Server (NTRS)

    Masek, Jeffrey G.; Goward, Samuel N.; Kennedy, Robert E.; Cohen, Warren B.; Moisen, Gretchen G.; Schleeweis, Karen; Huang, Chengquan

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing U.S. land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest disturbance across the conterminous United States for 1985-2005. The geographic sample design used a probability-based scheme to encompass major forest types and maximize geographic dispersion. For each sample location disturbance was identified in the Landsat series using the Vegetation Change Tracker (VCT) algorithm. The NAFD analysis indicates that, on average, 2.77 Mha/yr of forests were disturbed annually, representing 1.09%/yr of US forestland. These satellite-based national disturbance rates estimates tend to be lower than those derived from land management inventories, reflecting both methodological and definitional differences. In particular the VCT approach used with a biennial time step has limited sensitivity to low-intensity disturbances. Unlike prior satellite studies, our biennial forest disturbance rates vary by nearly a factor of two between high and low years. High western US disturbance rates were associated with active fire years and insect activity, while variability in the east is more strongly related to harvest rates in managed forests. We note that generating a geographic sample based on representing forest type and variability may be problematic since the spatial pattern of disturbance does not necessarily correlate with forest type. We also find that the prevalence of diffuse, non-stand clearing disturbance in US forests makes the application of a biennial geographic sample problematic. Future satellite-based studies of disturbance at regional and national scales should focus on wall-to-wall analyses with annual time step for improved accuracy.

  15. Nonlinear times series analysis of epileptic human electroencephalogram (EEG)

    NASA Astrophysics Data System (ADS)

    Li, Dingzhou

    The problem of seizure anticipation in patients with epilepsy has attracted significant attention in the past few years. In this paper we discuss two approaches, using methods of nonlinear time series analysis applied to scalp electrode recordings, which is able to distinguish between epochs temporally distant from and just prior to, the onset of a seizure in patients with temporal lobe epilepsy. First we describe a method involving a comparison of recordings taken from electrodes adjacent to and remote from the site of the seizure focus. In particular, we define a nonlinear quantity which we call marginal predictability. This quantity is computed using data from remote and from adjacent electrodes. We find that the difference between the marginal predictabilities computed for the remote and adjacent electrodes decreases several tens of minutes prior to seizure onset, compared to its value interictally. We also show that these difl'crcnc es of marginal predictability intervals are independent of the behavior state of the patient. Next we examine the please coherence between different electrodes both in the long-range and the short-range. When time is distant from seizure onsets ("interictally"), epileptic patients have lower long-range phase coherence in the delta (1-4Hz) and beta (18-30Hz) frequency band compared to nonepileptic subjects. When seizures approach (''preictally"), we observe an increase in phase coherence in the beta band. However, interictally there is no difference in short-range phase coherence between this cohort of patients and non-epileptic subjects. Preictally short-range phase coherence also increases in the alpha (10-13Hz) and the beta band. Next we apply the quantity marginal predictability on the phase difference time series. Such marginal predictabilities are lower in the patients than in the non-epileptic subjects. However, when seizure approaches, the former moves asymptotically towards the latter.

  16. SAGE: A tool for time-series analysis of Greenland

    NASA Astrophysics Data System (ADS)

    Duerr, R. E.; Gallaher, D. W.; Khalsa, S. S.; Lewis, S.

    2011-12-01

    The National Snow and Ice Data Center (NSIDC) has developed an operational tool for analysis. This production tool is known as "Services for the Analysis of the Greenland Environment" (SAGE). Using an integrated workspace approach, a researcher has the ability to find relevant data and perform various analysis functions on the data, as well as retrieve the data and analysis results. While there continues to be compelling observational evidence for increased surface melting and rapid thinning along the margins of the Greenland ice sheet, there are still uncertainties with respect to estimates of mass balance of Greenland's ice sheet as a whole. To better understand the dynamics of these issues, it is important for scientists to have access to a variety of datasets from multiple sources, and to be able to integrate and analyze the data. SAGE provides data from various sources, such as AMSR-E and AVHRR datasets, which can be analyzed individually through various time-series plots and aggregation functions; or they can be analyzed together with scatterplots or overlaid time-series plots to provide quick and useful results to support various research products. The application is available at http://nsidc.org/data/sage/. SAGE was built on top of NSIDC's existing Searchlight engine. The SAGE interface gives users access to much of NSIDC's relevant Greenland raster data holdings, as well as data from outside sources. Additionally, various web services provide access for other clients to utilize the functionality that the SAGE interface provides. Combined, these methods of accessing the tool allow scientists the ability to devote more of their time to their research, and less on trying to find and retrieve the data they need.

  17. Acoustic thermometry time series in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dushaw, B. D.; Howe, B. M.; Mercer, J. A.; Worcester; Npal Group*, P. F.

    2002-12-01

    Acoustic measurements of large-scale, depth-averaged temperatures are continuing in the North Pacific as a follow on to the Acoustic Thermometry of Ocean Climate (ATOC) project. An acoustic source is located just north of Kauai. It transmits to six receivers to the east at 1-4-Mm ranges and one receiver to the northwest at about 4-Mm range. The transmission schedule is six times per day at four-day intervals. The time series were obtained from 1998 through 1999 and, after a two-year interruption because of permitting issues, began again in January 2002 to continue for at least another five years. The intense mesoscale thermal variability around Hawaii is evident in all time series; this variability is much greater than that observed near the California coast. The paths to the east, particularly those paths to the California coast, show cooling this year relative to the earlier data. The path to the northwest shows a modest warming. The acoustic rays sample depths below the mixed layer near Hawaii and to the surface as they near the California coast or extend north of the sub-arctic front. The temperatures measured acoustically are compared with those inferred from TOPEX altimetry, ARGO float data, and with ECCO (Estimating the Circulation and Climate of the Ocean) model output. This on-going data collection effort, to be augmented over the next years with a more complete observing array, can be used for, e.g., separating whole-basin climate change from low-mode spatial variability such as the Pacific Decadal Oscillation (PDO). [*NPAL (North Pacific Acoustic Laboratory) Group: J. A. Colosi, B. D. Cornuelle, B. D. Dushaw, M. A. Dzieciuch, B. M. Howe, J. A. Mercer, R. C. Spindel, and P. F. Worcester. Work supported by the Office of Naval Research.

  18. Established time series measure occurrence and frequency of episodic events.

    NASA Astrophysics Data System (ADS)

    Pebody, Corinne; Lampitt, Richard

    2015-04-01

    Established time series measure occurrence and frequency of episodic events. Episodic flux events occur in open oceans. Time series making measurements over significant time scales are one of the few methods that can capture these events and compare their impact with 'normal' flux. Seemingly rare events may be significant on local scales, but without the ability to measure the extent of flux on spatial and temporal scales and combine with the frequency of occurrence, it is difficult to constrain their impact. The Porcupine Abyssal Plain Sustained Observatory (PAP-SO) in the Northeast Atlantic (49 °N 16 °W, 5000m water depth) has measured particle flux since 1989 and zooplankton swimmers since 2000. Sediment traps at 3000m and 100 metres above bottom, collect material year round and we have identified close links between zooplankton and particle flux. Some of these larger animals, for example Diacria trispinosa, make a significant contribution to carbon flux through episodic flux events. D. trispinosa is a euthecosome mollusc which occurs in the Northeast Atlantic, though the PAP-SO is towards the northern limit of its distribution. Pteropods are comprised of aragonite shell, containing soft body parts excepting the muscular foot which extends beyond the mouth of the living animal. Pteropods, both live-on-entry animals and the empty shells are found year round in the 3000m trap. Generally the abundance varies with particle flux, but within that general pattern there are episodic events where significant numbers of these animals containing both organic and inorganic carbon are captured at depth and therefore could be defined as contributing to export flux. Whether the pulse of animals is as a result of the life cycle of D. trispinosa or the effects of the physics of the water column is unclear, but the complexity of the PAP-SO enables us not only to collect these animals but to examine them in parallel to the biogeochemical and physical elements measured by the

  19. Numerical Methodology for Coupled Time-Accurate Simulations of Primary and Secondary Flowpaths in Gas Turbines

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.

    2006-01-01

    Detailed information of the flow-fields in the secondary flowpaths and their interaction with the primary flows in gas turbine engines is necessary for successful designs with optimized secondary flow streams. Present work is focused on the development of a simulation methodology for coupled time-accurate solutions of the two flowpaths. The secondary flowstream is treated using SCISEAL, an unstructured adaptive Cartesian grid code developed for secondary flows and seals, while the mainpath flow is solved using TURBO, a density based code with capability of resolving rotor-stator interaction in multi-stage machines. An interface is being tested that links the two codes at the rim seal to allow data exchange between the two codes for parallel, coupled execution. A description of the coupling methodology and the current status of the interface development is presented. Representative steady-state solutions of the secondary flow in the UTRC HP Rig disc cavity are also presented.

  20. Cartesian Off-Body Grid Adaption for Viscous Time- Accurate Flow Simulation

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; Pulliam, Thomas H.

    2011-01-01

    An improved solution adaption capability has been implemented in the OVERFLOW overset grid CFD code. Building on the Cartesian off-body approach inherent in OVERFLOW and the original adaptive refinement method developed by Meakin, the new scheme provides for automated creation of multiple levels of finer Cartesian grids. Refinement can be based on the undivided second-difference of the flow solution variables, or on a specific flow quantity such as vorticity. Coupled with load-balancing and an inmemory solution interpolation procedure, the adaption process provides very good performance for time-accurate simulations on parallel compute platforms. A method of using refined, thin body-fitted grids combined with adaption in the off-body grids is presented, which maximizes the part of the domain subject to adaption. Two- and three-dimensional examples are used to illustrate the effectiveness and performance of the adaption scheme.