Sample records for generated time series

  1. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  2. InSAR Deformation Time Series Processed On-Demand in the Cloud

    NASA Astrophysics Data System (ADS)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.

  3. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  4. Pseudo-random bit generator based on lag time series

    NASA Astrophysics Data System (ADS)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  5. A multi-site stochastic weather generator of daily precipitation and temperature

    USDA-ARS?s Scientific Manuscript database

    Stochastic weather generators are used to generate time series of climate variables that have statistical properties similar to those of observed data. Most stochastic weather generators work for a single site, and can only generate climate data at a single point, or independent time series at sever...

  6. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  7. A high-fidelity weather time series generator using the Markov Chain process on a piecewise level

    NASA Astrophysics Data System (ADS)

    Hersvik, K.; Endrerud, O.-E. V.

    2017-12-01

    A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.

  8. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  9. A study of sound generation in subsonic rotors, volume 2

    NASA Technical Reports Server (NTRS)

    Chalupnik, J. D.; Clark, L. T.

    1975-01-01

    Computer programs were developed for use in the analysis of sound generation by subsonic rotors. Program AIRFOIL computes the spectrum of radiated sound from a single airfoil immersed in a laminar flow field. Program ROTOR extends this to a rotating frame, and provides a model for sound generation in subsonic rotors. The program also computes tone sound generation due to steady state forces on the blades. Program TONE uses a moving source analysis to generate a time series for an array of forces moving in a circular path. The resultant time series are than Fourier transformed to render the results in spectral form. Program SDATA is a standard time series analysis package. It reads in two discrete time series and forms auto and cross covariances and normalizes these to form correlations. The program then transforms the covariances to yield auto and cross power spectra by means of a Fourier transformation.

  10. A novel weight determination method for time series data aggregation

    NASA Astrophysics Data System (ADS)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  11. Volatility of linear and nonlinear time series

    NASA Astrophysics Data System (ADS)

    Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo

    2005-07-01

    Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.

  12. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-07-25

    This paper presents four algorithms to generate random forecast error time series. The performance of four algorithms is compared. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets used in power grid operation to study the net load balancing need in variable generation integration studies. The four algorithms are truncated-normal distribution models, state-space based Markov models, seasonal autoregressive moving average (ARMA) models, and a stochastic-optimization based approach. The comparison is made using historical DA load forecast and actual load valuesmore » to generate new sets of DA forecasts with similar stoical forecast error characteristics (i.e., mean, standard deviation, autocorrelation, and cross-correlation). The results show that all methods generate satisfactory results. One method may preserve one or two required statistical characteristics better the other methods, but may not preserve other statistical characteristics as well compared with the other methods. Because the wind and load forecast error generators are used in wind integration studies to produce wind and load forecasts time series for stochastic planning processes, it is sometimes critical to use multiple methods to generate the error time series to obtain a statistically robust result. Therefore, this paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  13. Macroscopic Spatial Complexity of the Game of Life Cellular Automaton: A Simple Data Analysis

    NASA Astrophysics Data System (ADS)

    Hernández-Montoya, A. R.; Coronel-Brizio, H. F.; Rodríguez-Achach, M. E.

    In this chapter we present a simple data analysis of an ensemble of 20 time series, generated by averaging the spatial positions of the living cells for each state of the Game of Life Cellular Automaton (GoL). We show that at the macroscopic level described by these time series, complexity properties of GoL are also presented and the following emergent properties, typical of data extracted complex systems such as financial or economical come out: variations of the generated time series following an asymptotic power law distribution, large fluctuations tending to be followed by large fluctuations, and small fluctuations tending to be followed by small ones, and fast decay of linear correlations, however, the correlations associated to their absolute variations exhibit a long range memory. Finally, a Detrended Fluctuation Analysis (DFA) of the generated time series, indicates that the GoL spatial macro states described by the time series are not either completely ordered or random, in a measurable and very interesting way.

  14. Time series analysis of the developed financial markets' integration using visibility graphs

    NASA Astrophysics Data System (ADS)

    Zhuang, Enyu; Small, Michael; Feng, Gang

    2014-09-01

    A time series representing the developed financial markets' segmentation from 1973 to 2012 is studied. The time series reveals an obvious market integration trend. To further uncover the features of this time series, we divide it into seven windows and generate seven visibility graphs. The measuring capabilities of the visibility graphs provide means to quantitatively analyze the original time series. It is found that the important historical incidents that influenced market integration coincide with variations in the measured graphical node degree. Through the measure of neighborhood span, the frequencies of the historical incidents are disclosed. Moreover, it is also found that large "cycles" and significant noise in the time series are linked to large and small communities in the generated visibility graphs. For large cycles, how historical incidents significantly affected market integration is distinguished by density and compactness of the corresponding communities.

  15. Recurrence plots revisited

    NASA Astrophysics Data System (ADS)

    Casdagli, M. C.

    1997-09-01

    We show that recurrence plots (RPs) give detailed characterizations of time series generated by dynamical systems driven by slowly varying external forces. For deterministic systems we show that RPs of the time series can be used to reconstruct the RP of the driving force if it varies sufficiently slowly. If the driving force is one-dimensional, its functional form can then be inferred up to an invertible coordinate transformation. The same results hold for stochastic systems if the RP of the time series is suitably averaged and transformed. These results are used to investigate the nonlinear prediction of time series generated by dynamical systems driven by slowly varying external forces. We also consider the problem of detecting a small change in the driving force, and propose a surrogate data technique for assessing statistical significance. Numerically simulated time series and a time series of respiration rates recorded from a subject with sleep apnea are used as illustrative examples.

  16. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-12-18

    This paper presents four algorithms to generate random forecast error time series, including a truncated-normal distribution model, a state-space based Markov model, a seasonal autoregressive moving average (ARMA) model, and a stochastic-optimization based model. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets, used for variable generation integration studies. A comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics. This paper discusses and comparesmore » the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  17. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  18. A statistical approach for generating synthetic tip stress data from limited CPT soundings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basalams, M.K.

    CPT tip stress data obtained from a Uranium mill tailings impoundment are treated as time series. A statistical class of models that was developed to model time series is explored to investigate its applicability in modeling the tip stress series. These models were developed by Box and Jenkins (1970) and are known as Autoregressive Moving Average (ARMA) models. This research demonstrates how to apply the ARMA models to tip stress series. Generation of synthetic tip stress series that preserve the main statistical characteristics of the measured series is also investigated. Multiple regression analysis is used to model the regional variationmore » of the ARMA model parameters as well as the regional variation of the mean and the standard deviation of the measured tip stress series. The reliability of the generated series is investigated from a geotechnical point of view as well as from a statistical point of view. Estimation of the total settlement using the measured and the generated series subjected to the same loading condition are performed. The variation of friction angle with depth of the impoundment materials is also investigated. This research shows that these series can be modeled by the Box and Jenkins ARMA models. A third degree Autoregressive model AR(3) is selected to represent these series. A theoretical double exponential density function is fitted to the AR(3) model residuals. Synthetic tip stress series are generated at nearby locations. The generated series are shown to be reliable in estimating the total settlement and the friction angle variation with depth for this particular site.« less

  19. Simulation of time series by distorted Gaussian processes

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1977-01-01

    Distorted stationary Gaussian process can be used to provide computer-generated imitations of experimental time series. A method of analyzing a source time series and synthesizing an imitation is shown, and an example using X-band radiometer data is given.

  20. A Locally Optimal Algorithm for Estimating a Generating Partition from an Observed Time Series and Its Application to Anomaly Detection.

    PubMed

    Ghalyan, Najah F; Miller, David J; Ray, Asok

    2018-06-12

    Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.

  1. Long-range correlations in time series generated by time-fractional diffusion: A numerical study

    NASA Astrophysics Data System (ADS)

    Barbieri, Davide; Vivoli, Alessandro

    2005-09-01

    Time series models showing power law tails in autocorrelation functions are common in econometrics. A special non-Markovian model for such kind of time series is provided by the random walk introduced by Gorenflo et al. as a discretization of time fractional diffusion. The time series so obtained are analyzed here from a numerical point of view in terms of autocorrelations and covariance matrices.

  2. Time-series-based hybrid mathematical modelling method adapted to forecast automotive and medical waste generation: Case study of Lithuania.

    PubMed

    Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras

    2018-05-01

    The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.

  3. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  4. Riemannian multi-manifold modeling and clustering in brain networks

    NASA Astrophysics Data System (ADS)

    Slavakis, Konstantinos; Salsabilian, Shiva; Wack, David S.; Muldoon, Sarah F.; Baidoo-Williams, Henry E.; Vettel, Jean M.; Cieslak, Matthew; Grafton, Scott T.

    2017-08-01

    This paper introduces Riemannian multi-manifold modeling in the context of brain-network analytics: Brainnetwork time-series yield features which are modeled as points lying in or close to a union of a finite number of submanifolds within a known Riemannian manifold. Distinguishing disparate time series amounts thus to clustering multiple Riemannian submanifolds. To this end, two feature-generation schemes for brain-network time series are put forth. The first one is motivated by Granger-causality arguments and uses an auto-regressive moving average model to map low-rank linear vector subspaces, spanned by column vectors of appropriately defined observability matrices, to points into the Grassmann manifold. The second one utilizes (non-linear) dependencies among network nodes by introducing kernel-based partial correlations to generate points in the manifold of positivedefinite matrices. Based on recently developed research on clustering Riemannian submanifolds, an algorithm is provided for distinguishing time series based on their Riemannian-geometry properties. Numerical tests on time series, synthetically generated from real brain-network structural connectivity matrices, reveal that the proposed scheme outperforms classical and state-of-the-art techniques in clustering brain-network states/structures.

  5. Evaluating the temporal stability of synthetically generated time-series for crop types in central Germany

    USDA-ARS?s Scientific Manuscript database

    Synthetically generated Landsat time-series based on the STARFM algorithm are increasingly used for applications in forestry or agriculture. Although successes in classification and derivation of phenological orbiomass parameters are evident, a thorough evaluation of the limits of the method is stil...

  6. Analysis and generation of groundwater concentration time series

    NASA Astrophysics Data System (ADS)

    Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae

    2018-01-01

    Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.

  7. A cluster merging method for time series microarray with production values.

    PubMed

    Chira, Camelia; Sedano, Javier; Camara, Monica; Prieto, Carlos; Villar, Jose R; Corchado, Emilio

    2014-09-01

    A challenging task in time-course microarray data analysis is to cluster genes meaningfully combining the information provided by multiple replicates covering the same key time points. This paper proposes a novel cluster merging method to accomplish this goal obtaining groups with highly correlated genes. The main idea behind the proposed method is to generate a clustering starting from groups created based on individual temporal series (representing different biological replicates measured in the same time points) and merging them by taking into account the frequency by which two genes are assembled together in each clustering. The gene groups at the level of individual time series are generated using several shape-based clustering methods. This study is focused on a real-world time series microarray task with the aim to find co-expressed genes related to the production and growth of a certain bacteria. The shape-based clustering methods used at the level of individual time series rely on identifying similar gene expression patterns over time which, in some models, are further matched to the pattern of production/growth. The proposed cluster merging method is able to produce meaningful gene groups which can be naturally ranked by the level of agreement on the clustering among individual time series. The list of clusters and genes is further sorted based on the information correlation coefficient and new problem-specific relevant measures. Computational experiments and results of the cluster merging method are analyzed from a biological perspective and further compared with the clustering generated based on the mean value of time series and the same shape-based algorithm.

  8. A coupled stochastic rainfall-evapotranspiration model for hydrological impact analysis

    NASA Astrophysics Data System (ADS)

    Pham, Minh Tu; Vernieuwe, Hilde; De Baets, Bernard; Verhoest, Niko E. C.

    2018-02-01

    A hydrological impact analysis concerns the study of the consequences of certain scenarios on one or more variables or fluxes in the hydrological cycle. In such an exercise, discharge is often considered, as floods originating from extremely high discharges often cause damage. Investigating the impact of extreme discharges generally requires long time series of precipitation and evapotranspiration to be used to force a rainfall-runoff model. However, such kinds of data may not be available and one should resort to stochastically generated time series, even though the impact of using such data on the overall discharge, and especially on the extreme discharge events, is not well studied. In this paper, stochastically generated rainfall and corresponding evapotranspiration time series, generated by means of vine copulas, are used to force a simple conceptual hydrological model. The results obtained are comparable to the modelled discharge using observed forcing data. Yet, uncertainties in the modelled discharge increase with an increasing number of stochastically generated time series used. Notwithstanding this finding, it can be concluded that using a coupled stochastic rainfall-evapotranspiration model has great potential for hydrological impact analysis.

  9. Time lagged ordinal partition networks for capturing dynamics of continuous dynamical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCullough, Michael; Iu, Herbert Ho-Ching; Small, Michael

    2015-05-15

    We investigate a generalised version of the recently proposed ordinal partition time series to network transformation algorithm. First, we introduce a fixed time lag for the elements of each partition that is selected using techniques from traditional time delay embedding. The resulting partitions define regions in the embedding phase space that are mapped to nodes in the network space. Edges are allocated between nodes based on temporal succession thus creating a Markov chain representation of the time series. We then apply this new transformation algorithm to time series generated by the Rössler system and find that periodic dynamics translate tomore » ring structures whereas chaotic time series translate to band or tube-like structures—thereby indicating that our algorithm generates networks whose structure is sensitive to system dynamics. Furthermore, we demonstrate that simple network measures including the mean out degree and variance of out degrees can track changes in the dynamical behaviour in a manner comparable to the largest Lyapunov exponent. We also apply the same analysis to experimental time series generated by a diode resonator circuit and show that the network size, mean shortest path length, and network diameter are highly sensitive to the interior crisis captured in this particular data set.« less

  10. An approach to constructing a homogeneous time series of soil mositure using SMOS

    USDA-ARS?s Scientific Manuscript database

    Overlapping soil moisture time series derived from two satellite microwave radiometers (SMOS, Soil Moisture and Ocean Salinity; AMSR-E, Advanced Microwave Scanning Radiometer - Earth Observing System) are used to generate a soil moisture time series from 2003 to 2010. Two statistical methodologies f...

  11. Atmospheric turbulence simulation for Shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1979-01-01

    An improved non-recursive model for atmospheric turbulence along the flight path of the Shuttle Orbiter is developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model the time series for both gusts and gust gradients are generated and stored on a series of magnetic tapes. Section 2 provides a description of the various technical considerations associated with the turbulence simulation model. Included in this section are descriptions of the digital filter simulation model, the von Karman spectra with finite upper limits, and the final non recursive turbulence simulation model which was used to generate the time series. Section 2 provides a description of the various technical considerations associated with the turbulence simulation model. Included in this section are descriptions of the digial filter simulation model, the von Karman spectra with finite upper limits, and the final non recursive turbulence simulation model which was used to generate the time series. Section 3 provides a description of the time series as currently recorded on magnetic tape. Conclusions and recommendations are presented in Section 4.

  12. Interactive Digital Signal Processor

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1985-01-01

    Interactive Digital Signal Processor, IDSP, consists of set of time series analysis "operators" based on various algorithms commonly used for digital signal analysis. Processing of digital signal time series to extract information usually achieved by applications of number of fairly standard operations. IDSP excellent teaching tool for demonstrating application for time series operators to artificially generated signals.

  13. On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sinha, A. K.

    1973-01-01

    Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.

  14. Approaches in highly parameterized inversion: TSPROC, a general time-series processor to assist in model calibration and result summarization

    USGS Publications Warehouse

    Westenbroek, Stephen M.; Doherty, John; Walker, John F.; Kelson, Victor A.; Hunt, Randall J.; Cera, Timothy B.

    2012-01-01

    The TSPROC (Time Series PROCessor) computer software uses a simple scripting language to process and analyze time series. It was developed primarily to assist in the calibration of environmental models. The software is designed to perform calculations on time-series data commonly associated with surface-water models, including calculation of flow volumes, transformation by means of basic arithmetic operations, and generation of seasonal and annual statistics and hydrologic indices. TSPROC can also be used to generate some of the key input files required to perform parameter optimization by means of the PEST (Parameter ESTimation) computer software. Through the use of TSPROC, the objective function for use in the model-calibration process can be focused on specific components of a hydrograph.

  15. Use of a Principal Components Analysis for the Generation of Daily Time Series.

    NASA Astrophysics Data System (ADS)

    Dreveton, Christine; Guillou, Yann

    2004-07-01

    A new approach for generating daily time series is considered in response to the weather-derivatives market. This approach consists of performing a principal components analysis to create independent variables, the values of which are then generated separately with a random process. Weather derivatives are financial or insurance products that give companies the opportunity to cover themselves against adverse climate conditions. The aim of a generator is to provide a wider range of feasible situations to be used in an assessment of risk. Generation of a temperature time series is required by insurers or bankers for pricing weather options. The provision of conditional probabilities and a good representation of the interannual variance are the main challenges of a generator when used for weather derivatives. The generator was developed according to this new approach using a principal components analysis and was applied to the daily average temperature time series of the Paris-Montsouris station in France. The observed dataset was homogenized and the trend was removed to represent correctly the present climate. The results obtained with the generator show that it represents correctly the interannual variance of the observed climate; this is the main result of the work, because one of the main discrepancies of other generators is their inability to represent accurately the observed interannual climate variance—this discrepancy is not acceptable for an application to weather derivatives. The generator was also tested to calculate conditional probabilities: for example, the knowledge of the aggregated value of heating degree-days in the middle of the heating season allows one to estimate the probability if reaching a threshold at the end of the heating season. This represents the main application of a climate generator for use with weather derivatives.


  16. Multifractal analysis of time series generated by discrete Ito equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele

    2015-06-15

    In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.

  17. Magnitude and sign of long-range correlated time series: Decomposition and surrogate signal generation.

    PubMed

    Gómez-Extremera, Manuel; Carpena, Pedro; Ivanov, Plamen Ch; Bernaola-Galván, Pedro A

    2016-04-01

    We systematically study the scaling properties of the magnitude and sign of the fluctuations in correlated time series, which is a simple and useful approach to distinguish between systems with different dynamical properties but the same linear correlations. First, we decompose artificial long-range power-law linearly correlated time series into magnitude and sign series derived from the consecutive increments in the original series, and we study their correlation properties. We find analytical expressions for the correlation exponent of the sign series as a function of the exponent of the original series. Such expressions are necessary for modeling surrogate time series with desired scaling properties. Next, we study linear and nonlinear correlation properties of series composed as products of independent magnitude and sign series. These surrogate series can be considered as a zero-order approximation to the analysis of the coupling of magnitude and sign in real data, a problem still open in many fields. We find analytical results for the scaling behavior of the composed series as a function of the correlation exponents of the magnitude and sign series used in the composition, and we determine the ranges of magnitude and sign correlation exponents leading to either single scaling or to crossover behaviors. Finally, we obtain how the linear and nonlinear properties of the composed series depend on the correlation exponents of their magnitude and sign series. Based on this information we propose a method to generate surrogate series with controlled correlation exponent and multifractal spectrum.

  18. Modeling Time Series Data for Supervised Learning

    ERIC Educational Resources Information Center

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  19. Use of a Weather Generator for analysis of projections of future daily temperature and its validation with climate change indices

    NASA Astrophysics Data System (ADS)

    Di Piazza, A.; Cordano, E.; Eccel, E.

    2012-04-01

    The issue of climate change detection is considered a major challenge. In particular, high temporal resolution climate change scenarios are required in the evaluation of the effects of climate change on agricultural management (crop suitability, yields, risk assessment, etc.) energy production and water management. In this work, a "Weather Generator" technique was used for downscaling climate change scenarios for temperature. An R package (RMAWGEN, Cordano and Eccel, 2011 - available on http://cran.r-project.org) was developed aiming to generate synthetic daily weather conditions by using the theory of vectorial auto-regressive models (VAR). The VAR model was chosen for its ability in maintaining the temporal and spatial correlations among variables. In particular, observed time series of daily maximum and minimum temperature are transformed into "new" normally-distributed variable time series which are used to calibrate the parameters of a VAR model by using ordinary least square methods. Therefore the implemented algorithm, applied to monthly mean climatic values downscaled by Global Climate Model predictions, can generate several stochastic daily scenarios where the statistical consistency among series is saved. Further details are present in RMAWGEN documentation. An application is presented here by using a dataset with daily temperature time series recorded in 41 different sites of Trentino region for the period 1958-2010. Temperature time series were pre-processed to fill missing values (by a site-specific calibrated Inverse Distance Weighting algorithm, corrected with elevation) and to remove inhomogeneities. Several climatic indices were taken into account, useful for several impact assessment applications, and their time trends within the time series were analyzed. The indices go from the more classical ones, as annual mean temperatures, seasonal mean temperatures and their anomalies (from the reference period 1961-1990) to the climate change indices selected from the list recommended by the World Meteorological Organization Commission for Climatology (WMO-CCL) and the Research Programme on Climate Variability and Predictability (CLIVAR) project's Expert Team on Climate Change Detection, Monitoring and Indices (ETCCDMI). Each index was applied to both observed (and processed) data and to synthetic time series produced by the Weather Generator, over the thirty year reference period 1981-2010, in order to validate the procedure. Climate projections were statistically downscaled for a selection of sites for the two 30-year periods 2021-2050 and 2071-2099 of the European project "Ensembles" multi-model output (scenario A1B). The use of several climatic indices strengthens the trend analysis of both the generated synthetic series and future climate projections.

  20. Testing for nonlinearity in time series: The method of surrogate data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, J.; Galdrikian, B.; Longtin, A.

    1991-01-01

    We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less

  1. Modeling long correlation times using additive binary Markov chains: Applications to wind generation time series.

    PubMed

    Weber, Juliane; Zachow, Christopher; Witthaut, Dirk

    2018-03-01

    Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.

  2. Modeling long correlation times using additive binary Markov chains: Applications to wind generation time series

    NASA Astrophysics Data System (ADS)

    Weber, Juliane; Zachow, Christopher; Witthaut, Dirk

    2018-03-01

    Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.

  3. Assessment of a stochastic downscaling methodology in generating an ensemble of hourly future climate time series

    NASA Astrophysics Data System (ADS)

    Fatichi, S.; Ivanov, V. Y.; Caporali, E.

    2013-04-01

    This study extends a stochastic downscaling methodology to generation of an ensemble of hourly time series of meteorological variables that express possible future climate conditions at a point-scale. The stochastic downscaling uses general circulation model (GCM) realizations and an hourly weather generator, the Advanced WEather GENerator (AWE-GEN). Marginal distributions of factors of change are computed for several climate statistics using a Bayesian methodology that can weight GCM realizations based on the model relative performance with respect to a historical climate and a degree of disagreement in projecting future conditions. A Monte Carlo technique is used to sample the factors of change from their respective marginal distributions. As a comparison with traditional approaches, factors of change are also estimated by averaging GCM realizations. With either approach, the derived factors of change are applied to the climate statistics inferred from historical observations to re-evaluate parameters of the weather generator. The re-parameterized generator yields hourly time series of meteorological variables that can be considered to be representative of future climate conditions. In this study, the time series are generated in an ensemble mode to fully reflect the uncertainty of GCM projections, climate stochasticity, as well as uncertainties of the downscaling procedure. Applications of the methodology in reproducing future climate conditions for the periods of 2000-2009, 2046-2065 and 2081-2100, using the period of 1962-1992 as the historical baseline are discussed for the location of Firenze (Italy). The inferences of the methodology for the period of 2000-2009 are tested against observations to assess reliability of the stochastic downscaling procedure in reproducing statistics of meteorological variables at different time scales.

  4. Testing the shape of distributions of weather data

    NASA Astrophysics Data System (ADS)

    Baccon, Ana L. P.; Lunardi, José T.

    2016-08-01

    The characterization of the statistical distributions of observed weather data is of crucial importance both for the construction and for the validation of weather models, such as weather generators (WG's). An important class of WG's (e.g., the Richardson-type generators) reduce the time series of each variable to a time series of its residual elements, and the residuals are often assumed to be normally distributed. In this work we propose an approach to investigate if the shape assumed for the distribution of residuals is consistent or not with the observed data of a given site. Specifically, this procedure tests if the same distribution shape for the residuals noise is maintained along the time. The proposed approach is an adaptation to climate time series of a procedure first introduced to test the shapes of distributions of growth rates of business firms aggregated in large panels of short time series. We illustrate the procedure by applying it to the residuals time series of maximum temperature in a given location, and investigate the empirical consistency of two assumptions, namely i) the most common assumption that the distribution of the residuals is Gaussian and ii) that the residuals noise has a time invariant shape which coincides with the empirical distribution of all the residuals noise of the whole time series pooled together.

  5. 40 CFR 141.602 - System specific studies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V...-compliance results generated during the time period beginning with the first reported result and ending with...

  6. 40 CFR 141.602 - System specific studies.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...

  7. 40 CFR 141.602 - System specific studies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...

  8. 40 CFR 141.602 - System specific studies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...

  9. 40 CFR 141.602 - System specific studies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...

  10. Trend time-series modeling and forecasting with neural networks.

    PubMed

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  11. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  12. Modeling climate change impacts on combined sewer overflow using synthetic precipitation time series.

    PubMed

    Bendel, David; Beck, Ferdinand; Dittmer, Ulrich

    2013-01-01

    In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).

  13. The Global Streamflow Indices and Metadata Archive (GSIM) - Part 2: Quality control, time-series indices and homogeneity assessment

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Lukas; Do, Hong Xuan; Leonard, Michael; Westra, Seth

    2018-04-01

    This is Part 2 of a two-paper series presenting the Global Streamflow Indices and Metadata Archive (GSIM), which is a collection of daily streamflow observations at more than 30 000 stations around the world. While Part 1 (Do et al., 2018a) describes the data collection process as well as the generation of auxiliary catchment data (e.g. catchment boundary, land cover, mean climate), Part 2 introduces a set of quality controlled time-series indices representing (i) the water balance, (ii) the seasonal cycle, (iii) low flows and (iv) floods. To this end we first consider the quality of individual daily records using a combination of quality flags from data providers and automated screening methods. Subsequently, streamflow time-series indices are computed for yearly, seasonal and monthly resolution. The paper provides a generalized assessment of the homogeneity of all generated streamflow time-series indices, which can be used to select time series that are suitable for a specific task. The newly generated global set of streamflow time-series indices is made freely available with an digital object identifier at https://doi.pangaea.de/10.1594/PANGAEA.887470 and is expected to foster global freshwater research, by acting as a ground truth for model validation or as a basis for assessing the role of human impacts on the terrestrial water cycle. It is hoped that a renewed interest in streamflow data at the global scale will foster efforts in the systematic assessment of data quality and provide momentum to overcome administrative barriers that lead to inconsistencies in global collections of relevant hydrological observations.

  14. Alpine Grassland Phenology as Seen in AVHRR, VEGETATION, and MODIS NDVI Time Series - a Comparison with In Situ Measurements

    PubMed Central

    Fontana, Fabio; Rixen, Christian; Jonas, Tobias; Aberegg, Gabriel; Wunderle, Stefan

    2008-01-01

    This study evaluates the ability to track grassland growth phenology in the Swiss Alps with NOAA-16 Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) time series. Three growth parameters from 15 alpine and subalpine grassland sites were investigated between 2001 and 2005: Melt-Out (MO), Start Of Growth (SOG), and End Of Growth (EOG). We tried to estimate these phenological dates from yearly NDVI time series by identifying dates, where certain fractions (thresholds) of the maximum annual NDVI amplitude were crossed for the first time. For this purpose, the NDVI time series were smoothed using two commonly used approaches (Fourier adjustment or alternatively Savitzky-Golay filtering). Moreover, AVHRR NDVI time series were compared against data from the newer generation sensors SPOT VEGETATION and TERRA MODIS. All remote sensing NDVI time series were highly correlated with single point ground measurements and therefore accurately represented growth dynamics of alpine grassland. The newer generation sensors VGT and MODIS performed better than AVHRR, however, differences were minor. Thresholds for the determination of MO, SOG, and EOG were similar across sensors and smoothing methods, which demonstrated the robustness of the results. For our purpose, the Fourier adjustment algorithm created better NDVI time series than the Savitzky-Golay filter, since latter appeared to be more sensitive to noisy NDVI time series. Findings show that the application of various thresholds to NDVI time series allows the observation of the temporal progression of vegetation growth at the selected sites with high consistency. Hence, we believe that our study helps to better understand large-scale vegetation growth dynamics above the tree line in the European Alps. PMID:27879852

  15. Alpine Grassland Phenology as Seen in AVHRR, VEGETATION, and MODIS NDVI Time Series - a Comparison with In Situ Measurements.

    PubMed

    Fontana, Fabio; Rixen, Christian; Jonas, Tobias; Aberegg, Gabriel; Wunderle, Stefan

    2008-04-23

    This study evaluates the ability to track grassland growth phenology in the Swiss Alps with NOAA-16 Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) time series. Three growth parameters from 15 alpine and subalpine grassland sites were investigated between 2001 and 2005: Melt-Out (MO), Start Of Growth (SOG), and End Of Growth (EOG).We tried to estimate these phenological dates from yearly NDVI time series by identifying dates, where certain fractions (thresholds) of the maximum annual NDVI amplitude were crossed for the first time. For this purpose, the NDVI time series were smoothed using two commonly used approaches (Fourier adjustment or alternatively Savitzky-Golay filtering). Moreover, AVHRR NDVI time series were compared against data from the newer generation sensors SPOT VEGETATION and TERRA MODIS. All remote sensing NDVI time series were highly correlated with single point ground measurements and therefore accurately represented growth dynamics of alpine grassland. The newer generation sensors VGT and MODIS performed better than AVHRR, however, differences were minor. Thresholds for the determination of MO, SOG, and EOG were similar across sensors and smoothing methods, which demonstrated the robustness of the results. For our purpose, the Fourier adjustment algorithm created better NDVI time series than the Savitzky-Golay filter, since latter appeared to be more sensitive to noisy NDVI time series. Findings show that the application of various thresholds to NDVI time series allows the observation of the temporal progression of vegetation growth at the selected sites with high consistency. Hence, we believe that our study helps to better understand largescale vegetation growth dynamics above the tree line in the European Alps.

  16. Duality between Time Series and Networks

    PubMed Central

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferraioli, Luigi; Hueller, Mauro; Vitale, Stefano

    The scientific objectives of the LISA Technology Package experiment on board of the LISA Pathfinder mission demand accurate calibration and validation of the data analysis tools in advance of the mission launch. The level of confidence required in the mission outcomes can be reached only by intensively testing the tools on synthetically generated data. A flexible procedure allowing the generation of a cross-correlated stationary noise time series was set up. A multichannel time series with the desired cross-correlation behavior can be generated once a model for a multichannel cross-spectral matrix is provided. The core of the procedure comprises a noisemore » coloring, multichannel filter designed via a frequency-by-frequency eigendecomposition of the model cross-spectral matrix and a subsequent fit in the Z domain. The common problem of initial transients in a filtered time series is solved with a proper initialization of the filter recursion equations. The noise generator performance was tested in a two-dimensional case study of the closed-loop LISA Technology Package dynamics along the two principal degrees of freedom.« less

  18. Construction of regulatory networks using expression time-series data of a genotyped population.

    PubMed

    Yeung, Ka Yee; Dombek, Kenneth M; Lo, Kenneth; Mittler, John E; Zhu, Jun; Schadt, Eric E; Bumgarner, Roger E; Raftery, Adrian E

    2011-11-29

    The inference of regulatory and biochemical networks from large-scale genomics data is a basic problem in molecular biology. The goal is to generate testable hypotheses of gene-to-gene influences and subsequently to design bench experiments to confirm these network predictions. Coexpression of genes in large-scale gene-expression data implies coregulation and potential gene-gene interactions, but provide little information about the direction of influences. Here, we use both time-series data and genetics data to infer directionality of edges in regulatory networks: time-series data contain information about the chronological order of regulatory events and genetics data allow us to map DNA variations to variations at the RNA level. We generate microarray data measuring time-dependent gene-expression levels in 95 genotyped yeast segregants subjected to a drug perturbation. We develop a Bayesian model averaging regression algorithm that incorporates external information from diverse data types to infer regulatory networks from the time-series and genetics data. Our algorithm is capable of generating feedback loops. We show that our inferred network recovers existing and novel regulatory relationships. Following network construction, we generate independent microarray data on selected deletion mutants to prospectively test network predictions. We demonstrate the potential of our network to discover de novo transcription-factor binding sites. Applying our construction method to previously published data demonstrates that our method is competitive with leading network construction algorithms in the literature.

  19. Estimation of Parameters from Discrete Random Nonstationary Time Series

    NASA Astrophysics Data System (ADS)

    Takayasu, H.; Nakamura, T.

    For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.

  20. High voltage pulse generator

    DOEpatents

    Fasching, George E.

    1977-03-08

    An improved high-voltage pulse generator has been provided which is especially useful in ultrasonic testing of rock core samples. An N number of capacitors are charged in parallel to V volts and at the proper instance are coupled in series to produce a high-voltage pulse of N times V volts. Rapid switching of the capacitors from the paralleled charging configuration to the series discharging configuration is accomplished by using silicon-controlled rectifiers which are chain self-triggered following the initial triggering of a first one of the rectifiers connected between the first and second of the plurality of charging capacitors. A timing and triggering circuit is provided to properly synchronize triggering pulses to the first SCR at a time when the charging voltage is not being applied to the parallel-connected charging capacitors. Alternate circuits are provided for controlling the application of the charging voltage from a charging circuit to be applied to the parallel capacitors which provides a selection of at least two different intervals in which the charging voltage is turned "off" to allow the SCR's connecting the capacitors in series to turn "off" before recharging begins. The high-voltage pulse-generating circuit including the N capacitors and corresponding SCR's which connect the capacitors in series when triggered "on" further includes diodes and series-connected inductors between the parallel-connected charging capacitors which allow sufficiently fast charging of the capacitors for a high pulse repetition rate and yet allow considerable control of the decay time of the high-voltage pulses from the pulse-generating circuit.

  1. Comparing the performance of FA, DFA and DMA using different synthetic long-range correlated time series

    PubMed Central

    Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier

    2012-01-01

    Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785

  2. Methods for developing time-series climate surfaces to drive topographically distributed energy- and water-balance models

    USGS Publications Warehouse

    Susong, D.; Marks, D.; Garen, D.

    1999-01-01

    Topographically distributed energy- and water-balance models can accurately simulate both the development and melting of a seasonal snowcover in the mountain basins. To do this they require time-series climate surfaces of air temperature, humidity, wind speed, precipitation, and solar and thermal radiation. If data are available, these parameters can be adequately estimated at time steps of one to three hours. Unfortunately, climate monitoring in mountain basins is very limited, and the full range of elevations and exposures that affect climate conditions, snow deposition, and melt is seldom sampled. Detailed time-series climate surfaces have been successfully developed using limited data and relatively simple methods. We present a synopsis of the tools and methods used to combine limited data with simple corrections for the topographic controls to generate high temporal resolution time-series images of these climate parameters. Methods used include simulations, elevational gradients, and detrended kriging. The generated climate surfaces are evaluated at points and spatially to determine if they are reasonable approximations of actual conditions. Recommendations are made for the addition of critical parameters and measurement sites into routine monitoring systems in mountain basins.Topographically distributed energy- and water-balance models can accurately simulate both the development and melting of a seasonal snowcover in the mountain basins. To do this they require time-series climate surfaces of air temperature, humidity, wind speed, precipitation, and solar and thermal radiation. If data are available, these parameters can be adequately estimated at time steps of one to three hours. Unfortunately, climate monitoring in mountain basins is very limited, and the full range of elevations and exposures that affect climate conditions, snow deposition, and melt is seldom sampled. Detailed time-series climate surfaces have been successfully developed using limited data and relatively simple methods. We present a synopsis of the tools and methods used to combine limited data with simple corrections for the topographic controls to generate high temporal resolution time-series images of these climate parameters. Methods used include simulations, elevational gradients, and detrended kriging. The generated climate surfaces are evaluated at points and spatially to determine if they are reasonable approximations of actual conditions. Recommendations are made for the addition of critical parameters and measurement sites into routine monitoring systems in mountain basins.

  3. Efficient Generation and Use of Power Series for Broad Application.

    NASA Astrophysics Data System (ADS)

    Rudmin, Joseph; Sochacki, James

    2017-01-01

    A brief history and overview of the Parker-Sockacki Method of Power Series generation is presented. This method generates power series to order n in time n2 for any system of differential equations that has a power series solution. The method is simple enough that novices to differential equations can easily learn it and immediately apply it. Maximal absolute error estimates allow one to determine the number of terms needed to reach desired accuracy. Ratios of coefficients in a solution with global convergence differ signficantly from that for a solution with only local convergence. Divergence of the series prevents one from overlooking poles. The method can always be cast in polynomial form, which allows separation of variables in almost all physical systems, facilitating exploration of hidden symmetries, and is implicitly symplectic.

  4. Time series with tailored nonlinearities

    NASA Astrophysics Data System (ADS)

    Räth, C.; Laut, I.

    2015-10-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.

  5. Reconstruction of the modified discrete Langevin equation from persistent time series

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czechowski, Zbigniew

    The discrete Langevin-type equation, which can describe persistent processes, was introduced. The procedure of reconstruction of the equation from time series was proposed and tested on synthetic data, with short and long-tail distributions, generated by different Langevin equations. Corrections due to the finite sampling rates were derived. For an exemplary meteorological time series, an appropriate Langevin equation, which constitutes a stochastic macroscopic model of the phenomenon, was reconstructed.

  6. Generation and Validation of Spatial Distribution of Hourly Wind Speed Time-Series using Machine Learning

    NASA Astrophysics Data System (ADS)

    Veronesi, F.; Grassi, S.

    2016-09-01

    Wind resource assessment is a key aspect of wind farm planning since it allows to estimate the long term electricity production. Moreover, wind speed time-series at high resolution are helpful to estimate the temporal changes of the electricity generation and indispensable to design stand-alone systems, which are affected by the mismatch of supply and demand. In this work, we present a new generalized statistical methodology to generate the spatial distribution of wind speed time-series, using Switzerland as a case study. This research is based upon a machine learning model and demonstrates that statistical wind resource assessment can successfully be used for estimating wind speed time-series. In fact, this method is able to obtain reliable wind speed estimates and propagate all the sources of uncertainty (from the measurements to the mapping process) in an efficient way, i.e. minimizing computational time and load. This allows not only an accurate estimation, but the creation of precise confidence intervals to map the stochasticity of the wind resource for a particular site. The validation shows that machine learning can minimize the bias of the wind speed hourly estimates. Moreover, for each mapped location this method delivers not only the mean wind speed, but also its confidence interval, which are crucial data for planners.

  7. Effect of noise and filtering on largest Lyapunov exponent of time series associated with human walking.

    PubMed

    Mehdizadeh, Sina; Sanjari, Mohammad Ali

    2017-11-07

    This study aimed to determine the effect of added noise, filtering and time series length on the largest Lyapunov exponent (LyE) value calculated for time series obtained from a passive dynamic walker. The simplest passive dynamic walker model comprising of two massless legs connected by a frictionless hinge joint at the hip was adopted to generate walking time series. The generated time series was used to construct a state space with the embedding dimension of 3 and time delay of 100 samples. The LyE was calculated as the exponential rate of divergence of neighboring trajectories of the state space using Rosenstein's algorithm. To determine the effect of noise on LyE values, seven levels of Gaussian white noise (SNR=55-25dB with 5dB steps) were added to the time series. In addition, the filtering was performed using a range of cutoff frequencies from 3Hz to 19Hz with 2Hz steps. The LyE was calculated for both noise-free and noisy time series with different lengths of 6, 50, 100 and 150 strides. Results demonstrated a high percent error in the presence of noise for LyE. Therefore, these observations suggest that Rosenstein's algorithm might not perform well in the presence of added experimental noise. Furthermore, findings indicated that at least 50 walking strides are required to calculate LyE to account for the effect of noise. Finally, observations support that a conservative filtering of the time series with a high cutoff frequency might be more appropriate prior to calculating LyE. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  9. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process.

    PubMed

    Moran, John L; Solomon, Patricia J

    2013-05-24

    Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Monthly mean raw mortality (at hospital discharge) time series, 1995-2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) "in-control" status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues.

  10. 75 FR 22779 - FIFRA Scientific Advisory Panel; Notice of Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-30

    ... exposure factors to generate time series of exposure for simulated individuals. One-stage or two-stage...., built-in simple source-to- concentration module, user-entered time series from other models or field... FOR FURTHER INFORMATION CONTACT at least 10 days prior to the meeting to give EPA as much time as...

  11. "Batch" kinetics in flow: online IR analysis and continuous control.

    PubMed

    Moore, Jason S; Jensen, Klavs F

    2014-01-07

    Currently, kinetic data is either collected under steady-state conditions in flow or by generating time-series data in batch. Batch experiments are generally considered to be more suitable for the generation of kinetic data because of the ability to collect data from many time points in a single experiment. Now, a method that rapidly generates time-series reaction data from flow reactors by continuously manipulating the flow rate and reaction temperature has been developed. This approach makes use of inline IR analysis and an automated microreactor system, which allowed for rapid and tight control of the operating conditions. The conversion/residence time profiles at several temperatures were used to fit parameters to a kinetic model. This method requires significantly less time and a smaller amount of starting material compared to one-at-a-time flow experiments, and thus allows for the rapid generation of kinetic data. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    PubMed

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  13. Remote Sensing Time Series Product Tool

    NASA Technical Reports Server (NTRS)

    Predos, Don; Ryan, Robert E.; Ross, Kenton W.

    2006-01-01

    The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced programmers to bypass the GUI and to create more user-specific output products, such as comparison time plots or images. This type of time series analysis tool for remotely sensed imagery could be the basis of a large-area vegetation surveillance system. The TSPT has been used to generate NDVI time series over growing seasons in California and Argentina and for hurricane events, such as Hurricane Katrina.

  14. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 datamore » points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.« less

  15. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  16. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  17. Pan-European stochastic flood event set

    NASA Astrophysics Data System (ADS)

    Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav

    2017-04-01

    Impact Forecasting (IF), the model development center of Aon Benfield, has been developing a large suite of catastrophe flood models on probabilistic bases for individual countries in Europe. Such natural catastrophes do not follow national boundaries: for example, the major flood in 2016 was responsible for the Europe's largest insured loss of USD3.4bn and affected Germany, France, Belgium, Austria and parts of several other countries. Reflecting such needs, IF initiated a pan-European flood event set development which combines cross-country exposures with country based loss distributions to provide more insightful data to re/insurers. Because the observed discharge data are not available across the whole Europe in sufficient quantity and quality to permit a detailed loss evaluation purposes, a top-down approach was chosen. This approach is based on simulating precipitation from a GCM/RCM model chain followed by a calculation of discharges using rainfall-runoff modelling. IF set up this project in a close collaboration with Karlsruhe Institute of Technology (KIT) regarding the precipitation estimates and with University of East Anglia (UEA) in terms of the rainfall-runoff modelling. KIT's main objective is to provide high resolution daily historical and stochastic time series of key meteorological variables. A purely dynamical downscaling approach with the regional climate model COSMO-CLM (CCLM) is used to generate the historical time series, using re-analysis data as boundary conditions. The resulting time series are validated against the gridded observational dataset E-OBS, and different bias-correction methods are employed. The generation of the stochastic time series requires transfer functions between large-scale atmospheric variables and regional temperature and precipitation fields. These transfer functions are developed for the historical time series using reanalysis data as predictors and bias-corrected CCLM simulated precipitation and temperature as predictands. Finally, the transfer functions are applied to a large ensemble of GCM simulations with forcing corresponding to present day climate conditions to generate highly resolved stochastic time series of precipitation and temperature for several thousand years. These time series form the input for the rainfall-runoff model developed by the UEA team. It is a spatially distributed model adapted from the HBV model and will be calibrated for individual basins using historical discharge data. The calibrated model will be driven by the precipitation time series generated by the KIT team to simulate discharges at a daily time step. The uncertainties in the simulated discharges will be analysed using multiple model parameter sets. A number of statistical methods will be used to assess return periods, changes in the magnitudes, changes in the characteristics of floods such as time base and time to peak, and spatial correlations of large flood events. The Pan-European flood stochastic event set will permit a better view of flood risk for market applications.

  18. True random bit generators based on current time series of contact glow discharge electrolysis

    NASA Astrophysics Data System (ADS)

    Rojas, Andrea Espinel; Allagui, Anis; Elwakil, Ahmed S.; Alawadhi, Hussain

    2018-05-01

    Random bit generators (RBGs) in today's digital information and communication systems employ a high rate physical entropy sources such as electronic, photonic, or thermal time series signals. However, the proper functioning of such physical systems is bound by specific constrains that make them in some cases weak and susceptible to external attacks. In this study, we show that the electrical current time series of contact glow discharge electrolysis, which is a dc voltage-powered micro-plasma in liquids, can be used for generating random bit sequences in a wide range of high dc voltages. The current signal is quantized into a binary stream by first using a simple moving average function which makes the distribution centered around zero, and then applying logical operations which enables the binarized data to pass all tests in industry-standard randomness test suite by the National Institute of Standard Technology. Furthermore, the robustness of this RBG against power supply attacks has been examined and verified.

  19. Analysis and stochastic modelling of Intensity-Duration-Frequency relationship from 88 years of 10 min rainfall data in North Spain

    NASA Astrophysics Data System (ADS)

    Delgado, Oihane; Campo-Bescós, Miguel A.; López, J. Javier

    2017-04-01

    Frequently, when we are trying to solve certain hydrological engineering problems, it is often necessary to know rain intensity values related to a specific probability or return period, T. Based on analyses of extreme rainfall events at different time scale aggregation, we can deduce the relationships among Intensity-Duration-Frequency (IDF), that are widely used in hydraulic infrastructure design. However, the lack of long time series of rainfall intensities for smaller time periods, minutes or hours, leads to use mathematical expressions to characterize and extend these curves. One way to deduce them is through the development of synthetic rainfall time series generated from stochastic models, which is evaluated in this work. From recorded accumulated rainfall time series every 10 min in the pluviograph of Igueldo (San Sebastian, Spain) for the time period between 1927-2005, their homogeneity has been checked and possible statistically significant increasing or decreasing trends have also been shown. Subsequently, two models have been calibrated: Bartlett-Lewis and Markov chains models, which are based on the successions of storms, composed for a series of rainfall events, separated by a short interval of time each. Finally, synthetic ten-minute rainfall time series are generated, which allow to estimate detailed IDF curves and compare them with the estimated IDF based on the recorded data.

  20. Accuracy of time-domain and frequency-domain methods used to characterize catchment transit time distributions

    NASA Astrophysics Data System (ADS)

    Godsey, S. E.; Kirchner, J. W.

    2008-12-01

    The mean residence time - the average time that it takes rainfall to reach the stream - is a basic parameter used to characterize catchment processes. Heterogeneities in these processes lead to a distribution of travel times around the mean residence time. By examining this travel time distribution, we can better predict catchment response to contamination events. A catchment system with shorter residence times or narrower distributions will respond quickly to contamination events, whereas systems with longer residence times or longer-tailed distributions will respond more slowly to those same contamination events. The travel time distribution of a catchment is typically inferred from time series of passive tracers (e.g., water isotopes or chloride) in precipitation and streamflow. Variations in the tracer concentration in streamflow are usually damped compared to those in precipitation, because precipitation inputs from different storms (with different tracer signatures) are mixed within the catchment. Mathematically, this mixing process is represented by the convolution of the travel time distribution and the precipitation tracer inputs to generate the stream tracer outputs. Because convolution in the time domain is equivalent to multiplication in the frequency domain, it is relatively straightforward to estimate the parameters of the travel time distribution in either domain. In the time domain, the parameters describing the travel time distribution are typically estimated by maximizing the goodness of fit between the modeled and measured tracer outputs. In the frequency domain, the travel time distribution parameters can be estimated by fitting a power-law curve to the ratio of precipitation spectral power to stream spectral power. Differences between the methods of parameter estimation in the time and frequency domain mean that these two methods may respond differently to variations in data quality, record length and sampling frequency. Here we evaluate how well these two methods of travel time parameter estimation respond to different sources of uncertainty and compare the methods to one another. We do this by generating synthetic tracer input time series of different lengths, and convolve these with specified travel-time distributions to generate synthetic output time series. We then sample both the input and output time series at various sampling intervals and corrupt the time series with realistic error structures. Using these 'corrupted' time series, we infer the apparent travel time distribution, and compare it to the known distribution that was used to generate the synthetic data in the first place. This analysis allows us to quantify how different record lengths, sampling intervals, and error structures in the tracer measurements affect the apparent mean residence time and the apparent shape of the travel time distribution.

  1. Boolean network inference from time series data incorporating prior biological knowledge.

    PubMed

    Haider, Saad; Pal, Ranadip

    2012-01-01

    Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.

  2. Variance fluctuations in nonstationary time series: a comparative study of music genres

    NASA Astrophysics Data System (ADS)

    Jennings, Heather D.; Ivanov, Plamen Ch.; De Martins, Allan M.; da Silva, P. C.; Viswanathan, G. M.

    2004-05-01

    An important problem in physics concerns the analysis of audio time series generated by transduced acoustic phenomena. Here, we develop a new method to quantify the scaling properties of the local variance of nonstationary time series. We apply this technique to analyze audio signals obtained from selected genres of music. We find quantitative differences in the correlation properties of high art music, popular music, and dance music. We discuss the relevance of these objective findings in relation to the subjective experience of music.

  3. A perturbative approach for enhancing the performance of time series forecasting.

    PubMed

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    NASA Astrophysics Data System (ADS)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  5. Coupling Poisson rectangular pulse and multiplicative microcanonical random cascade models to generate sub-daily precipitation timeseries

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph

    2018-07-01

    To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.

  6. LAI, FAPAR and FCOVER products derived from AVHRR long time series: principles and evaluation

    NASA Astrophysics Data System (ADS)

    Verger, A.; Baret, F.; Weiss, M.; Lacaze, R.; Makhmara, H.; Pacholczyk, P.; Smets, B.; Kandasamy, S.; Vermote, E.

    2012-04-01

    Continuous and long term global monitoring of the terrestrial biosphere has draught an intense interest in the recent years in the context of climate and global change. Developing methodologies for generating historical data records from data collected with different satellite sensors over the past three decades by taking benefits from the improvements identified in the processing of the new generation sensors is a new central issue in remote sensing community. In this context, the Bio-geophysical Parameters (BioPar) service within Geoland2 project (http://www.geoland2.eu) aims at developing pre-operational infrastructures for providing global land products both in near real time and off-line mode with long time series. In this contribution, we describe the principles of the GEOLAND algorithm for generating long term datasets of three key biophysical variables, leaf area index (LAI), Fraction of Absorbed Photosynthetic Active Radiation (FAPAR) and cover fraction (FCOVER), that play a key role in several processes, including photosynthesis, respiration and transpiration. LAI, FAPAR and FCOVER are produced globally from AVHRR Long Term Data Record (LTDR) for the 1981-2000 period at 0.05° spatial resolution and 10 days temporal sampling frequency. The proposed algorithm aims to ensure robustness of the derived long time series and consistency with the ones developed in the recent years, and particularly with GEOLAND products derived from VEGETATION sensor. The approach is based on the capacity of neural networks to learn a particular biophysical product (GEOLAND) from reflectances from another sensor (AVHRR normalized reflectances in the red and near infrared bands). Outliers due to possible cloud contamination or residual atmospheric correction are iteratively eliminated. Prior information based on the climatology is used to get more robust estimates. A specific gap filing and smoothing procedure was applied to generate continuous and smooth time series of decadal products. Finally, quality assessment information as well as tentative quantitative uncertainties were proposed. The comparison of the resulting AVHRR LTDR products with actual GEOLAND series derived from VEGETATION demonstrates that they are very consistent, providing continuous time series of global observations of LAI, FAPAR and FCOVER for the last 30-year period, with continuation after 2011.

  7. Adaptive time-variant models for fuzzy-time-series forecasting.

    PubMed

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  8. Approximate Entropies for Stochastic Time Series and EKG Time Series of Patients with Epilepsy and Pseudoseizures

    NASA Astrophysics Data System (ADS)

    Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron

    2009-10-01

    A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.

  9. Multiscale multifractal time irreversibility analysis of stock markets

    NASA Astrophysics Data System (ADS)

    Jiang, Chenguang; Shang, Pengjian; Shi, Wenbin

    2016-11-01

    Time irreversibility is one of the most important properties of nonstationary time series. Complex time series often demonstrate even multiscale time irreversibility, such that not only the original but also coarse-grained time series are asymmetric over a wide range of scales. We study the multiscale time irreversibility of time series. In this paper, we develop a method called multiscale multifractal time irreversibility analysis (MMRA), which allows us to extend the description of time irreversibility to include the dependence on the segment size and statistical moments. We test the effectiveness of MMRA in detecting multifractality and time irreversibility of time series generated from delayed Henon map and binomial multifractal model. Then we employ our method to the time irreversibility analysis of stock markets in different regions. We find that the emerging market has higher multifractality degree and time irreversibility compared with developed markets. In this sense, the MMRA method may provide new angles in assessing the evolution stage of stock markets.

  10. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process

    PubMed Central

    2013-01-01

    Background Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Methods Monthly mean raw mortality (at hospital discharge) time series, 1995–2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) “in-control” status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. Results The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. Conclusions The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues. PMID:23705957

  11. Separation of spatial-temporal patterns ('climatic modes') by combined analysis of really measured and generated numerically vector time series

    NASA Astrophysics Data System (ADS)

    Feigin, A. M.; Mukhin, D.; Volodin, E. M.; Gavrilov, A.; Loskutov, E. M.

    2013-12-01

    The new method of decomposition of the Earth's climate system into well separated spatial-temporal patterns ('climatic modes') is discussed. The method is based on: (i) generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding vector (space-distributed) time series in basis of spatial-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points; (ii) expanding both real SST data, and longer by several times SST data generated numerically, in STEOF basis; (iii) use of the numerically produced STEOF basis for exclusion of 'too slow' (and thus not represented correctly) processes from real data. The application of the method allows by means of vector time series generated numerically by the INM RAS Coupled Climate Model [2] to separate from real SST anomalies data [3] two climatic modes possessing by noticeably different time scales: 3-5 and 9-11 years. Relations of separated modes to ENSO and PDO are investigated. Possible applications of spatial-temporal climatic patterns concept to prognosis of climate system evolution is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm 3. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/

  12. Distributions-per-level: a means of testing level detectors and models of patch-clamp data.

    PubMed

    Schröder, I; Huth, T; Suitchmezian, V; Jarosik, J; Schnell, S; Hansen, U P

    2004-01-01

    Level or jump detectors generate the reconstructed time series from a noisy record of patch-clamp current. The reconstructed time series is used to create dwell-time histograms for the kinetic analysis of the Markov model of the investigated ion channel. It is shown here that some additional lines in the software of such a detector can provide a powerful new means of patch-clamp analysis. For each current level that can be recognized by the detector, an array is declared. The new software assigns every data point of the original time series to the array that belongs to the actual state of the detector. From the data sets in these arrays distributions-per-level are generated. Simulated and experimental time series analyzed by Hinkley detectors are used to demonstrate the benefits of these distributions-per-level. First, they can serve as a test of the reliability of jump and level detectors. Second, they can reveal beta distributions as resulting from fast gating that would usually be hidden in the overall amplitude histogram. Probably the most valuable feature is that the malfunctions of the Hinkley detectors turn out to depend on the Markov model of the ion channel. Thus, the errors revealed by the distributions-per-level can be used to distinguish between different putative Markov models of the measured time series.

  13. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  14. Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm

    NASA Astrophysics Data System (ADS)

    Mathai, J.; Mujumdar, P.

    2017-12-01

    A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.

  15. Evaluating the uncertainty of predicting future climate time series at the hourly time scale

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.

    2011-12-01

    A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.

  16. Phenological Parameters Estimation Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney D.; Ross, Kenton W.; Spruce, Joseph P.; Smoot, James C.; Ryan, Robert E.; Gasser, Gerald E.; Prados, Donald L.; Vaughan, Ronald D.

    2010-01-01

    The Phenological Parameters Estimation Tool (PPET) is a set of algorithms implemented in MATLAB that estimates key vegetative phenological parameters. For a given year, the PPET software package takes in temporally processed vegetation index data (3D spatio-temporal arrays) generated by the time series product tool (TSPT) and outputs spatial grids (2D arrays) of vegetation phenological parameters. As a precursor to PPET, the TSPT uses quality information for each pixel of each date to remove bad or suspect data, and then interpolates and digitally fills data voids in the time series to produce a continuous, smoothed vegetation index product. During processing, the TSPT displays NDVI (Normalized Difference Vegetation Index) time series plots and images from the temporally processed pixels. Both the TSPT and PPET currently use moderate resolution imaging spectroradiometer (MODIS) satellite multispectral data as a default, but each software package is modifiable and could be used with any high-temporal-rate remote sensing data collection system that is capable of producing vegetation indices. Raw MODIS data from the Aqua and Terra satellites is processed using the TSPT to generate a filtered time series data product. The PPET then uses the TSPT output to generate phenological parameters for desired locations. PPET output data tiles are mosaicked into a Conterminous United States (CONUS) data layer using ERDAS IMAGINE, or equivalent software package. Mosaics of the vegetation phenology data products are then reprojected to the desired map projection using ERDAS IMAGINE

  17. Signatures of ecological processes in microbial community time series.

    PubMed

    Faust, Karoline; Bauchinger, Franziska; Laroche, Béatrice; de Buyl, Sophie; Lahti, Leo; Washburne, Alex D; Gonze, Didier; Widder, Stefanie

    2018-06-28

    Growth rates, interactions between community members, stochasticity, and immigration are important drivers of microbial community dynamics. In sequencing data analysis, such as network construction and community model parameterization, we make implicit assumptions about the nature of these drivers and thereby restrict model outcome. Despite apparent risk of methodological bias, the validity of the assumptions is rarely tested, as comprehensive procedures are lacking. Here, we propose a classification scheme to determine the processes that gave rise to the observed time series and to enable better model selection. We implemented a three-step classification scheme in R that first determines whether dependence between successive time steps (temporal structure) is present in the time series and then assesses with a recently developed neutrality test whether interactions between species are required for the dynamics. If the first and second tests confirm the presence of temporal structure and interactions, then parameters for interaction models are estimated. To quantify the importance of temporal structure, we compute the noise-type profile of the community, which ranges from black in case of strong dependency to white in the absence of any dependency. We applied this scheme to simulated time series generated with the Dirichlet-multinomial (DM) distribution, Hubbell's neutral model, the generalized Lotka-Volterra model and its discrete variant (the Ricker model), and a self-organized instability model, as well as to human stool microbiota time series. The noise-type profiles for all but DM data clearly indicated distinctive structures. The neutrality test correctly classified all but DM and neutral time series as non-neutral. The procedure reliably identified time series for which interaction inference was suitable. Both tests were required, as we demonstrated that all structured time series, including those generated with the neutral model, achieved a moderate to high goodness of fit to the Ricker model. We present a fast and robust scheme to classify community structure and to assess the prevalence of interactions directly from microbial time series data. The procedure not only serves to determine ecological drivers of microbial dynamics, but also to guide selection of appropriate community models for prediction and follow-up analysis.

  18. Estimation of different data compositions for early-season crop type classification.

    PubMed

    Hao, Pengyu; Wu, Mingquan; Niu, Zheng; Wang, Li; Zhan, Yulin

    2018-01-01

    Timely and accurate crop type distribution maps are an important inputs for crop yield estimation and production forecasting as multi-temporal images can observe phenological differences among crops. Therefore, time series remote sensing data are essential for crop type mapping, and image composition has commonly been used to improve the quality of the image time series. However, the optimal composition period is unclear as long composition periods (such as compositions lasting half a year) are less informative and short composition periods lead to information redundancy and missing pixels. In this study, we initially acquired daily 30 m Normalized Difference Vegetation Index (NDVI) time series by fusing MODIS, Landsat, Gaofen and Huanjing (HJ) NDVI, and then composited the NDVI time series using four strategies (daily, 8-day, 16-day, and 32-day). We used Random Forest to identify crop types and evaluated the classification performances of the NDVI time series generated from four composition strategies in two studies regions from Xinjiang, China. Results indicated that crop classification performance improved as crop separabilities and classification accuracies increased, and classification uncertainties dropped in the green-up stage of the crops. When using daily NDVI time series, overall accuracies saturated at 113-day and 116-day in Bole and Luntai, and the saturated overall accuracies (OAs) were 86.13% and 91.89%, respectively. Cotton could be identified 40∼60 days and 35∼45 days earlier than the harvest in Bole and Luntai when using daily, 8-day and 16-day composition NDVI time series since both producer's accuracies (PAs) and user's accuracies (UAs) were higher than 85%. Among the four compositions, the daily NDVI time series generated the highest classification accuracies. Although the 8-day, 16-day and 32-day compositions had similar saturated overall accuracies (around 85% in Bole and 83% in Luntai), the 8-day and 16-day compositions achieved these accuracies around 155-day in Bole and 133-day in Luntai, which were earlier than the 32-day composition (170-day in both Bole and Luntai). Therefore, when the daily NDVI time series cannot be acquired, the 16-day composition is recommended in this study.

  19. Estimation of different data compositions for early-season crop type classification

    PubMed Central

    Wu, Mingquan; Wang, Li; Zhan, Yulin

    2018-01-01

    Timely and accurate crop type distribution maps are an important inputs for crop yield estimation and production forecasting as multi-temporal images can observe phenological differences among crops. Therefore, time series remote sensing data are essential for crop type mapping, and image composition has commonly been used to improve the quality of the image time series. However, the optimal composition period is unclear as long composition periods (such as compositions lasting half a year) are less informative and short composition periods lead to information redundancy and missing pixels. In this study, we initially acquired daily 30 m Normalized Difference Vegetation Index (NDVI) time series by fusing MODIS, Landsat, Gaofen and Huanjing (HJ) NDVI, and then composited the NDVI time series using four strategies (daily, 8-day, 16-day, and 32-day). We used Random Forest to identify crop types and evaluated the classification performances of the NDVI time series generated from four composition strategies in two studies regions from Xinjiang, China. Results indicated that crop classification performance improved as crop separabilities and classification accuracies increased, and classification uncertainties dropped in the green-up stage of the crops. When using daily NDVI time series, overall accuracies saturated at 113-day and 116-day in Bole and Luntai, and the saturated overall accuracies (OAs) were 86.13% and 91.89%, respectively. Cotton could be identified 40∼60 days and 35∼45 days earlier than the harvest in Bole and Luntai when using daily, 8-day and 16-day composition NDVI time series since both producer’s accuracies (PAs) and user’s accuracies (UAs) were higher than 85%. Among the four compositions, the daily NDVI time series generated the highest classification accuracies. Although the 8-day, 16-day and 32-day compositions had similar saturated overall accuracies (around 85% in Bole and 83% in Luntai), the 8-day and 16-day compositions achieved these accuracies around 155-day in Bole and 133-day in Luntai, which were earlier than the 32-day composition (170-day in both Bole and Luntai). Therefore, when the daily NDVI time series cannot be acquired, the 16-day composition is recommended in this study. PMID:29868265

  20. Synthetic Generation of Myocardial Blood-Oxygen-Level-Dependent MRI Time Series via Structural Sparse Decomposition Modeling

    PubMed Central

    Rusu, Cristian; Morisi, Rita; Boschetto, Davide; Dharmakumar, Rohan; Tsaftaris, Sotirios A.

    2014-01-01

    This paper aims to identify approaches that generate appropriate synthetic data (computer generated) for Cardiac Phase-resolved Blood-Oxygen-Level-Dependent (CP–BOLD) MRI. CP–BOLD MRI is a new contrast agent- and stress-free approach for examining changes in myocardial oxygenation in response to coronary artery disease. However, since signal intensity changes are subtle, rapid visualization is not possible with the naked eye. Quantifying and visualizing the extent of disease relies on myocardial segmentation and registration to isolate the myocardium and establish temporal correspondences and ischemia detection algorithms to identify temporal differences in BOLD signal intensity patterns. If transmurality of the defect is of interest pixel-level analysis is necessary and thus a higher precision in registration is required. Such precision is currently not available affecting the design and performance of the ischemia detection algorithms. In this work, to enable algorithmic developments of ischemia detection irrespective to registration accuracy, we propose an approach that generates synthetic pixel-level myocardial time series. We do this by (a) modeling the temporal changes in BOLD signal intensity based on sparse multi-component dictionary learning, whereby segmentally derived myocardial time series are extracted from canine experimental data to learn the model; and (b) demonstrating the resemblance between real and synthetic time series for validation purposes. We envision that the proposed approach has the capacity to accelerate development of tools for ischemia detection while markedly reducing experimental costs so that cardiac BOLD MRI can be rapidly translated into the clinical arena for the noninvasive assessment of ischemic heart disease. PMID:24691119

  1. Synthetic generation of myocardial blood-oxygen-level-dependent MRI time series via structural sparse decomposition modeling.

    PubMed

    Rusu, Cristian; Morisi, Rita; Boschetto, Davide; Dharmakumar, Rohan; Tsaftaris, Sotirios A

    2014-07-01

    This paper aims to identify approaches that generate appropriate synthetic data (computer generated) for cardiac phase-resolved blood-oxygen-level-dependent (CP-BOLD) MRI. CP-BOLD MRI is a new contrast agent- and stress-free approach for examining changes in myocardial oxygenation in response to coronary artery disease. However, since signal intensity changes are subtle, rapid visualization is not possible with the naked eye. Quantifying and visualizing the extent of disease relies on myocardial segmentation and registration to isolate the myocardium and establish temporal correspondences and ischemia detection algorithms to identify temporal differences in BOLD signal intensity patterns. If transmurality of the defect is of interest pixel-level analysis is necessary and thus a higher precision in registration is required. Such precision is currently not available affecting the design and performance of the ischemia detection algorithms. In this work, to enable algorithmic developments of ischemia detection irrespective to registration accuracy, we propose an approach that generates synthetic pixel-level myocardial time series. We do this by 1) modeling the temporal changes in BOLD signal intensity based on sparse multi-component dictionary learning, whereby segmentally derived myocardial time series are extracted from canine experimental data to learn the model; and 2) demonstrating the resemblance between real and synthetic time series for validation purposes. We envision that the proposed approach has the capacity to accelerate development of tools for ischemia detection while markedly reducing experimental costs so that cardiac BOLD MRI can be rapidly translated into the clinical arena for the noninvasive assessment of ischemic heart disease.

  2. The Effect on Non-Normal Distributions on the Integrated Moving Average Model of Time-Series Analysis.

    ERIC Educational Resources Information Center

    Doerann-George, Judith

    The Integrated Moving Average (IMA) model of time series, and the analysis of intervention effects based on it, assume random shocks which are normally distributed. To determine the robustness of the analysis to violations of this assumption, empirical sampling methods were employed. Samples were generated from three populations; normal,…

  3. A Comparison of Alternative Approaches to the Analysis of Interrupted Time-Series.

    ERIC Educational Resources Information Center

    Harrop, John W.; Velicer, Wayne F.

    1985-01-01

    Computer generated data representative of 16 Auto Regressive Integrated Moving Averages (ARIMA) models were used to compare the results of interrupted time-series analysis using: (1) the known model identification, (2) an assumed (l,0,0) model, and (3) an assumed (3,0,0) model as an approximation to the General Transformation approach. (Author/BW)

  4. An Investigation of Acoustic Interaction with the Ocean Bottom from Experimental Time Series Generated by Explosive Sources.

    DTIC Science & Technology

    1983-12-01

    near the turbidity channels. Furthermore, Hastrup concludes, after an analysis of time series data taken from the Tyrrhenian abyssal plain, that the top...Bottom-Interacting Ocean Acoustics edited by W. A. Kuperman and F. B. Jensen (Plenum Press, N York, 1980). 84 24. 0. F. Hastrup , "Digital Analysis of

  5. Multifractal surrogate-data generation algorithm that preserves pointwise Hölder regularity structure, with initial applications to turbulence

    NASA Astrophysics Data System (ADS)

    Keylock, C. J.

    2017-03-01

    An algorithm is described that can generate random variants of a time series while preserving the probability distribution of original values and the pointwise Hölder regularity. Thus, it preserves the multifractal properties of the data. Our algorithm is similar in principle to well-known algorithms based on the preservation of the Fourier amplitude spectrum and original values of a time series. However, it is underpinned by a dual-tree complex wavelet transform rather than a Fourier transform. Our method, which we term the iterated amplitude adjusted wavelet transform can be used to generate bootstrapped versions of multifractal data, and because it preserves the pointwise Hölder regularity but not the local Hölder regularity, it can be used to test hypotheses concerning the presence of oscillating singularities in a time series, an important feature of turbulence and econophysics data. Because the locations of the data values are randomized with respect to the multifractal structure, hypotheses about their mutual coupling can be tested, which is important for the velocity-intermittency structure of turbulence and self-regulating processes.

  6. Improvement of downscaled rainfall and temperature across generations over the Western Himalayan region of India

    NASA Astrophysics Data System (ADS)

    Das, L.; Dutta, M.; Akhter, J.; Meher, J. K.

    2016-12-01

    It is a challenging task to create station level (local scale) climate change information over the mountainous locations of Western Himalayan Region (WHR) in India because of limited data availability and poor data quality. In the present study, missing values of station data were handled through Multiple Imputation Chained Equation (MICE) technique. Finally 22 numbers of rain gauge and 16 number of temperature station data having continuous record during 1901­2005 and 1969­2009 period respectively were considered as reference stations for developing downscaled rainfall and temperature time series from five commonly available GCMs in the IPCC's different generation assessment reports namely 2nd, 3rd, 4th and 5th hereafter known as SAR, TAR, AR4 and AR5 respectively. Downscaled models were developed using the combined data from the ERA-interim reanalysis and GCMs historical runs (in spite of forcing were not identical in different generation) as predictor and station level rainfall and temperature as predictands. Station level downscaled rainfall and temperature time series were constructed for five GCMs available in each generation. Regional averaged downscaled time series comprising of all stations was prepared for each model and generation and the downscaled results were compared with observed time series. Finally an Overall Model Improvement Index (OMII) was developed using the downscaling results, which was used to investigate the model improvement across generations as well as the improvement of downscaling results obtained from the Empirical Statistical Downscaling (ESD) methods. In case of temperature, models have improved from SAR to AR5 over the study area. In all most all the GCMs TAR is showing worst performance over the WHR by considering the different statistical indices used in this study. In case of precipitation, no model has shown gradual improvement from SAR to AR5 both for interpolated and downscaled values.

  7. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  8. Detecting chaos in irregularly sampled time series.

    PubMed

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  9. Status, trends, and changes in freshwater inflows to bay systems in the Corpus Christi Bay National Estuary Program study area

    USGS Publications Warehouse

    Asquith, W.H.; Mosier, J. G.; Bush, P.W.

    1997-01-01

    The watershed simulation model Hydrologic Simulation Program—Fortran (HSPF) was used to generate simulated flow (runoff) from the 13 watersheds to the six bay systems because adequate gaged streamflow data from which to estimate freshwater inflows are not available; only about 23 percent of the adjacent contributing watershed area is gaged. The model was calibrated for the gaged parts of three watersheds—that is, selected input parameters (meteorologic and hydrologic properties and conditions) that control runoff were adjusted in a series of simulations until an adequate match between model-generated flows and a set (time series) of gaged flows was achieved. The primary model input is rainfall and evaporation data and the model output is a time series of runoff volumes. After calibration, simulations driven by daily rainfall for a 26-year period (1968–93) were done for the 13 watersheds to obtain runoff under current (1983–93), predevelopment (pre-1940 streamflow and pre-urbanization), and future (2010) land-use conditions for estimating freshwater inflows and for comparing runoff under the three land-use conditions; and to obtain time series of runoff from which to estimate time series of freshwater inflows for trend analysis.

  10. Introduction and application of the multiscale coefficient of variation analysis.

    PubMed

    Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh

    2017-10-01

    Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.

  11. False-nearest-neighbors algorithm and noise-corrupted time series

    NASA Astrophysics Data System (ADS)

    Rhodes, Carl; Morari, Manfred

    1997-05-01

    The false-nearest-neighbors (FNN) algorithm was originally developed to determine the embedding dimension for autonomous time series. For noise-free computer-generated time series, the algorithm does a good job in predicting the embedding dimension. However, the problem of predicting the embedding dimension when the time-series data are corrupted by noise was not fully examined in the original studies of the FNN algorithm. Here it is shown that with large data sets, even small amounts of noise can lead to incorrect prediction of the embedding dimension. Surprisingly, as the length of the time series analyzed by FNN grows larger, the cause of incorrect prediction becomes more pronounced. An analysis of the effect of noise on the FNN algorithm and a solution for dealing with the effects of noise are given here. Some results on the theoretically correct choice of the FNN threshold are also presented.

  12. A Computer Program for the Generation of ARIMA Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Noles, Keith O.

    1977-01-01

    The autoregressive integrated moving averages model (ARIMA) has been applied to time series data in psychological and educational research. A program is described that generates ARIMA data of a known order. The program enables researchers to explore statistical properties of ARIMA data and simulate systems producing time dependent observations.…

  13. Weighted combination of LOD values oa splitted into frequency windows

    NASA Astrophysics Data System (ADS)

    Fernandez, L. I.; Gambis, D.; Arias, E. F.

    In this analysis a one-day combined time series of LOD(length-of-day) estimates is presented. We use individual data series derived by 7 GPS and 3 SLR analysis centers, which routinely contribute to the IERS database over a recent 27-month period (Jul 1996 - Oct 1998). The result is compared to the multi-technique combined series C04 produced by the Central Bureau of the IERS that is commonly used as a reference for the study of the phenomena of Earth rotation variations. The Frequency Windows Combined Series procedure brings out a time series, which is close to C04 but shows an amplitude difference that might explain the evident periodic behavior present in the differences of these two combined series. This method could be useful to generate a new time series to be used as a reference in the high frequency variations of the Earth rotation studies.

  14. Monitoring land surface albedo and vegetation dynamics using high spatial and temporal resolution synthetic time series from Landsat and the MODIS BRDF/NBAR/albedo product

    NASA Astrophysics Data System (ADS)

    Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingsong; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Román, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.

    2017-07-01

    Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.

  15. Monitoring land surface albedo and vegetation dynamics using high spatial and temporal resolution synthetic time series from Landsat and the MODIS BRDF/NBAR/albedo product

    USGS Publications Warehouse

    Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingson; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.

    2017-01-01

    Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.

  16. Monitoring Land Surface Albedo and Vegetation Dynamics Using High Spatial and Temporal Resolution Synthetic Time Series from Landsat and the MODIS BRDF/NBAR/Albedo Product

    NASA Technical Reports Server (NTRS)

    Wang, Zhuosen; Schaaf, Crystal B.; Sun, Quingsong; Kim, Jihyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.; hide

    2017-01-01

    Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warmingcooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500-meter Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF (Bidirectional Reflectance Distribution Function) / NBAR (Nadir BRDF-Adjusted Reflectance) / albedo products and 30-meter Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDFAlbedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30-meter Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30-meter albedos for the intervening daily time steps in this study. These enhanced daily 30-meter spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of 0.006. These synthetic time series provide much greater spatial detail than the 500 meter gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 kilometers by 14 kilometers) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF-Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30-meter resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.

  17. Forecasting the Relative and Cumulative Effects of Multiple Stressors on At-risk Populations

    DTIC Science & Technology

    2011-08-01

    Vitals (observed vital rates), Movement, Ranges, Barriers (barrier interactions), Stochasticity (a time series of stochasticity indices...Simulation Viewer are themselves stochastic . They can change each time it is run. B. 196 Analysis If multiple Census events are present in the life...30-year period. A monthly time series was generated for the 20th-century using monthly anomalies for temperature, precipitation, and percent

  18. Wind, Wave, and Tidal Energy Without Power Conditioning

    NASA Technical Reports Server (NTRS)

    Jones, Jack A.

    2013-01-01

    Most present wind, wave, and tidal energy systems require expensive power conditioning systems that reduce overall efficiency. This new design eliminates power conditioning all, or nearly all, of the time. Wind, wave, and tidal energy systems can transmit their energy to pumps that send high-pressure fluid to a central power production area. The central power production area can consist of a series of hydraulic generators. The hydraulic generators can be variable displacement generators such that the RPM, and thus the voltage, remains constant, eliminating the need for further power conditioning. A series of wind blades is attached to a series of radial piston pumps, which pump fluid to a series of axial piston motors attached to generators. As the wind is reduced, the amount of energy is reduced, and the number of active hydraulic generators can be reduced to maintain a nearly constant RPM. If the axial piston motors have variable displacement, an exact RPM can be maintained for all, or nearly all, wind speeds. Analyses have been performed that show over 20% performance improvements with this technique over conventional wind turbines

  19. Monitoring microbial responses to ocean deoxygenation in a model oxygen minimum zone.

    PubMed

    Hallam, Steven J; Torres-Beltrán, Mónica; Hawley, Alyse K

    2017-10-31

    Today in Scientific Data, two compendia of geochemical and multi-omic sequence information (DNA, RNA, protein) generated over almost a decade of time series monitoring in a seasonally anoxic coastal marine setting are presented to the scientific community. These data descriptors introduce a model ecosystem for the study of microbial responses to ocean deoxygenation, a phenotype that is currently expanding due to climate change. Public access to this time series information is intended to promote scientific collaborations and the generation of new hypotheses relevant to microbial ecology, biogeochemistry and global change issues.

  20. Application and evaluation of forecasting methods for municipal solid waste generation in an Eastern-European city.

    PubMed

    Rimaityte, Ingrida; Ruzgas, Tomas; Denafas, Gintaras; Racys, Viktoras; Martuzevicius, Dainius

    2012-01-01

    Forecasting of generation of municipal solid waste (MSW) in developing countries is often a challenging task due to the lack of data and selection of suitable forecasting method. This article aimed to select and evaluate several methods for MSW forecasting in a medium-scaled Eastern European city (Kaunas, Lithuania) with rapidly developing economics, with respect to affluence-related and seasonal impacts. The MSW generation was forecast with respect to the economic activity of the city (regression modelling) and using time series analysis. The modelling based on social-economic indicators (regression implemented in LCA-IWM model) showed particular sensitivity (deviation from actual data in the range from 2.2 to 20.6%) to external factors, such as the synergetic effects of affluence parameters or changes in MSW collection system. For the time series analysis, the combination of autoregressive integrated moving average (ARIMA) and seasonal exponential smoothing (SES) techniques were found to be the most accurate (mean absolute percentage error equalled to 6.5). Time series analysis method was very valuable for forecasting the weekly variation of waste generation data (r (2) > 0.87), but the forecast yearly increase should be verified against the data obtained by regression modelling. The methods and findings of this study may assist the experts, decision-makers and scientists performing forecasts of MSW generation, especially in developing countries.

  1. Extracting Hydrologic Understanding from the Unique Space-time Sampling of the Surface Water and Ocean Topography (SWOT) Mission

    NASA Astrophysics Data System (ADS)

    Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.

    2017-12-01

    The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.

  2. Reference manual for generation and analysis of Habitat Time Series: version II

    USGS Publications Warehouse

    Milhous, Robert T.; Bartholow, John M.; Updike, Marlys A.; Moos, Alan R.

    1990-01-01

    The selection of an instream flow requirement for water resource management often requires the review of how the physical habitat changes through time. This review is referred to as 'Time Series Analysis." The Tune Series Library (fSLIB) is a group of programs to enter, transform, analyze, and display time series data for use in stream habitat assessment. A time series may be defined as a sequence of data recorded or calculated over time. Examples might be historical monthly flow, predicted monthly weighted usable area, daily electrical power generation, annual irrigation diversion, and so forth. The time series can be analyzed, both descriptively and analytically, to understand the importance of the variation in the events over time. This is especially useful in the development of instream flow needs based on habitat availability. The TSLIB group of programs assumes that you have an adequate study plan to guide you in your analysis. You need to already have knowledge about such things as time period and time step, species and life stages to consider, and appropriate comparisons or statistics to be produced and displayed or tabulated. Knowing your destination, you must first evaluate whether TSLIB can get you there. Remember, data are not answers. This publication is a reference manual to TSLIB and is intended to be a guide to the process of using the various programs in TSLIB. This manual is essentially limited to the hands-on use of the various programs. a TSLIB use interface program (called RTSM) has been developed to provide an integrated working environment where the use has a brief on-line description of each TSLIB program with the capability to run the TSLIB program while in the user interface. For information on the RTSM program, refer to Appendix F. Before applying the computer models described herein, it is recommended that the user enroll in the short course "Problem Solving with the Instream Flow Incremental Methodology (IFIM)." This course is offered by the Aquatic Systems Branch of the National Ecology Research Center. For more information about the TSLIB software, refer to the Memorandum of Understanding. Chapter 1 provides a brief introduction to the Instream Flow Incremental Methodology and TSLIB. Other chapters in this manual provide information on the different aspects of using the models. The information contained in the other chapters includes (2) acquisition, entry, manipulation, and listing of streamflow data; (3) entry, manipulation, and listing of the habitat-versus-streamflow function; (4) transferring streamflow data; (5) water resources systems analysis; (6) generation and analysis of daily streamflow and habitat values; (7) generation of the time series of monthly habitats; (8) manipulation, analysis, and display of month time series data; and (9) generation, analysis, and display of annual time series data. Each section includes documentation for the programs therein with at least one page of information for each program, including a program description, instructions for running the program, and sample output. The Appendixes contain the following: (A) sample file formats; (B) descriptions of default filenames; (C) alphabetical summary of batch-procedure files; (D) installing and running TSLIB on a microcomputer; (E) running TSLIB on a CDC Cyber computer; (F) using the TSLIB user interface program (RTSM); and (G) running WATSTORE on the USGS Amdahl mainframe computer. The number for this version of TSLIB--Version II-- is somewhat arbitrary, as the TSLIB programs were collected into a library some time ago; but operators tended to use and manage them as individual programs. Therefore, we will consider the group of programs from the past that were only on the CDC Cyber computer as Version 0; the programs from the past that were on both the Cyber and the IBM-compatible microcomputer as Version I; and the programs contained in this reference manual as Version II.

  3. Monitoring volcano activity through Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Cassisi, C.; Montalto, P.; Prestifilippo, M.; Aliotta, M.; Cannata, A.; Patanè, D.

    2013-12-01

    During 2011-2013, Mt. Etna was mainly characterized by cyclic occurrences of lava fountains, totaling to 38 episodes. During this time interval Etna volcano's states (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN), whose automatic recognition is very useful for monitoring purposes, turned out to be strongly related to the trend of RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area. Since RMS time series behavior is considered to be stochastic, we can try to model the system generating its values, assuming to be a Markov process, by using Hidden Markov models (HMMs). HMMs are a powerful tool in modeling any time-varying series. HMMs analysis seeks to recover the sequence of hidden states from the observed emissions. In our framework, observed emissions are characters generated by the SAX (Symbolic Aggregate approXimation) technique, which maps RMS time series values with discrete literal emissions. The experiments show how it is possible to guess volcano states by means of HMMs and SAX.

  4. Prediction of flow dynamics using point processes

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito; Stemler, Thomas; Eroglu, Deniz; Marwan, Norbert

    2018-01-01

    Describing a time series parsimoniously is the first step to study the underlying dynamics. For a time-discrete system, a generating partition provides a compact description such that a time series and a symbolic sequence are one-to-one. But, for a time-continuous system, such a compact description does not have a solid basis. Here, we propose to describe a time-continuous time series using a local cross section and the times when the orbit crosses the local cross section. We show that if such a series of crossing times and some past observations are given, we can predict the system's dynamics with fine accuracy. This reconstructability neither depends strongly on the size nor the placement of the local cross section if we have a sufficiently long database. We demonstrate the proposed method using the Lorenz model as well as the actual measurement of wind speed.

  5. An improvement of the measurement of time series irreversibility with visibility graph approach

    NASA Astrophysics Data System (ADS)

    Wu, Zhenyu; Shang, Pengjian; Xiong, Hui

    2018-07-01

    We propose a method to improve the measure of real-valued time series irreversibility which contains two tools: the directed horizontal visibility graph and the Kullback-Leibler divergence. The degree of time irreversibility is estimated by the Kullback-Leibler divergence between the in and out degree distributions presented in the associated visibility graph. In our work, we reframe the in and out degree distributions by encoding them with different embedded dimensions used in calculating permutation entropy(PE). With this improved method, we can not only estimate time series irreversibility efficiently, but also detect time series irreversibility from multiple dimensions. We verify the validity of our method and then estimate the amount of time irreversibility of series generated by chaotic maps as well as global stock markets over the period 2005-2015. The result shows that the amount of time irreversibility reaches the peak with embedded dimension d = 3 under circumstances of experiment and financial markets.

  6. Co-optimizing Generation and Transmission Expansion with Wind Power in Large-Scale Power Grids Implementation in the US Eastern Interconnection

    DOE PAGES

    You, Shutang; Hadley, Stanton W.; Shankar, Mallikarjun; ...

    2016-01-12

    This paper studies the generation and transmission expansion co-optimization problem with a high wind power penetration rate in the US Eastern Interconnection (EI) power grid. In this paper, the generation and transmission expansion problem for the EI system is modeled as a mixed-integer programming (MIP) problem. Our paper also analyzed a time series generation method to capture the variation and correlation of both load and wind power across regions. The obtained series can be easily introduced into the expansion planning problem and then solved through existing MIP solvers. Simulation results show that the proposed planning model and series generation methodmore » can improve the expansion result significantly through modeling more detailed information of wind and load variation among regions in the US EI system. Moreover, the improved expansion plan that combines generation and transmission will aid system planners and policy makers to maximize the social welfare in large-scale power grids.« less

  7. Controlled generation of a single Trichel pulse and a series of single Trichel pulses in air

    NASA Astrophysics Data System (ADS)

    Mizeraczyk, Jerzy; Berendt, Artur; Akishev, Yuri

    2018-04-01

    In this paper, a simple method for the controlled generation of a single Trichel pulse or a series of single Trichel pulses of a regulated repetition frequency in air is proposed. The concept of triggering a single Trichel pulse or a series of such pulses is based on the precise controlling the voltage inception of the negative corona, which can be accomplished through the use of a ramp voltage pulse or a series of such pulses with properly chosen ramp voltage pulse parameters (rise and fall times, and ramp voltage pulse repetition frequency). The proposal has been tested in experiments using a needle-to-plate electrode arrangement in air, and reproducible Trichel pulses (single or in a series) were obtained by triggering them with an appropriately designed voltage waveform. The proposed method and results obtained have been qualitatively analysed. The analysis provides guidance for designing the voltage ramp pulse in respect of the generation of a single Trichel pulse or a series of single Trichel pulses. The controlled generation of a single Trichel pulse or a series of such pulses would be a helpful research tool for the refined studies of the fundamental processes in a negative corona discharge in a single- (air is an example) and multi-phase gaseous fluids. The controlled generation of a single Trichel pulse or a series of Trichel pulses can also be attractive for those corona treatments which need manipulation of the electric charge and heat portions delivered by the Trichel pulses to the object.

  8. Generating synthetic wave climates for coastal modelling: a linear mixed modelling approach

    NASA Astrophysics Data System (ADS)

    Thomas, C.; Lark, R. M.

    2013-12-01

    Numerical coastline morphological evolution models require wave climate properties to drive morphological change through time. Wave climate properties (typically wave height, period and direction) may be temporally fixed, culled from real wave buoy data, or allowed to vary in some way defined by a Gaussian or other pdf. However, to examine sensitivity of coastline morphologies to wave climate change, it seems desirable to be able to modify wave climate time series from a current to some new state along a trajectory, but in a way consistent with, or initially conditioned by, the properties of existing data, or to generate fully synthetic data sets with realistic time series properties. For example, mean or significant wave height time series may have underlying periodicities, as revealed in numerous analyses of wave data. Our motivation is to develop a simple methodology to generate synthetic wave climate time series that can change in some stochastic way through time. We wish to use such time series in a coastline evolution model to test sensitivities of coastal landforms to changes in wave climate over decadal and centennial scales. We have worked initially on time series of significant wave height, based on data from a Waverider III buoy located off the coast of Yorkshire, England. The statistical framework for the simulation is the linear mixed model. The target variable, perhaps after transformation (Box-Cox), is modelled as a multivariate Gaussian, the mean modelled as a function of a fixed effect, and two random components, one of which is independently and identically distributed (iid) and the second of which is temporally correlated. The model was fitted to the data by likelihood methods. We considered the option of a periodic mean, the period either fixed (e.g. at 12 months) or estimated from the data. We considered two possible correlation structures for the second random effect. In one the correlation decays exponentially with time. In the second (spherical) model, it cuts off at a temporal range. Having fitted the model, multiple realisations were generated; the random effects were simulated by specifying a covariance matrix for the simulated values, with the estimated parameters. The Cholesky factorisation of the covariance matrix was computed and realizations of the random component of the model generated by pre-multiplying a vector of iid standard Gaussian variables by the lower triangular factor. The resulting random variate was added to the mean value computed from the fixed effects, and the result back-transformed to the original scale of the measurement. Realistic simulations result from approach described above. Background exploratory data analysis was undertaken on 20-day sets of 30-minute buoy data, selected from days 5-24 of months January, April, July, October, 2011, to elucidate daily to weekly variations, and to keep numerical analysis tractable computationally. Work remains to be undertaken to develop suitable models for synthetic directional data. We suggest that the general principles of the method will have applications in other geomorphological modelling endeavours requiring time series of stochastically variable environmental parameters.

  9. A time series modeling approach in risk appraisal of violent and sexual recidivism.

    PubMed

    Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E

    2010-10-01

    For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.

  10. Probabilistic reasoning over seismic RMS time series: volcano monitoring through HMMs and SAX technique

    NASA Astrophysics Data System (ADS)

    Aliotta, M. A.; Cassisi, C.; Prestifilippo, M.; Cannata, A.; Montalto, P.; Patanè, D.

    2014-12-01

    During the last years, volcanic activity at Mt. Etna was often characterized by cyclic occurrences of fountains. In the period between January 2011 and June 2013, 38 episodes of lava fountains has been observed. Automatic recognition of the volcano's states related to lava fountain episodes (Quiet, Pre-Fountaining, Fountaining, Post-Fountaining) is very useful for monitoring purposes. We discovered that such states are strongly related to the trend of RMS (Root Mean Square) of the seismic signal recorded in the summit area. In the framework of the project PON SIGMA (Integrated Cloud-Sensor System for Advanced Multirisk Management) work, we tried to model the system generating its sampled values (assuming to be a Markov process and assuming that RMS time series is a stochastic process), by using Hidden Markov models (HMMs), that are a powerful tool for modeling any time-varying series. HMMs analysis seeks to discover the sequence of hidden states from the observed emissions. In our framework, observed emissions are characters generated by SAX (Symbolic Aggregate approXimation) technique. SAX is able to map RMS time series values with discrete literal emissions. Our experiments showed how to predict volcano states by means of SAX and HMMs.

  11. Emerging properties of financial time series in the ``Game of Life''

    NASA Astrophysics Data System (ADS)

    Hernández-Montoya, A. R.; Coronel-Brizio, H. F.; Stevens-Ramírez, G. A.; Rodríguez-Achach, M.; Politi, M.; Scalas, E.

    2011-12-01

    We explore the spatial complexity of Conway’s “Game of Life,” a prototypical cellular automaton by means of a geometrical procedure generating a two-dimensional random walk from a bidimensional lattice with periodical boundaries. The one-dimensional projection of this process is analyzed and it turns out that some of its statistical properties resemble the so-called stylized facts observed in financial time series. The scope and meaning of this result are discussed from the viewpoint of complex systems. In particular, we stress how the supposed peculiarities of financial time series are, often, overrated in their importance.

  12. Emerging properties of financial time series in the "Game of Life".

    PubMed

    Hernández-Montoya, A R; Coronel-Brizio, H F; Stevens-Ramírez, G A; Rodríguez-Achach, M; Politi, M; Scalas, E

    2011-12-01

    We explore the spatial complexity of Conway's "Game of Life," a prototypical cellular automaton by means of a geometrical procedure generating a two-dimensional random walk from a bidimensional lattice with periodical boundaries. The one-dimensional projection of this process is analyzed and it turns out that some of its statistical properties resemble the so-called stylized facts observed in financial time series. The scope and meaning of this result are discussed from the viewpoint of complex systems. In particular, we stress how the supposed peculiarities of financial time series are, often, overrated in their importance.

  13. An Extended IEEE 118-Bus Test System With High Renewable Penetration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pena, Ivonne; Martinez-Anido, Carlo Brancucci; Hodge, Bri-Mathias

    This article describes a new publicly available version of the IEEE 118-bus test system, named NREL-118. The database is based on the transmission representation (buses and lines) of the IEEE 118-bus test system, with a reconfigured generation representation using three regions of the US Western Interconnection from the latest Western Electricity Coordination Council (WECC) 2024 Common Case [1]. Time-synchronous hourly load, wind, and solar time series are provided for over one year (8784 hours). The public database presented and described in this manuscript will allow researchers to model a test power system using detailed transmission, generation, load, wind, and solarmore » data. This database includes key additional features that add to the current IEEE 118-bus test model, such as: the inclusion of 10 generation technologies with different heat rate functions, minimum stable levels and ramping rates, GHG emissions rates, regulation and contingency reserves, and hourly time series data for one full year for load, wind and solar generation.« less

  14. Real time wave forecasting using wind time history and numerical model

    NASA Astrophysics Data System (ADS)

    Jain, Pooja; Deo, M. C.; Latha, G.; Rajendran, V.

    Operational activities in the ocean like planning for structural repairs or fishing expeditions require real time prediction of waves over typical time duration of say a few hours. Such predictions can be made by using a numerical model or a time series model employing continuously recorded waves. This paper presents another option to do so and it is based on a different time series approach in which the input is in the form of preceding wind speed and wind direction observations. This would be useful for those stations where the costly wave buoys are not deployed and instead only meteorological buoys measuring wind are moored. The technique employs alternative artificial intelligence approaches of an artificial neural network (ANN), genetic programming (GP) and model tree (MT) to carry out the time series modeling of wind to obtain waves. Wind observations at four offshore sites along the east coast of India were used. For calibration purpose the wave data was generated using a numerical model. The predicted waves obtained using the proposed time series models when compared with the numerically generated waves showed good resemblance in terms of the selected error criteria. Large differences across the chosen techniques of ANN, GP, MT were not noticed. Wave hindcasting at the same time step and the predictions over shorter lead times were better than the predictions over longer lead times. The proposed method is a cost effective and convenient option when a site-specific information is desired.

  15. Cloud masking and removal in remote sensing image time series

    NASA Astrophysics Data System (ADS)

    Gómez-Chova, Luis; Amorós-López, Julia; Mateo-García, Gonzalo; Muñoz-Marí, Jordi; Camps-Valls, Gustau

    2017-01-01

    Automatic cloud masking of Earth observation images is one of the first required steps in optical remote sensing data processing since the operational use and product generation from satellite image time series might be hampered by undetected clouds. The high temporal revisit of current and forthcoming missions and the scarcity of labeled data force us to cast cloud screening as an unsupervised change detection problem in the temporal domain. We introduce a cloud screening method based on detecting abrupt changes along the time dimension. The main assumption is that image time series follow smooth variations over land (background) and abrupt changes will be mainly due to the presence of clouds. The method estimates the background surface changes using the information in the time series. In particular, we propose linear and nonlinear least squares regression algorithms that minimize both the prediction and the estimation error simultaneously. Then, significant differences in the image of interest with respect to the estimated background are identified as clouds. The use of kernel methods allows the generalization of the algorithm to account for higher-order (nonlinear) feature relations. After the proposed cloud masking and cloud removal, cloud-free time series at high spatial resolution can be used to obtain a better monitoring of land cover dynamics and to generate more elaborated products. The method is tested in a dataset with 5-day revisit time series from SPOT-4 at high resolution and with Landsat-8 time series. Experimental results show that the proposed method yields more accurate cloud masks when confronted with state-of-the-art approaches typically used in operational settings. In addition, the algorithm has been implemented in the Google Earth Engine platform, which allows us to access the full Landsat-8 catalog and work in a parallel distributed platform to extend its applicability to a global planetary scale.

  16. Quantifying the consequences of changing hydroclimatic extremes on protection levels for the Rhine

    NASA Astrophysics Data System (ADS)

    Sperna Weiland, Frederiek; Hegnauer, Mark; Buiteveld, Hendrik; Lammersen, Rita; van den Boogaard, Henk; Beersma, Jules

    2017-04-01

    The Dutch method for quantifying the magnitude and frequency of occurrence of discharge extremes in the Rhine basin and the potential influence of climate change hereon are presented. In the Netherlands flood protection design requires estimates of discharge extremes for return periods of 1000 up to 100,000 years. Observed discharge records are too short to derive such extreme return discharges, therefore extreme value assessment is based on very long synthetic discharge time-series generated with the Generator of Rainfall And Discharge Extremes (GRADE). The GRADE instrument consists of (1) a stochastic weather generator based on time series resampling of historical f rainfall and temperature and (2) a hydrological model optimized following the GLUE methodology and (3) a hydrodynamic model to simulate the propagation of flood waves based on the generated hydrological time-series. To assess the potential influence of climate change, the four KNMI'14 climate scenarios are applied. These four scenarios represent a large part of the uncertainty provided by the GCMs used for the IPCC 5th assessment report (the CMIP5 GCM simulations under different climate forcings) and are for this purpose tailored to the Rhine and Meuse river basins. To derive the probability distributions of extreme discharges under climate change the historical synthetic rainfall and temperature series simulated with the weather generator are transformed to the future following the KNMI'14 scenarios. For this transformation the Advanced Delta Change method, which allows that the changes in the extremes differ from those in the means, is used. Subsequently the hydrological model is forced with the historical and future (i.e. transformed) synthetic time-series after which the propagation of the flood waves is simulated with the hydrodynamic model to obtain the extreme discharge statistics both for current and future climate conditions. The study shows that both for 2050 and 2085 increases in discharge extremes for the river Rhine at Lobith are projected by all four KNMI'14 climate scenarios. This poses increased requirements for flood protection design in order to prepare for changing climate conditions.

  17. A Point Rainfall Generator With Internal Storm Structure

    NASA Astrophysics Data System (ADS)

    Marien, J. L.; Vandewiele, G. L.

    1986-04-01

    A point rainfall generator is a probabilistic model for the time series of rainfall as observed in one geographical point. The main purpose of such a model is to generate long synthetic sequences of rainfall for simulation studies. The present generator is a continuous time model based on 13.5 years of 10-min point rainfalls observed in Belgium and digitized with a resolution of 0.1 mm. The present generator attempts to model all features of the rainfall time series which are important for flood studies as accurately as possible. The original aspects of the model are on the one hand the way in which storms are defined and on the other hand the theoretical model for the internal storm characteristics. The storm definition has the advantage that the important characteristics of successive storms are fully independent and very precisely modelled, even on time bases as small as 10 min. The model of the internal storm characteristics has a strong theoretical structure. This fact justifies better the extrapolation of this model to severe storms for which the data are very sparse. This can be important when using the model to simulate severe flood events.

  18. Collaborative Research with Chinese, Indian, Filipino and North European Research Organizations on Infectious Disease Epidemics.

    PubMed

    Sumi, Ayako; Kobayashi, Nobumichi

    2017-01-01

    In this report, we present a short review of applications of time series analysis, which consists of spectral analysis based on the maximum entropy method in the frequency domain and the least squares method in the time domain, to the incidence data of infectious diseases. This report consists of three parts. First, we present our results obtained by collaborative research on infectious disease epidemics with Chinese, Indian, Filipino and North European research organizations. Second, we present the results obtained with the Japanese infectious disease surveillance data and the time series numerically generated from a mathematical model, called the susceptible/exposed/infectious/recovered (SEIR) model. Third, we present an application of the time series analysis to pathologic tissues to examine the usefulness of time series analysis for investigating the spatial pattern of pathologic tissue. It is anticipated that time series analysis will become a useful tool for investigating not only infectious disease surveillance data but also immunological and genetic tests.

  19. A window-based time series feature extraction method.

    PubMed

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-10-01

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Simulating extreme low-discharge events for the Rhine using a stochastic model

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Mens, Marjolein; Schasfoort, Femke; Diermanse, Ferdinand; Pulido-Velazquez, Manuel

    2017-04-01

    The specific features of hydrological droughts make them more difficult to be analysed than other water-related phenomena: longer time scales (months to several years) so less historical events are available, and the drought severity and associate damage depends on a combination of variables with no clear prevalence (e.g., total water deficit, maximum deficit and duration). As part of drought risk analysis, which aims to provide insight into the variability of hydrological conditions and associated socio-economic impacts, long synthetic time series should therefore be developed. In this contribution, we increase the length of the available inflow time series using stochastic autoregressive modelling. This enhancement could improve the characterization of the extreme range and can define extreme droughts with similar periods of return but different patterns that can lead to distinctly different damages. The methodology consists of: 1) fitting an autoregressive model (AR, ARMA…) to the available records; 2) generating extended time series (thousands of years); 3) performing a frequency analysis with different characteristic variables (total, deficit, maximum deficit and so on); and 4) selecting extreme drought events associated with different characteristic variables and return periods. The methodology was applied to the Rhine river discharge at location Lobith, where the Rhine enters The Netherlands. A monthly ARMA(1,1) autoregressive model with seasonally varying parameters was fitted and successfully validated to the historical records available since year 1901. The maximum monthly deficit with respect to a threshold value of 1800 m3/s and the average discharge for a given time span in m3/s were chosen as indicators to identify drought periods. A synthetic series of 10,000 years of discharges was generated using the validated ARMA model. Two time spans were considered in the analysis: the whole calendar year and the half-year period between April and September (the summer half year, where water demands are highest). Frequency analysis was performed for both indicators and time spans for the generated time series and the historical records. The comparison between observed and generated series showed that the ARMA model provides a good reproduction of the maximum deficits and total discharges, especially for the summer half-year period. The resulting synthetic series are therefore considered credible. These synthetic series, with its wealth of information, can then be used as inputs for the damage assessment models, together with information on precipitation deficits, in order to estimate the risk that lower inflows can have on the urban, the agricultural, the shipping sector and so on. This will help in associating economic losses and periods of return, as well as for estimating how droughts with similar periods of return but different patterns can lead to different damages. ACKNOWLEDGEMENT This study has been supported by the European Union's Horizon 2020 research and innovation programme under the IMPREX project (grant agreement no: 641.811), and by the Climate-KIC Pioneers into Practice Program supported by the European Union's EIT.

  1. The incorrect usage of singular spectral analysis and discrete wavelet transform in hybrid models to predict hydrological time series

    NASA Astrophysics Data System (ADS)

    Du, Kongchang; Zhao, Ying; Lei, Jiaqiang

    2017-09-01

    In hydrological time series prediction, singular spectrum analysis (SSA) and discrete wavelet transform (DWT) are widely used as preprocessing techniques for artificial neural network (ANN) and support vector machine (SVM) predictors. These hybrid or ensemble models seem to largely reduce the prediction error. In current literature researchers apply these techniques to the whole observed time series and then obtain a set of reconstructed or decomposed time series as inputs to ANN or SVM. However, through two comparative experiments and mathematical deduction we found the usage of SSA and DWT in building hybrid models is incorrect. Since SSA and DWT adopt 'future' values to perform the calculation, the series generated by SSA reconstruction or DWT decomposition contain information of 'future' values. These hybrid models caused incorrect 'high' prediction performance and may cause large errors in practice.

  2. Testing for nonlinearity in non-stationary physiological time series.

    PubMed

    Guarín, Diego; Delgado, Edilson; Orozco, Álvaro

    2011-01-01

    Testing for nonlinearity is one of the most important preprocessing steps in nonlinear time series analysis. Typically, this is done by means of the linear surrogate data methods. But it is a known fact that the validity of the results heavily depends on the stationarity of the time series. Since most physiological signals are non-stationary, it is easy to falsely detect nonlinearity using the linear surrogate data methods. In this document, we propose a methodology to extend the procedure for generating constrained surrogate time series in order to assess nonlinearity in non-stationary data. The method is based on the band-phase-randomized surrogates, which consists (contrary to the linear surrogate data methods) in randomizing only a portion of the Fourier phases in the high frequency domain. Analysis of simulated time series showed that in comparison to the linear surrogate data method, our method is able to discriminate between linear stationarity, linear non-stationary and nonlinear time series. Applying our methodology to heart rate variability (HRV) records of five healthy patients, we encountered that nonlinear correlations are present in this non-stationary physiological signals.

  3. Detecting and modelling delayed density-dependence in abundance time series of a small mammal (Didelphis aurita)

    NASA Astrophysics Data System (ADS)

    Brigatti, E.; Vieira, M. V.; Kajin, M.; Almeida, P. J. A. L.; de Menezes, M. A.; Cerqueira, R.

    2016-02-01

    We study the population size time series of a Neotropical small mammal with the intent of detecting and modelling population regulation processes generated by density-dependent factors and their possible delayed effects. The application of analysis tools based on principles of statistical generality are nowadays a common practice for describing these phenomena, but, in general, they are more capable of generating clear diagnosis rather than granting valuable modelling. For this reason, in our approach, we detect the principal temporal structures on the bases of different correlation measures, and from these results we build an ad-hoc minimalist autoregressive model that incorporates the main drivers of the dynamics. Surprisingly our model is capable of reproducing very well the time patterns of the empirical series and, for the first time, clearly outlines the importance of the time of attaining sexual maturity as a central temporal scale for the dynamics of this species. In fact, an important advantage of this analysis scheme is that all the model parameters are directly biologically interpretable and potentially measurable, allowing a consistency check between model outputs and independent measurements.

  4. Analysis and prediction of aperiodic hydrodynamic oscillatory time series by feed-forward neural networks, fuzzy logic, and a local nonlinear predictor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentili, Pier Luigi, E-mail: pierluigi.gentili@unipg.it; Gotoda, Hiroshi; Dolnik, Milos

    Forecasting of aperiodic time series is a compelling challenge for science. In this work, we analyze aperiodic spectrophotometric data, proportional to the concentrations of two forms of a thermoreversible photochromic spiro-oxazine, that are generated when a cuvette containing a solution of the spiro-oxazine undergoes photoreaction and convection due to localized ultraviolet illumination. We construct the phase space for the system using Takens' theorem and we calculate the Lyapunov exponents and the correlation dimensions to ascertain the chaotic character of the time series. Finally, we predict the time series using three distinct methods: a feed-forward neural network, fuzzy logic, and amore » local nonlinear predictor. We compare the performances of these three methods.« less

  5. Updating Landsat time series of surface-reflectance composites and forest change products with new observations

    NASA Astrophysics Data System (ADS)

    Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.

    2017-12-01

    The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time series change detection and update approach followed here, science outcomes or reports representing one temporal epoch can be considered stable and will not be altered when a time series is updated with newly available data.

  6. Detection of statistical asymmetries in non-stationary sign time series: Analysis of foreign exchange data

    PubMed Central

    Takayasu, Hideki; Takayasu, Misako

    2017-01-01

    We extend the concept of statistical symmetry as the invariance of a probability distribution under transformation to analyze binary sign time series data of price difference from the foreign exchange market. We model segments of the sign time series as Markov sequences and apply a local hypothesis test to evaluate the symmetries of independence and time reversion in different periods of the market. For the test, we derive the probability of a binary Markov process to generate a given set of number of symbol pairs. Using such analysis, we could not only segment the time series according the different behaviors but also characterize the segments in terms of statistical symmetries. As a particular result, we find that the foreign exchange market is essentially time reversible but this symmetry is broken when there is a strong external influence. PMID:28542208

  7. Hybrid intelligent methodology to design translation invariant morphological operators for Brazilian stock market prediction.

    PubMed

    Araújo, Ricardo de A

    2010-12-01

    This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Classification of time series patterns from complex dynamic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less

  9. Applying "domino" model to study dipolar geomagnetic field reversals and secular variation

    NASA Astrophysics Data System (ADS)

    Peqini, Klaudio; Duka, Bejo

    2014-05-01

    Aiming to understand the physical processes underneath the reversals events of geomagnetic field, different numerical models have been conceived. We considered the so named "domino" model, an Ising-Heisenberg model of interacting magnetic spins aligned along a ring [Mazaud and Laj, EPSL, 1989; Mori et al., arXiv:1110.5062v2, 2012]. We will present here some results which are slightly different from the already published results, and will give our interpretation on the differences. Following the empirical studies of the long series of the axial magnetic moment (dipolar moment or "magnetization") generated by the model varying all model parameters, we defined the set of parameters that supply the longest mean time between reversals. Using this set of parameters, a short time series (about 10,000 years) of axial magnetic moment was generated. After de-noising the fluctuation of this time series, we compared it with the series of dipolar magnetic moment values supplied by CALS10K.1b model for the last 10000 years. We found similar behavior of the both series, even if the "domino" model could not supply a full explanation of the geomagnetic field SV. In a similar way we will compare a 14000 years long series with the dipolar magnetic moment obtained by the model SHA.DIF.14k [Pavón-Carrasco et al., EPSL, 2014].

  10. Sensor sentinel computing device

    DOEpatents

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  11. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  12. Linear time-varying models can reveal non-linear interactions of biomolecular regulatory networks using multiple time-series data.

    PubMed

    Kim, Jongrae; Bates, Declan G; Postlethwaite, Ian; Heslop-Harrison, Pat; Cho, Kwang-Hyun

    2008-05-15

    Inherent non-linearities in biomolecular interactions make the identification of network interactions difficult. One of the principal problems is that all methods based on the use of linear time-invariant models will have fundamental limitations in their capability to infer certain non-linear network interactions. Another difficulty is the multiplicity of possible solutions, since, for a given dataset, there may be many different possible networks which generate the same time-series expression profiles. A novel algorithm for the inference of biomolecular interaction networks from temporal expression data is presented. Linear time-varying models, which can represent a much wider class of time-series data than linear time-invariant models, are employed in the algorithm. From time-series expression profiles, the model parameters are identified by solving a non-linear optimization problem. In order to systematically reduce the set of possible solutions for the optimization problem, a filtering process is performed using a phase-portrait analysis with random numerical perturbations. The proposed approach has the advantages of not requiring the system to be in a stable steady state, of using time-series profiles which have been generated by a single experiment, and of allowing non-linear network interactions to be identified. The ability of the proposed algorithm to correctly infer network interactions is illustrated by its application to three examples: a non-linear model for cAMP oscillations in Dictyostelium discoideum, the cell-cycle data for Saccharomyces cerevisiae and a large-scale non-linear model of a group of synchronized Dictyostelium cells. The software used in this article is available from http://sbie.kaist.ac.kr/software

  13. The detection of local irreversibility in time series based on segmentation

    NASA Astrophysics Data System (ADS)

    Teng, Yue; Shang, Pengjian

    2018-06-01

    We propose a strategy for the detection of local irreversibility in stationary time series based on multiple scale. The detection is beneficial to evaluate the displacement of irreversibility toward local skewness. By means of this method, we can availably discuss the local irreversible fluctuations of time series as the scale changes. The method was applied to simulated nonlinear signals generated by the ARFIMA process and logistic map to show how the irreversibility functions react to the increasing of the multiple scale. The method was applied also to series of financial markets i.e., American, Chinese and European markets. The local irreversibility for different markets demonstrate distinct characteristics. Simulations and real data support the need of exploring local irreversibility.

  14. Neural networks and traditional time series methods: a synergistic combination in state economic forecasts.

    PubMed

    Hansen, J V; Nelson, R D

    1997-01-01

    Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.

  15. Computing the Lyapunov spectrum of a dynamical system from an observed time series

    NASA Technical Reports Server (NTRS)

    Brown, Reggie; Bryant, Paul; Abarbanel, Henry D. I.

    1991-01-01

    The paper examines the problem of accurately determining, from an observed time series, the Liapunov exponents for the dynamical system generating the data. It is shown that, even with very large data sets, it is clearly advantageous to utilize local neighborhood-to-neighborhood mappings with higher-order Taylor series rather than just local linear maps. This procedure is demonstrated on the Henon and Ikeda maps of the plane itself, the Lorenz system of three ordinary differential equations, and the Mackey-Glass delay differential equation.

  16. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    PubMed

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.

  17. Characterization of chaotic attractors under noise: A recurrence network perspective

    NASA Astrophysics Data System (ADS)

    Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.

    2016-12-01

    We undertake a detailed numerical investigation to understand how the addition of white and colored noise to a chaotic time series changes the topology and the structure of the underlying attractor reconstructed from the time series. We use the methods and measures of recurrence plot and recurrence network generated from the time series for this analysis. We explicitly show that the addition of noise obscures the property of recurrence of trajectory points in the phase space which is the hallmark of every dynamical system. However, the structure of the attractor is found to be robust even upto high noise levels of 50%. An advantage of recurrence network measures over the conventional nonlinear measures is that they can be applied on short and non stationary time series data. By using the results obtained from the above analysis, we go on to analyse the light curves from a dominant black hole system and show that the recurrence network measures are capable of identifying the nature of noise contamination in a time series.

  18. Optimal Reorganization of NASA Earth Science Data for Enhanced Accessibility and Usability for the Hydrology Community

    NASA Technical Reports Server (NTRS)

    Teng, William; Rui, Hualan; Strub, Richard; Vollmer, Bruce

    2016-01-01

    A long-standing "Digital Divide" in data representation exists between the preferred way of data access by the hydrology community and the common way of data archival by earth science data centers. Typically, in hydrology, earth surface features are expressed as discrete spatial objects (e.g., watersheds), and time-varying data are contained in associated time series. Data in earth science archives, although stored as discrete values (of satellite swath pixels or geographical grids), represent continuous spatial fields, one file per time step. This Divide has been an obstacle, specifically, between the Consortium of Universities for the Advancement of Hydrologic Science, Inc. and NASA earth science data systems. In essence, the way data are archived is conceptually orthogonal to the desired method of access. Our recent work has shown an optimal method of bridging the Divide, by enabling operational access to long-time series (e.g., 36 years of hourly data) of selected NASA datasets. These time series, which we have termed "data rods," are pre-generated or generated on-the-fly. This optimal solution was arrived at after extensive investigations of various approaches, including one based on "data curtains." The on-the-fly generation of data rods uses "data cubes," NASA Giovanni, and parallel processing. The optimal reorganization of NASA earth science data has significantly enhanced the access to and use of the data for the hydrology user community.

  19. Conventional and advanced time series estimation: application to the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database, 1993-2006.

    PubMed

    Moran, John L; Solomon, Patricia J

    2011-02-01

    Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time-dependence may be established using conventional and advanced econometric time series estimators. © 2010 Blackwell Publishing Ltd.

  20. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  1. Performance of time-series methods in forecasting the demand for red blood cell transfusion.

    PubMed

    Pereira, Arturo

    2004-05-01

    Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.

  2. Robust extrema features for time-series data analysis.

    PubMed

    Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N

    2013-06-01

    The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.

  3. Time series analysis of InSAR data: Methods and trends

    NASA Astrophysics Data System (ADS)

    Osmanoğlu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cabral-Cano, Enrique

    2016-05-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ;unwrapping; of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  4. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  5. Time series evapotranspiration maps at a regional scale: A methodology, evaluation, and their use in water resources management

    NASA Astrophysics Data System (ADS)

    Gowda, P. H.

    2016-12-01

    Evapotranspiration (ET) is an important process in ecosystems' water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. There are efforts to develop such datasets on a regional to global scale but often faced with the limitations of spatial-temporal resolution tradeoffs in satellite remote sensing technology. In this study, we developed frameworks for generating high and medium resolution daily ET maps from Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data, respectively. For developing high resolution (30-m) daily time series ET maps with Landsat TM data, the series version of Two Source Energy Balance (TSEB) model was used to compute sensible and latent heat fluxes of soil and canopy separately. Landsat 5 (2000-2011) and Landsat 8 (2013-2014) imageries for row 28/35 and 27/36 covering central Oklahoma was used. MODIS data (2001-2014) covering Oklahoma and Texas Panhandle was used to develop medium resolution (250-m), time series daily ET maps with SEBS (Surface Energy Balance System) model. An extensive network of weather stations managed by Texas High Plains ET Network and Oklahoma Mesonet was used to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. A linear interpolation sub-model was used to estimate the daily ET between the image acquisition days. Accuracy assessment of daily ET maps were done against eddy covariance data from two grassland sites at El Reno, OK. Statistical results indicated good performance by modeling frameworks developed for deriving time series ET maps. Results indicated that the proposed ET mapping framework is suitable for deriving daily time series ET maps at regional scale with Landsat and MODIS data.

  6. Quantifying Surface Water Dynamics at 30 Meter Spatial Resolution in the North American High Northern Latitudes 1991-2011

    NASA Technical Reports Server (NTRS)

    Carroll, Mark; Wooten, Margaret; DiMiceli, Charlene; Sohlberg, Robert; Kelly, Maureen

    2016-01-01

    The availability of a dense time series of satellite observations at moderate (30 m) spatial resolution is enabling unprecedented opportunities for understanding ecosystems around the world. A time series of data from Landsat was used to generate a series of three maps at decadal time step to show how surface water has changed from 1991 to 2011 in the high northern latitudes of North America. Previous attempts to characterize the change in surface water in this region have been limited in either spatial or temporal resolution, or both. This series of maps was generated for the NASA Arctic and Boreal Vulnerability Experiment (ABoVE), which began in fall 2015. These maps show a nominal extent of surface water by using multiple observations to make a single map for each time step. This increases the confidence that any detected changes are related to climate or ecosystem changes not simply caused by short duration weather events such as flood or drought. The methods and comparison to other contemporary maps of the region are presented here. Initial verification results indicate 96% producer accuracy and 54% user accuracy when compared to 2-m resolution World View-2 data. All water bodies that were omitted were one Landsat pixel or smaller, hence below detection limits of the instrument.

  7. Time-series animation techniques for visualizing urban growth

    USGS Publications Warehouse

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  8. The conditional resampling model STARS: weaknesses of the modeling concept and development

    NASA Astrophysics Data System (ADS)

    Menz, Christoph

    2016-04-01

    The Statistical Analogue Resampling Scheme (STARS) is based on a modeling concept of Werner and Gerstengarbe (1997). The model uses a conditional resampling technique to create a simulation time series from daily observations. Unlike other time series generators (such as stochastic weather generators) STARS only needs a linear regression specification of a single variable as the target condition for the resampling. Since its first implementation the algorithm was further extended in order to allow for a spatially distributed trend signal, to preserve the seasonal cycle and the autocorrelation of the observation time series (Orlovsky, 2007; Orlovsky et al., 2008). This evolved version was successfully used in several climate impact studies. However a detaild evaluation of the simulations revealed two fundamental weaknesses of the utilized resampling technique. 1. The restriction of the resampling condition on a single individual variable can lead to a misinterpretation of the change signal of other variables when the model is applied to a mulvariate time series. (F. Wechsung and M. Wechsung, 2014). As one example, the short-term correlations between precipitation and temperature (cooling of the near-surface air layer after a rainfall event) can be misinterpreted as a climatic change signal in the simulation series. 2. The model restricts the linear regression specification to the annual mean time series, refusing the specification of seasonal varying trends. To overcome these fundamental weaknesses a redevelopment of the whole algorithm was done. The poster discusses the main weaknesses of the earlier model implementation and the methods applied to overcome these in the new version. Based on the new model idealized simulations were conducted to illustrate the enhancement.

  9. Using Evolved Fuzzy Neural Networks for Injury Detection from Isokinetic Curves

    NASA Astrophysics Data System (ADS)

    Couchet, Jorge; Font, José María; Manrique, Daniel

    In this paper we propose an evolutionary fuzzy neural networks system for extracting knowledge from a set of time series containing medical information. The series represent isokinetic curves obtained from a group of patients exercising the knee joint on an isokinetic dynamometer. The system has two parts: i) it analyses the time series input in order generate a simplified model of an isokinetic curve; ii) it applies a grammar-guided genetic program to obtain a knowledge base represented by a fuzzy neural network. Once the knowledge base has been generated, the system is able to perform knee injuries detection. The results suggest that evolved fuzzy neural networks perform better than non-evolutionary approaches and have a high accuracy rate during both the training and testing phases. Additionally, they are robust, as the system is able to self-adapt to changes in the problem without human intervention.

  10. The ANTARES observation network

    NASA Astrophysics Data System (ADS)

    Dogliotti, Ana I.; Ulloa, Osvaldo; Muller-Karger, Frank; Hu, Chuanmin; Murch, Brock; Taylor, Charles; Yuras, Gabriel; Kampel, Milton; Lutz, Vivian; Gaeta, Salvador; Gagliardini, Domingo A.; Garcia, Carlos A. E.; Klein, Eduardo; Helbling, Walter; Varela, Ramon; Barbieri, Elena; Negri, Ruben; Frouin, Robert; Sathyendranath, Shubha; Platt, Trevor

    2005-08-01

    The ANTARES network seeks to understand the variability of the coastal environment on a continental scale and the local, regional, and global factors and processes that effect this change. The focus are coastal zones of South America and the Caribbean Sea. The initial approach includes developing time series of in situ and satellite-based environmental observations in coastal and oceanic regions. The network is constituted by experts that seek to exchange ideas, develop an infrastructure for mutual logistical and knowledge support, and link in situ time series of observations located around the Americas with real-time and historical satellite-derived time series of relevant products. A major objective is to generate information that will be distributed publicly and openly in the service of coastal ocean research, resource management, science-based policy making and education in the Americas. As a first stage, the network has linked oceanographic time series located in Argentina, Brazil, Chile and Venezuela. The group has also developed an online tool to examine satellite data collected with sensors such as NASA's MODIS. Specifically, continental-scale high-resolution (1 km) maps of chlorophyll and of sea surface temperature are generated and served daily over the web according to specifications of users within the ANTARES network. Other satellite-derived variables will be added as support for the network is solidified. ANTARES serves data and offers simple analysis tools that anyone can use with the ultimate goal of improving coastal assessments, management and policies.

  11. Joint Seasonal ARMA Approach for Modeling of Load Forecast Errors in Planning Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.

    2014-04-14

    To make informed and robust decisions in the probabilistic power system operation and planning process, it is critical to conduct multiple simulations of the generated combinations of wind and load parameters and their forecast errors to handle the variability and uncertainty of these time series. In order for the simulation results to be trustworthy, the simulated series must preserve the salient statistical characteristics of the real series. In this paper, we analyze day-ahead load forecast error data from multiple balancing authority locations and characterize statistical properties such as mean, standard deviation, autocorrelation, correlation between series, time-of-day bias, and time-of-day autocorrelation.more » We then construct and validate a seasonal autoregressive moving average (ARMA) model to model these characteristics, and use the model to jointly simulate day-ahead load forecast error series for all BAs.« less

  12. Design of a 9-loop quasi-exponential waveform generator

    NASA Astrophysics Data System (ADS)

    Banerjee, Partha; Shukla, Rohit; Shyam, Anurag

    2015-12-01

    We know in an under-damped L-C-R series circuit, current follows a damped sinusoidal waveform. But if a number of sinusoidal waveforms of decreasing time period, generated in an L-C-R circuit, be combined in first quarter cycle of time period, then a quasi-exponential nature of output current waveform can be achieved. In an L-C-R series circuit, quasi-exponential current waveform shows a rising current derivative and thereby finds many applications in pulsed power. Here, we have described design and experiment details of a 9-loop quasi-exponential waveform generator. In that, design details of magnetic switches have also been described. In the experiment, output current of 26 kA has been achieved. It has been shown that how well the experimentally obtained output current profile matches with the numerically computed output.

  13. Design of a 9-loop quasi-exponential waveform generator.

    PubMed

    Banerjee, Partha; Shukla, Rohit; Shyam, Anurag

    2015-12-01

    We know in an under-damped L-C-R series circuit, current follows a damped sinusoidal waveform. But if a number of sinusoidal waveforms of decreasing time period, generated in an L-C-R circuit, be combined in first quarter cycle of time period, then a quasi-exponential nature of output current waveform can be achieved. In an L-C-R series circuit, quasi-exponential current waveform shows a rising current derivative and thereby finds many applications in pulsed power. Here, we have described design and experiment details of a 9-loop quasi-exponential waveform generator. In that, design details of magnetic switches have also been described. In the experiment, output current of 26 kA has been achieved. It has been shown that how well the experimentally obtained output current profile matches with the numerically computed output.

  14. IDSP- INTERACTIVE DIGITAL SIGNAL PROCESSOR

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1994-01-01

    The Interactive Digital Signal Processor, IDSP, consists of a set of time series analysis "operators" based on the various algorithms commonly used for digital signal analysis work. The processing of a digital time series to extract information is usually achieved by the application of a number of fairly standard operations. However, it is often desirable to "experiment" with various operations and combinations of operations to explore their effect on the results. IDSP is designed to provide an interactive and easy-to-use system for this type of digital time series analysis. The IDSP operators can be applied in any sensible order (even recursively), and can be applied to single time series or to simultaneous time series. IDSP is being used extensively to process data obtained from scientific instruments onboard spacecraft. It is also an excellent teaching tool for demonstrating the application of time series operators to artificially-generated signals. IDSP currently includes over 43 standard operators. Processing operators provide for Fourier transformation operations, design and application of digital filters, and Eigenvalue analysis. Additional support operators provide for data editing, display of information, graphical output, and batch operation. User-developed operators can be easily interfaced with the system to provide for expansion and experimentation. Each operator application generates one or more output files from an input file. The processing of a file can involve many operators in a complex application. IDSP maintains historical information as an integral part of each file so that the user can display the operator history of the file at any time during an interactive analysis. IDSP is written in VAX FORTRAN 77 for interactive or batch execution and has been implemented on a DEC VAX-11/780 operating under VMS. The IDSP system generates graphics output for a variety of graphics systems. The program requires the use of Versaplot and Template plotting routines and IMSL Math/Library routines. These software packages are not included in IDSP. The virtual memory requirement for the program is approximately 2.36 MB. The IDSP system was developed in 1982 and was last updated in 1986. Versaplot is a registered trademark of Versatec Inc. Template is a registered trademark of Template Graphics Software Inc. IMSL Math/Library is a registered trademark of IMSL Inc.

  15. Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition

    PubMed Central

    Munoz-Organero, Mario; Ruiz-Blazquez, Ramona

    2017-01-01

    Body-worn sensors in general and accelerometers in particular have been widely used in order to detect human movements and activities. The execution of each type of movement by each particular individual generates sequences of time series of sensed data from which specific movement related patterns can be assessed. Several machine learning algorithms have been used over windowed segments of sensed data in order to detect such patterns in activity recognition based on intermediate features (either hand-crafted or automatically learned from data). The underlying assumption is that the computed features will capture statistical differences that can properly classify different movements and activities after a training phase based on sensed data. In order to achieve high accuracy and recall rates (and guarantee the generalization of the system to new users), the training data have to contain enough information to characterize all possible ways of executing the activity or movement to be detected. This could imply large amounts of data and a complex and time-consuming training phase, which has been shown to be even more relevant when automatically learning the optimal features to be used. In this paper, we present a novel generative model that is able to generate sequences of time series for characterizing a particular movement based on the time elasticity properties of the sensed data. The model is used to train a stack of auto-encoders in order to learn the particular features able to detect human movements. The results of movement detection using a newly generated database with information on five users performing six different movements are presented. The generalization of results using an existing database is also presented in the paper. The results show that the proposed mechanism is able to obtain acceptable recognition rates (F = 0.77) even in the case of using different people executing a different sequence of movements and using different hardware. PMID:28208736

  16. Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition.

    PubMed

    Munoz-Organero, Mario; Ruiz-Blazquez, Ramona

    2017-02-08

    Body-worn sensors in general and accelerometers in particular have been widely used in order to detect human movements and activities. The execution of each type of movement by each particular individual generates sequences of time series of sensed data from which specific movement related patterns can be assessed. Several machine learning algorithms have been used over windowed segments of sensed data in order to detect such patterns in activity recognition based on intermediate features (either hand-crafted or automatically learned from data). The underlying assumption is that the computed features will capture statistical differences that can properly classify different movements and activities after a training phase based on sensed data. In order to achieve high accuracy and recall rates (and guarantee the generalization of the system to new users), the training data have to contain enough information to characterize all possible ways of executing the activity or movement to be detected. This could imply large amounts of data and a complex and time-consuming training phase, which has been shown to be even more relevant when automatically learning the optimal features to be used. In this paper, we present a novel generative model that is able to generate sequences of time series for characterizing a particular movement based on the time elasticity properties of the sensed data. The model is used to train a stack of auto-encoders in order to learn the particular features able to detect human movements. The results of movement detection using a newly generated database with information on five users performing six different movements are presented. The generalization of results using an existing database is also presented in the paper. The results show that the proposed mechanism is able to obtain acceptable recognition rates ( F = 0.77) even in the case of using different people executing a different sequence of movements and using different hardware.

  17. Algorithms exploiting ultrasonic sensors for subject classification

    NASA Astrophysics Data System (ADS)

    Desai, Sachi; Quoraishee, Shafik

    2009-09-01

    Proposed here is a series of techniques exploiting micro-Doppler ultrasonic sensors capable of characterizing various detected mammalian targets based on their physiological movements captured a series of robust features. Employed is a combination of unique and conventional digital signal processing techniques arranged in such a manner they become capable of classifying a series of walkers. These processes for feature extraction develops a robust feature space capable of providing discrimination of various movements generated from bipeds and quadrupeds and further subdivided into large or small. These movements can be exploited to provide specific information of a given signature dividing it in a series of subset signatures exploiting wavelets to generate start/stop times. After viewing a series spectrograms of the signature we are able to see distinct differences and utilizing kurtosis, we generate an envelope detector capable of isolating each of the corresponding step cycles generated during a walk. The walk cycle is defined as one complete sequence of walking/running from the foot pushing off the ground and concluding when returning to the ground. This time information segments the events that are readily seen in the spectrogram but obstructed in the temporal domain into individual walk sequences. This walking sequence is then subsequently translated into a three dimensional waterfall plot defining the expected energy value associated with the motion at particular instance of time and frequency. The value is capable of being repeatable for each particular class and employable to discriminate the events. Highly reliable classification is realized exploiting a classifier trained on a candidate sample space derived from the associated gyrations created by motion from actors of interest. The classifier developed herein provides a capability to classify events as an adult humans, children humans, horses, and dogs at potentially high rates based on the tested sample space. The algorithm developed and described will provide utility to an underused sensor modality for human intrusion detection because of the current high-rate of generated false alarms. The active ultrasonic sensor coupled in a multi-modal sensor suite with binary, less descriptive sensors like seismic devices realizing a greater accuracy rate for detection of persons of interest for homeland purposes.

  18. Power estimation using simulations for air pollution time-series studies

    PubMed Central

    2012-01-01

    Background Estimation of power to assess associations of interest can be challenging for time-series studies of the acute health effects of air pollution because there are two dimensions of sample size (time-series length and daily outcome counts), and because these studies often use generalized linear models to control for complex patterns of covariation between pollutants and time trends, meteorology and possibly other pollutants. In general, statistical software packages for power estimation rely on simplifying assumptions that may not adequately capture this complexity. Here we examine the impact of various factors affecting power using simulations, with comparison of power estimates obtained from simulations with those obtained using statistical software. Methods Power was estimated for various analyses within a time-series study of air pollution and emergency department visits using simulations for specified scenarios. Mean daily emergency department visit counts, model parameter value estimates and daily values for air pollution and meteorological variables from actual data (8/1/98 to 7/31/99 in Atlanta) were used to generate simulated daily outcome counts with specified temporal associations with air pollutants and randomly generated error based on a Poisson distribution. Power was estimated by conducting analyses of the association between simulated daily outcome counts and air pollution in 2000 data sets for each scenario. Power estimates from simulations and statistical software (G*Power and PASS) were compared. Results In the simulation results, increasing time-series length and average daily outcome counts both increased power to a similar extent. Our results also illustrate the low power that can result from using outcomes with low daily counts or short time series, and the reduction in power that can accompany use of multipollutant models. Power estimates obtained using standard statistical software were very similar to those from the simulations when properly implemented; implementation, however, was not straightforward. Conclusions These analyses demonstrate the similar impact on power of increasing time-series length versus increasing daily outcome counts, which has not previously been reported. Implementation of power software for these studies is discussed and guidance is provided. PMID:22995599

  19. Power estimation using simulations for air pollution time-series studies.

    PubMed

    Winquist, Andrea; Klein, Mitchel; Tolbert, Paige; Sarnat, Stefanie Ebelt

    2012-09-20

    Estimation of power to assess associations of interest can be challenging for time-series studies of the acute health effects of air pollution because there are two dimensions of sample size (time-series length and daily outcome counts), and because these studies often use generalized linear models to control for complex patterns of covariation between pollutants and time trends, meteorology and possibly other pollutants. In general, statistical software packages for power estimation rely on simplifying assumptions that may not adequately capture this complexity. Here we examine the impact of various factors affecting power using simulations, with comparison of power estimates obtained from simulations with those obtained using statistical software. Power was estimated for various analyses within a time-series study of air pollution and emergency department visits using simulations for specified scenarios. Mean daily emergency department visit counts, model parameter value estimates and daily values for air pollution and meteorological variables from actual data (8/1/98 to 7/31/99 in Atlanta) were used to generate simulated daily outcome counts with specified temporal associations with air pollutants and randomly generated error based on a Poisson distribution. Power was estimated by conducting analyses of the association between simulated daily outcome counts and air pollution in 2000 data sets for each scenario. Power estimates from simulations and statistical software (G*Power and PASS) were compared. In the simulation results, increasing time-series length and average daily outcome counts both increased power to a similar extent. Our results also illustrate the low power that can result from using outcomes with low daily counts or short time series, and the reduction in power that can accompany use of multipollutant models. Power estimates obtained using standard statistical software were very similar to those from the simulations when properly implemented; implementation, however, was not straightforward. These analyses demonstrate the similar impact on power of increasing time-series length versus increasing daily outcome counts, which has not previously been reported. Implementation of power software for these studies is discussed and guidance is provided.

  20. Real-time liquid-crystal atmosphere turbulence simulator with graphic processing unit.

    PubMed

    Hu, Lifa; Xuan, Li; Li, Dayu; Cao, Zhaoliang; Mu, Quanquan; Liu, Yonggang; Peng, Zenghui; Lu, Xinghai

    2009-04-27

    To generate time-evolving atmosphere turbulence in real time, a phase-generating method for our liquid-crystal (LC) atmosphere turbulence simulator (ATS) is derived based on the Fourier series (FS) method. A real matrix expression for generating turbulence phases is given and calculated with a graphic processing unit (GPU), the GeForce 8800 Ultra. A liquid crystal on silicon (LCOS) with 256x256 pixels is used as the turbulence simulator. The total time to generate a turbulence phase is about 7.8 ms for calculation and readout with the GPU. A parallel processing method of calculating and sending a picture to the LCOS is used to improve the simulating speed of our LC ATS. Therefore, the real-time turbulence phase-generation frequency of our LC ATS is up to 128 Hz. To our knowledge, it is the highest speed used to generate a turbulence phase in real time.

  1. Present-day deformation in Europe, as seen by the EPOS-GNSS prototype solution in double difference, and first co- and post-seismic displacements associated with 2016 Amatrice and Norcia earthquakes (Italy)

    NASA Astrophysics Data System (ADS)

    Socquet, Anne; Déprez, Aline; Cotte, Nathalie; Maubant, Louise; Walpersdorf, Andrea; Bato, Mary Grace

    2017-04-01

    We present here a new pan-European velocity field, obtained by processing 500+ cGPS stations in double difference, in the framework of the implementation phase of the European Plate Observing System (EPOS) project. This prototype solution spans the 2000-2016 period, and includes data from RING, NOA, RENAG and European Permanent Network (EPN) cGPS netwprks. The data set is first split into daily sub-networks (between 8 and 14 sub-networks). The sub-networks consist in about 40 stations, with 2 overlapping stations. For each day and for each sub-network, the GAMIT processing is conducted independently. Once each sub-network achieves satisfactory results, a daily combination is performed in order to produce SINEX files. The Chi square value associated with the combination allows us to evaluate its quality. Eventually, a multi year combination generates position time series for each station. Each time series is visualized and the jumps associated with material change (antenna or receiver) are estimated and corrected. This procedure allows us to generate daily solutions and position time series for all stations. The associated "interseismic" velocity field has then been estimated using a times series analysis using MIDAS software, and compared to another independent estimate obtained by Kalman filtering with globk software. In addition to this velocity field we made a specific zoom on Italy and present a strain rate map as well as time series showing co- and post- seismic movements associated with the 2016 Amatrice and Norcia earthquakes.

  2. PULSE SYNTHESIZING GENERATOR

    DOEpatents

    Kerns, Q.A.

    1963-08-01

    >An electronlc circuit for synthesizing electrical current pulses having very fast rise times includes several sinewave generators tuned to progressively higher harmonic frequencies with signal amplitudes and phases selectable according to the Fourier series of the waveform that is to be synthesized. Phase control is provided by periodically triggering the generators at precisely controlled times. The outputs of the generators are combined in a coaxial transmission line. Any frequency-dependent delays that occur in the transmission line can be readily compensated for so that the desired signal wave shape is obtained at the output of the line. (AEC)

  3. Identifying Autocorrelation Generated by Various Error Processes in Interrupted Time-Series Regression Designs: A Comparison of AR1 and Portmanteau Tests

    ERIC Educational Resources Information Center

    Huitema, Bradley E.; McKean, Joseph W.

    2007-01-01

    Regression models used in the analysis of interrupted time-series designs assume statistically independent errors. Four methods of evaluating this assumption are the Durbin-Watson (D-W), Huitema-McKean (H-M), Box-Pierce (B-P), and Ljung-Box (L-B) tests. These tests were compared with respect to Type I error and power under a wide variety of error…

  4. Additional in-series compliance reduces muscle force summation and alters the time course of force relaxation during fixed-end contractions.

    PubMed

    Mayfield, Dean L; Launikonis, Bradley S; Cresswell, Andrew G; Lichtwark, Glen A

    2016-11-15

    There are high mechanical demands placed on skeletal muscles in movements requiring rapid acceleration of the body or its limbs. Tendons are responsible for transmitting muscle forces, but, because of their elasticity, can manipulate the mechanics of the internal contractile apparatus. Shortening of the contractile apparatus against the stretch of tendon affects force generation according to known mechanical properties; however, the extent to which differences in tendon compliance alter force development in response to a burst of electrical impulses is unclear. To establish the influence of series compliance on force summation, we studied electrically evoked doublet contractions in the cane toad peroneus muscle in the presence and absence of a compliant artificial tendon. Additional series compliance reduced tetanic force by two-thirds, a finding predicted based on the force-length property of skeletal muscle. Doublet force and force-time integral expressed relative to the twitch were also reduced by additional series compliance. Active shortening over a larger range of the ascending limb of the force-length curve and at a higher velocity, leading to a progressive reduction in force-generating potential, could be responsible. Muscle-tendon interaction may also explain the accelerated time course of force relaxation in the presence of additional compliance. Our findings suggest that a compliant tendon limits force summation under constant-length conditions. However, high series compliance can be mechanically advantageous when a muscle-tendon unit is actively stretched, permitting muscle fibres to generate force almost isometrically, as shown during stretch-shorten cycles in locomotor activities. Restricting active shortening would likely favour rapid force development. © 2016. Published by The Company of Biologists Ltd.

  5. Smoothing and gap-filling of high resolution multi-spectral time series: Example of Landsat data

    NASA Astrophysics Data System (ADS)

    Vuolo, Francesco; Ng, Wai-Tim; Atzberger, Clement

    2017-05-01

    This paper introduces a novel methodology for generating 15-day, smoothed and gap-filled time series of high spatial resolution data. The approach is based on templates from high quality observations to fill data gaps that are subsequently filtered. We tested our method for one large contiguous area (Bavaria, Germany) and for nine smaller test sites in different ecoregions of Europe using Landsat data. Overall, our results match the validation dataset to a high degree of accuracy with a mean absolute error (MAE) of 0.01 for visible bands, 0.03 for near-infrared and 0.02 for short-wave-infrared. Occasionally, the reconstructed time series are affected by artefacts due to undetected clouds. Less frequently, larger uncertainties occur as a result of extended periods of missing data. Reliable cloud masks are highly warranted for making full use of time series.

  6. Using ordinal partition transition networks to analyze ECG data

    NASA Astrophysics Data System (ADS)

    Kulp, Christopher W.; Chobot, Jeremy M.; Freitas, Helena R.; Sprechini, Gene D.

    2016-07-01

    Electrocardiogram (ECG) data from patients with a variety of heart conditions are studied using ordinal pattern partition networks. The ordinal pattern partition networks are formed from the ECG time series by symbolizing the data into ordinal patterns. The ordinal patterns form the nodes of the network and edges are defined through the time ordering of the ordinal patterns in the symbolized time series. A network measure, called the mean degree, is computed from each time series-generated network. In addition, the entropy and number of non-occurring ordinal patterns (NFP) is computed for each series. The distribution of mean degrees, entropies, and NFPs for each heart condition studied is compared. A statistically significant difference between healthy patients and several groups of unhealthy patients with varying heart conditions is found for the distributions of the mean degrees, unlike for any of the distributions of the entropies or NFPs.

  7. Analysing the Image Building Effects of TV Advertisements Using Internet Community Data

    NASA Astrophysics Data System (ADS)

    Uehara, Hiroshi; Sato, Tadahiko; Yoshida, Kenichi

    This paper proposes a method to measure the effects of TV advertisements on the Internet bulletin boards. It aims to clarify how the viewes' interests on TV advertisements are reflected on their images on the promoted products. Two kinds of time series data are generated based on the proposed method. First one represents the time series fluctuation of the interests on the TV advertisements. Another one represents the time series fluctuation of the images on the products. By analysing the correlations between these two time series data, we try to clarify the implicit relationship between the viewer's interests on the TV advertisement and their images on the promoted products. By applying the proposed method to an Internet bulletin board that deals with certain cosmetic brand, we show that the images on the products vary depending on the difference of the interests on each TV advertisement.

  8. Monitoring Farmland Loss Caused by Urbanization in Beijing from Modis Time Series Using Hierarchical Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Yuan, Y.; Meng, Y.; Chen, Y. X.; Jiang, C.; Yue, A. Z.

    2018-04-01

    In this study, we proposed a method to map urban encroachment onto farmland using satellite image time series (SITS) based on the hierarchical hidden Markov model (HHMM). In this method, the farmland change process is decomposed into three hierarchical levels, i.e., the land cover level, the vegetation phenology level, and the SITS level. Then a three-level HHMM is constructed to model the multi-level semantic structure of farmland change process. Once the HHMM is established, a change from farmland to built-up could be detected by inferring the underlying state sequence that is most likely to generate the input time series. The performance of the method is evaluated on MODIS time series in Beijing. Results on both simulated and real datasets demonstrate that our method improves the change detection accuracy compared with the HMM-based method.

  9. Understanding the source of multifractality in financial markets

    NASA Astrophysics Data System (ADS)

    Barunik, Jozef; Aste, Tomaso; Di Matteo, T.; Liu, Ruipeng

    2012-09-01

    In this paper, we use the generalized Hurst exponent approach to study the multi-scaling behavior of different financial time series. We show that this approach is robust and powerful in detecting different types of multi-scaling. We observe a puzzling phenomenon where an apparent increase in multifractality is measured in time series generated from shuffled returns, where all time-correlations are destroyed, while the return distributions are conserved. This effect is robust and it is reproduced in several real financial data including stock market indices, exchange rates and interest rates. In order to understand the origin of this effect we investigate different simulated time series by means of the Markov switching multifractal model, autoregressive fractionally integrated moving average processes with stable innovations, fractional Brownian motion and Levy flights. Overall we conclude that the multifractality observed in financial time series is mainly a consequence of the characteristic fat-tailed distribution of the returns and time-correlations have the effect to decrease the measured multifractality.

  10. Ensemble Bayesian forecasting system Part I: Theory and algorithms

    NASA Astrophysics Data System (ADS)

    Herr, Henry D.; Krzysztofowicz, Roman

    2015-05-01

    The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.

  11. A method for generating high resolution satellite image time series

    NASA Astrophysics Data System (ADS)

    Guo, Tao

    2014-10-01

    There is an increasing demand for satellite remote sensing data with both high spatial and temporal resolution in many applications. But it still is a challenge to simultaneously improve spatial resolution and temporal frequency due to the technical limits of current satellite observation systems. To this end, much R&D efforts have been ongoing for years and lead to some successes roughly in two aspects, one includes super resolution, pan-sharpen etc. methods which can effectively enhance the spatial resolution and generate good visual effects, but hardly preserve spectral signatures and result in inadequate analytical value, on the other hand, time interpolation is a straight forward method to increase temporal frequency, however it increase little informative contents in fact. In this paper we presented a novel method to simulate high resolution time series data by combing low resolution time series data and a very small number of high resolution data only. Our method starts with a pair of high and low resolution data set, and then a spatial registration is done by introducing LDA model to map high and low resolution pixels correspondingly. Afterwards, temporal change information is captured through a comparison of low resolution time series data, and then projected onto the high resolution data plane and assigned to each high resolution pixel according to the predefined temporal change patterns of each type of ground objects. Finally the simulated high resolution data is generated. A preliminary experiment shows that our method can simulate a high resolution data with a reasonable accuracy. The contribution of our method is to enable timely monitoring of temporal changes through analysis of time sequence of low resolution images only, and usage of costly high resolution data can be reduces as much as possible, and it presents a highly effective way to build up an economically operational monitoring solution for agriculture, forest, land use investigation, environment and etc. applications.

  12. A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall

    NASA Astrophysics Data System (ADS)

    Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.

    2017-06-01

    Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.

  13. Multivariate multiscale entropy of financial markets

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun

    2017-11-01

    In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.

  14. Defect-Repairable Latent Feature Extraction of Driving Behavior via a Deep Sparse Autoencoder

    PubMed Central

    Taniguchi, Tadahiro; Takenaka, Kazuhito; Bando, Takashi

    2018-01-01

    Data representing driving behavior, as measured by various sensors installed in a vehicle, are collected as multi-dimensional sensor time-series data. These data often include redundant information, e.g., both the speed of wheels and the engine speed represent the velocity of the vehicle. Redundant information can be expected to complicate the data analysis, e.g., more factors need to be analyzed; even varying the levels of redundancy can influence the results of the analysis. We assume that the measured multi-dimensional sensor time-series data of driving behavior are generated from low-dimensional data shared by the many types of one-dimensional data of which multi-dimensional time-series data are composed. Meanwhile, sensor time-series data may be defective because of sensor failure. Therefore, another important function is to reduce the negative effect of defective data when extracting low-dimensional time-series data. This study proposes a defect-repairable feature extraction method based on a deep sparse autoencoder (DSAE) to extract low-dimensional time-series data. In the experiments, we show that DSAE provides high-performance latent feature extraction for driving behavior, even for defective sensor time-series data. In addition, we show that the negative effect of defects on the driving behavior segmentation task could be reduced using the latent features extracted by DSAE. PMID:29462931

  15. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  16. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    2014-06-15

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  17. miniSEED: The Backbone Data Format for Seismological Time Series

    NASA Astrophysics Data System (ADS)

    Ahern, T. K.; Benson, R. B.; Trabant, C. M.

    2017-12-01

    In 1987, the International Federation of Digital Seismograph Networks (FDSN), adopted the Standard for the Exchange of Earthquake Data (SEED) format to be used for data archiving and exchange of seismological time series data. Since that time, the format has evolved to accommodate new capabilities and features. For example, a notable change in 1992 allowed the format, which includes both the comprehensive metadata and the time series samples, to be used in two additional forms: a container for metadata only called "dataless SEED", and 2) a stand-alone structure for time series called "miniSEED". While specifically designed for seismological data and related metadata, this format has proven to be a useful format for a wide variety of geophysical time series data. Many FDSN data centers now store temperature, pressure, infrasound, tilt and other time series measurements in this internationally used format. Since April 2016, members of the FDSN have been in discussions to design a next generation miniSEED format to accommodate current and future needs, to further generalize the format, and to address a number of historical problems or limitations. We believe the correct approach is to simplify the header, allow for arbitrary header additions, expand the current identifiers, and allow for anticipated future identifiers which are currently unknown. We also believe the primary goal of the format is for efficient archiving, selection and exchange of time series data. By focusing on these goals we avoid trying to generalize the format too broadly into specialized areas such as efficient, low-latency delivery, or including unbounded non-time series data. Our presentation will provide an overview of this format and highlight its most valuable characteristics for time series data from any geophysical domain or beyond.

  18. Sample entropy applied to the analysis of synthetic time series and tachograms

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.; Gálvez-Coyt, G. G.; Solís-Montufar, E.

    2017-01-01

    Entropy is a method of non-linear analysis that allows an estimate of the irregularity of a system, however, there are different types of computational entropy that were considered and tested in order to obtain one that would give an index of signals complexity taking into account the data number of the analysed time series, the computational resources demanded by the method, and the accuracy of the calculation. An algorithm for the generation of fractal time-series with a certain value of β was used for the characterization of the different entropy algorithms. We obtained a significant variation for most of the algorithms in terms of the series size, which could result counterproductive for the study of real signals of different lengths. The chosen method was sample entropy, which shows great independence of the series size. With this method, time series of heart interbeat intervals or tachograms of healthy subjects and patients with congestive heart failure were analysed. The calculation of sample entropy was carried out for 24-hour tachograms and time subseries of 6-hours for sleepiness and wakefulness. The comparison between the two populations shows a significant difference that is accentuated when the patient is sleeping.

  19. Characterizing system dynamics with a weighted and directed network constructed from time series data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Xiaoran, E-mail: sxr0806@gmail.com; School of Mathematics and Statistics, The University of Western Australia, Crawley WA 6009; Small, Michael, E-mail: michael.small@uwa.edu.au

    In this work, we propose a novel method to transform a time series into a weighted and directed network. For a given time series, we first generate a set of segments via a sliding window, and then use a doubly symbolic scheme to characterize every windowed segment by combining absolute amplitude information with an ordinal pattern characterization. Based on this construction, a network can be directly constructed from the given time series: segments corresponding to different symbol-pairs are mapped to network nodes and the temporal succession between nodes is represented by directed links. With this conversion, dynamics underlying the timemore » series has been encoded into the network structure. We illustrate the potential of our networks with a well-studied dynamical model as a benchmark example. Results show that network measures for characterizing global properties can detect the dynamical transitions in the underlying system. Moreover, we employ a random walk algorithm to sample loops in our networks, and find that time series with different dynamics exhibits distinct cycle structure. That is, the relative prevalence of loops with different lengths can be used to identify the underlying dynamics.« less

  20. Time series smoother for effect detection.

    PubMed

    You, Cheng; Lin, Dennis K J; Young, S Stanley

    2018-01-01

    In environmental epidemiology, it is often encountered that multiple time series data with a long-term trend, including seasonality, cannot be fully adjusted by the observed covariates. The long-term trend is difficult to separate from abnormal short-term signals of interest. This paper addresses how to estimate the long-term trend in order to recover short-term signals. Our case study demonstrates that the current spline smoothing methods can result in significant positive and negative cross-correlations from the same dataset, depending on how the smoothing parameters are chosen. To circumvent this dilemma, three classes of time series smoothers are proposed to detrend time series data. These smoothers do not require fine tuning of parameters and can be applied to recover short-term signals. The properties of these smoothers are shown with both a case study using a factorial design and a simulation study using datasets generated from the original dataset. General guidelines are provided on how to discover short-term signals from time series with a long-term trend. The benefit of this research is that a problem is identified and characteristics of possible solutions are determined.

  1. Time series smoother for effect detection

    PubMed Central

    Lin, Dennis K. J.; Young, S. Stanley

    2018-01-01

    In environmental epidemiology, it is often encountered that multiple time series data with a long-term trend, including seasonality, cannot be fully adjusted by the observed covariates. The long-term trend is difficult to separate from abnormal short-term signals of interest. This paper addresses how to estimate the long-term trend in order to recover short-term signals. Our case study demonstrates that the current spline smoothing methods can result in significant positive and negative cross-correlations from the same dataset, depending on how the smoothing parameters are chosen. To circumvent this dilemma, three classes of time series smoothers are proposed to detrend time series data. These smoothers do not require fine tuning of parameters and can be applied to recover short-term signals. The properties of these smoothers are shown with both a case study using a factorial design and a simulation study using datasets generated from the original dataset. General guidelines are provided on how to discover short-term signals from time series with a long-term trend. The benefit of this research is that a problem is identified and characteristics of possible solutions are determined. PMID:29684033

  2. Analysis of continuous GPS measurements from southern Victoria Land, Antarctica

    USGS Publications Warehouse

    Willis, Michael J.

    2007-01-01

    Several years of continuous data have been collected at remote bedrock Global Positioning System (GPS) sites in southern Victoria Land, Antarctica. Annual to sub-annual variations are observed in the position time-series. An atmospheric pressure loading (APL) effect is calculated from pressure field anomalies supplied by the European Centre for Medium-Range Weather Forecasts (ECMWF) model loading an elastic Earth model. The predicted APL signal has a moderate correlation with the vertical position time-series at McMurdo, Ross Island (International Global Navigation Satellite System Service (IGS) station MCM4), produced using a global solution. In contrast, a local solution in which MCM4 is the fiducial site generates a vertical time series for a remote site in Victoria Land (Cape Roberts, ROB4) which exhibits a low, inverse correlation with the predicted atmospheric pressure loading signal. If, in the future, known and well modeled geophysical loads can be separated from the time-series, then local hydrological loading, of interest for glaciological and climate applications, can potentially be extracted from the GPS time-series.

  3. Characterization of Vertical Impact Device Acceleration Pulses Using Parametric Assessment: Phase IV Dual Impact Pulses

    DTIC Science & Technology

    2017-01-04

    response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and maintaining...configurations with a restrained manikin, was evaluated in four different test series . Test Series 1 was conducted to determine the materials and...5 ms TTP. Test Series 2 was conducted to determine the materials and drop heights required for energy attenuation of the seat pan to generate a 4 m

  4. Simulation of an ensemble of future climate time series with an hourly weather generator

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.

    2010-12-01

    There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ), Detroit (MI), and Firenze (Italy). The stochastic downscaling is carried out with eight GCMs from the CMIP3 multi-model dataset (IPCC 4AR, A1B scenario).

  5. Digital gate pulse generator for cycloconverter control

    DOEpatents

    Klein, Frederick F.; Mutone, Gioacchino A.

    1989-01-01

    The present invention provides a digital gate pulse generator which controls the output of a cycloconverter used for electrical power conversion applications by determining the timing and delivery of the firing pulses to the switching devices in the cycloconverter. Previous gate pulse generators have been built with largely analog or discrete digital circuitry which require many precision components and periodic adjustment. The gate pulse generator of the present invention utilizes digital techniques and a predetermined series of values to develop the necessary timing signals for firing the switching device. Each timing signal is compared with a reference signal to determine the exact firing time. The present invention is significantly more compact than previous gate pulse generators, responds quickly to changes in the output demand and requires only one precision component and no adjustments.

  6. Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application

    NASA Astrophysics Data System (ADS)

    Chen, Jinduan; Boccelli, Dominic L.

    2018-02-01

    Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.

  7. Object-Oriented Classification of Sugarcane Using Time-Series Middle-Resolution Remote Sensing Data Based on AdaBoost

    PubMed Central

    Zhou, Zhen; Huang, Jingfeng; Wang, Jing; Zhang, Kangyu; Kuang, Zhaomin; Zhong, Shiquan; Song, Xiaodong

    2015-01-01

    Most areas planted with sugarcane are located in southern China. However, remote sensing of sugarcane has been limited because useable remote sensing data are limited due to the cloudy climate of this region during the growing season and severe spectral mixing with other crops. In this study, we developed a methodology for automatically mapping sugarcane over large areas using time-series middle-resolution remote sensing data. For this purpose, two major techniques were used, the object-oriented method (OOM) and data mining (DM). In addition, time-series Chinese HJ-1 CCD images were obtained during the sugarcane growing period. Image objects were generated using a multi-resolution segmentation algorithm, and DM was implemented using the AdaBoost algorithm, which generated the prediction model. The prediction model was applied to the HJ-1 CCD time-series image objects, and then a map of the sugarcane planting area was produced. The classification accuracy was evaluated using independent field survey sampling points. The confusion matrix analysis showed that the overall classification accuracy reached 93.6% and that the Kappa coefficient was 0.85. Thus, the results showed that this method is feasible, efficient, and applicable for extrapolating the classification of other crops in large areas where the application of high-resolution remote sensing data is impractical due to financial considerations or because qualified images are limited. PMID:26528811

  8. Object-Oriented Classification of Sugarcane Using Time-Series Middle-Resolution Remote Sensing Data Based on AdaBoost.

    PubMed

    Zhou, Zhen; Huang, Jingfeng; Wang, Jing; Zhang, Kangyu; Kuang, Zhaomin; Zhong, Shiquan; Song, Xiaodong

    2015-01-01

    Most areas planted with sugarcane are located in southern China. However, remote sensing of sugarcane has been limited because useable remote sensing data are limited due to the cloudy climate of this region during the growing season and severe spectral mixing with other crops. In this study, we developed a methodology for automatically mapping sugarcane over large areas using time-series middle-resolution remote sensing data. For this purpose, two major techniques were used, the object-oriented method (OOM) and data mining (DM). In addition, time-series Chinese HJ-1 CCD images were obtained during the sugarcane growing period. Image objects were generated using a multi-resolution segmentation algorithm, and DM was implemented using the AdaBoost algorithm, which generated the prediction model. The prediction model was applied to the HJ-1 CCD time-series image objects, and then a map of the sugarcane planting area was produced. The classification accuracy was evaluated using independent field survey sampling points. The confusion matrix analysis showed that the overall classification accuracy reached 93.6% and that the Kappa coefficient was 0.85. Thus, the results showed that this method is feasible, efficient, and applicable for extrapolating the classification of other crops in large areas where the application of high-resolution remote sensing data is impractical due to financial considerations or because qualified images are limited.

  9. Brain-Inspired Photonic Signal Processor for Generating Periodic Patterns and Emulating Chaotic Systems

    NASA Astrophysics Data System (ADS)

    Antonik, Piotr; Haelterman, Marc; Massar, Serge

    2017-05-01

    Reservoir computing is a bioinspired computing paradigm for processing time-dependent signals. Its hardware implementations have received much attention because of their simplicity and remarkable performance on a series of benchmark tasks. In previous experiments, the output was uncoupled from the system and, in most cases, simply computed off-line on a postprocessing computer. However, numerical investigations have shown that feeding the output back into the reservoir opens the possibility of long-horizon time-series forecasting. Here, we present a photonic reservoir computer with output feedback, and we demonstrate its capacity to generate periodic time series and to emulate chaotic systems. We study in detail the effect of experimental noise on system performance. In the case of chaotic systems, we introduce several metrics, based on standard signal-processing techniques, to evaluate the quality of the emulation. Our work significantly enlarges the range of tasks that can be solved by hardware reservoir computers and, therefore, the range of applications they could potentially tackle. It also raises interesting questions in nonlinear dynamics and chaos theory.

  10. A stochastical event-based continuous time step rainfall generator based on Poisson rectangular pulse and microcanonical random cascade models

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph

    2017-04-01

    Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30

  11. Two approaches to timescale modeling for proxy series with chronological errors.

    NASA Astrophysics Data System (ADS)

    Divine, Dmitry; Godtliebsen, Fred

    2010-05-01

    A substantial part of proxy series used in paleoclimate research has chronological uncertainties. Any constructed timescale is therefore only an estimate of the true, but unknown timescale. An accurate assessment of the timing of events in the paleoproxy series and networks, as well as the use of proxy-based paleoclimate reconstructions in GCM model scoring experiments, requires the effect of these errors to be properly taken into account. We consider two types of the timescale error models corresponding to the two basic approaches to construction of the (depth-) age scale in a proxy series. Typically, a chronological control of a proxy series stemming from all types of marine and terrestrial sedimentary archives is based on the use of 14C dates, reference horizons or their combination. Depending on the prevalent origin of the available fix points (age markers) the following approaches to timescale modeling are proposed. 1) 14C dates. The algorithm uses Markov-chain Monte Carlo sampling technique to generate the ordered set of perturbed age markers. Proceeding sequentially from the youngest to the oldest fixpoint, the sampler draws random numbers from the age distribution of each individual 14C date. Every following perturbed age marker is generated such that condition of no age reversal is fulfilled. The relevant regression model is then applied to construct a simulated timescale. 2) Reference horizons (f. ex. volcanic or dust layers, T bomb peak) generally provide absolutely dated fixpoints. Due to a natural variability in sedimentation (accumulation) rate, however, the dating uncertainty in the interpolated timescale tends to grow together with a span to the nearest fixpoint. The (accumulation, sedimentation) process associated with formation of a proxy series is modelled using stochastic Levy process. The respective increments for the process are drawn from the log-normal distribution with the mean/variance ratio prescribed as a site(proxy)- dependent external parameter. The number of generated annual increments corresponds to a time interval between the considered reference horizons. The simulated series is then rescaled to match the length of the actual core section being modelled. Within each method the multitude of timescales is generated creating a number of possible realisations of a proxy series or a proxy based reconstruction in the time domain. This allows consideration of a proxy record in a probabilistic framework. The effect of accounting for uncertainties in chronology on a reconstructed environmental variable is illustrated with the two case studies of marine sediment records.

  12. Evaluation of COPD's diaphragm motion extracted from 4D-MRI

    NASA Astrophysics Data System (ADS)

    Swastika, Windra; Masuda, Yoshitada; Kawata, Naoko; Matsumoto, Koji; Suzuki, Toshio; Iesato, Ken; Tada, Yuji; Sugiura, Toshihiko; Tanabe, Nobuhiro; Tatsumi, Koichiro; Ohnishi, Takashi; Haneishi, Hideaki

    2015-03-01

    We have developed a method called intersection profile method to construct a 4D-MRI (3D+time) from time-series of 2D-MRI. The basic idea is to find the best matching of the intersection profile from the time series of 2D-MRI in sagittal plane (navigator slice) and time series of 2D-MRI in coronal plane (data slice). In this study, we use 4D-MRI to semiautomatically extract the right diaphragm motion of 16 subjects (8 healthy subjects and 8 COPD patients). The diaphragm motion is then evaluated quantitatively by calculating the displacement of each subjects and normalized it. We also generate phase-length map to view and locate paradoxical motion of the COPD patients. The quantitative results of the normalized displacement shows that COPD patients tend to have smaller displacement compared to healthy subjects. The average normalized displacement of total 8 COPD patients is 9.4mm and the average of normalized displacement of 8 healthy volunteers is 15.3mm. The generated phase-length maps show that not all of the COPD patients have paradoxical motion, however if it has paradoxical motion, the phase-length map is able to locate where does it occur.

  13. Permutation entropy of finite-length white-noise time series.

    PubMed

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  14. Random generation of the turbulence slopes of a Shack-Hartmann wavefront sensor.

    PubMed

    Conan, Rodolphe

    2014-03-15

    A method to generate the turbulence measurements of a Shack-Hartmann wavefront sensor is presented. Numerical simulations demonstrate that the spatial and temporal statistic properties of the slopes are respected, allowing us to generate the turbulence wavefront gradient corresponding to both natural and laser guide stars, as well as time series in accordance with the frozen flow model.

  15. Heart rate time series characteristics for early detection of infections in critically ill patients.

    PubMed

    Tambuyzer, T; Guiza, F; Boonen, E; Meersseman, P; Vervenne, H; Hansen, T K; Bjerre, M; Van den Berghe, G; Berckmans, D; Aerts, J M; Meyfroidt, G

    2017-04-01

    It is difficult to make a distinction between inflammation and infection. Therefore, new strategies are required to allow accurate detection of infection. Here, we hypothesize that we can distinguish infected from non-infected ICU patients based on dynamic features of serum cytokine concentrations and heart rate time series. Serum cytokine profiles and heart rate time series of 39 patients were available for this study. The serum concentration of ten cytokines were measured using blood sampled every 10 min between 2100 and 0600 hours. Heart rate was recorded every minute. Ten metrics were used to extract features from these time series to obtain an accurate classification of infected patients. The predictive power of the metrics derived from the heart rate time series was investigated using decision tree analysis. Finally, logistic regression methods were used to examine whether classification performance improved with inclusion of features derived from the cytokine time series. The AUC of a decision tree based on two heart rate features was 0.88. The model had good calibration with 0.09 Hosmer-Lemeshow p value. There was no significant additional value of adding static cytokine levels or cytokine time series information to the generated decision tree model. The results suggest that heart rate is a better marker for infection than information captured by cytokine time series when the exact stage of infection is not known. The predictive value of (expensive) biomarkers should always be weighed against the routinely monitored data, and such biomarkers have to demonstrate added value.

  16. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation

    USDA-ARS?s Scientific Manuscript database

    The scale mismatch between remotely sensed observations and crop growth models simulated state variables decreases the reliability of crop yield estimates. To overcome this problem, we used a two-step data assimilation phases: first we generated a complete leaf area index (LAI) time series by combin...

  17. A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis

    NASA Astrophysics Data System (ADS)

    Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann

    2017-04-01

    The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for the entire study area.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodríguez-González, R.; Martínez-Orozco, J. C.; Madrigal-Melchor, J.

    In this work we use the standard T-matrix method to study the tunneling of Dirac electrons through graphene multilayers. A graphene sheet is deposited on top of slabs of Silicon-Oxide (SiO{sub 2}) and Silicon-Carbide (SiC) substrates, in which we applied the Cantor’s series. We calculate the transmittance as a function of energy for different incident angles and different generations of the Cantor’s series. Comparing the transmittance, we found three types of self-similarity: (a) local - into generations, (b) between incident angles and (c) between generations. We also compute the angular distribution of the transmittance for fixed energies finding a self-similarmore » pattern between generations. To our knowledge is the first time that four different self-similar patterns are presented in Cantor-based multilayers.« less

  19. a Spiral-Based Downscaling Method for Generating 30 M Time Series Image Data

    NASA Astrophysics Data System (ADS)

    Liu, B.; Chen, J.; Xing, H.; Wu, H.; Zhang, J.

    2017-09-01

    The spatial detail and updating frequency of land cover data are important factors influencing land surface dynamic monitoring applications in high spatial resolution scale. However, the fragmentized patches and seasonal variable of some land cover types (e. g. small crop field, wetland) make it labor-intensive and difficult in the generation of land cover data. Utilizing the high spatial resolution multi-temporal image data is a possible solution. Unfortunately, the spatial and temporal resolution of available remote sensing data like Landsat or MODIS datasets can hardly satisfy the minimum mapping unit and frequency of current land cover mapping / updating at the same time. The generation of high resolution time series may be a compromise to cover the shortage in land cover updating process. One of popular way is to downscale multi-temporal MODIS data with other high spatial resolution auxiliary data like Landsat. But the usual manner of downscaling pixel based on a window may lead to the underdetermined problem in heterogeneous area, result in the uncertainty of some high spatial resolution pixels. Therefore, the downscaled multi-temporal data can hardly reach high spatial resolution as Landsat data. A spiral based method was introduced to downscale low spatial and high temporal resolution image data to high spatial and high temporal resolution image data. By the way of searching the similar pixels around the adjacent region based on the spiral, the pixel set was made up in the adjacent region pixel by pixel. The underdetermined problem is prevented to a large extent from solving the linear system when adopting the pixel set constructed. With the help of ordinary least squares, the method inverted the endmember values of linear system. The high spatial resolution image was reconstructed on the basis of high spatial resolution class map and the endmember values band by band. Then, the high spatial resolution time series was formed with these high spatial resolution images image by image. Simulated experiment and remote sensing image downscaling experiment were conducted. In simulated experiment, the 30 meters class map dataset Globeland30 was adopted to investigate the effect on avoid the underdetermined problem in downscaling procedure and a comparison between spiral and window was conducted. Further, the MODIS NDVI and Landsat image data was adopted to generate the 30m time series NDVI in remote sensing image downscaling experiment. Simulated experiment results showed that the proposed method had a robust performance in downscaling pixel in heterogeneous region and indicated that it was superior to the traditional window-based methods. The high resolution time series generated may be a benefit to the mapping and updating of land cover data.

  20. COBRE Research Workshop on Higher Education: Equity and Efficiency.

    ERIC Educational Resources Information Center

    Chicago Univ., IL.

    This document comprises 8 papers presented at the COBRE Research Workshop on Higher Education. The papers are: (1) "Schooling and Equality from Generation to Generation;" (2) "Time Series Changes in Personal Income Inequality: The United States Experience, 1939 to 1985;" (3) "Education, Income, and Ability;" (4) "Proposals for Financing Higher…

  1. Impact of number of realizations on the suitability of simulated weather data for hydrologic and environmental applications

    USDA-ARS?s Scientific Manuscript database

    Stochastic weather generators are widely used in hydrological, environmental, and agricultural applications to simulate and forecast weather time series. However, such stochastic processes usually produce random outputs hence the question on how representative the generated data are if obtained fro...

  2. Comparison of different synthetic 5-min rainfall time series on the results of rainfall runoff simulations in urban drainage modelling

    NASA Astrophysics Data System (ADS)

    Krämer, Stefan; Rohde, Sophia; Schröder, Kai; Belli, Aslan; Maßmann, Stefanie; Schönfeld, Martin; Henkel, Erik; Fuchs, Lothar

    2015-04-01

    The design of urban drainage systems with numerical simulation models requires long, continuous rainfall time series with high temporal resolution. However, suitable observed time series are rare. As a result, usual design concepts often use uncertain or unsuitable rainfall data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic rainfall data as input for urban drainage modelling are advanced, tested, and compared. Synthetic rainfall time series of three different precipitation model approaches, - one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model-, are provided for three catchments with different sewer system characteristics in different climate regions in Germany: - Hamburg (northern Germany): maritime climate, mean annual rainfall: 770 mm; combined sewer system length: 1.729 km (City center of Hamburg), storm water sewer system length (Hamburg Harburg): 168 km - Brunswick (Lower Saxony, northern Germany): transitional climate from maritime to continental, mean annual rainfall: 618 mm; sewer system length: 278 km, connected impervious area: 379 ha, height difference: 27 m - Friburg in Brisgau (southern Germany): Central European transitional climate, mean annual rainfall: 908 mm; sewer system length: 794 km, connected impervious area: 1 546 ha, height difference 284 m Hydrodynamic models are set up for each catchment to simulate rainfall runoff processes in the sewer systems. Long term event time series are extracted from the - three different synthetic rainfall time series (comprising up to 600 years continuous rainfall) provided for each catchment and - observed gauge rainfall (reference rainfall) according national hydraulic design standards. The synthetic and reference long term event time series are used as rainfall input for the hydrodynamic sewer models. For comparison of the synthetic rainfall time series against the reference rainfall and against each other the number of - surcharged manholes, - surcharges per manhole, - and the average surcharge volume per manhole are applied as hydraulic performance criteria. The results are discussed and assessed to answer the following questions: - Are the synthetic rainfall approaches suitable to generate high resolution rainfall series and do they produce, - in combination with numerical rainfall runoff models - valid results for design of urban drainage systems? - What are the bounds of uncertainty in the runoff results depending on the synthetic rainfall model and on the climate region? The work is carried out within the SYNOPSE project, funded by the German Federal Ministry of Education and Research (BMBF).

  3. Generation of synthetic influent data to perform (micro)pollutant wastewater treatment modelling studies.

    PubMed

    Snip, L J P; Flores-Alsina, X; Aymerich, I; Rodríguez-Mozaz, S; Barceló, D; Plósz, B G; Corominas, Ll; Rodriguez-Roda, I; Jeppsson, U; Gernaey, K V

    2016-11-01

    The use of process models to simulate the fate of micropollutants in wastewater treatment plants is constantly growing. However, due to the high workload and cost of measuring campaigns, many simulation studies lack sufficiently long time series representing realistic wastewater influent dynamics. In this paper, the feasibility of the Benchmark Simulation Model No. 2 (BSM2) influent generator is tested to create realistic dynamic influent (micro)pollutant disturbance scenarios. The presented set of models is adjusted to describe the occurrence of three pharmaceutical compounds and one of each of its metabolites with samples taken every 2-4h: the anti-inflammatory drug ibuprofen (IBU), the antibiotic sulfamethoxazole (SMX) and the psychoactive carbamazepine (CMZ). Information about type of excretion and total consumption rates forms the basis for creating the data-defined profiles used to generate the dynamic time series. In addition, the traditional influent characteristics such as flow rate, ammonium, particulate chemical oxygen demand and temperature are also modelled using the same framework with high frequency data. The calibration is performed semi-automatically with two different methods depending on data availability. The 'traditional' variables are calibrated with the Bootstrap method while the pharmaceutical loads are estimated with a least squares approach. The simulation results demonstrate that the BSM2 influent generator can describe the dynamics of both traditional variables and pharmaceuticals. Lastly, the study is complemented with: 1) the generation of longer time series for IBU following the same catchment principles; 2) the study of the impact of in-sewer SMX biotransformation when estimating the average daily load; and, 3) a critical discussion of the results, and the future opportunities of the presented approach balancing model structure/calibration procedure complexity versus predictive capabilities. Copyright © 2016. Published by Elsevier B.V.

  4. Space shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1980-01-01

    The effects of atmospheric turbulence in both horizontal and near horizontal flight, during the return of the space shuttle, are important for determining design, control, and 'pilot-in-the-loop' effects. A nonrecursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model, the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes which are entitled shuttle simulation turbulence tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 10,000 meters. The turbulence generation procedure is described as well as the results of validating the simulated turbulence. Conclusions and recommendations are presented and references cited. The tabulated one dimensional von Karman spectra and the results of spectral and statistical analyses of the SSTT are contained in the appendix.

  5. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  6. Probabilistic Reasoning Over Seismic Time Series: Volcano Monitoring by Hidden Markov Models at Mt. Etna

    NASA Astrophysics Data System (ADS)

    Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea; Montalto, Placido; Patanè, Domenico; Privitera, Eugenio

    2016-07-01

    From January 2011 to December 2015, Mt. Etna was mainly characterized by a cyclic eruptive behavior with more than 40 lava fountains from New South-East Crater. Using the RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area, an automatic recognition of the different states of volcanic activity (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN) has been applied for monitoring purposes. Since values of the RMS time series calculated on the seismic signal are generated from a stochastic process, we can try to model the system generating its sampled values, assumed to be a Markov process, using Hidden Markov Models (HMMs). HMMs analysis seeks to recover the sequence of hidden states from the observations. In our framework, observations are characters generated by the Symbolic Aggregate approXimation (SAX) technique, which maps RMS time series values with symbols of a pre-defined alphabet. The main advantages of the proposed framework, based on HMMs and SAX, with respect to other automatic systems applied on seismic signals at Mt. Etna, are the use of multiple stations and static thresholds to well characterize the volcano states. Its application on a wide seismic dataset of Etna volcano shows the possibility to guess the volcano states. The experimental results show that, in most of the cases, we detected lava fountains in advance.

  7. Multiscale entropy-based methods for heart rate variability complexity analysis

    NASA Astrophysics Data System (ADS)

    Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio

    2015-03-01

    Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.

  8. Data imputation analysis for Cosmic Rays time series

    NASA Astrophysics Data System (ADS)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  9. Refined composite multiscale weighted-permutation entropy of financial time series

    NASA Astrophysics Data System (ADS)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  10. Internal combustion engine control for series hybrid electric vehicles by parallel and distributed genetic programming/multiobjective genetic algorithms

    NASA Astrophysics Data System (ADS)

    Gladwin, D.; Stewart, P.; Stewart, J.

    2011-02-01

    This article addresses the problem of maintaining a stable rectified DC output from the three-phase AC generator in a series-hybrid vehicle powertrain. The series-hybrid prime power source generally comprises an internal combustion (IC) engine driving a three-phase permanent magnet generator whose output is rectified to DC. A recent development has been to control the engine/generator combination by an electronically actuated throttle. This system can be represented as a nonlinear system with significant time delay. Previously, voltage control of the generator output has been achieved by model predictive methods such as the Smith Predictor. These methods rely on the incorporation of an accurate system model and time delay into the control algorithm, with a consequent increase in computational complexity in the real-time controller, and as a necessity relies to some extent on the accuracy of the models. Two complementary performance objectives exist for the control system. Firstly, to maintain the IC engine at its optimal operating point, and secondly, to supply a stable DC supply to the traction drive inverters. Achievement of these goals minimises the transient energy storage requirements at the DC link, with a consequent reduction in both weight and cost. These objectives imply constant velocity operation of the IC engine under external load disturbances and changes in both operating conditions and vehicle speed set-points. In order to achieve these objectives, and reduce the complexity of implementation, in this article a controller is designed by the use of Genetic Programming methods in the Simulink modelling environment, with the aim of obtaining a relatively simple controller for the time-delay system which does not rely on the implementation of real time system models or time delay approximations in the controller. A methodology is presented to utilise the miriad of existing control blocks in the Simulink libraries to automatically evolve optimal control structures.

  11. Parametric vs. non-parametric daily weather generator: validation and comparison

    NASA Astrophysics Data System (ADS)

    Dubrovsky, Martin

    2016-04-01

    As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30 years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database.

  12. Comparison of detrending methods for fluctuation analysis in hydrology

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Chen, Yongqin David

    2011-03-01

    SummaryTrends within a hydrologic time series can significantly influence the scaling results of fluctuation analysis, such as rescaled range (RS) analysis and (multifractal) detrended fluctuation analysis (MF-DFA). Therefore, removal of trends is important in the study of scaling properties of the time series. In this study, three detrending methods, including adaptive detrending algorithm (ADA), Fourier-based method, and average removing technique, were evaluated by analyzing numerically generated series and observed streamflow series with obvious relative regular periodic trend. Results indicated that: (1) the Fourier-based detrending method and ADA were similar in detrending practices, and given proper parameters, these two methods can produce similarly satisfactory results; (2) detrended series by Fourier-based detrending method and ADA lose the fluctuation information at larger time scales, and the location of crossover points is heavily impacted by the chosen parameters of these two methods; and (3) the average removing method has an advantage over the other two methods, i.e., the fluctuation information at larger time scales is kept well-an indication of relatively reliable performance in detrending. In addition, the average removing method performed reasonably well in detrending a time series with regular periods or trends. In this sense, the average removing method should be preferred in the study of scaling properties of the hydrometeorolgical series with relative regular periodic trend using MF-DFA.

  13. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    PubMed

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  14. Chatter detection in turning using persistent homology

    NASA Astrophysics Data System (ADS)

    Khasawneh, Firas A.; Munch, Elizabeth

    2016-03-01

    This paper describes a new approach for ascertaining the stability of stochastic dynamical systems in their parameter space by examining their time series using topological data analysis (TDA). We illustrate the approach using a nonlinear delayed model that describes the tool oscillations due to self-excited vibrations in turning. Each time series is generated using the Euler-Maruyama method and a corresponding point cloud is obtained using the Takens embedding. The point cloud can then be analyzed using a tool from TDA known as persistent homology. The results of this study show that the described approach can be used for analyzing datasets of delay dynamical systems generated both from numerical simulation and experimental data. The contributions of this paper include presenting for the first time a topological approach for investigating the stability of a class of nonlinear stochastic delay equations, and introducing a new application of TDA to machining processes.

  15. Second and third order nonlinear optical properties of conjugated molecules and polymers

    NASA Technical Reports Server (NTRS)

    Perry, Joseph W.; Stiegman, Albert E.; Marder, Seth R.; Coulter, Daniel R.; Beratan, David N.; Brinza, David E.

    1988-01-01

    Second- and third-order nonlinear optical properties of some newly synthesized organic molecules and polymers are reported. Powder second-harmonic-generation efficiencies of up to 200 times urea have been realized for asymmetric donor-acceptor acetylenes. Third harmonic generation chi(3)s have been determined for a series of small conjugated molecules in solution. THG chi(3)s have also been determined for a series of soluble conjugated copolymers prepared using ring-opening metathesis polymerization. The results are discussed in terms of relevant molecular and/or macroscopic structural features of these conjugated organic materials.

  16. Method of multiplexed analysis using ion mobility spectrometer

    DOEpatents

    Belov, Mikhail E [Richland, WA; Smith, Richard D [Richland, WA

    2009-06-02

    A method for analyzing analytes from a sample introduced into a Spectrometer by generating a pseudo random sequence of a modulation bins, organizing each modulation bin as a series of submodulation bins, thereby forming an extended pseudo random sequence of submodulation bins, releasing the analytes in a series of analyte packets into a Spectrometer, thereby generating an unknown original ion signal vector, detecting the analytes at a detector, and characterizing the sample using the plurality of analyte signal subvectors. The method is advantageously applied to an Ion Mobility Spectrometer, and an Ion Mobility Spectrometer interfaced with a Time of Flight Mass Spectrometer.

  17. Alteration of Box-Jenkins methodology by implementing genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Ismail, Zuhaimy; Maarof, Mohd Zulariffin Md; Fadzli, Mohammad

    2015-02-01

    A time series is a set of values sequentially observed through time. The Box-Jenkins methodology is a systematic method of identifying, fitting, checking and using integrated autoregressive moving average time series model for forecasting. Box-Jenkins method is an appropriate for a medium to a long length (at least 50) time series data observation. When modeling a medium to a long length (at least 50), the difficulty arose in choosing the accurate order of model identification level and to discover the right parameter estimation. This presents the development of Genetic Algorithm heuristic method in solving the identification and estimation models problems in Box-Jenkins. Data on International Tourist arrivals to Malaysia were used to illustrate the effectiveness of this proposed method. The forecast results that generated from this proposed model outperformed single traditional Box-Jenkins model.

  18. Automatic Dance Lesson Generation

    ERIC Educational Resources Information Center

    Yang, Yang; Leung, H.; Yue, Lihua; Deng, LiQun

    2012-01-01

    In this paper, an automatic lesson generation system is presented which is suitable in a learning-by-mimicking scenario where the learning objects can be represented as multiattribute time series data. The dance is used as an example in this paper to illustrate the idea. Given a dance motion sequence as the input, the proposed lesson generation…

  19. Wavelet analysis in ecology and epidemiology: impact of statistical tests

    PubMed Central

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-01-01

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the ‘beta-surrogate’ method. PMID:24284892

  20. Wavelet analysis in ecology and epidemiology: impact of statistical tests.

    PubMed

    Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario

    2014-02-06

    Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.

  1. A dataset of future daily weather data for crop modelling over Europe derived from climate change scenarios

    NASA Astrophysics Data System (ADS)

    Duveiller, G.; Donatelli, M.; Fumagalli, D.; Zucchini, A.; Nelson, R.; Baruth, B.

    2017-02-01

    Coupled atmosphere-ocean general circulation models (GCMs) simulate different realizations of possible future climates at global scale under contrasting scenarios of land-use and greenhouse gas emissions. Such data require several additional processing steps before it can be used to drive impact models. Spatial downscaling, typically by regional climate models (RCM), and bias-correction are two such steps that have already been addressed for Europe. Yet, the errors in resulting daily meteorological variables may be too large for specific model applications. Crop simulation models are particularly sensitive to these inconsistencies and thus require further processing of GCM-RCM outputs. Moreover, crop models are often run in a stochastic manner by using various plausible weather time series (often generated using stochastic weather generators) to represent climate time scale for a period of interest (e.g. 2000 ± 15 years), while GCM simulations typically provide a single time series for a given emission scenario. To inform agricultural policy-making, data on near- and medium-term decadal time scale is mostly requested, e.g. 2020 or 2030. Taking a sample of multiple years from these unique time series to represent time horizons in the near future is particularly problematic because selecting overlapping years may lead to spurious trends, creating artefacts in the results of the impact model simulations. This paper presents a database of consolidated and coherent future daily weather data for Europe that addresses these problems. Input data consist of daily temperature and precipitation from three dynamically downscaled and bias-corrected regional climate simulations of the IPCC A1B emission scenario created within the ENSEMBLES project. Solar radiation is estimated from temperature based on an auto-calibration procedure. Wind speed and relative air humidity are collected from historical series. From these variables, reference evapotranspiration and vapour pressure deficit are estimated ensuring consistency within daily records. The weather generator ClimGen is then used to create 30 synthetic years of all variables to characterize the time horizons of 2000, 2020 and 2030, which can readily be used for crop modelling studies.

  2. Wavelet transform approach for fitting financial time series data

    NASA Astrophysics Data System (ADS)

    Ahmed, Amel Abdoullah; Ismail, Mohd Tahir

    2015-10-01

    This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.

  3. Electric Grid Expansion Planning with High Levels of Variable Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, Stanton W.; You, Shutang; Shankar, Mallikarjun

    2016-02-01

    Renewables are taking a large proportion of generation capacity in U.S. power grids. As their randomness has increasing influence on power system operation, it is necessary to consider their impact on system expansion planning. To this end, this project studies the generation and transmission expansion co-optimization problem of the US Eastern Interconnection (EI) power grid with a high wind power penetration rate. In this project, the generation and transmission expansion problem for the EI system is modeled as a mixed-integer programming (MIP) problem. This study analyzed a time series creation method to capture the diversity of load and wind powermore » across balancing regions in the EI system. The obtained time series can be easily introduced into the MIP co-optimization problem and then solved robustly through available MIP solvers. Simulation results show that the proposed time series generation method and the expansion co-optimization model and can improve the expansion result significantly after considering the diversity of wind and load across EI regions. The improved expansion plan that combines generation and transmission will aid system planners and policy makers to maximize the social welfare. This study shows that modelling load and wind variations and diversities across balancing regions will produce significantly different expansion result compared with former studies. For example, if wind is modeled in more details (by increasing the number of wind output levels) so that more wind blocks are considered in expansion planning, transmission expansion will be larger and the expansion timing will be earlier. Regarding generation expansion, more wind scenarios will slightly reduce wind generation expansion in the EI system and increase the expansion of other generation such as gas. Also, adopting detailed wind scenarios will reveal that it may be uneconomic to expand transmission networks for transmitting a large amount of wind power through a long distance in the EI system. Incorporating more details of renewables in expansion planning will inevitably increase the computational burden. Therefore, high performance computing (HPC) techniques are urgently needed for power system operation and planning optimization. As a scoping study task, this project tested some preliminary parallel computation techniques such as breaking down the simulation task into several sub-tasks based on chronology splitting or sample splitting, and then assigning these sub-tasks to different cores. Testing results show significant time reduction when a simulation task is split into several sub-tasks for parallel execution.« less

  4. Towards pattern generation and chaotic series prediction with photonic reservoir computers

    NASA Astrophysics Data System (ADS)

    Antonik, Piotr; Hermans, Michiel; Duport, François; Haelterman, Marc; Massar, Serge

    2016-03-01

    Reservoir Computing is a bio-inspired computing paradigm for processing time dependent signals that is particularly well suited for analog implementations. Our team has demonstrated several photonic reservoir computers with performance comparable to digital algorithms on a series of benchmark tasks such as channel equalisation and speech recognition. Recently, we showed that our opto-electronic reservoir computer could be trained online with a simple gradient descent algorithm programmed on an FPGA chip. This setup makes it in principle possible to feed the output signal back into the reservoir, and thus highly enrich the dynamics of the system. This will allow to tackle complex prediction tasks in hardware, such as pattern generation and chaotic and financial series prediction, which have so far only been studied in digital implementations. Here we report simulation results of our opto-electronic setup with an FPGA chip and output feedback applied to pattern generation and Mackey-Glass chaotic series prediction. The simulations take into account the major aspects of our experimental setup. We find that pattern generation can be easily implemented on the current setup with very good results. The Mackey-Glass series prediction task is more complex and requires a large reservoir and more elaborate training algorithm. With these adjustments promising result are obtained, and we now know what improvements are needed to match previously reported numerical results. These simulation results will serve as basis of comparison for experiments we will carry out in the coming months.

  5. [Approximation to the dynamics of meningococcal meningitis through dynamic systems and time series].

    PubMed

    Canals, M

    1996-02-01

    Meningococcal meningitis is subjected to epidemiological surveillance due to its severity and the occasional presentation of epidemic outbreaks. This work analyses previous disease models, generate new ones and analyses monthly cases using ARIMA time series models. The results show that disease dynamics for closed populations is epidemic and the epidemic size is related to the proportion of carriers and the transmissiveness of the agent. In open populations, disease dynamics depends on the admission rate of susceptible and the relative admission of infected individuals. Our model considers a logistic populational growth and carrier admission proportional to populational size, generating an endemic dynamics. Considering a non-instantaneous system response, a greater realism is obtained establishing that the endemic situation may present a dynamics highly sensitive to initial conditions, depending on the transmissiveness and proportion of susceptible individuals in the population. Time series model showed an adequate predictive capacity in terms no longer than 10 months. The lack of long term predictability was attributed to local changes in the proportion of carriers or on transmissiveness that lead to chaotic dynamics over a seasonal pattern. Predictions for 1995 and 1996 were obtained.

  6. TimesVector: a vectorized clustering approach to the analysis of time series transcriptome data from multiple phenotypes.

    PubMed

    Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun

    2017-12-01

    Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/. sunkim.bioinfo@snu.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  7. Comparison of different synthetic 5-min rainfall time series regarding their suitability for urban drainage modelling

    NASA Astrophysics Data System (ADS)

    van der Heijden, Sven; Callau Poduje, Ana; Müller, Hannes; Shehu, Bora; Haberlandt, Uwe; Lorenz, Manuel; Wagner, Sven; Kunstmann, Harald; Müller, Thomas; Mosthaf, Tobias; Bárdossy, András

    2015-04-01

    For the design and operation of urban drainage systems with numerical simulation models, long, continuous precipitation time series with high temporal resolution are necessary. Suitable observed time series are rare. As a result, intelligent design concepts often use uncertain or unsuitable precipitation data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic precipitation data for urban drainage modelling are advanced, tested, and compared. The presented study compares four different approaches of precipitation models regarding their ability to reproduce rainfall and runoff characteristics. These include one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model, and one disaggregation approach based on daily precipitation measurements. All four models produce long precipitation time series with a temporal resolution of five minutes. The synthetic time series are first compared to observed rainfall reference time series. Comparison criteria include event based statistics like mean dry spell and wet spell duration, wet spell amount and intensity, long term means of precipitation sum and number of events, and extreme value distributions for different durations. Then they are compared regarding simulated discharge characteristics using an urban hydrological model on a fictitious sewage network. First results show a principal suitability of all rainfall models but with different strengths and weaknesses regarding the different rainfall and runoff characteristics considered.

  8. Identification and classification of transient pulses observed in magnetometer array data by time-domain principal component analysis filtering

    NASA Astrophysics Data System (ADS)

    Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.

    2017-08-01

    A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.

  9. Homogenization of long instrumental temperature and precipitation series over the Spanish Northern Coast

    NASA Astrophysics Data System (ADS)

    Sigro, J.; Brunet, M.; Aguilar, E.; Stoll, H.; Jimenez, M.

    2009-04-01

    The Spanish-funded research project Rapid Climate Changes in the Iberian Peninsula (IP) Based on Proxy Calibration, Long Term Instrumental Series and High Resolution Analyses of Terrestrial and Marine Records (CALIBRE: ref. CGL2006-13327-C04/CLI) has as main objective to analyse climate dynamics during periods of rapid climate change by means of developing high-resolution paleoclimate proxy records from marine and terrestrial (lakes and caves) deposits over the IP and calibrating them with long-term and high-quality instrumental climate time series. Under CALIBRE, the coordinated project Developing and Enhancing a Climate Instrumental Dataset for Calibrating Climate Proxy Data and Analysing Low-Frequency Climate Variability over the Iberian Peninsula (CLICAL: CGL2006-13327-C04-03/CLI) is devoted to the development of homogenised climate records and sub-regional time series which can be confidently used in the calibration of the lacustrine, marine and speleothem time series generated under CALIBRE. Here we present the procedures followed in order to homogenise a dataset of maximum and minimum temperature and precipitation data on a monthly basis over the Spanish northern coast. The dataset is composed of thirty (twenty) precipitation (temperature) long monthly records. The data are quality controlled following the procedures recommended by Aguilar et al. (2003) and tested for homogeneity and adjusted by following the approach adopted by Brunet et al. (2008). Sub-regional time series of precipitation, maximum and minimum temperatures for the period 1853-2007 have been generated by averaging monthly anomalies and then adding back the base-period mean, according to the method of Jones and Hulme (1996). Also, a method to adjust the variance bias present in regional time series associated over time with varying sample size has been applied (Osborn et al., 1997). The results of this homogenisation exercise and the development of the associated sub-regional time series will be widely discussed. Initial comparisons with rapidly growing speleothems in two different caves indicate that speleothem trace element ratios like Ba/Ca are recording the decrease in littoral precipitation in the last several decades. References Aguilar, E., Auer, I., Brunet, M., Peterson, T. C. and Weringa, J. 2003. Guidelines on Climate Metadata and Homogenization, World Meteorological Organization (WMO)-TD no. 1186 / World Climate Data and Monitoring Program (WCDMP) no. 53, Geneva: 51 pp. Brunet M, Saladié O, Jones P, Sigró J, Aguilar E, Moberg A, Lister D, Walther A, Almarza C. 2008. A case-study/guidance on the development of long-term daily adjusted temperature datasets, WMO-TD-1425/WCDMP-66, Geneva: 43 pp. Jones, P D, and Hulme M, 1996, Calculating regional climatic time series for temperature and precipitation: Methods and illustrations, Int. J. Climatol., 16, 361- 377. Osborn, T. J., Briffa K. R., and Jones P. D., 1997, Adjusting variance for sample-size in tree-ring chronologies and other regional mean time series, Dendrochronologia, 15, 89- 99.

  10. CauseMap: fast inference of causality from complex time series.

    PubMed

    Maher, M Cyrus; Hernandez, Ryan D

    2015-01-01

    Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data. Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM), a method for establishing causality from long time series data (≳25 observations). Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens' Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement CCM in Julia, a high-performance programming language designed for facile technical computing. Our software package, CauseMap, is platform-independent and freely available as an official Julia package. Conclusions. CauseMap is an efficient implementation of a state-of-the-art algorithm for detecting causality from time series data. We believe this tool will be a valuable resource for biomedical research and personalized medicine.

  11. Whole-Volume Clustering of Time Series Data from Zebrafish Brain Calcium Images via Mixture Modeling.

    PubMed

    Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L

    2018-02-01

    Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.

  12. STEM connections to the GOES-R Satellite Series

    NASA Astrophysics Data System (ADS)

    Mooney, M. E.; Schmit, T.

    2015-12-01

    GOES-R, a new Geostationary Operational Environmental Satellite (GOES) is scheduled to be launched in October of 2016. Its role is to continue western hemisphere satellite coverage while the existing GOES series winds down its 20-year operation. However, instruments on the next generation GOES-R satellite series will provide major improvements to the current GOES, both in the frequency of images acquired and the spectral and spatial resolution of the images, providing a perfect conduit for STEM education. Most of these improvements will be provided by the Advanced Baseline Imager (ABI). ABI will provide three times more spectral information, four times the spatial resolution, and more than five times faster temporal coverage than the current GOES. Another exciting addition to the GOES-R satellite series will be the Geostationary Lightning Mapper (GLM). The all new GLM on GOES-R will measure total lightning activity continuously over the Americas and adjacent ocean regions with near uniform spatial resolution of approximately 10 km! Due to ABI, GLM and improved spacecraft calibration and navigation, the next generation GOES-R satellite series will usher in an exciting era of satellite applications and opportunities for STEM education. This session will present and demonstrate exciting next-gen imagery advancements and new HTML5 WebApps that demonstrate STEM connections to these improvements. Participants will also be invited to join the GOES-R Education Proving Ground, a national network of educators who will receive stipends to attend 4 webinars during the spring of 2016, pilot a STEM lesson plan, and organize a school-wide launch awareness event.

  13. Triangular Arbitrage as an Interaction in Foreign Exchange Markets

    NASA Astrophysics Data System (ADS)

    Aiba, Yukihiro; Hatano, Naomichi

    Analyzing correlation in financial time series is a topic of considerable interest [1]-[17]. In the foreign exchange market, a correlation among the exchange rates can be generated by a triangular arbitrage transaction. The purpose of this article is to review our recent study [18]-[23] on modeling the interaction generated by the triangular arbitrage.

  14. Uncertainty estimation with bias-correction for flow series based on rating curve

    NASA Astrophysics Data System (ADS)

    Shao, Quanxi; Lerat, Julien; Podger, Geoff; Dutta, Dushmanta

    2014-03-01

    Streamflow discharge constitutes one of the fundamental data required to perform water balance studies and develop hydrological models. A rating curve, designed based on a series of concurrent stage and discharge measurements at a gauging location, provides a way to generate complete discharge time series with a reasonable quality if sufficient measurement points are available. However, the associated uncertainty is frequently not available even though it has a significant impact on hydrological modelling. In this paper, we identify the discrepancy of the hydrographers' rating curves used to derive the historical discharge data series and proposed a modification by bias correction which is also in the form of power function as the traditional rating curve. In order to obtain the uncertainty estimation, we propose a further both-side Box-Cox transformation to stabilize the regression residuals as close to the normal distribution as possible, so that a proper uncertainty can be attached for the whole discharge series in the ensemble generation. We demonstrate the proposed method by applying it to the gauging stations in the Flinders and Gilbert rivers in north-west Queensland, Australia.

  15. Modified superposition: A simple time series approach to closed-loop manual controller identification

    NASA Technical Reports Server (NTRS)

    Biezad, D. J.; Schmidt, D. K.; Leban, F.; Mashiko, S.

    1986-01-01

    Single-channel pilot manual control output in closed-tracking tasks is modeled in terms of linear discrete transfer functions which are parsimonious and guaranteed stable. The transfer functions are found by applying a modified super-position time series generation technique. A Levinson-Durbin algorithm is used to determine the filter which prewhitens the input and a projective (least squares) fit of pulse response estimates is used to guarantee identified model stability. Results from two case studies are compared to previous findings, where the source of data are relatively short data records, approximately 25 seconds long. Time delay effects and pilot seasonalities are discussed and analyzed. It is concluded that single-channel time series controller modeling is feasible on short records, and that it is important for the analyst to determine a criterion for best time domain fit which allows association of model parameter values, such as pure time delay, with actual physical and physiological constraints. The purpose of the modeling is thus paramount.

  16. Layered Ensemble Architecture for Time Series Forecasting.

    PubMed

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  17. Indexed semi-Markov process for wind speed modeling.

    NASA Astrophysics Data System (ADS)

    Petroni, F.; D'Amico, G.; Prattico, F.

    2012-04-01

    The increasing interest in renewable energy leads scientific research to find a better way to recover most of the available energy. Particularly, the maximum energy recoverable from wind is equal to 59.3% of that available (Betz law) at a specific pitch angle and when the ratio between the wind speed in output and in input is equal to 1/3. The pitch angle is the angle formed between the airfoil of the blade of the wind turbine and the wind direction. Old turbine and a lot of that actually marketed, in fact, have always the same invariant geometry of the airfoil. This causes that wind turbines will work with an efficiency that is lower than 59.3%. New generation wind turbines, instead, have a system to variate the pitch angle by rotating the blades. This system able the wind turbines to recover, at different wind speed, always the maximum energy, working in Betz limit at different speed ratios. A powerful system control of the pitch angle allows the wind turbine to recover better the energy in transient regime. A good stochastic model for wind speed is then needed to help both the optimization of turbine design and to assist the system control to predict the value of the wind speed to positioning the blades quickly and correctly. The possibility to have synthetic data of wind speed is a powerful instrument to assist designer to verify the structures of the wind turbines or to estimate the energy recoverable from a specific site. To generate synthetic data, Markov chains of first or higher order are often used [1,2,3]. In particular in [1] is presented a comparison between a first-order Markov chain and a second-order Markov chain. A similar work, but only for the first-order Markov chain, is conduced by [2], presenting the probability transition matrix and comparing the energy spectral density and autocorrelation of real and synthetic wind speed data. A tentative to modeling and to join speed and direction of wind is presented in [3], by using two models, first-order Markov chain with different number of states, and Weibull distribution. All this model use Markov chains to generate synthetic wind speed time series but the search for a better model is still open. Approaching this issue, we applied new models which are generalization of Markov models. More precisely we applied semi-Markov models to generate synthetic wind speed time series. In a previous work we proposed different semi-Markov models, showing their ability to reproduce the autocorrelation structures of wind speed data. In that paper we showed also that the autocorrelation is higher with respect to the Markov model. Unfortunately this autocorrelation was still too small compared to the empirical one. In order to overcome the problem of low autocorrelation, in this paper we propose an indexed semi-Markov model. More precisely we assume that wind speed is described by a discrete time homogeneous semi-Markov process. We introduce a memory index which takes into account the periods of different wind activities. With this model the statistical characteristics of wind speed are faithfully reproduced. The wind is a very unstable phenomenon characterized by a sequence of lulls and sustained speeds, and a good wind generator must be able to reproduce such sequences. To check the validity of the predictive semi-Markovian model, the persistence of synthetic winds were calculated, then averaged and computed. The model is used to generate synthetic time series for wind speed by means of Monte Carlo simulations and the time lagged autocorrelation is used to compare statistical properties of the proposed models with those of real data and also with a time series generated though a simple Markov chain. [1] A. Shamshad, M.A. Bawadi, W.M.W. Wan Hussin, T.A. Majid, S.A.M. Sanusi, First and second order Markov chain models for synthetic generation of wind speed time series, Energy 30 (2005) 693-708. [2] H. Nfaoui, H. Essiarab, A.A.M. Sayigh, A stochastic Markov chain model for simulating wind speed time series at Tangiers, Morocco, Renewable Energy 29 (2004) 1407-1418. [3] F. Youcef Ettoumi, H. Sauvageot, A.-E.-H. Adane, Statistical bivariate modeling of wind using first-order Markov chain and Weibull distribution, Renewable Energy 28 (2003) 1787-1802.

  18. Spatiotemporal dynamics of human settlement patterns in the Southeast U.S. from DMSP/OLS nightlight time series, 1992-2013

    NASA Astrophysics Data System (ADS)

    Wang, C.; Lu, L.

    2015-12-01

    The Southeast U.S. is listed one of the fastest growing regions by the Census Bureau, covering two of the eleven megaregions of the United States (Florida and Piedmont Atlantic). The Defense Meteorological Satellite Program (DMSP)'s Operational Line-scan System (OLS) nighttime light (NTL) imagery offers a good opportunity for characterizing the extent and dynamics of urban development at global and regional scales. However, the commonly used thresholding technique for NTL-based urban land mapping often underestimates the suburban and rural areas and overestimates urban extents. In this study we developed a novel approach to estimating impervious surface area (ISA) by integrating the NTL and optical reflectance data. A geographically weighted regression model was built to extract ISA from the Vegetation-Adjusted NTL Urban Index (VANUI). The ISA was estimated each year from 1992 to 2013 to generate the ISA time series for the U.S. Southeast region. Using the National Land Cover Database (NLCD) products of percent imperviousness (2001, 2006, and 2010) as our reference data, accuracy assessment indicated that our approach made considerable improvement of the ISA estimation, especially in suburban areas. With the ISA time series, a nonparametric Mann-Kendall trend analysis was performed to detect hotspots of human settlement expansion, followed by the exploration of decennial U.S. census data to link these patterns to migration flows in these hotspots. Our results provided significant insights to human settlement of the U.S. Southeast in the past decades. The proposed approach has great potential for mapping ISA at broad scales with nightlight data such as DMSP/OLS and the new-generation VIIRS products. The ISA time series generated in this study can be used to assess the anthropogenic impacts on regional climate, environment and ecosystem services in the U.S. Southeast.

  19. Reduced-complexity multi-site rainfall generation: one million years over night using the model TripleM

    NASA Astrophysics Data System (ADS)

    Breinl, Korbinian; Di Baldassarre, Giuliano; Girons Lopez, Marc

    2017-04-01

    We assess uncertainties of multi-site rainfall generation across spatial scales and different climatic conditions. Many research subjects in earth sciences such as floods, droughts or water balance simulations require the generation of long rainfall time series. In large study areas the simulation at multiple sites becomes indispensable to account for the spatial rainfall variability, but becomes more complex compared to a single site due to the intermittent nature of rainfall. Weather generators can be used for extrapolating rainfall time series, and various models have been presented in the literature. Even though the large majority of multi-site rainfall generators is based on similar methods, such as resampling techniques or Markovian processes, they often become too complex. We think that this complexity has been a limit for the application of such tools. Furthermore, the majority of multi-site rainfall generators found in the literature are either not publicly available or intended for being applied at small geographical scales, often only in temperate climates. Here we present a revised, and now publicly available, version of a multi-site rainfall generation code first applied in 2014 in Austria and France, which we call TripleM (Multisite Markov Model). We test this fast and robust code with daily rainfall observations from the United States, in a subtropical, tropical and temperate climate, using rain gauge networks with a maximum site distance above 1,000km, thereby generating one million years of synthetic time series. The modelling of these one million years takes one night on a recent desktop computer. In this research, we first start the simulations with a small station network of three sites and progressively increase the number of sites and the spatial extent, and analyze the changing uncertainties for multiple statistical metrics such as dry and wet spells, rainfall autocorrelation, lagged cross correlations and the inter-annual rainfall variability. Our study contributes to the scientific community of earth sciences and the ongoing debate on extreme precipitation in a changing climate by making a stable, and very easily applicable, multi-site rainfall generation code available to the research community and providing a better understanding of the performance of multi-site rainfall generation depending on spatial scales and climatic conditions.

  20. Meteorological conditions during the summer 1986 CITE 2 flight series

    NASA Technical Reports Server (NTRS)

    Shipham, Mark C.; Cahoon, Donald R.; Bachmeier, A. Scott

    1990-01-01

    An overview of meteorological conditions during the NASA Global Tropospheric Experiment/Chemical Instrumentation Testing and Evaluation (GTE/CITE 2) summer 1986 flight series is presented. Computer-generated isentropic trajectories are used to trace the history of air masses encountered along each aircraft flight path. The synoptic-scale wind fields are depicted based on Montgomery stream function analyses. Time series of aircraft-measured temperature, dew point, ozone, and altitude are shown to depict air mass variability. Observed differences between maritime tropical and maritime polar air masses are discussed.

  1. The promise of the state space approach to time series analysis for nursing research.

    PubMed

    Levy, Janet A; Elser, Heather E; Knobel, Robin B

    2012-01-01

    Nursing research, particularly related to physiological development, often depends on the collection of time series data. The state space approach to time series analysis has great potential to answer exploratory questions relevant to physiological development but has not been used extensively in nursing. The aim of the study was to introduce the state space approach to time series analysis and demonstrate potential applicability to neonatal monitoring and physiology. We present a set of univariate state space models; each one describing a process that generates a variable of interest over time. Each model is presented algebraically and a realization of the process is presented graphically from simulated data. This is followed by a discussion of how the model has been or may be used in two nursing projects on neonatal physiological development. The defining feature of the state space approach is the decomposition of the series into components that are functions of time; specifically, slowly varying level, faster varying periodic, and irregular components. State space models potentially simulate developmental processes where a phenomenon emerges and disappears before stabilizing, where the periodic component may become more regular with time, or where the developmental trajectory of a phenomenon is irregular. The ultimate contribution of this approach to nursing science will require close collaboration and cross-disciplinary education between nurses and statisticians.

  2. Radioactive Decay: Audio Data Collection

    ERIC Educational Resources Information Center

    Struthers, Allan

    2009-01-01

    Many phenomena generate interesting audible time series. This data can be collected and processed using audio software. The free software package "Audacity" is used to demonstrate the process by recording, processing, and extracting click times from an inexpensive radiation detector. The high quality of the data is demonstrated with a simple…

  3. Usefulness of AIRS-Derived OLR, Temperature, Water Vapor and Cloudiness Anomaly Time-series for GCM Validation

    NASA Technical Reports Server (NTRS)

    Molnar, Gyula; Susskind, Joel; Iredell, Lena

    2010-01-01

    The ROBUST nature (biases are not as important as previous GCM-evaluations suggest) of the AIRS-observations-generated ARC-maps and ATs as well as their interrelations suggest that they could be a useful tool to select CGCMs which may be considered the reliable, i.e., to be trusted even for longer-term climate drift/change predictions (even on the regional scale). Get monthly gridded CGCM time-series of atmospheric variables coinciding with the timeframe of the AIRS analyses for at least 5-6 years and do the actual evaluations of ARC-maps and ATs for the coinciding time periods.

  4. Forecasting hotspots using predictive visual analytics approach

    DOEpatents

    Maciejewski, Ross; Hafen, Ryan; Rudolph, Stephen; Cleveland, William; Ebert, David

    2014-12-30

    A method for forecasting hotspots is provided. The method may include the steps of receiving input data at an input of the computational device, generating a temporal prediction based on the input data, generating a geospatial prediction based on the input data, and generating output data based on the time series and geospatial predictions. The output data may be configured to display at least one user interface at an output of the computational device.

  5. Validation of two (parametric vs non-parametric) daily weather generators

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Skalak, P.

    2015-12-01

    As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed-like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30-years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database. Acknowledgements: The weather generator is developed and validated within the frame of projects WG4VALUE (sponsored by the Ministry of Education, Youth and Sports of CR), and VALUE (COST ES 1102 action).

  6. Temporal downscaling of decadal sediment load estimates to a daily interval for use in hindcast simulations

    USGS Publications Warehouse

    Ganju, N.K.; Knowles, N.; Schoellhamer, D.H.

    2008-01-01

    In this study we used hydrologic proxies to develop a daily sediment load time-series, which agrees with decadal sediment load estimates, when integrated. Hindcast simulations of bathymetric change in estuaries require daily sediment loads from major tributary rivers, to capture the episodic delivery of sediment during multi-day freshwater flow pulses. Two independent decadal sediment load estimates are available for the Sacramento/San Joaquin River Delta, California prior to 1959, but they must be downscaled to a daily interval for use in hindcast models. Daily flow and sediment load data to the Delta are available after 1930 and 1959, respectively, but bathymetric change simulations for San Francisco Bay prior to this require a method to generate daily sediment load estimates into the Delta. We used two historical proxies, monthly rainfall and unimpaired flow magnitudes, to generate monthly unimpaired flows to the Sacramento/San Joaquin Delta for the 1851-1929 period. This step generated the shape of the monthly hydrograph. These historical monthly flows were compared to unimpaired monthly flows from the modern era (1967-1987), and a least-squares metric selected a modern water year analogue for each historical water year. The daily hydrograph for the modern analogue was then assigned to the historical year and scaled to match the flow volume estimated by dendrochronology methods, providing the correct total flow for the year. We applied a sediment rating curve to this time-series of daily flows, to generate daily sediment loads for 1851-1958. The rating curve was calibrated with the two independent decadal sediment load estimates, over two distinct periods. This novel technique retained the timing and magnitude of freshwater flows and sediment loads, without damping variability or net sediment loads to San Francisco Bay. The time-series represents the hydraulic mining period with sustained periods of increased sediment loads, and a dramatic decrease after 1910, corresponding to a reduction in available mining debris. The analogue selection procedure also permits exploration of the morphological hydrograph concept, where a limited set of hydrographs is used to simulate the same bathymetric change as the actual set of hydrographs. The final daily sediment load time-series and morphological hydrograph concept will be applied as landward boundary conditions for hindcasting simulations of bathymetric change in San Francisco Bay.

  7. Assessing backscatter change due to backscatter gradient over the Greenland ice sheet using Envisat and SARAL altimetry

    NASA Astrophysics Data System (ADS)

    Su, Xiaoli; Luo, Zhicai; Zhou, Zebing

    2018-06-01

    Knowledge of backscatter change is important to accurately retrieve elevation change time series from satellite radar altimetry over continental ice sheets. Previously, backscatter coefficients generated in two cases, namely with and without accounting for backscatter gradient (BG), are used. However, the difference between backscatter time series obtained separately in these two cases and its impact on retrieving elevation change are not well known. Here we first compare the mean profiles of the Ku and Ka band backscatter over the Greenland ice sheet (GrIS), with results illustrating that the Ku-band backscatter is 3 ∼ 5 dB larger than that of the Ka band. We then conduct statistic analysis about time series of backscatter formed separately in the above two cases for both Ku and Ka bands over two regions in the GrIS. It is found that the standard deviation of backscatter time series becomes slightly smaller after removing the BG effect, which suggests that the method for the BG correction is effective. Furthermore, the impact on elevation change from backscatter change due to the BG effect is separately assessed for both Ku and Ka bands over the GrIS. We conclude that Ka band altimetry would benefit from a BG induced backscatter analysis (∼10% over region 2). This study may provide a reference to form backscatter time series towards refining elevation change time series from satellite radar altimetry over ice sheets using repeat-track analysis.

  8. Comparative analysis of seismic persistence of Hindu Kush nests (Afghanistan) and Los Santos (Colombia) using fractal dimension

    NASA Astrophysics Data System (ADS)

    Prada, D. A.; Sanabria, M. P.; Torres, A. F.; Álvarez, M. A.; Gómez, J.

    2018-04-01

    The study of persistence in time series in seismic events in two of the most important nets such as Hindu Kush in Afghanistan and Los Santos Santander in Colombia generate great interest due to its high presence of telluric activity. The data were taken from the global seismological network. Using the Jarque-Bera test the presence of gaussian distribution was analyzed, and because the distribution in the series was asymmetric, without presence of mesocurtisity, the Hurst coefficient was calculated using the rescaled range method, with which it was found the fractal dimension associated to these time series and under what is possible to determine the persistence, antipersistence and volatility in these phenomena.

  9. A time series model: First-order integer-valued autoregressive (INAR(1))

    NASA Astrophysics Data System (ADS)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  10. Reproducibility of geochemical and climatic signals in the Atlantic coral Montastraea faveolata

    USGS Publications Warehouse

    Smith, Joseph M.; Quinn, T.M.; Helmle, K.P.; Halley, R.B.

    2006-01-01

    Monthly resolved, 41-year-long stable isotopic and elemental ratio time series were generated from two separate heads of Montastraea faveolata from Looe Key, Florida, to assess the fidelity of using geochemical variations in Montastraea, the dominant reef-building coral of the Atlantic, to reconstruct sea surface environmental conditions at this site. The stable isotope time series of the two corals replicate well; mean values of ??18O and ??13C are indistinguishable between cores (compare 0.70??? versus 0.68??? for ??13C and -3.90??? versus - 3.94??? for ??18O). Mean values from the Sr/Ca time series differ by 0.037 mmol/mol, which is outside of analytical error and indicates that nonenvironmental factors are influencing the coral Sr/ Ca records at Looe Key. We have generated significant ?? 18O-sea surface temperature (SST) (R = -0.84) and Sr/ Ca-SST (R = -0.86) calibration equations at Looe Key; however, these equations are different from previously published equations for Montastraea. Variations in growth parameters or kinetic effects are not sufficient to explain either the observed differences in the mean offset between Sr/Ca time series or the disagreement between previous calibrations and our calculated ??18O-SST and Sr/Ca-SST relationships. Calibration differences are most likely due to variations in seawater chemistry in the continentally influenced waters at Looe Key. Additional geochemical replication studies of Montastraea are needed and should include multiple coral heads from open ocean localities complemented whenever possible by seawater chemistry determinations. Copyright 2006 by the American Geophysical Union.

  11. Dynamical density delay maps: simple, new method for visualising the behaviour of complex systems

    PubMed Central

    2014-01-01

    Background Physiologic signals, such as cardiac interbeat intervals, exhibit complex fluctuations. However, capturing important dynamical properties, including nonstationarities may not be feasible from conventional time series graphical representations. Methods We introduce a simple-to-implement visualisation method, termed dynamical density delay mapping (“D3-Map” technique) that provides an animated representation of a system’s dynamics. The method is based on a generalization of conventional two-dimensional (2D) Poincaré plots, which are scatter plots where each data point, x(n), in a time series is plotted against the adjacent one, x(n + 1). First, we divide the original time series, x(n) (n = 1,…, N), into a sequence of segments (windows). Next, for each segment, a three-dimensional (3D) Poincaré surface plot of x(n), x(n + 1), h[x(n),x(n + 1)] is generated, in which the third dimension, h, represents the relative frequency of occurrence of each (x(n),x(n + 1)) point. This 3D Poincaré surface is then chromatised by mapping the relative frequency h values onto a colour scheme. We also generate a colourised 2D contour plot from each time series segment using the same colourmap scheme as for the 3D Poincaré surface. Finally, the original time series graph, the colourised 3D Poincaré surface plot, and its projection as a colourised 2D contour map for each segment, are animated to create the full “D3-Map.” Results We first exemplify the D3-Map method using the cardiac interbeat interval time series from a healthy subject during sleeping hours. The animations uncover complex dynamical changes, such as transitions between states, and the relative amount of time the system spends in each state. We also illustrate the utility of the method in detecting hidden temporal patterns in the heart rate dynamics of a patient with atrial fibrillation. The videos, as well as the source code, are made publicly available. Conclusions Animations based on density delay maps provide a new way of visualising dynamical properties of complex systems not apparent in time series graphs or standard Poincaré plot representations. Trainees in a variety of fields may find the animations useful as illustrations of fundamental but challenging concepts, such as nonstationarity and multistability. For investigators, the method may facilitate data exploration. PMID:24438439

  12. Time lag between immigration and tuberculosis rates in immigrants in the Netherlands: a time-series analysis.

    PubMed

    van Aart, C; Boshuizen, H; Dekkers, A; Korthals Altes, H

    2017-05-01

    In low-incidence countries, most tuberculosis (TB) cases are foreign-born. We explored the temporal relationship between immigration and TB in first-generation immigrants between 1995 and 2012 to assess whether immigration can be a predictor for TB in immigrants from high-incidence countries. We obtained monthly data on immigrant TB cases and immigration for the three countries of origin most frequently represented among TB cases in the Netherlands: Morocco, Somalia and Turkey. The best-fit seasonal autoregressive integrated moving average (SARIMA) model to the immigration time-series was used to prewhiten the TB time series. The cross-correlation function (CCF) was then computed on the residual time series to detect time lags between immigration and TB rates. We identified a 17-month lag between Somali immigration and Somali immigrant TB cases, but no time lag for immigrants from Morocco and Turkey. The absence of a lag in the Moroccan and Turkish population may be attributed to the relatively low TB prevalence in the countries of origin and an increased likelihood of reactivation TB in an ageing immigrant population. Understanding the time lag between Somali immigration and TB disease would benefit from a closer epidemiological analysis of cohorts of Somali cases diagnosed within the first years after entry.

  13. Production and Uses of Multi-Decade Geodetic Earth Science Data Records

    NASA Astrophysics Data System (ADS)

    Bock, Y.; Kedar, S.; Moore, A. W.; Fang, P.; Liu, Z.; Sullivan, A.; Argus, D. F.; Jiang, S.; Marshall, S. T.

    2017-12-01

    The Solid Earth Science ESDR System (SESES) project funded under the NASA MEaSUREs program produces and disseminates mature, long-term, calibrated and validated, GNSS based Earth Science Data Records (ESDRs) that encompass multiple diverse areas of interest in Earth Science, such as tectonic motion, transient slip and earthquake dynamics, as well as meteorology, climate, and hydrology. The ESDRs now span twenty-five years for the earliest stations and today are available for thousands of global and regional stations. Using a unified metadata database and a combination of GNSS solutions generated by two independent analysis centers, the project currently produces four long-term ESDR's: Geodetic Displacement Time Series: Daily, combined, cleaned and filtered, GIPSY and GAMIT long-term time series of continuous GPS station positions (global and regional) in the latest version of ITRF, automatically updated weekly. Geodetic Velocities: Weekly updated velocity field + velocity field histories in various reference frames; compendium of all model parameters including earthquake catalog, coseismic offsets, and postseismic model parameters (exponential or logarithmic). Troposphere Delay Time Series: Long-term time series of troposphere delay (30-min resolution) at geodetic stations, necessarily estimated during position time series production and automatically updated weekly. Seismogeodetic records for historic earthquakes: High-rate broadband displacement and seismic velocity time series combining 1 Hz GPS displacements and 100 Hz accelerometer data for select large earthquakes and collocated cGPS and seismic instruments from regional networks. We present several recent notable examples of the ESDR's usage: A transient slip study that uses the combined position time series to unravel "tremor-less" slow tectonic transient events. Fault geometry determination from geodetic slip rates. Changes in water resources across California's physiographic provinces at a spatial resolution of 75 km. Retrospective study of a southern California summer monsoon event.

  14. DTWscore: differential expression and cell clustering analysis for time-series single-cell RNA-seq data.

    PubMed

    Wang, Zhuo; Jin, Shuilin; Liu, Guiyou; Zhang, Xiurui; Wang, Nan; Wu, Deliang; Hu, Yang; Zhang, Chiping; Jiang, Qinghua; Xu, Li; Wang, Yadong

    2017-05-23

    The development of single-cell RNA sequencing has enabled profound discoveries in biology, ranging from the dissection of the composition of complex tissues to the identification of novel cell types and dynamics in some specialized cellular environments. However, the large-scale generation of single-cell RNA-seq (scRNA-seq) data collected at multiple time points remains a challenge to effective measurement gene expression patterns in transcriptome analysis. We present an algorithm based on the Dynamic Time Warping score (DTWscore) combined with time-series data, that enables the detection of gene expression changes across scRNA-seq samples and recovery of potential cell types from complex mixtures of multiple cell types. The DTWscore successfully classify cells of different types with the most highly variable genes from time-series scRNA-seq data. The study was confined to methods that are implemented and available within the R framework. Sample datasets and R packages are available at https://github.com/xiaoxiaoxier/DTWscore .

  15. ENCAPSULATED AEROSOLS

    DTIC Science & Technology

    materials determine the range of applicability of each method. A useful microencapsulation method, based on coagulation by inertial force was developed...The generation apparatus, consisting of two aerosol generators in series, was utilized to produce many kinds of microcapsules . A fluid energy mill...was found useful for the production of some microcapsules . The permeability of microcapsule films and the effect of exposure time and humidity were

  16. SPAGETTA, a Gridded Weather Generator: Calibration, Validation and its Use for Future Climate

    NASA Astrophysics Data System (ADS)

    Dubrovsky, Martin; Rotach, Mathias W.; Huth, Radan

    2017-04-01

    Spagetta is a new (started in 2016) stochastic multi-site multi-variate weather generator (WG). It can produce realistic synthetic daily (or monthly, or annual) weather series representing both present and future climate conditions at multiple sites (grids or stations irregularly distributed in space). The generator, whose model is based on the Wilks' (1999) multi-site extension of the parametric (Richardson's type) single site M&Rfi generator, may be run in two modes: In the first mode, it is run as a classical generator, which is calibrated in the first step using weather data from multiple sites, and only then it may produce arbitrarily long synthetic time series mimicking the spatial and temporal structure of the calibration weather data. To generate the weather series representing the future climate, the WG parameters are modified according to the climate change scenario, typically derived from GCM or RCM simulations. In the second mode, the user provides only basic information (not necessarily to be realistic) on the temporal and spatial auto-correlation structure of the surface weather variables and their mean annual cycle; the generator itself derives the parameters of the underlying autoregressive model, which produces the multi-site weather series. In the latter mode of operation, the user is allowed to prescribe the spatially varying trend, which is superimposed to the values produced by the generator; this feature has been implemented for use in developing the methodology for assessing significance of trends in multi-site weather series (for more details see another EGU-2017 contribution: Huth and Dubrovsky, 2017, Evaluating collective significance of climatic trends: A comparison of methods on synthetic data; EGU2017-4993). This contribution will focus on the first (classical) mode. The poster will present (a) model of the generator, (b) results of the validation tests made in terms of the spatial hot/cold/dry/wet spells, and (c) results of the pilot climate change impact experiment, in which (i) the WG parameters representing the spatial and temporal variability are modified using the climate change scenarios and then (ii) the effect on the above spatial validation indices derived from the synthetic series produced by the modified WG is analysed. In this experiment, the generator is calibrated using the E-OBS gridded daily weather data for several European regions, and the climate change scenarios are derived from the selected RCM simulation (taken from the CORDEX database).

  17. High voltage pulse generator. [Patent application

    DOEpatents

    Fasching, G.E.

    1975-06-12

    An improved high-voltage pulse generator is described which is especially useful in ultrasonic testing of rock core samples. An N number of capacitors are charged in parallel to V volts and at the proper instance are coupled in series to produce a high-voltage pulse of N times V volts. Rapid switching of the capacitors from the paralleled charging configuration to the series discharging configuration is accomplished by using silicon-controlled rectifiers which are chain self-triggered following the initial triggering of the first rectifier connected between the first and second capacitors. A timing and triggering circuit is provided to properly synchronize triggering pulses to the first SCR at a time when the charging voltage is not being applied to the parallel-connected charging capacitors. The output voltage can be readily increased by adding additional charging networks. The circuit allows the peak level of the output to be easily varied over a wide range by using a variable autotransformer in the charging circuit.

  18. Multiresolution forecasting for futures trading using wavelet decompositions.

    PubMed

    Zhang, B L; Coggins, R; Jabri, M A; Dersch, D; Flower, B

    2001-01-01

    We investigate the effectiveness of a financial time-series forecasting strategy which exploits the multiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift invariant scale-related representation. In transform space, each individual wavelet series is modeled by a separate multilayer perceptron (MLP). We apply the Bayesian method of automatic relevance determination to choose short past windows (short-term history) for the inputs to the MLPs at lower scales and long past windows (long-term history) at higher scales. To form the overall forecast, the individual forecasts are then recombined by the linear reconstruction property of the inverse transform with the chosen autocorrelation shell representation, or by another perceptron which learns the weight of each scale in the prediction of the original time series. The forecast results are then passed to a money management system to generate trades.

  19. POD Model Reconstruction for Gray-Box Fault Detection

    NASA Technical Reports Server (NTRS)

    Park, Han; Zak, Michail

    2007-01-01

    Proper orthogonal decomposition (POD) is the mathematical basis of a method of constructing low-order mathematical models for the "gray-box" fault-detection algorithm that is a component of a diagnostic system known as beacon-based exception analysis for multi-missions (BEAM). POD has been successfully applied in reducing computational complexity by generating simple models that can be used for control and simulation for complex systems such as fluid flows. In the present application to BEAM, POD brings the same benefits to automated diagnosis. BEAM is a method of real-time or offline, automated diagnosis of a complex dynamic system.The gray-box approach makes it possible to utilize incomplete or approximate knowledge of the dynamics of the system that one seeks to diagnose. In the gray-box approach, a deterministic model of the system is used to filter a time series of system sensor data to remove the deterministic components of the time series from further examination. What is left after the filtering operation is a time series of residual quantities that represent the unknown (or at least unmodeled) aspects of the behavior of the system. Stochastic modeling techniques are then applied to the residual time series. The procedure for detecting abnormal behavior of the system then becomes one of looking for statistical differences between the residual time series and the predictions of the stochastic model.

  20. Describing temporal variability of the mean Estonian precipitation series in climate time scale

    NASA Astrophysics Data System (ADS)

    Post, P.; Kärner, O.

    2009-04-01

    Applicability of the random walk type models to represent the temporal variability of various atmospheric temperature series has been successfully demonstrated recently (e.g. Kärner, 2002). Main problem in the temperature modeling is connected to the scale break in the generally self similar air temperature anomaly series (Kärner, 2005). The break separates short-range strong non-stationarity from nearly stationary longer range variability region. This is an indication of the fact that several geophysical time series show a short-range non-stationary behaviour and a stationary behaviour in longer range (Davis et al., 1996). In order to model series like that the choice of time step appears to be crucial. To characterize the long-range variability we can neglect the short-range non-stationary fluctuations, provided that we are able to model properly the long-range tendencies. The structure function (Monin and Yaglom, 1975) was used to determine an approximate segregation line between the short and the long scale in terms of modeling. The longer scale can be called climate one, because such models are applicable in scales over some decades. In order to get rid of the short-range fluctuations in daily series the variability can be examined using sufficiently long time step. In the present paper, we show that the same philosophy is useful to find a model to represent a climate-scale temporal variability of the Estonian daily mean precipitation amount series over 45 years (1961-2005). Temporal variability of the obtained daily time series is examined by means of an autoregressive and integrated moving average (ARIMA) family model of the type (0,1,1). This model is applicable for daily precipitation simulating if to select an appropriate time step that enables us to neglet the short-range non-stationary fluctuations. A considerably longer time step than one day (30 days) is used in the current paper to model the precipitation time series variability. Each ARIMA (0,1,1) model can be interpreted to be consisting of random walk in a noisy environment (Box and Jenkins, 1976). The fitted model appears to be weakly non-stationary, that gives us the possibility to use stationary approximation if only the noise component from that sum of white noise and random walk is exploited. We get a convenient routine to generate a stationary precipitation climatology with a reasonable accuracy, since the noise component variance is much larger than the dispersion of the random walk generator. This interpretation emphasizes dominating role of a random component in the precipitation series. The result is understandable due to a small territory of Estonia that is situated in the mid-latitude cyclone track. References Box, J.E.P. and G. Jenkins 1976: Time Series Analysis, Forecasting and Control (revised edn.), Holden Day San Francisco, CA, 575 pp. Davis, A., Marshak, A., Wiscombe, W. and R. Cahalan 1996: Multifractal characterizations of intermittency in nonstationary geophysical signals and fields.in G. Trevino et al. (eds) Current Topics in Nonsstationarity Analysis. World-Scientific, Singapore, 97-158. Kärner, O. 2002: On nonstationarity and antipersistency in global temperature series. J. Geophys. Res. D107; doi:10.1029/2001JD002024. Kärner, O. 2005: Some examples on negative feedback in the Earth climate system. Centr. European J. Phys. 3; 190-208. Monin, A.S. and A.M. Yaglom 1975: Statistical Fluid Mechanics, Vol 2. Mechanics of Turbulence , MIT Press Boston Mass, 886 pp.

  1. Forecasting Non-Stationary Diarrhea, Acute Respiratory Infection, and Malaria Time-Series in Niono, Mali

    PubMed Central

    Medina, Daniel C.; Findley, Sally E.; Guindo, Boubacar; Doumbia, Seydou

    2007-01-01

    Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. Conclusions/Significance The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel. PMID:18030322

  2. Forecasting non-stationary diarrhea, acute respiratory infection, and malaria time-series in Niono, Mali.

    PubMed

    Medina, Daniel C; Findley, Sally E; Guindo, Boubacar; Doumbia, Seydou

    2007-11-21

    Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. In this longitudinal retrospective (01/1996-06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel.

  3. Shilling attack detection for recommender systems based on credibility of group users and rating time series.

    PubMed

    Zhou, Wei; Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian

    2018-01-01

    Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user's credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method.

  4. Shilling attack detection for recommender systems based on credibility of group users and rating time series

    PubMed Central

    Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian

    2018-01-01

    Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user’s credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method. PMID:29742134

  5. FROG: Time Series Analysis for the Web Service Era

    NASA Astrophysics Data System (ADS)

    Allan, A.

    2005-12-01

    The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).

  6. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    NASA Astrophysics Data System (ADS)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  7. The NCAA's New Hammer

    ERIC Educational Resources Information Center

    Wolverton, Brad

    2012-01-01

    A series of unprecedented scandals has eroded confidence in big-time sports, increasing the appetite for change. Some critics have a tough time seeing the NCAA as a savior; they say the real problem is the NCAA structure itself, which allows athletes to generate billions of dollars for colleges while earning no compensation themselves. Mark A.…

  8. 3D displacement time series in the Afar rift zone computed from SAR phase and amplitude information

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Manconi, Andrea

    2013-04-01

    Large and rapid deformations, such as those caused by earthquakes, eruptions, and landslides cannot be fully measured by using standard DInSAR applications. Indeed, the phase information often degrades and some areas of the interferograms are affected by high fringe rates, leading to difficulties in the phase unwrapping, and/or to complete loss of coherence due to significant misregistration errors. This limitation can be overcome by exploiting the SAR image amplitude information instead of the phase, and by calculating the Pixel-Offset (PO) field SAR image pairs, for both range and azimuth directions. Moreover, it is possible to combine the PO results by following the same rationale of the SBAS technique, to finally retrieve the offset-based deformation time series. Such technique, named PO-SBAS, permits to retrieve the deformation field in areas affected by very large displacements at an accuracy that, for ENVISAT data, correspond to 30 cm and 15 cm for the range and azimuth, respectively [1]. Moreover, the combination of SBAS and PO-SBAS time series can help to better study and model deformation phenomena characterized by spatial and temporal heterogeneities [2]. The Dabbahu rift segment of the Afar depression has been active since 2005 when a 2.5 km3 dyke intrusion and hundreds of earthquakes marked the onset a rifting episode which continues to date. The ENVISAT satellite has repeatedly imaged the Afar depression since 2003, generating a large SAR archive. In this work, we study the Afar rift region deformations by using both the phase and amplitude information of several sets of SAR images acquired from ascending and descending ENVISAT tracks. We combined sets of small baseline interferograms through the SBAS algorithm, and we generate both ground deformation maps and time series along the satellite Line-Of-Sight (LOS). In areas where the deformation gradient causes loss of coherence, we retrieve the displacement field through the amplitude information. Furthermore, we could also retrieve the full 3D deformation field, by considering the North-South displacement component obtained from the azimuth PO information. The combination of SBAS and PO-SBAS information permits to better retrieve and constrain the full deformation field due to repeated intrusions, fault movements, as well as the magma movements from individual magma chambers. [1] Casu, F., A. Manconi, A. Pepe and R. Lanari, 2011. Deformation time-series generation in areas characterized by large displacement dynamics: the SAR amplitude Pixel-Offset SBAS technique, IEEE Transaction on Geosciences and Remote Sensing. [2] Manconi, A. and F. Casu, 2012. Joint analysis of displacement time series retrieved from SAR phase and amplitude: impact on the estimation of volcanic source parameters, Geophysical Research Letters, doi:10.1029/2012GL052202.

  9. Karst characterization in a semi-arid region using gravity, seismic, and resistivity geophysical techniques.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnhart, Kevin Scott

    2013-10-01

    We proposed to customize emerging in situ geophysical monitoring technology to generate time-series data during sporadic rain events in a semi-arid region. Electrodes were to be connected to wireless \

  10. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Note on Two-Phase Phenomena in Financial Markets

    NASA Astrophysics Data System (ADS)

    Jiang, Shi-Mei; Cai, Shi-Min; Zhou, Tao; Zhou, Pei-Ling

    2008-06-01

    The two-phase behaviour in financial markets actually means the bifurcation phenomenon, which represents the change of the conditional probability from an unimodal to a bimodal distribution. We investigate the bifurcation phenomenon in Hang-Seng index. It is observed that the bifurcation phenomenon in financial index is not universal, but specific under certain conditions. For Hang-Seng index and randomly generated time series, the phenomenon just emerges when the power-law exponent of absolute increment distribution is between 1 and 2 with appropriate period. Simulations on a randomly generated time series suggest the bifurcation phenomenon itself is subject to the statistics of absolute increment, thus it may not be able to reflect essential financial behaviours. However, even under the same distribution of absolute increment, the range where bifurcation phenomenon occurs is far different from real market to artificial data, which may reflect certain market information.

  11. Beyond annual streamflow reconstructions for the Upper Colorado River Basin: a paleo-water-balance approach

    USGS Publications Warehouse

    Gangopadhyay, Subhrendu; McCabe, Gregory J.; Woodhouse, Connie A.

    2015-01-01

    In this paper, we present a methodology to use annual tree-ring chronologies and a monthly water balance model to generate annual reconstructions of water balance variables (e.g., potential evapotrans- piration (PET), actual evapotranspiration (AET), snow water equivalent (SWE), soil moisture storage (SMS), and runoff (R)). The method involves resampling monthly temperature and precipitation from the instrumental record directed by variability indicated by the paleoclimate record. The generated time series of monthly temperature and precipitation are subsequently used as inputs to a monthly water balance model. The methodology is applied to the Upper Colorado River Basin, and results indicate that the methodology reliably simulates water-year runoff, maximum snow water equivalent, and seasonal soil moisture storage for the instrumental period. As a final application, the methodology is used to produce time series of PET, AET, SWE, SMS, and R for the 1404–1905 period for the Upper Colorado River Basin.

  12. Scenario Generation and Assessment Framework Solution in Support of the Comprehensive Approach

    DTIC Science & Technology

    2010-04-01

    attention, stress, fatigue etc.) and neurofeedback tracking for evaluation in a qualitative manner the real involvement of the trained participants in CAX...Series, Softrade, 2006 (in Bulgarian). [11] Minchev Z., Dukov G., Georgiev S. EEG Spectral Analysis in Serious Gaming: An Ad Hoc Experimental...Nonlinear and linear forecasting of the EEG time series, Biological Cybernetics, 66, 221-259, 1991. [20] Schubert, J., Svenson, P., and Mårtenson, Ch

  13. Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics

    NASA Astrophysics Data System (ADS)

    Kuchment, L.

    2012-04-01

    Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.

  14. Synthetic wind speed scenarios generation for probabilistic analysis of hybrid energy systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jun; Rabiti, Cristian

    Hybrid energy systems consisting of multiple energy inputs and multiple energy outputs have been proposed to be an effective element to enable ever increasing penetration of clean energy. In order to better understand the dynamic and probabilistic behavior of hybrid energy systems, this paper proposes a model combining Fourier series and autoregressive moving average (ARMA) to characterize historical weather measurements and to generate synthetic weather (e.g., wind speed) data. In particular, Fourier series is used to characterize the seasonal trend in historical data, while ARMA is applied to capture the autocorrelation in residue time series (e.g., measurements minus seasonal trends).more » The generated synthetic wind speed data is then utilized to perform probabilistic analysis of a particular hybrid energy system con guration, which consists of nuclear power plant, wind farm, battery storage, natural gas boiler, and chemical plant. As a result, requirements on component ramping rate, economic and environmental impacts of hybrid energy systems, and the effects of deploying different sizes of batteries in smoothing renewable variability, are all investigated.« less

  15. Synthetic wind speed scenarios generation for probabilistic analysis of hybrid energy systems

    DOE PAGES

    Chen, Jun; Rabiti, Cristian

    2016-11-25

    Hybrid energy systems consisting of multiple energy inputs and multiple energy outputs have been proposed to be an effective element to enable ever increasing penetration of clean energy. In order to better understand the dynamic and probabilistic behavior of hybrid energy systems, this paper proposes a model combining Fourier series and autoregressive moving average (ARMA) to characterize historical weather measurements and to generate synthetic weather (e.g., wind speed) data. In particular, Fourier series is used to characterize the seasonal trend in historical data, while ARMA is applied to capture the autocorrelation in residue time series (e.g., measurements minus seasonal trends).more » The generated synthetic wind speed data is then utilized to perform probabilistic analysis of a particular hybrid energy system con guration, which consists of nuclear power plant, wind farm, battery storage, natural gas boiler, and chemical plant. As a result, requirements on component ramping rate, economic and environmental impacts of hybrid energy systems, and the effects of deploying different sizes of batteries in smoothing renewable variability, are all investigated.« less

  16. Applying data mining techniques to medical time series: an empirical case study in electroencephalography and stabilometry.

    PubMed

    Anguera, A; Barreiro, J M; Lara, J A; Lizcano, D

    2016-01-01

    One of the major challenges in the medical domain today is how to exploit the huge amount of data that this field generates. To do this, approaches are required that are capable of discovering knowledge that is useful for decision making in the medical field. Time series are data types that are common in the medical domain and require specialized analysis techniques and tools, especially if the information of interest to specialists is concentrated within particular time series regions, known as events. This research followed the steps specified by the so-called knowledge discovery in databases (KDD) process to discover knowledge from medical time series derived from stabilometric (396 series) and electroencephalographic (200) patient electronic health records (EHR). The view offered in the paper is based on the experience gathered as part of the VIIP project. Knowledge discovery in medical time series has a number of difficulties and implications that are highlighted by illustrating the application of several techniques that cover the entire KDD process through two case studies. This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG) domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.

  17. Wavelet-based surrogate time series for multiscale simulation of heterogeneous catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savara, Aditya Ashi; Daw, C. Stuart; Xiong, Qingang

    We propose a wavelet-based scheme that encodes the essential dynamics of discrete microscale surface reactions in a form that can be coupled with continuum macroscale flow simulations with high computational efficiency. This makes it possible to simulate the dynamic behavior of reactor-scale heterogeneous catalysis without requiring detailed concurrent simulations at both the surface and continuum scales using different models. Our scheme is based on the application of wavelet-based surrogate time series that encodes the essential temporal and/or spatial fine-scale dynamics at the catalyst surface. The encoded dynamics are then used to generate statistically equivalent, randomized surrogate time series, which canmore » be linked to the continuum scale simulation. As a result, we illustrate an application of this approach using two different kinetic Monte Carlo simulations with different characteristic behaviors typical for heterogeneous chemical reactions.« less

  18. Dimensionless embedding for nonlinear time series analysis

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito; Aihara, Kazuyuki

    2017-09-01

    Recently, infinite-dimensional delay coordinates (InDDeCs) have been proposed for predicting high-dimensional dynamics instead of conventional delay coordinates. Although InDDeCs can realize faster computation and more accurate short-term prediction, it is still not well-known whether InDDeCs can be used in other applications of nonlinear time series analysis in which reconstruction is needed for the underlying dynamics from a scalar time series generated from a dynamical system. Here, we give theoretical support for justifying the use of InDDeCs and provide numerical examples to show that InDDeCs can be used for various applications for obtaining the recurrence plots, correlation dimensions, and maximal Lyapunov exponents, as well as testing directional couplings and extracting slow-driving forces. We demonstrate performance of the InDDeCs using the weather data. Thus, InDDeCs can eventually realize "dimensionless embedding" while we enjoy faster and more reliable computations.

  19. Deduction of initial strategy distributions of agents in mix-game models

    NASA Astrophysics Data System (ADS)

    Gou, Chengling

    2006-11-01

    This paper reports the effort of deducing the initial strategy distributions (ISDs) of agents in mix-game models that is used to predict a real financial time series generated from a target financial market. Using mix-games to predict Shanghai Index, we find that the time series of prediction accurate rates is sensitive to the ISDs of agents in group 2 who play a minority game, but less sensitive to the ISDs of agents in group 1 who play a majority game. And agents in group 2 tend to cluster in full strategy space (FSS) if the real financial time series has obvious tendency (upward or downward), otherwise they tend to scatter in FSS. We also find that the ISDs and the number of agents in group 1 influence the level of prediction accurate rates. Finally, this paper gives suggestion about further research.

  20. A low free-parameter stochastic model of daily Forbush decrease indices

    NASA Astrophysics Data System (ADS)

    Patra, Sankar Narayan; Bhattacharya, Gautam; Panja, Subhash Chandra; Ghosh, Koushik

    2014-01-01

    Forbush decrease is a rapid decrease in the observed galactic cosmic ray intensity pattern occurring after a coronal mass ejection. In the present paper we have analyzed the daily Forbush decrease indices from January, 1967 to December, 2003 generated in IZMIRAN, Russia. First the entire indices have been smoothened and next we have made an attempt to fit a suitable stochastic model for the present time series by means of a necessary number of process parameters. The study reveals that the present time series is governed by a stationary autoregressive process of order 2 with a trace of white noise. Under the consideration of the present model we have shown that chaos is not expected in the present time series which opens up the possibility of validation of its forecasting (both short-term and long-term) as well as its multi-periodic behavior.

  1. Using pad‐stripped acausally filtered strong‐motion data

    USGS Publications Warehouse

    Boore, David; Sisi, Aida Azari; Akkar, Sinan

    2012-01-01

    Most strong‐motion data processing involves acausal low‐cut filtering, which requires the addition of sometimes lengthy zero pads to the data. These padded sections are commonly removed by organizations supplying data, but this can lead to incompatibilities in measures of ground motion derived in the usual way from the padded and the pad‐stripped data. One way around this is to use the correct initial conditions in the pad‐stripped time series when computing displacements, velocities, and linear oscillator response. Another way of ensuring compatibility is to use postprocessing of the pad‐stripped acceleration time series. Using 4071 horizontal and vertical acceleration time series from the Turkish strong‐motion database, we show that the procedures used by two organizations—ITACA (ITalian ACcelerometric Archive) and PEER NGA (Pacific Earthquake Engineering Research Center–Next Generation Attenuation)—lead to little bias and distortion of derived seismic‐intensity measures.

  2. Hybrid grammar-based approach to nonlinear dynamical system identification from biological time series

    NASA Astrophysics Data System (ADS)

    McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.

    2006-02-01

    We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.

  3. Wavelet-based surrogate time series for multiscale simulation of heterogeneous catalysis

    DOE PAGES

    Savara, Aditya Ashi; Daw, C. Stuart; Xiong, Qingang; ...

    2016-01-28

    We propose a wavelet-based scheme that encodes the essential dynamics of discrete microscale surface reactions in a form that can be coupled with continuum macroscale flow simulations with high computational efficiency. This makes it possible to simulate the dynamic behavior of reactor-scale heterogeneous catalysis without requiring detailed concurrent simulations at both the surface and continuum scales using different models. Our scheme is based on the application of wavelet-based surrogate time series that encodes the essential temporal and/or spatial fine-scale dynamics at the catalyst surface. The encoded dynamics are then used to generate statistically equivalent, randomized surrogate time series, which canmore » be linked to the continuum scale simulation. As a result, we illustrate an application of this approach using two different kinetic Monte Carlo simulations with different characteristic behaviors typical for heterogeneous chemical reactions.« less

  4. Getting to the point: Rapid point selection and variable density InSAR time series for urban deformation monitoring

    NASA Astrophysics Data System (ADS)

    Spaans, K.; Hooper, A. J.

    2017-12-01

    The short revisit time and high data acquisition rates of current satellites have resulted in increased interest in the development of deformation monitoring and rapid disaster response capability, using InSAR. Fast, efficient data processing methodologies are required to deliver the timely results necessary for this, and also to limit computing resources required to process the large quantities of data being acquired. Contrary to volcano or earthquake applications, urban monitoring requires high resolution processing, in order to differentiate movements between buildings, or between buildings and the surrounding land. Here we present Rapid time series InSAR (RapidSAR), a method that can efficiently update high resolution time series of interferograms, and demonstrate its effectiveness over urban areas. The RapidSAR method estimates the coherence of pixels on an interferogram-by-interferogram basis. This allows for rapid ingestion of newly acquired images without the need to reprocess the earlier acquired part of the time series. The coherence estimate is based on ensembles of neighbouring pixels with similar amplitude behaviour through time, which are identified on an initial set of interferograms, and need be re-evaluated only occasionally. By taking into account scattering properties of points during coherence estimation, a high quality coherence estimate is achieved, allowing point selection at full resolution. The individual point selection maximizes the amount of information that can be extracted from each interferogram, as no selection compromise has to be reached between high and low coherence interferograms. In other words, points do not have to be coherent throughout the time series to contribute to the deformation time series. We demonstrate the effectiveness of our method over urban areas in the UK. We show how the algorithm successfully extracts high density time series from full resolution Sentinel-1 interferograms, and distinguish clearly between buildings and surrounding vegetation or streets. The fact that new interferograms can be processed separately from the remainder of the time series helps manage the high data volumes, both in space and time, generated by current missions.

  5. Enhancing Discovery, Search, and Access of NASA Hydrological Data by Leveraging GEOSS

    NASA Technical Reports Server (NTRS)

    Teng, William L.

    2015-01-01

    An ongoing NASA-funded project has removed a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series) for selected variables of the North American and Global Land Data Assimilation Systems (NLDAS and GLDAS, respectively) and other EOSDIS (Earth Observing System Data Information System) data sets (e.g., precipitation, soil moisture). These time series (data rods) are pre-generated. Data rods Web services are accessible through the CUAHSI Hydrologic Information System (HIS) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) but are not easily discoverable by users of other non-NASA data systems. The Global Earth Observation System of Systems (GEOSS) is a logical mechanism for providing access to the data rods. An ongoing GEOSS Water Services project aims to develop a distributed, global registry of water data, map, and modeling services cataloged using the standards and procedures of the Open Geospatial Consortium and the World Meteorological Organization. The ongoing data rods project has demonstrated the feasibility of leveraging the GEOSS infrastructure to help provide access to time series of model grid information or grids of information over a geographical domain for a particular time interval. A recently-begun, related NASA-funded ACCESS-GEOSS project expands on these prior efforts. Current work is focused on both improving the performance of the generation of on-the-fly (OTF) data rods and the Web interfaces from which users can easily discover, search, and access NASA data.

  6. VIIRS On-Orbit Calibration for Ocean Color Data Processing

    NASA Technical Reports Server (NTRS)

    Eplee, Robert E., Jr.; Turpie, Kevin R.; Fireman, Gwyn F.; Meister, Gerhard; Stone, Thomas C.; Patt, Frederick S.; Franz, Bryan; Bailey, Sean W.; Robinson, Wayne D.; McClain, Charles R.

    2012-01-01

    The NASA VIIRS Ocean Science Team (VOST) has the task of evaluating Suomi NPP VIIRS ocean color data for the continuity of the NASA ocean color climate data records. The generation of science quality ocean color data products requires an instrument calibration that is stable over time. Since the VIIRS NIR Degradation Anomaly directly impacts the bands used for atmospheric correction of the ocean color data (Bands M6 and M7), the VOST has adapted the VIIRS on-orbit calibration approach to meet the ocean science requirements. The solar diffuser calibration time series and the solar diffuser stability monitor time series have been used to derive changes in the instrument response and diffuser reflectance over time for bands M1-M11.

  7. Large-deviation probabilities for correlated Gaussian processes and intermittent dynamical systems

    NASA Astrophysics Data System (ADS)

    Massah, Mozhdeh; Nicol, Matthew; Kantz, Holger

    2018-05-01

    In its classical version, the theory of large deviations makes quantitative statements about the probability of outliers when estimating time averages, if time series data are identically independently distributed. We study large-deviation probabilities (LDPs) for time averages in short- and long-range correlated Gaussian processes and show that long-range correlations lead to subexponential decay of LDPs. A particular deterministic intermittent map can, depending on a control parameter, also generate long-range correlated time series. We illustrate numerically, in agreement with the mathematical literature, that this type of intermittency leads to a power law decay of LDPs. The power law decay holds irrespective of whether the correlation time is finite or infinite, and hence irrespective of whether the central limit theorem applies or not.

  8. Improvements to surrogate data methods for nonstationary time series.

    PubMed

    Lucio, J H; Valdés, R; Rodríguez, L R

    2012-05-01

    The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.

  9. Mapping Wetlands of Dongting Lake in China Using Landsat and SENTINEL-1 Time Series at 30M

    NASA Astrophysics Data System (ADS)

    Xing, L.; Tang, X.; Wang, H.; Fan, W.; Gao, X.

    2018-04-01

    Mapping and monitoring wetlands of Dongting lake using optical sensor data has been limited by cloud cover, and open access Sentinal-1 C-band data could provide cloud-free SAR images with both have high spatial and temporal resolution, which offer new opportunities for monitoring wetlands. In this study, we combined optical data and SAR data to map wetland of Dongting Lake reserves in 2016. Firstly, we generated two monthly composited Landsat land surface reflectance, NDVI, NDWI, TC-Wetness time series and Sentinel-1 (backscattering coefficient for VH and VV) time series. Secondly, we derived surface water body with two monthly frequencies based on the threshold method using the Sentinel-1 time series. Then the permanent water and seasonal water were separated by the submergence ratio. Other land cover types were identified based on SVM classifier using Landsat time series. Results showed that (1) the overall accuracies and kappa coefficients were above 86.6 % and 0.8. (3) Natural wetlands including permanent water body (14.8 %), seasonal water body (34.6 %), and permanent marshes (10.9 %) were the main land cover types, accounting for 60.3 % of the three wetland reserves. Human-made wetlands, such as rice fields, accounted 34.3 % of the total area. Generally, this study proposed a new flowchart for wetlands mapping in Dongting lake by combining multi-source remote sensing data, and the use of the two-monthly composited optical time series effectively made up the missing data due to the clouds and increased the possibility of precise wetlands classification.

  10. HydroClimATe: hydrologic and climatic analysis toolkit

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.

    2014-01-01

    The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.

  11. Reliable Early Classification on Multivariate Time Series with Numerical and Categorical Attributes

    DTIC Science & Technology

    2015-05-22

    design a procedure of feature extraction in REACT named MEG (Mining Equivalence classes with shapelet Generators) based on the concept of...Equivalence Classes Mining [12, 15]. MEG can efficiently and effectively generate the discriminative features. In addition, several strategies are proposed...technique of parallel computing [4] to propose a process of pa- rallel MEG for substantially reducing the computational overhead of discovering shapelet

  12. Creating historical range of variation (HRV) time series using landscape modeling: Overview and issues [Chapter 8

    Treesearch

    Robert E. Keane

    2012-01-01

    Simulation modeling can be a powerful tool for generating information about historical range of variation (HRV) in landscape conditions. In this chapter, I will discuss several aspects of the use of simulation modeling to generate landscape HRV data, including (1) the advantages and disadvantages of using simulation, (2) a brief review of possible landscape models. and...

  13. A laboratory assessment of the measurement accuracy of weighing type rainfall intensity gauges

    NASA Astrophysics Data System (ADS)

    Colli, M.; Chan, P. W.; Lanza, L. G.; La Barbera, P.

    2012-04-01

    In recent years the WMO Commission for Instruments and Methods of Observation (CIMO) fostered noticeable advancements in the accuracy of precipitation measurement issue by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries (Lanza et al., 2005; Vuerich et al., 2009). Extreme events analysis is proven to be highly affected by the on-site RI measurement accuracy (see e.g. Molini et al., 2004) and the time resolution of the available RI series certainly constitutes another key-factor in constructing hyetographs that are representative of real rain events. The OTT Pluvio2 weighing gauge (WG) and the GEONOR T-200 vibrating-wire precipitation gauge demonstrated very good performance under previous constant flow rate calibration efforts (Lanza et al., 2005). Although WGs do provide better performance than more traditional Tipping Bucket Rain gauges (TBR) under continuous and constant reference intensity, dynamic effects seem to affect the accuracy of WG measurements under real world/time varying rainfall conditions (Vuerich et al., 2009). The most relevant is due to the response time of the acquisition system and the derived systematic delay of the instrument in assessing the exact weight of the bin containing cumulated precipitation. This delay assumes a relevant role in case high resolution rain intensity time series are sought from the instrument, as is the case of many hydrologic and meteo-climatic applications. This work reports the laboratory evaluation of Pluvio2 and T-200 rainfall intensity measurements accuracy. Tests are carried out by simulating different artificial precipitation events, namely non-stationary rainfall intensity, using a highly accurate dynamic rainfall generator. Time series measured by an Ogawa drop counter (DC) at a field test site located within the Hong Kong International Airport (HKIA) were aggregated at a 1-minute scale and used as reference for the artificial rain generation (Colli et al., 2012). The preliminary development and validation of the rainfall simulator for the generation of variable time steps reference intensities is also shown. The generator is characterized by a sufficiently short time response with respect to the expected weighing gauges behavior in order to ensure effective comparison of the measured/reference intensity at very high resolution in time.

  14. Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.

    PubMed

    Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald

    2017-01-01

    Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.

  15. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress

    PubMed Central

    2018-01-01

    The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies. PMID:29765399

  16. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress.

    PubMed

    Cheng, Ching-Hsue; Chan, Chia-Pang; Yang, Jun-He

    2018-01-01

    The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.

  17. Study of spectro-temporal variation in paleo-climatic marine proxy records using wavelet transformations

    NASA Astrophysics Data System (ADS)

    Pandey, Chhavi P.

    2017-10-01

    Wavelet analysis is a powerful mathematical and computational tool to study periodic phenomena in time series particu-larly in the presence of potential frequency changes in time. Continuous wavelet transformation (CWT) provides localised spectral information of the analysed dataset and in particular useful to study multiscale, nonstationary processes occurring over finite spatial and temporal domains. In the present work, oxygen-isotope ratio from the plantonic foraminifera species (viz. Globigerina bul-loides and Globigerinoides ruber) acquired from the broad central plateau of the Maldives ridge situated in south-eastern Arabian sea have been used as climate proxy. CWT of the time series generated using both the biofacies indicate spectro-temporal varia-tion of the natural climatic cycles. The dominant period resembles to the period of Milankovitch glacial-interglacial cycle. Apart from that, various other cycles are present in the time series. The results are in good agreement with the astronomical theory of paleoclimates and can provide better visualisation of Indian summer monsoon in the context of climate change.

  18. Ultra-short pulse generator

    DOEpatents

    McEwan, T.E.

    1993-12-28

    An inexpensive pulse generating circuit is disclosed that generates ultra-short, 200 picosecond, and high voltage 100 kW, pulses suitable for wideband radar and other wideband applications. The circuit implements a nonlinear transmission line with series inductors and variable capacitors coupled to ground made from reverse biased diodes to sharpen and increase the amplitude of a high-voltage power MOSFET driver input pulse until it causes non-destructive transit time breakdown in a final avalanche shock wave diode, which increases and sharpens the pulse even more. 5 figures.

  19. Ultra-short pulse generator

    DOEpatents

    McEwan, Thomas E.

    1993-01-01

    An inexpensive pulse generating circuit is disclosed that generates ultra-short, 200 picosecond, and high voltage 100 kW, pulses suitable for wideband radar and other wideband applications. The circuit implements a nonlinear transmission line with series inductors and variable capacitors coupled to ground made from reverse biased diodes to sharpen and increase the amplitude of a high-voltage power MOSFET driver input pulse until it causes non-destructive transit time breakdown in a final avalanche shockwave diode, which increases and sharpens the pulse even more.

  20. Quantifying South East Asia's forest degradation using latest generation optical and radar satellite remote sensing

    NASA Astrophysics Data System (ADS)

    Broich, M.; Tulbure, M. G.; Wijaya, A.; Weisse, M.; Stolle, F.

    2017-12-01

    Deforestation and forest degradation form the 2nd largest source of anthropogenic CO2 emissions. While deforestation is being globally mapped with satellite image time series, degradation remains insufficiently quantified. Previous studies quantified degradation for small scale, local sites. A method suitable for accurate mapping across large areas has not yet been developed due to the variability of the low magnitude and short-lived degradation signal and the absence of data with suitable resolution properties. Here we use a combination of newly available streams of free optical and radar image time series acquired by NASA and ESA, and HPC-based data science algorithms to innovatively quantify degradation consistently across Southeast Asia (SEA). We used Sentinel1 c-band radar data and NASA's new Harmonized Landsat8 (L8) Sentinel2 (S2) product (HLS) for cloud free optical images. Our results show that dense time series of cloud penetrating Sentinel 1 c-band radar can provide degradation alarm flags, while the HLS product of cloud-free optical images can unambiguously confirm degradation alarms. The detectability of degradation differed across SEA. In the seasonal forest of continental SEA the reliability of our radar-based alarm flags increased as the variability in landscape moisture decreases in the dry season. We reliably confirmed alarms with optical image time series during the late dry season, where degradation in open canopy forests becomes detectable once the undergrowth vegetation has died down. Conversely, in insular SEA landscape moisture is low, the radar time series generated degradation alarms flags with moderate to high reliability throughout the year, further confirmed with the HLS product. Based on the HLS product we can now confirm degradation within < 6 months on average as opposed to 1 year when using either L8 or S2 alone. In contrast to continental SEA, across insular SEA our degradation maps are not suitable to provide annual maps of total degradation area, but can pinpoint degradation areas on a rolling basin throughout the year. In both continental SEA and insular SEA there the combination of optical and radar time series provides better results than either one on its own. Our results provide significant information with application for carbon trading policy and land management.

  1. Radiance And Irradiance Of The Solar HeII 304 Emission Line

    NASA Astrophysics Data System (ADS)

    McMullin, D. R.; Floyd, L. E.; Auchère, F.

    2013-12-01

    For over 17 years, EIT and the later EUVI instruments aboard SoHO and STEREO, respectively, have provided a time series of radiant images in the HeII 30.4 nm transition region and three coronal emission lines (FeIX/X, FeXII, and FeXV). While the EIT measurements were gathered from positions approximately on the Earth-Sun axis, EUVI images have been gathered at angles ranging to more than ×90 degrees in solar longitude relative the Earth-Sun axis. Using a Differential Emission Measure (DEM) model, these measurements provide a basis for estimates of the spectral irradiance for the solar spectrum of wavelengths between 15 and 50 nm at any position in the heliosphere. In particular, we generate the He 30.4 spectral irradiance in all directions in the heliosphere and examine its time series in selected directions. Such spectra are utilized for two distinct purposes. First, the photoionization rate of neutral He at each position is calculated. Neutral He is of interest because it traverses the heliopause relatively undisturbed and therefore provides a measure of isotopic parameters beyond the heliosphere. Second, we use these generate a time series of estimates of the solar spectral luminosity in the HeII 30.4 nm emission line extending from the recent past solar cycle 23 minimum into the current weak solar cycle 24 enabling an estimate of its variation over the solar cycle. Because this 30.4~nm spectral luminosity is the sum of such radiation in all directions, its time series is devoid of the 27-day solar rotation periodicity present in indices typically used to represent solar activity.

  2. Recursive Algorithms for Real-Time Digital CR-RCn Pulse Shaping

    NASA Astrophysics Data System (ADS)

    Nakhostin, M.

    2011-10-01

    This paper reports on recursive algorithms for real-time implementation of CR-(RC)n filters in digital nuclear spectroscopy systems. The algorithms are derived by calculating the Z-transfer function of the filters for filter orders up to n=4 . The performances of the filters are compared with the performance of the conventional digital trapezoidal filter using a noise generator which separately generates pure series, 1/f and parallel noise. The results of our study enable one to select the optimum digital filter for different noise and rate conditions.

  3. Optimal trajectory generation for mechanical arms. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Iemenschot, J. A.

    1972-01-01

    A general method of generating optimal trajectories between an initial and a final position of an n degree of freedom manipulator arm with nonlinear equations of motion is proposed. The method is based on the assumption that the time history of each of the coordinates can be expanded in a series of simple time functions. By searching over the coefficients of the terms in the expansion, trajectories which minimize the value of a given cost function can be obtained. The method has been applied to a planar three degree of freedom arm.

  4. Retrieving hydrological connectivity from empirical causality in karst systems

    NASA Astrophysics Data System (ADS)

    Delforge, Damien; Vanclooster, Marnik; Van Camp, Michel; Poulain, Amaël; Watlet, Arnaud; Hallet, Vincent; Kaufmann, Olivier; Francis, Olivier

    2017-04-01

    Because of their complexity, karst systems exhibit nonlinear dynamics. Moreover, if one attempts to model a karst, the hidden behavior complicates the choice of the most suitable model. Therefore, both intense investigation methods and nonlinear data analysis are needed to reveal the underlying hydrological connectivity as a prior for a consistent physically based modelling approach. Convergent Cross Mapping (CCM), a recent method, promises to identify causal relationships between time series belonging to the same dynamical systems. The method is based on phase space reconstruction and is suitable for nonlinear dynamics. As an empirical causation detection method, it could be used to highlight the hidden complexity of a karst system by revealing its inner hydrological and dynamical connectivity. Hence, if one can link causal relationships to physical processes, the method should show great potential to support physically based model structure selection. We present the results of numerical experiments using karst model blocks combined in different structures to generate time series from actual rainfall series. CCM is applied between the time series to investigate if the empirical causation detection is consistent with the hydrological connectivity suggested by the karst model.

  5. The application of neural networks to myoelectric signal analysis: a preliminary study.

    PubMed

    Kelly, M F; Parker, P A; Scott, R N

    1990-03-01

    Two neural network implementations are applied to myoelectric signal (MES) analysis tasks. The motivation behind this research is to explore more reliable methods of deriving control for multidegree of freedom arm prostheses. A discrete Hopfield network is used to calculate the time series parameters for a moving average MES model. It is demonstrated that the Hopfield network is capable of generating the same time series parameters as those produced by the conventional sequential least squares (SLS) algorithm. Furthermore, it can be extended to applications utilizing larger amounts of data, and possibly to higher order time series models, without significant degradation in computational efficiency. The second neural network implementation involves using a two-layer perceptron for classifying a single site MES based on two features, specifically the first time series parameter, and the signal power. Using these features, the perceptron is trained to distinguish between four separate arm functions. The two-dimensional decision boundaries used by the perceptron classifier are delineated. It is also demonstrated that the perceptron is able to rapidly compensate for variations when new data are incorporated into the training set. This adaptive quality suggests that perceptrons may provide a useful tool for future MES analysis.

  6. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    NASA Astrophysics Data System (ADS)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  7. Reconstruction of network topology using status-time-series data

    NASA Astrophysics Data System (ADS)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  8. Investigation of prospects for forecasting non-linear time series by example of drilling oil and gas wells

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Sizonenko, A. B.; Zhdanov, A. A.

    2018-05-01

    Discrete time series or mappings are proposed for describing the dynamics of a nonlinear system. The article considers the problems of forecasting the dynamics of the system from the time series generated by it. In particular, the commercial rate of drilling oil and gas wells can be considered as a series where each next value depends on the previous one. The main parameter here is the technical drilling speed. With the aim of eliminating the measurement error and presenting the commercial speed of the object to the current with a good accuracy, future or any of the elapsed time points, the use of the Kalman filter is suggested. For the transition from a deterministic model to a probabilistic one, the use of ensemble modeling is suggested. Ensemble systems can provide a wide range of visual output, which helps the user to evaluate the measure of confidence in the model. In particular, the availability of information on the estimated calendar duration of the construction of oil and gas wells will allow drilling companies to optimize production planning by rationalizing the approach to loading drilling rigs, which ultimately leads to maximization of profit and an increase of their competitiveness.

  9. Spatial analysis of precipitation time series over the Upper Indus Basin

    NASA Astrophysics Data System (ADS)

    Latif, Yasir; Yaoming, Ma; Yaseen, Muhammad

    2018-01-01

    The upper Indus basin (UIB) holds one of the most substantial river systems in the world, contributing roughly half of the available surface water in Pakistan. This water provides necessary support for agriculture, domestic consumption, and hydropower generation; all critical for a stable economy in Pakistan. This study has identified trends, analyzed variability, and assessed changes in both annual and seasonal precipitation during four time series, identified herein as: (first) 1961-2013, (second) 1971-2013, (third) 1981-2013, and (fourth) 1991-2013, over the UIB. This study investigated spatial characteristics of the precipitation time series over 15 weather stations and provides strong evidence of annual precipitation by determining significant trends at 6 stations (Astore, Chilas, Dir, Drosh, Gupis, and Kakul) out of the 15 studied stations, revealing a significant negative trend during the fourth time series. Our study also showed significantly increased precipitation at Bunji, Chitral, and Skardu, whereas such trends at the rest of the stations appear insignificant. Moreover, our study found that seasonal precipitation decreased at some locations (at a high level of significance), as well as periods of scarce precipitation during all four seasons. The observed decreases in precipitation appear stronger and more significant in autumn; having 10 stations exhibiting decreasing precipitation during the fourth time series, with respect to time and space. Furthermore, the observed decreases in precipitation appear robust and more significant for regions at high elevation (>1300 m). This analysis concludes that decreasing precipitation dominated the UIB, both temporally and spatially including in the higher areas.

  10. a Landsat Time-Series Stacks Model for Detection of Cropland Change

    NASA Astrophysics Data System (ADS)

    Chen, J.; Chen, J.; Zhang, J.

    2017-09-01

    Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.

  11. Topological data analysis of financial time series: Landscapes of crashes

    NASA Astrophysics Data System (ADS)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  12. Performance Evaluation of Machine Learning Methods for Leaf Area Index Retrieval from Time-Series MODIS Reflectance Data

    PubMed Central

    Wang, Tongtong; Xiao, Zhiqiang; Liu, Zhigang

    2017-01-01

    Leaf area index (LAI) is an important biophysical parameter and the retrieval of LAI from remote sensing data is the only feasible method for generating LAI products at regional and global scales. However, most LAI retrieval methods use satellite observations at a specific time to retrieve LAI. Because of the impacts of clouds and aerosols, the LAI products generated by these methods are spatially incomplete and temporally discontinuous, and thus they cannot meet the needs of practical applications. To generate high-quality LAI products, four machine learning algorithms, including back-propagation neutral network (BPNN), radial basis function networks (RBFNs), general regression neutral networks (GRNNs), and multi-output support vector regression (MSVR) are proposed to retrieve LAI from time-series Moderate Resolution Imaging Spectroradiometer (MODIS) reflectance data in this study and performance of these machine learning algorithms is evaluated. The results demonstrated that GRNNs, RBFNs, and MSVR exhibited low sensitivity to training sample size, whereas BPNN had high sensitivity. The four algorithms performed slightly better with red, near infrared (NIR), and short wave infrared (SWIR) bands than red and NIR bands, and the results were significantly better than those obtained using single band reflectance data (red or NIR). Regardless of band composition, GRNNs performed better than the other three methods. Among the four algorithms, BPNN required the least training time, whereas MSVR needed the most for any sample size. PMID:28045443

  13. Performance Evaluation of Machine Learning Methods for Leaf Area Index Retrieval from Time-Series MODIS Reflectance Data.

    PubMed

    Wang, Tongtong; Xiao, Zhiqiang; Liu, Zhigang

    2017-01-01

    Leaf area index (LAI) is an important biophysical parameter and the retrieval of LAI from remote sensing data is the only feasible method for generating LAI products at regional and global scales. However, most LAI retrieval methods use satellite observations at a specific time to retrieve LAI. Because of the impacts of clouds and aerosols, the LAI products generated by these methods are spatially incomplete and temporally discontinuous, and thus they cannot meet the needs of practical applications. To generate high-quality LAI products, four machine learning algorithms, including back-propagation neutral network (BPNN), radial basis function networks (RBFNs), general regression neutral networks (GRNNs), and multi-output support vector regression (MSVR) are proposed to retrieve LAI from time-series Moderate Resolution Imaging Spectroradiometer (MODIS) reflectance data in this study and performance of these machine learning algorithms is evaluated. The results demonstrated that GRNNs, RBFNs, and MSVR exhibited low sensitivity to training sample size, whereas BPNN had high sensitivity. The four algorithms performed slightly better with red, near infrared (NIR), and short wave infrared (SWIR) bands than red and NIR bands, and the results were significantly better than those obtained using single band reflectance data (red or NIR). Regardless of band composition, GRNNs performed better than the other three methods. Among the four algorithms, BPNN required the least training time, whereas MSVR needed the most for any sample size.

  14. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    PubMed Central

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  15. A four-stage hybrid model for hydrological time series forecasting.

    PubMed

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  16. Modeling and Simulation of the Economics of Mining in the Bitcoin Market.

    PubMed

    Cocco, Luisanna; Marchesi, Michele

    2016-01-01

    In January 3, 2009, Satoshi Nakamoto gave rise to the "Bitcoin Blockchain", creating the first block of the chain hashing on his computer's central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU's generation. They are GPU's, FPGA's and ASIC's generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU's generation, the first with economic significance. The model reproduces some "stylized facts" found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network.

  17. Generation of deformation time series from SAR data sequences in areas affected by large dynamics: insights from Sierra Negra caldera, Galápagos Islands

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Manconi, Andrea; Pepe, Antonio; Lanari, Riccardo

    2010-05-01

    Differential Synthetic Aperture Radar Interferometry (DInSAR) is a remote sensing technique that allows producing spatially dense deformation maps of the Earth surface, with centimeter accuracy. To this end, the phase difference of SAR image pairs acquired before and after a deformation episode is properly exploited. This technique, originally applied to investigate single deformation events, has been further extended to analyze the temporal evolution of the deformation field through the generation of displacement time-series. A well-established approach is represented by the Small BAseline Subset (SBAS) technique (Berardino et al., 2002), whose capability to analyze deformation events at low and full spatial resolution has largely been demonstrated. However, in areas where large and/or rapid deformation phenomena occur, the exploitation of the differential interferograms, thus also of the displacement time-series, can be strongly limited by the presence of significant misregistration errors and/or very high fringe rates, making unfeasible the phase unwrapping step. In this work, we propose advances on the generation of deformation time-series in areas affected by large deformation dynamics. We present an extension of the amplitude-based Pixel-Offset analyses by applying the SBAS strategy, in order to move from the investigation of single (large) deformation events to that of dynamic phenomena. The above-mentioned method has been tested on an ENVISAT SAR data archive (Track 61, Frames 7173-7191) related to the Galapagos Islands, focusing on Sierra Negra caldera (Galapagos Islands), an active volcanic area often characterized by large and rapid deformation events leading to severe image misregistration effects (Yun et al., 2007). Moreover, we present a cross-validation of the retrieved deformation estimates comparing our results to continuous GPS measurements and to synthetic deformation obtained by independently modeling the interferometric phase information when available. References: P. Berardino et al., (2002), A new algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms, IEEE Transactions on Geoscience and Remote Sensing, vol. 40, 11, pp. 2375-2383. S-H. Yun et al., (2007), Interferogram formation in the presence of complex and large deformation, Geophys. Res. Lett., vol. 34, L12305.

  18. GRRATS: A New Approach to Inland Altimetry Processing for Major World Rivers

    NASA Astrophysics Data System (ADS)

    Coss, S. P.

    2016-12-01

    Here we present work-in-progress results aimed at generating a new radar altimetry dataset GRRATS (Global River Radar Altimetry Time Series) extracted over global ocean-draining rivers wider than 900 m. GRATTS was developed as a component of the NASA MEaSUREs project (PI: Dennis Lettenmaier, UCLA) to generate pre-SWOT data products for decadal or longer global river elevation changes from multi-mission satellite radar altimetry data. The dataset at present includes 909 time series from 39 rivers. A new method of filtering VS (virtual station) height time series is presented where, DEM based heights were used to establish limits for the ice1 retracked Jason2 and Envisat heights at present. While GRRATS is following in the footsteps of several predecessors, it contributes to one of the critical climate data records in generating a validated and comprehensive hydrologic observations in river height. The current data product includes VSs in north and south Americas, Africa and Eurasia, with the most comprehensive set of Jason-2 and Envisat RA time series available for North America and Eurasia. We present a semi-automated procedure to process returns from river locations, identified with Landsat images and updated water mask extent. Consistent methodologies for flagging ice cover are presented. DEM heights used in height filtering were retained and can be used as river height profiles. All non-validated VS have been assigned a letter grade A-D to aid end users in selection of data. Validated VS are accompanied with a suite of fit statistics. Due to the inclusiveness of the dataset, not all VS were able to undergo validation (415 of 909), but those that were demonstrate that confidence in the data product is warranted. Validation was accomplished using records from 45 in situ gauges from 12 rivers. Meta-analysis was performed to compare each gauge with each VS by relative height. Preliminary validation results are as follows. 89.3% of the data have positive Nash Sutcliff Efficiency (NES) values, and the median NSE value is 0.73. The median standard deviation of error (STDE) is .92 m. GRRATS will soon be publicly available in NetCDF format with CF compliant metadata.

  19. On the design of henon and logistic map-based random number generator

    NASA Astrophysics Data System (ADS)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  20. MTpy: A Python toolbox for magnetotellurics

    USGS Publications Warehouse

    Krieger, Lars; Peacock, Jared R.

    2014-01-01

    In this paper, we introduce the structure and concept of MTpy  . Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.

  1. Harmonic oscillators and resonance series generated by a periodic unstable classical orbit

    NASA Technical Reports Server (NTRS)

    Kazansky, A. K.; Ostrovsky, Valentin N.

    1995-01-01

    The presence of an unstable periodic classical orbit allows one to introduce the decay time as a purely classical magnitude: inverse of the Lyapunov index which characterizes the orbit instability. The Uncertainty Relation gives the corresponding resonance width which is proportional to the Planck constant. The more elaborate analysis is based on the parabolic equation method where the problem is effectively reduced to the multidimensional harmonic oscillator with the time-dependent frequency. The resonances form series in the complex energy plane which is equidistant in the direction perpendicular to the real axis. The applications of the general approach to various problems in atomic physics are briefly exposed.

  2. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  3. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  4. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.

  5. Design factors and considerations for a time-based flight management system

    NASA Technical Reports Server (NTRS)

    Vicroy, D. D.; Williams, D. H.; Sorensen, J. A.

    1986-01-01

    Recent NASA Langley Research Center research to develop a technology data base from which an advanced Flight Management System (FMS) design might evolve is reviewed. In particular, the generation of fixed range cruise/descent reference trajectories which meet predefined end conditions of altitude, speed, and time is addressed. Results on the design and theoretical basis of the trajectory generation algorithm are presented, followed by a brief discussion of a series of studies that are being conducted to determine the accuracy requirements of the aircraft and weather models resident in the trajectory generation algorithm. Finally, studies to investigate the interface requirements between the pilot and an advanced FMS are considered.

  6. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  7. Application of data cubes for improving detection of water cycle extreme events

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Albayrak, A.

    2015-12-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.

  8. The Fourier decomposition method for nonlinear and non-stationary time series analysis.

    PubMed

    Singh, Pushpendra; Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik

    2017-03-01

    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of 'Fourier intrinsic band functions' (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time-frequency-energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms.

  9. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    NASA Technical Reports Server (NTRS)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  10. A hybrid wavelet de-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series.

    PubMed

    Wang, Dong; Borthwick, Alistair G; He, Handan; Wang, Yuankun; Zhu, Jieyu; Lu, Yuan; Xu, Pengcheng; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-01-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, wavelet de-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. Compared to three other generic methods, the results generated by WD-REPA model presented invariably smaller error measures which means the forecasting capability of the WD-REPA model is better than other models. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Generation of an allelic series of knock-in mice using recombinase-mediated cassette exchange (RMCE).

    PubMed

    Roebroek, Anton J M; Van Gool, Bart

    2014-01-01

    Molecular genetic strategies applying embryonic stem cell (ES cell) technologies to study the function of a gene in mice or to generate a mouse model for a human disease are continuously under development. Next to (conditional) inactivation of genes the application and importance of approaches to generate knock-in mutations are increasing. In this chapter the principle and application of recombinase-mediated cassette exchange (RMCE) are discussed as being a new emerging knock-in strategy, which enables easy generation of a series of different knock-in mutations within one gene. An RMCE protocol, which was used to generate a series of different knock-in mutations in the Lrp1 gene of ES cells, is described in detail as an example of how RMCE can be used to generate highly efficiently an allelic series of differently modified ES cell clones from a parental modified ES cell clone. Subsequently the differently modified ES cell clones can be used to generate an allelic series of mutant knock-in mice.

  12. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    NASA Astrophysics Data System (ADS)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  13. ControlShell - A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.

    1991-01-01

    ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.

  14. Multiscale analysis of the intensity fluctuation in a time series of dynamic speckle patterns.

    PubMed

    Federico, Alejandro; Kaufmann, Guillermo H

    2007-04-10

    We propose the application of a method based on the discrete wavelet transform to detect, identify, and measure scaling behavior in dynamic speckle. The multiscale phenomena presented by a sample and displayed by its speckle activity are analyzed by processing the time series of dynamic speckle patterns. The scaling analysis is applied to the temporal fluctuation of the speckle intensity and also to the two derived data sets generated by its magnitude and sign. The application of the method is illustrated by analyzing paint-drying processes and bruising in apples. The results are discussed taking into account the different time organizations obtained for the scaling behavior of the magnitude and the sign of the intensity fluctuation.

  15. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    PubMed Central

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  16. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series.

    PubMed

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin

    2014-01-30

    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Enhancement of Mutual Discovery, Search, and Access of Data for Users of NASA and GEOSS-Cataloged Data Systems

    NASA Technical Reports Server (NTRS)

    Teng, William; Maidment, David; Rodell, Matthew; Strub, Richard; Arctur, David; Ames, Daniel; Rui, Hualan; Vollmer, Bruce; Seiler, Edward

    2014-01-01

    An ongoing NASA-funded Data Rods (time series) project has demonstrated the removal of a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series) for selected variables of the North American and Global Land Data Assimilation Systems (NLDAS and GLDAS, respectively) and other NASA data sets. Data rods are pre-generated or generated on-the-fly (OTF), leveraging the NASA Simple Subset Wizard (SSW), a gateway to NASA data centers. Data rods Web services are accessible through the CUAHSI Hydrologic Information System (HIS) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) but are not easily discoverable by users of other non-NASA data systems. An ongoing GEOSS Water Services project aims to develop a distributed, global registry of water data, map, and modeling services cataloged using the standards and procedures of the Open Geospatial Consortium and the World Meteorological Organization. Preliminary work has shown GEOSS can be leveraged to help provide access to data rods. A new NASA-funded project is extending this early work.

  18. HIMAWARI-8 Geostationary Satellite Observation of the Internal Solitary Waves in the South China Sea

    NASA Astrophysics Data System (ADS)

    Gao, Q.; Dong, D.; Yang, X.; Husi, L.; Shang, H.

    2018-04-01

    The new generation geostationary meteorological satellite, Himawari-8 (H-8), was launched in 2015. Its main payload, the Advanced Himawari Imager (AHI), can observe the earth with 10-minute interval and as high as 500-m spatial resolution. This makes the H-8 satellite an ideal data source for marine and atmospheric phenomena monitoring. In this study, the propagation of internal solitary waves (ISWs) in the South China Sea is investigated using AHI imagery time series for the first time. Three ISWs cases were studied at 3:30-8:00 UTC on 30 May, 2016. In all, 28 ISWs were detected and tracked between the time series image pairs. The propagation direction and phase speeds of these ISWs are calculated and analyzed. The observation results show that the properties of ISW propagation not stable and maintains nonlinear during its lifetime. The resultant ISW speeds agree well with the theoretical values estimated from the Taylor-Goldstein equation using Argo dataset. This study has demonstrated that the new generation geostationary satellite can be a useful tool to monitor and investigate the oceanic internal waves.

  19. Data Quality Monitoring and Noise Analysis at the EUREF Permanent Network

    NASA Astrophysics Data System (ADS)

    Kenyeres, A.; Bruyninx, C.

    2004-12-01

    The EUREF Permanent Network (EPN) includes now more then 150 GNSS stations of different quality and different observation history. The greatest portion of the sites is settled on the tectonically stable parts of Eurasia, where only mm-level yearly displacements are expected. In order to extract the relevant geophysical information, sophisticated analysis tools and stable, long term observations are necessary. As the EPN is operational since 1996, it offers the potential to estimate high quality velocities associated with reliable uncertainties. In order to support this work, a set of efficient and demonstrative tools have been developed to monitor the data and station quality. The periodically upgraded results are displayed on the website of the EPN Central Bureau (CB) (www.epncb.oma.be) in terms of sky plots, graphs of observation percentage, cycle slips and multipath. The different quality plots are indirectly used for the interpretation of the time series. Sudden changes or unusual variation in the time series (beyond the obvious equipment change) often correlates with changes in the environment mirrored by the quality plots. These graphs are vital for the proper interpretation and the understanding of the real processes. Knowing the nuisance factors, we can generate cleaner time series. We are presenting relevant examples of this work. Two kinds of time series plots are displayed at the EPN CB website: raw and improved time series. They are cumulative solutions of the weekly EPN SINEX files using the minimum constraint approach. Within the improved time series the outliers and offsets are already taken into account. We will also present preliminary results of a detailed noise analysis of the EPN time series. The target of this work is twofold: on one side we aim at computing more realistic velocity estimates of the EPN stations and on the other side the information about the station noise characteristics will support the removal and proper interpretation of site-specific phenomena .

  20. Monitoring gradual ecosystem change using Landsat time series analyses: case studies in selected forest and rangeland ecosystems

    USGS Publications Warehouse

    Vogelmann, James E.; Xian, George; Homer, Collin G.; Tolk, Brian

    2012-01-01

    The focus of the study was to assess gradual changes occurring throughout a range of natural ecosystems using decadal Landsat Thematic Mapper (TM) and Enhanced Thematic Mapper Plus (ETM +) time series data. Time series data stacks were generated for four study areas: (1) a four scene area dominated by forest and rangeland ecosystems in the southwestern United States, (2) a sagebrush-dominated rangeland in Wyoming, (3) woodland adjacent to prairie in northwestern Nebraska, and (4) a forested area in the White Mountains of New Hampshire. Through analyses of time series data, we found evidence of gradual systematic change in many of the natural vegetation communities in all four areas. Many of the conifer forests in the southwestern US are showing declines related to insects and drought, but very few are showing evidence of improving conditions or increased greenness. Sagebrush communities are showing decreases in greenness related to fire, mining, and probably drought, but very few of these communities are showing evidence of increased greenness or improving conditions. In Nebraska, forest communities are showing local expansion and increased canopy densification in the prairie–woodland interface, and in the White Mountains high elevation understory conifers are showing range increases towards lower elevations. The trends detected are not obvious through casual inspection of the Landsat images. Analyses of time series data using many scenes and covering multiple years are required in order to develop better impressions and representations of the changing ecosystem patterns and trends that are occurring. The approach described in this paper demonstrates that Landsat time series data can be used operationally for assessing gradual ecosystem change across large areas. Local knowledge and available ancillary data are required in order to fully understand the nature of these trends.

  1. Occurrence of CPPopt Values in Uncorrelated ICP and ABP Time Series.

    PubMed

    Cabeleira, M; Czosnyka, M; Liu, X; Donnelly, J; Smielewski, P

    2018-01-01

    Optimal cerebral perfusion pressure (CPPopt) is a concept that uses the pressure reactivity (PRx)-CPP relationship over a given period to find a value of CPP at which PRx shows best autoregulation. It has been proposed that this relationship be modelled by a U-shaped curve, where the minimum is interpreted as being the CPP value that corresponds to the strongest autoregulation. Owing to the nature of the calculation and the signals involved in it, the occurrence of CPPopt curves generated by non-physiological variations of intracranial pressure (ICP) and arterial blood pressure (ABP), termed here "false positives", is possible. Such random occurrences would artificially increase the yield of CPPopt values and decrease the reliability of the methodology.In this work, we studied the probability of the random occurrence of false-positives and we compared the effect of the parameters used for CPPopt calculation on this probability. To simulate the occurrence of false-positives, uncorrelated ICP and ABP time series were generated by destroying the relationship between the waves in real recordings. The CPPopt algorithm was then applied to these new series and the number of false-positives was counted for different values of the algorithm's parameters. The percentage of CPPopt curves generated from uncorrelated data was demonstrated to be 11.5%. This value can be minimised by tuning some of the calculation parameters, such as increasing the calculation window and increasing the minimum PRx span accepted on the curve.

  2. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    NASA Astrophysics Data System (ADS)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil protection agency meteorological warning network. ACKNOWLEDGEMENTS The research was co-financed by the Italian Ministry of University, by means of the PRIN 2006 PRIN program, within the research project entitled ‘Definition of critical rainfall thresholds for destructive landslides for civil protection purposes'. REFERENCES Cowpertwait, P.S.P., Kilsby, C.G. and O'Connell, P.E., 2002. A space-time Neyman-Scott model of rainfall: Empirical analysis of extremes, Water Resources Research, 38(8):1-14. Salas, J.D., 1992. Analysis and modeling of hydrological time series, in D.R. Maidment, ed., Handbook of Hydrology, McGraw-Hill, New York. Heneker, T.M., Lambert, M.F. and Kuczera G., 2001. A point rainfall model for risk-based design, Journal of Hydrology, 247(1-2):54-71.

  3. Calibration and validation of a phenomenological influent pollutant disturbance scenario generator using full-scale data.

    PubMed

    Flores-Alsina, Xavier; Saagi, Ramesh; Lindblom, Erik; Thirsing, Carsten; Thornberg, Dines; Gernaey, Krist V; Jeppsson, Ulf

    2014-03-15

    The objective of this paper is to demonstrate the full-scale feasibility of the phenomenological dynamic influent pollutant disturbance scenario generator (DIPDSG) that was originally used to create the influent data of the International Water Association (IWA) Benchmark Simulation Model No. 2 (BSM2). In this study, the influent characteristics of two large Scandinavian treatment facilities are studied for a period of two years. A step-wise procedure based on adjusting the most sensitive parameters at different time scales is followed to calibrate/validate the DIPDSG model blocks for: 1) flow rate; 2) pollutants (carbon, nitrogen); 3) temperature; and, 4) transport. Simulation results show that the model successfully describes daily/weekly and seasonal variations and the effect of rainfall and snow melting on the influent flow rate, pollutant concentrations and temperature profiles. Furthermore, additional phenomena such as size and accumulation/flush of particulates of/in the upstream catchment and sewer system are incorporated in the simulated time series. Finally, this study is complemented with: 1) the generation of additional future scenarios showing the effects of different rainfall patterns (climate change) or influent biodegradability (process uncertainty) on the generated time series; 2) a demonstration of how to reduce the cost/workload of measuring campaigns by filling the gaps due to missing data in the influent profiles; and, 3) a critical discussion of the presented results balancing model structure/calibration procedure complexity and prediction capabilities. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Dynamic modelling of microRNA regulation during mesenchymal stem cell differentiation.

    PubMed

    Weber, Michael; Sotoca, Ana M; Kupfer, Peter; Guthke, Reinhard; van Zoelen, Everardus J

    2013-11-12

    Network inference from gene expression data is a typical approach to reconstruct gene regulatory networks. During chondrogenic differentiation of human mesenchymal stem cells (hMSCs), a complex transcriptional network is active and regulates the temporal differentiation progress. As modulators of transcriptional regulation, microRNAs (miRNAs) play a critical role in stem cell differentiation. Integrated network inference aimes at determining interrelations between miRNAs and mRNAs on the basis of expression data as well as miRNA target predictions. We applied the NetGenerator tool in order to infer an integrated gene regulatory network. Time series experiments were performed to measure mRNA and miRNA abundances of TGF-beta1+BMP2 stimulated hMSCs. Network nodes were identified by analysing temporal expression changes, miRNA target gene predictions, time series correlation and literature knowledge. Network inference was performed using NetGenerator to reconstruct a dynamical regulatory model based on the measured data and prior knowledge. The resulting model is robust against noise and shows an optimal trade-off between fitting precision and inclusion of prior knowledge. It predicts the influence of miRNAs on the expression of chondrogenic marker genes and therefore proposes novel regulatory relations in differentiation control. By analysing the inferred network, we identified a previously unknown regulatory effect of miR-524-5p on the expression of the transcription factor SOX9 and the chondrogenic marker genes COL2A1, ACAN and COL10A1. Genome-wide exploration of miRNA-mRNA regulatory relationships is a reasonable approach to identify miRNAs which have so far not been associated with the investigated differentiation process. The NetGenerator tool is able to identify valid gene regulatory networks on the basis of miRNA and mRNA time series data.

  5. Is walking a random walk? Evidence for long-range correlations in stride interval of human gait

    NASA Technical Reports Server (NTRS)

    Hausdorff, Jeffrey M.; Peng, C.-K.; Ladin, Zvi; Wei, Jeanne Y.; Goldberger, Ary L.

    1995-01-01

    Complex fluctuation of unknown origin appear in the normal gait pattern. These fluctuations might be described as being (1) uncorrelated white noise, (2) short-range correlations, or (3) long-range correlations with power-law scaling. To test these possibilities, the stride interval of 10 healthy young men was measured as they walked for 9 min at their usual rate. From these time series we calculated scaling indexes by using a modified random walk analysis and power spectral analysis. Both indexes indicated the presence of long-range self-similar correlations extending over hundreds of steps; the stride interval at any time depended on the stride interval at remote previous times, and this dependence decayed in a scale-free (fractallike) power-law fashion. These scaling indexes were significantly different from those obtained after random shuffling of the original time series, indicating the importance of the sequential ordering of the stride interval. We demonstrate that conventional models of gait generation fail to reproduce the observed scaling behavior and introduce a new type of central pattern generator model that sucessfully accounts for the experimentally observed long-range correlations.

  6. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    PubMed Central

    Zhou, Fuqun; Zhang, Aining

    2016-01-01

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152

  7. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    PubMed

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  8. Data series embedding and scale invariant statistics.

    PubMed

    Michieli, I; Medved, B; Ristov, S

    2010-06-01

    Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.

  9. Effective precipitation duration for runoff peaks based on catchment modelling

    NASA Astrophysics Data System (ADS)

    Sikorska, A. E.; Viviroli, D.; Seibert, J.

    2018-01-01

    Despite precipitation intensities may greatly vary during one flood event, detailed information about these intensities may not be required to accurately simulate floods with a hydrological model which rather reacts to cumulative precipitation sums. This raises two questions: to which extent is it important to preserve sub-daily precipitation intensities and how long does it effectively rain from the hydrological point of view? Both questions might seem straightforward to answer with a direct analysis of past precipitation events but require some arbitrary choices regarding the length of a precipitation event. To avoid these arbitrary decisions, here we present an alternative approach to characterize the effective length of precipitation event which is based on runoff simulations with respect to large floods. More precisely, we quantify the fraction of a day over which the daily precipitation has to be distributed to faithfully reproduce the large annual and seasonal floods which were generated by the hourly precipitation rate time series. New precipitation time series were generated by first aggregating the hourly observed data into daily totals and then evenly distributing them over sub-daily periods (n hours). These simulated time series were used as input to a hydrological bucket-type model and the resulting runoff flood peaks were compared to those obtained when using the original precipitation time series. We define then the effective daily precipitation duration as the number of hours n, for which the largest peaks are simulated best. For nine mesoscale Swiss catchments this effective daily precipitation duration was about half a day, which indicates that detailed information on precipitation intensities is not necessarily required to accurately estimate peaks of the largest annual and seasonal floods. These findings support the use of simple disaggregation approaches to make usage of past daily precipitation observations or daily precipitation simulations (e.g. from climate models) for hydrological modeling at an hourly time step.

  10. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.

  11. Long-Term Trends and Variability in Spring Development of Calanus finmarchicus in the Southeastern Norwegian Sea during 1996-2012

    NASA Astrophysics Data System (ADS)

    Dupont, N.; Bagøien, E.; Melle, W.

    2016-02-01

    Calanus finmarchicus is the dominant copepod species in the Norwegian Sea in terms of biomass, playing a key role in the ecosystem by transferring energy from primary producers to higher trophic levels. This study analyses the long-term trend of a 17-year time series (1996-2012) on abundance of adult Calanus finmarchicus in the Atlantic water-mass of the southern Norwegian Sea during spring. The long-term trend in spring abundance was assessed by using Generalised Additive Models, while simultaneously accounting for both general population development and inter-annual variation in population development throughout the study period. In one model, we focus on inter-annual changes in timing of the Calanus spring seasonal development by including Mean Stage Composition as a measure for state of population development. Following a short increase during the years 1996 to 2000, the abundance of Calanus finmarchicus decreased strongly until about the year 2010. For the two last years of the studied period, 2011-2012, increasing population abundances are suggested but with less certainty. The model results suggest that the analysis is capturing the G0 generation, displaying a peak for the adults in about mid-April. Inter-annual differences in spring seasonal development, with the peak of adults shifting towards earlier in the season as well as a shorter generation time are suggested. Considering the importance of Calanus finmarchicus as food for planktivorous predators in the Norwegian Sea, our time series analysis suggests relevant changes both with respect to the spring abundance and timing of this food source. The next step is to relate variation in the Calanus time series to environmental factors with special emphasis on climatic drivers.

  12. Aeroelastic impact of above-rated wave-induced structural motions on the near-wake stability of a floating offshore wind turbine rotor

    NASA Astrophysics Data System (ADS)

    Rodriguez, Steven; Jaworski, Justin

    2017-11-01

    The impact of above-rated wave-induced motions on the stability of floating offshore wind turbine near-wakes is studied numerically. The rotor near-wake is generated using a lifting-line free vortex wake method, which is strongly coupled to a finite element solver for kinematically nonlinear blade deformations. A synthetic time series of relatively high-amplitude/high-frequency representative of above-rated conditions of the NREL 5MW referece wind turbine is imposed on the rotor structure. To evaluate the impact of these above-rated conditions, a linear stability analysis is first performed on the near wake generated by a fixed-tower wind turbine configuration at above-rated inflow conditions. The platform motion is then introduced via synthetic time series, and a stability analysis is performed on the wake generated by the floating offshore wind turbine at the same above-rated inflow conditions. The stability trends (disturbance modes versus the divergence rate of vortex structures) of the two analyses are compared to identify the impact that above-rated wave-induced structural motions have on the stability of the floating offshore wind turbine wake.

  13. First and second order semi-Markov chains for wind speed modeling

    NASA Astrophysics Data System (ADS)

    Prattico, F.; Petroni, F.; D'Amico, G.

    2012-04-01

    The increasing interest in renewable energy leads scientific research to find a better way to recover most of the available energy. Particularly, the maximum energy recoverable from wind is equal to 59.3% of that available (Betz law) at a specific pitch angle and when the ratio between the wind speed in output and in input is equal to 1/3. The pitch angle is the angle formed between the airfoil of the blade of the wind turbine and the wind direction. Old turbine and a lot of that actually marketed, in fact, have always the same invariant geometry of the airfoil. This causes that wind turbines will work with an efficiency that is lower than 59.3%. New generation wind turbines, instead, have a system to variate the pitch angle by rotating the blades. This system able the wind turbines to recover, at different wind speed, always the maximum energy, working in Betz limit at different speed ratios. A powerful system control of the pitch angle allows the wind turbine to recover better the energy in transient regime. A good stochastic model for wind speed is then needed to help both the optimization of turbine design and to assist the system control to predict the value of the wind speed to positioning the blades quickly and correctly. The possibility to have synthetic data of wind speed is a powerful instrument to assist designer to verify the structures of the wind turbines or to estimate the energy recoverable from a specific site. To generate synthetic data, Markov chains of first or higher order are often used [1,2,3]. In particular in [3] is presented a comparison between a first-order Markov chain and a second-order Markov chain. A similar work, but only for the first-order Markov chain, is conduced by [2], presenting the probability transition matrix and comparing the energy spectral density and autocorrelation of real and synthetic wind speed data. A tentative to modeling and to join speed and direction of wind is presented in [1], by using two models, first-order Markov chain with different number of states, and Weibull distribution. All this model use Markov chains to generate synthetic wind speed time series but the search for a better model is still open. Approaching this issue, we applied new models which are generalization of Markov models. More precisely we applied semi-Markov models to generate synthetic wind speed time series. Semi-Markov processes (SMP) are a wide class of stochastic processes which generalize at the same time both Markov chains and renewal processes. Their main advantage is that of using whatever type of waiting time distribution for modeling the time to have a transition from one state to another one. This major flexibility has a price to pay: availability of data to estimate the parameters of the model which are more numerous. Data availability is not an issue in wind speed studies, therefore, semi-Markov models can be used in a statistical efficient way. In this work we present three different semi-Markov chain models: the first one is a first-order SMP where the transition probabilities from two speed states (at time Tn and Tn-1) depend on the initial state (the state at Tn-1), final state (the state at Tn) and on the waiting time (given by t=Tn-Tn-1), the second model is a second order SMP where we consider the transition probabilities as depending also on the state the wind speed was before the initial state (which is the state at Tn-2) and the last one is still a second order SMP where the transition probabilities depends on the three states at Tn-2,Tn-1 and Tn and on the waiting times t_1=Tn-1-Tn-2 and t_2=Tn-Tn-1. The three models are used to generate synthetic time series for wind speed by means of Monte Carlo simulations and the time lagged autocorrelation is used to compare statistical properties of the proposed models with those of real data and also with a time series generated though a simple Markov chain. [1] F. Youcef Ettoumi, H. Sauvageot, A.-E.-H. Adane, Statistical bivariate modeling of wind using first-order Markov chain and Weibull distribution, Renewable Energy, 28/2003 1787-1802. [2] A. Shamshad, M.A. Bawadi, W.M.W. Wan Hussin, T.A. Majid, S.A.M. Sanusi, First and second order Markov chain models for synthetic generation of wind speed time series, Energy 30/2005 693-708. [3] H. Nfaoui, H. Essiarab, A.A.M. Sayigh, A stochastic Markov chain model for simulating wind speed time series at Tangiers, Morocco, Renewable Energy 29/2004, 1407-1418.

  14. MODIS Interactive Subsetting Tool (MIST)

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Duerr, R.; Haran, T.; Khalsa, S. S.; Miller, D.

    2008-12-01

    In response to requests from the user community, NSIDC has teamed with the Oak Ridge National Laboratory Distributive Active Archive Center (ORNL DAAC) and the Moderate Resolution Data Center (MrDC) to provide time series subsets of satellite data covering stations in the Greenland Climate Network (GC-NET) and the International Arctic Systems for Observing the Atmosphere (IASOA) network. To serve these data NSIDC created the MODIS Interactive Subsetting Tool (MIST). MIST works with 7 km by 7 km subset time series of certain Version 5 (V005) MODIS products over GC-Net and IASOA stations. User- selected data are delivered in a text Comma Separated Value (CSV) file format. MIST also provides online analysis capabilities that include generating time series and scatter plots. Currently, MIST is a Beta prototype and NSIDC intends that user requests will drive future development of the tool. The intent of this poster is to introduce MIST to the MODIS data user audience and illustrate some of the online analysis capabilities.

  15. Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure

    NASA Astrophysics Data System (ADS)

    Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak

    2017-09-01

    Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.

  16. State energy data report 1996: Consumption estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The State Energy Data Report (SEDR) provides annual time series estimates of State-level energy consumption by major economic sectors. The estimates are developed in the Combined State Energy Data System (CSEDS), which is maintained and operated by the Energy Information Administration (EIA). The goal in maintaining CSEDS is to create historical time series of energy consumption by State that are defined as consistently as possible over time and across sectors. CSEDS exists for two principal reasons: (1) to provide State energy consumption estimates to Members of Congress, Federal and State agencies, and the general public and (2) to provide themore » historical series necessary for EIA`s energy models. To the degree possible, energy consumption has been assigned to five sectors: residential, commercial, industrial, transportation, and electric utility sectors. Fuels covered are coal, natural gas, petroleum, nuclear electric power, hydroelectric power, biomass, and other, defined as electric power generated from geothermal, wind, photovoltaic, and solar thermal energy. 322 tabs.« less

  17. Disorder generated by interacting neural networks: application to econophysics and cryptography

    NASA Astrophysics Data System (ADS)

    Kinzel, Wolfgang; Kanter, Ido

    2003-10-01

    When neural networks are trained on their own output signals they generate disordered time series. In particular, when two neural networks are trained on their mutual output they can synchronize; they relax to a time-dependent state with identical synaptic weights. Two applications of this phenomenon are discussed for (a) econophysics and (b) cryptography. (a) When agents competing in a closed market (minority game) are using neural networks to make their decisions, the total system relaxes to a state of good performance. (b) Two partners communicating over a public channel can find a common secret key.

  18. Cumulative and Synergistic Effects of Physical, Biological, and Acoustic Signals on Marine Mammal Habitat Use

    DTIC Science & Technology

    2009-09-30

    soundbite time series, 3) determination of daily species presence, 4) generation of seasonal soundscapes , and 5) generation of geophysical (wind...m and 55 m (Figure 2b). RESULTS Passive Acoustics Spectral data from M5 highlighted the change in seasonal soundscapes related to bowhead...whale migration in January and the ice seal breeding season in March-May. The soundscapes of the summer and fall revealed an acoustic environment

  19. Enhancement of Mutual Discovery, Search, and Access of Data for Users of NASA and GEOSS-Cataloged Data Systems

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Maidment, D. R.; Rodell, M.; Strub, R. F.; Arctur, D. K.; Ames, D. P.; Vollmer, B.; Seiler, E.

    2014-12-01

    An ongoing NASA-funded project has removed a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series) for selected variables of the North American and Global Land Data Assimilation Systems (NLDAS and GLDAS, respectively) and other EOSDIS (Earth Observing System Data Information System) data sets. These time series ("data rods") are pre-generated or generated on-the-fly (OTF), leveraging the NASA Simple Subset Wizard (SSW), a gateway to NASA data centers. Data rods Web services are accessible through the CUAHSI Hydrologic Information System (HIS) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) but are not easily discoverable by users of other non-NASA data systems. The Global Earth Observation System of Systems (GEOSS) is a logical mechanism for providing access to the data rods, both pre-generated and OTF. There is an ongoing series of multi-organizational GEOSS Architecture Implementation Pilots, now in Phase-7 (AIP-7) and with a strong water sub-theme, that is aimed at the GEOSS Water Strategic Target "to produce [by 2015] comprehensive sets of data and information products to support decision-making for efficient management of the world's water resources, based on coordinated, sustained observations of the water cycle on multiple scales." The aim of this "GEOSS Water Services" project is to develop a distributed, global registry of water data, map, and modeling services catalogued using the standards and procedures of the Open Geospatial Consortium and the World Meteorological Organization. This project has already demonstrated that the GEOSS infrastructure can be leveraged to help provide access to time series of model grid information (e.g., NLDAS, GLDAS) or grids of information over a geographical domain for a particular time interval. A new NASA-funded project was begun, to expand on these early efforts to enhance the discovery, search, and access of NASA data by non-NASA users, comprising the following key aspects: 1. Leverage SSW API and EOS Clearing House (ECHO) 2. Register data rods services in GEOSS 3. Develop Web Feature Services (WFS) for data rods 4. Enhance metadata in WFS 5. Make non-NASA data visible to NASA users by leveraging SSW 6. Develop hydrological use cases to guide project deployment and serve as metrics

  20. Quantification of shoreline change along Hatteras Island, North Carolina: Oregon Inlet to Cape Hatteras, 1978-2002, and associated vector shoreline data

    USGS Publications Warehouse

    Hapke, Cheryl J.; Henderson, Rachel E.

    2015-01-01

    Shoreline change spanning twenty-four years was assessed along the coastline of Cape Hatteras National Seashore, at Hatteras Island, North Carolina. The shorelines used in the analysis were generated from georeferenced historical aerial imagery and are used to develop shoreline change rates for Hatteras Island, from Oregon Inlet to Cape Hatteras. A total of 14 dates of aerial photographs ranging from 1978 through 2002 were obtained from the U.S. Army Corp of Engineers Field Research Facility in Duck, North Carolina, and scanned to generate digital imagery. The digital imagery was georeferenced and high water line shorelines (interpreted from the wet/dry line) were digitized from each date to produce a time series of shorelines for the study area. Rates of shoreline change were calculated for three periods: the full span of the time series, 1978 through 2002, and two approximately decadal subsets, 1978–89 and 1989–2002.

  1. Implemented Lomb-Scargle periodogram: a valuable tool for improving cyclostratigraphic research on unevenly sampled deep-sea stratigraphic sequences

    NASA Astrophysics Data System (ADS)

    Pardo-Iguzquiza, Eulogio; Rodríguez-Tovar, Francisco J.

    2011-12-01

    One important handicap when working with stratigraphic sequences is the discontinuous character of the sedimentary record, especially relevant in cyclostratigraphic analysis. Uneven palaeoclimatic/palaeoceanographic time series are common, their cyclostratigraphic analysis being comparatively difficult because most spectral methodologies are appropriate only when working with even sampling. As a means to solve this problem, a program for calculating the smoothed Lomb-Scargle periodogram and cross-periodogram, which additionally evaluates the statistical confidence of the estimated power spectrum through a Monte Carlo procedure (the permutation test), has been developed. The spectral analysis of a short uneven time series calls for assessment of the statistical significance of the spectral peaks, since a periodogram can always be calculated but the main challenge resides in identifying true spectral features. To demonstrate the effectiveness of this program, two case studies are presented: the one deals with synthetic data and the other with paleoceanographic/palaeoclimatic proxies. On a simulated time series of 500 data, two uneven time series (with 100 and 25 data) were generated by selecting data at random. Comparative analysis between the power spectra from the simulated series and from the two uneven time series demonstrates the usefulness of the smoothed Lomb-Scargle periodogram for uneven sequences, making it possible to distinguish between statistically significant and spurious spectral peaks. Fragmentary time series of Cd/Ca ratios and δ18O from core AII107-131 of SPECMAP were analysed as a real case study. The efficiency of the direct and cross Lomb-Scargle periodogram in recognizing Milankovitch and sub-Milankovitch signals related to palaeoclimatic/palaeoceanographic changes is demonstrated. As implemented, the Lomb-Scargle periodogram may be applied to any palaeoclimatic/palaeoceanographic proxies, including those usually recovered from contourites, and it holds special interest in the context of centennial- to millennial-scale climatic changes affecting contouritic currents.

  2. Building Change Detection in Very High Resolution Satellite Stereo Image Time Series

    NASA Astrophysics Data System (ADS)

    Tian, J.; Qin, R.; Cerra, D.; Reinartz, P.

    2016-06-01

    There is an increasing demand for robust methods on urban sprawl monitoring. The steadily increasing number of high resolution and multi-view sensors allows producing datasets with high temporal and spatial resolution; however, less effort has been dedicated to employ very high resolution (VHR) satellite image time series (SITS) to monitor the changes in buildings with higher accuracy. In addition, these VHR data are often acquired from different sensors. The objective of this research is to propose a robust time-series data analysis method for VHR stereo imagery. Firstly, the spatial-temporal information of the stereo imagery and the Digital Surface Models (DSMs) generated from them are combined, and building probability maps (BPM) are calculated for all acquisition dates. In the second step, an object-based change analysis is performed based on the derivative features of the BPM sets. The change consistence between object-level and pixel-level are checked to remove any outlier pixels. Results are assessed on six pairs of VHR satellite images acquired within a time span of 7 years. The evaluation results have proved the efficiency of the proposed method.

  3. An M-estimator for reduced-rank system identification.

    PubMed

    Chen, Shaojie; Liu, Kai; Yang, Yuguang; Xu, Yuting; Lee, Seonjoo; Lindquist, Martin; Caffo, Brian S; Vogelstein, Joshua T

    2017-01-15

    High-dimensional time-series data from a wide variety of domains, such as neuroscience, are being generated every day. Fitting statistical models to such data, to enable parameter estimation and time-series prediction, is an important computational primitive. Existing methods, however, are unable to cope with the high-dimensional nature of these data, due to both computational and statistical reasons. We mitigate both kinds of issues by proposing an M-estimator for Reduced-rank System IDentification ( MR. SID). A combination of low-rank approximations, ℓ 1 and ℓ 2 penalties, and some numerical linear algebra tricks, yields an estimator that is computationally efficient and numerically stable. Simulations and real data examples demonstrate the usefulness of this approach in a variety of problems. In particular, we demonstrate that MR. SID can accurately estimate spatial filters, connectivity graphs, and time-courses from native resolution functional magnetic resonance imaging data. MR. SID therefore enables big time-series data to be analyzed using standard methods, readying the field for further generalizations including non-linear and non-Gaussian state-space models.

  4. An M-estimator for reduced-rank system identification

    PubMed Central

    Chen, Shaojie; Liu, Kai; Yang, Yuguang; Xu, Yuting; Lee, Seonjoo; Lindquist, Martin; Caffo, Brian S.; Vogelstein, Joshua T.

    2018-01-01

    High-dimensional time-series data from a wide variety of domains, such as neuroscience, are being generated every day. Fitting statistical models to such data, to enable parameter estimation and time-series prediction, is an important computational primitive. Existing methods, however, are unable to cope with the high-dimensional nature of these data, due to both computational and statistical reasons. We mitigate both kinds of issues by proposing an M-estimator for Reduced-rank System IDentification ( MR. SID). A combination of low-rank approximations, ℓ1 and ℓ2 penalties, and some numerical linear algebra tricks, yields an estimator that is computationally efficient and numerically stable. Simulations and real data examples demonstrate the usefulness of this approach in a variety of problems. In particular, we demonstrate that MR. SID can accurately estimate spatial filters, connectivity graphs, and time-courses from native resolution functional magnetic resonance imaging data. MR. SID therefore enables big time-series data to be analyzed using standard methods, readying the field for further generalizations including non-linear and non-Gaussian state-space models. PMID:29391659

  5. An Interrupted Time-Series Analysis of Durkheim's Social Deregulation Thesis: The Case of the Russian Federation.

    PubMed

    Pridemore, William Alex; Chamlin, Mitchell B; Cochran, John K

    2007-06-01

    The dissolution of the Soviet Union resulted in sudden, widespread, and fundamental changes to Russian society. The former social welfare system-with its broad guarantees of employment, healthcare, education, and other forms of social support-was dismantled in the shift toward democracy, rule of law, and a free-market economy. This unique natural experiment provides a rare opportunity to examine the potentially disintegrative effects of rapid social change on deviance, and thus to evaluate one of Durkheim's core tenets. We took advantage of this opportunity by performing interrupted time-series analyses of annual age-adjusted homicide, suicide, and alcohol-related mortality rates for the Russian Federation using data from 1956 to 2002, with 1992-2002 as the postintervention time-frame. The ARIMA models indicate that, controlling for the long-term processes that generated these three time series, the breakup of the Soviet Union was associated with an appreciable increase in each of the cause-of-death rates. We interpret these findings as being consistent with the Durkheimian hypothesis that rapid social change disrupts social order, thereby increasing the level of crime and deviance.

  6. Influence of weather on the synchrony of gypsy moth (Lepidoptera: Lymantriidae) outbreaks in New England

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, D.W.; Liebhold, A.M.

    1995-10-01

    Outbreaks of the gypsy moth, Lymantria dispar (L.), were partially synchronous across New England states (Massachusetts, Maine, New Hampshire, and Vermont) from 1938 to 1992. To explain this synchrony, we investigated the Moran effect, a hypothesis that local population oscillations, which result form similar density-dependent mechanisms operating at time lags, may be synchronized over wide areas by exposure to common weather patterns. We also investigated the theory of climatic release, which ostulates that outbreaks are triggered by climatic factors favorable for population growth. Time series analysis revealed defoliation series in 2 states as 1st-order autoregressive processes and the other 2more » as periodic 2nd-order autoregressive processes. Defoliation residuals series computed using the autoregressive models for each state were cross correlated with series of weather variables recorded in the respective states. The weather variables significantly correlated with defoliation residuals in all 4 states were minimum temperature and precipitation in mid-December in the same gypsy moth generation and minimum temperature in mid- to late July of the previous generation. These weather variables also were correlated strongly among the 4 states. The analyses supported the predictions of the Moran effect and suggest the common weather may synchronize local populations so as to produce pest outbreaks over wide areas. We did not find convincing evidence to support the theory of climatic release. 41 refs., 7 figs., 4 tabs.« less

  7. The role of group index engineering in series-connected photonic crystal microcavities for high density sensor microarrays

    PubMed Central

    Zou, Yi; Chakravarty, Swapnajit; Zhu, Liang; Chen, Ray T.

    2014-01-01

    We experimentally demonstrate an efficient and robust method for series connection of photonic crystal microcavities that are coupled to photonic crystal waveguides in the slow light transmission regime. We demonstrate that group index taper engineering provides excellent optical impedance matching between the input and output strip waveguides and the photonic crystal waveguide, a nearly flat transmission over the entire guided mode spectrum and clear multi-resonance peaks corresponding to individual microcavities that are connected in series. Series connected photonic crystal microcavities are further multiplexed in parallel using cascaded multimode interference power splitters to generate a high density silicon nanophotonic microarray comprising 64 photonic crystal microcavity sensors, all of which are interrogated simultaneously at the same instant of time. PMID:25316921

  8. Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.

    PubMed

    Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa

    2017-02-01

    Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.

  9. Population momentum across vertebrate life histories

    USGS Publications Warehouse

    Koons, D.N.; Grand, J.B.; Arnold, J.M.

    2006-01-01

    Population abundance is critically important in conservation, management, and demographic theory. Thus, to better understand how perturbations to the life history affect long-term population size, we examined population momentum for four vertebrate classes with different life history strategies. In a series of demographic experiments we show that population momentum generally has a larger effect on long-term population size for organisms with long generation times than for organisms with short generation times. However, patterns between population momentum and generation time varied across taxonomic groups and according to the life history parameter that was changed. Our findings indicate that momentum may be an especially important aspect of population dynamics for long-lived vertebrates, and deserves greater attention in life history studies. Further, we discuss the importance of population momentum in natural resource management, pest control, and conservation arenas. ?? 2006 Elsevier B.V. All rights reserved.

  10. A time-series method for automated measurement of changes in mitotic and interphase duration from time-lapse movies.

    PubMed

    Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W

    2011-01-01

    Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.

  11. Nonstationary frequency analysis for the trivariate flood series of the Weihe River

    NASA Astrophysics Data System (ADS)

    Jiang, Cong; Xiong, Lihua

    2016-04-01

    Some intensive human activities such as water-soil conservation can significantly alter the natural hydrological processes of rivers. In this study, the effect of the water-soil conservation on the trivariate flood series from the Weihe River located in the Northwest China is investigated. The annual maxima daily discharge, annual maxima 3-day flood volume and annual maxima 5-day flood volume are chosen as the study data and used to compose the trivariate flood series. The nonstationarities in both the individual univariate flood series and the corresponding antecedent precipitation series generating the flood events are examined by the Mann-Kendall trend test. It is found that all individual univariate flood series present significant decreasing trend, while the antecedent precipitation series can be treated as stationary. It indicates that the increase of the water-soil conservation land area has altered the rainfall-runoff relationship of the Weihe basin, and induced the nonstationarities in the three individual univariate flood series. The time-varying moments model based on the Pearson type III distribution is applied to capture the nonstationarities in the flood frequency distribution with the water-soil conservation land area introduced as the explanatory variable of the flood distribution parameters. Based on the analysis for each individual univariate flood series, the dependence structure among the three univariate flood series are investigated by the time-varying copula model also with the water-soil conservation land area as the explanatory variable of copula parameters. The results indicate that the dependence among the trivariate flood series is enhanced by the increase of water-soil conservation land area.

  12. Between a Map and a Data Rod

    NASA Technical Reports Server (NTRS)

    Teng, William; Rui, Hualan; Strub, Richard; Vollmer, Bruce

    2015-01-01

    A Digital Divide has long stood between how NASA and other satellite-derived data are typically archived (time-step arrays or maps) and how hydrology and other point-time series oriented communities prefer to access those data. In essence, the desired method of data access is orthogonal to the way the data are archived. Our approach to bridging the Divide is part of a larger NASA-supported data rods project to enhance access to and use of NASA and other data by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) and the larger hydrology community. Our main objective was to determine a way to reorganize data that is optimal for these communities. Two related objectives were to optimally reorganize data in a way that (1) is operational and fits in and leverages the existing Goddard Earth Sciences Data and Information Services Center (GES DISC) operational environment and (2) addresses the scaling up of data sets available as time series from those archived at the GES DISC to potentially include those from other Earth Observing System Data and Information System (EOSDIS) data archives. Through several prototype efforts and lessons learned, we arrived at a non-database solution that satisfied our objectivesconstraints. We describe, in this presentation, how we implemented the operational production of pre-generated data rods and, considering the tradeoffs between length of time series (or number of time steps), resources needed, and performance, how we implemented the operational production of on-the-fly (virtual) data rods. For the virtual data rods, we leveraged a number of existing resources, including the NASA Giovanni Cache and NetCDF Operators (NCO) and used data cubes processed in parallel. Our current benchmark performance for virtual generation of data rods is about a years worth of time series for hourly data (9,000 time steps) in 90 seconds. Our approach is a specific implementation of the general optimal strategy of reorganizing data to match the desired means of access. Results from our project have already significantly extended NASA data to the large and important hydrology user community that has been, heretofore, mostly unable to easily access and use NASA data.

  13. Between a Map and a Data Rod

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Rui, H.; Strub, R. F.; Vollmer, B.

    2015-12-01

    A "Digital Divide" has long stood between how NASA and other satellite-derived data are typically archived (time-step arrays or "maps") and how hydrology and other point-time series oriented communities prefer to access those data. In essence, the desired method of data access is orthogonal to the way the data are archived. Our approach to bridging the Divide is part of a larger NASA-supported "data rods" project to enhance access to and use of NASA and other data by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) and the larger hydrology community. Our main objective was to determine a way to reorganize data that is optimal for these communities. Two related objectives were to optimally reorganize data in a way that (1) is operational and fits in and leverages the existing Goddard Earth Sciences Data and Information Services Center (GES DISC) operational environment and (2) addresses the scaling up of data sets available as time series from those archived at the GES DISC to potentially include those from other Earth Observing System Data and Information System (EOSDIS) data archives. Through several prototype efforts and lessons learned, we arrived at a non-database solution that satisfied our objectives/constraints. We describe, in this presentation, how we implemented the operational production of pre-generated data rods and, considering the tradeoffs between length of time series (or number of time steps), resources needed, and performance, how we implemented the operational production of on-the-fly ("virtual") data rods. For the virtual data rods, we leveraged a number of existing resources, including the NASA Giovanni Cache and NetCDF Operators (NCO) and used data cubes processed in parallel. Our current benchmark performance for virtual generation of data rods is about a year's worth of time series for hourly data (~9,000 time steps) in ~90 seconds. Our approach is a specific implementation of the general optimal strategy of reorganizing data to match the desired means of access. Results from our project have already significantly extended NASA data to the large and important hydrology user community that has been, heretofore, mostly unable to easily access and use NASA data.

  14. How are you feeling?: A personalized methodology for predicting mental states from temporally observable physical and behavioral information.

    PubMed

    Tuarob, Suppawong; Tucker, Conrad S; Kumara, Soundar; Giles, C Lee; Pincus, Aaron L; Conroy, David E; Ram, Nilam

    2017-04-01

    It is believed that anomalous mental states such as stress and anxiety not only cause suffering for the individuals, but also lead to tragedies in some extreme cases. The ability to predict the mental state of an individual at both current and future time periods could prove critical to healthcare practitioners. Currently, the practical way to predict an individual's mental state is through mental examinations that involve psychological experts performing the evaluations. However, such methods can be time and resource consuming, mitigating their broad applicability to a wide population. Furthermore, some individuals may also be unaware of their mental states or may feel uncomfortable to express themselves during the evaluations. Hence, their anomalous mental states could remain undetected for a prolonged period of time. The objective of this work is to demonstrate the ability of using advanced machine learning based approaches to generate mathematical models that predict current and future mental states of an individual. The problem of mental state prediction is transformed into the time series forecasting problem, where an individual is represented as a multivariate time series stream of monitored physical and behavioral attributes. A personalized mathematical model is then automatically generated to capture the dependencies among these attributes, which is used for prediction of mental states for each individual. In particular, we first illustrate the drawbacks of traditional multivariate time series forecasting methodologies such as vector autoregression. Then, we show that such issues could be mitigated by using machine learning regression techniques which are modified for capturing temporal dependencies in time series data. A case study using the data from 150 human participants illustrates that the proposed machine learning based forecasting methods are more suitable for high-dimensional psychological data than the traditional vector autoregressive model in terms of both magnitude of error and directional accuracy. These results not only present a successful usage of machine learning techniques in psychological studies, but also serve as a building block for multiple medical applications that could rely on an automated system to gauge individuals' mental states. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Wavelet-based time series bootstrap model for multidecadal streamflow simulation using climate indicators

    NASA Astrophysics Data System (ADS)

    Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth

    2016-05-01

    A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.

  16. The Fourier decomposition method for nonlinear and non-stationary time series analysis

    PubMed Central

    Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik

    2017-01-01

    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of ‘Fourier intrinsic band functions’ (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time–frequency–energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms. PMID:28413352

  17. Long-term variability in Northern Hemisphere snow cover and associations with warmer winters

    USGS Publications Warehouse

    McCabe, Gregory J.; Wolock, David M.

    2010-01-01

    A monthly snow accumulation and melt model is used with gridded monthly temperature and precipitation data for the Northern Hemisphere to generate time series of March snow-covered area (SCA) for the period 1905 through 2002. The time series of estimated SCA for March is verified by comparison with previously published time series of SCA for the Northern Hemisphere. The time series of estimated Northern Hemisphere March SCA shows a substantial decrease since about 1970, and this decrease corresponds to an increase in mean winter Northern Hemisphere temperature. The increase in winter temperature has caused a decrease in the fraction of precipitation that occurs as snow and an increase in snowmelt for some parts of the Northern Hemisphere, particularly the mid-latitudes, thus reducing snow packs and March SCA. In addition, the increase in winter temperature and the decreases in SCA appear to be associated with a contraction of the circumpolar vortex and a poleward movement of storm tracks, resulting in decreased precipitation (and snow) in the low- to mid-latitudes and an increase in precipitation (and snow) in high latitudes. If Northern Hemisphere winter temperatures continue to warm as they have since the 1970s, then March SCA will likely continue to decrease.

  18. Long-term variability in Northern Hemisphere snow cover and associations with warmer winters

    USGS Publications Warehouse

    McCabe, G.J.; Wolock, D.M.

    2010-01-01

    A monthly snow accumulation and melt model is used with gridded monthly temperature and precipitation data for the Northern Hemisphere to generate time series of March snow-covered area (SCA) for the period 1905 through 2002. The time series of estimated SCA for March is verified by comparison with previously published time series of SCA for the Northern Hemisphere. The time series of estimated Northern Hemisphere March SCA shows a substantial decrease since about 1970, and this decrease corresponds to an increase in mean winter Northern Hemisphere temperature. The increase in winter temperature has caused a decrease in the fraction of precipitation that occurs as snow and an increase in snowmelt for some parts of the Northern Hemisphere, particularly the mid-latitudes, thus reducing snow packs and March SCA. In addition, the increase in winter temperature and the decreases in SCA appear to be associated with a contraction of the circumpolar vortex and a poleward movement of storm tracks, resulting in decreased precipitation (and snow) in the low- to mid-latitudes and an increase in precipitation (and snow) in high latitudes. If Northern Hemisphere winter temperatures continue to warm as they have since the 1970s, then March SCA will likely continue to decrease. ?? 2009 Springer Science+Business Media B.V.

  19. Monitoring cotton root rot by synthetic Sentinel-2 NDVI time series using improved spatial and temporal data fusion.

    PubMed

    Wu, Mingquan; Yang, Chenghai; Song, Xiaoyu; Hoffmann, Wesley Clint; Huang, Wenjiang; Niu, Zheng; Wang, Changyao; Li, Wang; Yu, Bo

    2018-01-31

    To better understand the progression of cotton root rot within the season, time series monitoring is required. In this study, an improved spatial and temporal data fusion approach (ISTDFA) was employed to combine 250-m Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Different Vegetation Index (NDVI) and 10-m Sentinetl-2 NDVI data to generate a synthetic Sentinel-2 NDVI time series for monitoring this disease. Then, the phenology of healthy cotton and infected cotton was modeled using a logistic model. Finally, several phenology parameters, including the onset day of greenness minimum (OGM), growing season length (GLS), onset of greenness increase (OGI), max NDVI value, and integral area of the phenology curve, were calculated. The results showed that ISTDFA could be used to combine time series MODIS and Sentinel-2 NDVI data with a correlation coefficient of 0.893. The logistic model could describe the phenology curves with R-squared values from 0.791 to 0.969. Moreover, the phenology curve of infected cotton showed a significant difference from that of healthy cotton. The max NDVI value, OGM, GSL and the integral area of the phenology curve for infected cotton were reduced by 0.045, 30 days, 22 days, and 18.54%, respectively, compared with those for healthy cotton.

  20. FELIX-1.0: A finite element solver for the time dependent generator coordinate method with the Gaussian overlap approximation

    NASA Astrophysics Data System (ADS)

    Regnier, D.; Verrière, M.; Dubray, N.; Schunck, N.

    2016-03-01

    We describe the software package FELIX that solves the equations of the time-dependent generator coordinate method (TDGCM) in N-dimensions (N ≥ 1) under the Gaussian overlap approximation. The numerical resolution is based on the Galerkin finite element discretization of the collective space and the Crank-Nicolson scheme for time integration. The TDGCM solver is implemented entirely in C++. Several additional tools written in C++, Python or bash scripting language are also included for convenience. In this paper, the solver is tested with a series of benchmarks calculations. We also demonstrate the ability of our code to handle a realistic calculation of fission dynamics.

  1. Generating Dynamic Persistence in the Time Domain

    NASA Astrophysics Data System (ADS)

    Guerrero, A.; Smith, L. A.; Smith, L. A.; Kaplan, D. T.

    2001-12-01

    Many dynamical systems present long-range correlations. Physically, these systems vary from biological to economical, including geological or urban systems. Important geophysical candidates for this type of behaviour include weather (or climate) and earthquake sequences. Persistence is characterised by slowly decaying correlation function; that, in theory, never dies out. The Persistence exponent reflects the degree of memory in the system and much effort has been expended creating and analysing methods that successfully estimate this parameter and model data that exhibits persistence. The most widely used methods for generating long correlated time series are not dynamical systems in the time domain, but instead are derived from a given spectral density. Little attention has been drawn to modelling persistence in the time domain. The time domain approach has the advantage that an observation at certain time can be calculated using previous observations which is particularly suitable when investigating the predictability of a long memory process. We will describe two of these methods in the time domain. One is a traditional approach using fractional ARIMA (autoregressive and moving average) models; the second uses a novel approach to extending a given series using random Fourier basis functions. The statistical quality of the two methods is compared, and they are contrasted with weather data which shows, reportedly, persistence. The suitability of this approach both for estimating predictability and for making predictions is discussed.

  2. Multi-scale clustering of functional data with application to hydraulic gradients in wetlands

    USGS Publications Warehouse

    Greenwood, Mark C.; Sojda, Richard S.; Sharp, Julia L.; Peck, Rory G.; Rosenberry, Donald O.

    2011-01-01

    A new set of methods are developed to perform cluster analysis of functions, motivated by a data set consisting of hydraulic gradients at several locations distributed across a wetland complex. The methods build on previous work on clustering of functions, such as Tarpey and Kinateder (2003) and Hitchcock et al. (2007), but explore functions generated from an additive model decomposition (Wood, 2006) of the original time se- ries. Our decomposition targets two aspects of the series, using an adaptive smoother for the trend and circular spline for the diurnal variation in the series. Different measures for comparing locations are discussed, including a method for efficiently clustering time series that are of different lengths using a functional data approach. The complicated nature of these wetlands are highlighted by the shifting group memberships depending on which scale of variation and year of the study are considered.

  3. Cross-bispectrum computation and variance estimation

    NASA Technical Reports Server (NTRS)

    Lii, K. S.; Helland, K. N.

    1981-01-01

    A method for the estimation of cross-bispectra of discrete real time series is developed. The asymptotic variance properties of the bispectrum are reviewed, and a method for the direct estimation of bispectral variance is given. The symmetry properties are described which minimize the computations necessary to obtain a complete estimate of the cross-bispectrum in the right-half-plane. A procedure is given for computing the cross-bispectrum by subdividing the domain into rectangular averaging regions which help reduce the variance of the estimates and allow easy application of the symmetry relationships to minimize the computational effort. As an example of the procedure, the cross-bispectrum of a numerically generated, exponentially distributed time series is computed and compared with theory.

  4. Geodesic regression for image time-series.

    PubMed

    Niethammer, Marc; Huang, Yang; Vialard, François-Xavier

    2011-01-01

    Registration of image-time series has so far been accomplished (i) by concatenating registrations between image pairs, (ii) by solving a joint estimation problem resulting in piecewise geodesic paths between image pairs, (iii) by kernel based local averaging or (iv) by augmenting the joint estimation with additional temporal irregularity penalties. Here, we propose a generative model extending least squares linear regression to the space of images by using a second-order dynamic formulation for image registration. Unlike previous approaches, the formulation allows for a compact representation of an approximation to the full spatio-temporal trajectory through its initial values. The method also opens up possibilities to design image-based approximation algorithms. The resulting optimization problem is solved using an adjoint method.

  5. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  6. Convergent Cross Mapping: Basic concept, influence of estimation parameters and practical application.

    PubMed

    Schiecke, Karin; Pester, Britta; Feucht, Martha; Leistritz, Lutz; Witte, Herbert

    2015-01-01

    In neuroscience, data are typically generated from neural network activity. Complex interactions between measured time series are involved, and nothing or only little is known about the underlying dynamic system. Convergent Cross Mapping (CCM) provides the possibility to investigate nonlinear causal interactions between time series by using nonlinear state space reconstruction. Aim of this study is to investigate the general applicability, and to show potentials and limitation of CCM. Influence of estimation parameters could be demonstrated by means of simulated data, whereas interval-based application of CCM on real data could be adapted for the investigation of interactions between heart rate and specific EEG components of children with temporal lobe epilepsy.

  7. The SBAS Sentinel-1 Surveillance service for automatic and systematic generation of Earth surface displacement within the GEP platform.

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; De Luca, Claudio; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana

    2017-04-01

    The Geohazards Exploitation Platform (GEP) is an ESA activity of the Earth Observation (EO) ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. GEP aims at providing both on-demand processing services for scientific users of the geohazards community and an integration platform for new EO data analysis processors dedicated to scientists and other expert users. In the Remote Sensing scenario, a crucial role is played by the recently launched Sentinel-1 (S1) constellation that, with its global acquisition policy, has literally flooded the scientific community with a huge amount of data acquired over large part of the Earth on a regular basis (down to 6-days with both Sentinel-1A and 1B passes). Moreover, the S1 data, as part of the European Copernicus program, are openly and freely accessible, thus fostering their use for the development of tools for Earth surface monitoring. In particular, due to their specific SAR Interferometry (InSAR) design, Sentinel-1 satellites can be exploited to build up operational services for the generation of advanced interferometric products that can be very useful within risk management and natural hazard monitoring scenarios. Accordingly, in this work we present the activities carried out for the development, integration, and deployment of the SBAS Sentinel-1 Surveillance service of CNR-IREA within the GEP platform. This service is based on a parallel implementation of the SBAS approach, referred to as P-SBAS, able to effectively run in large distributed computing infrastructures (grid and cloud) and to allow for an efficient computation of large SAR data sequences with advanced DInSAR approaches. In particular, the Surveillance service developed on GEP platform consists on the systematic and automatic processing of Sentinel-1 data on selected Areas of Interest (AoI) to generate updated surface displacement time series via the SBAS-InSAR algorithm. We built up a system that is automatically triggered by every new S1 acquisition over the AoI, once it is available on the S1 catalogue. Then, tacking benefit from the SBAS results generated by previous runs of the service, the system processes the new acquisitions only, thus saving storage space and computing time and finally generating an updated SBAS time series. The same P-SBAS processor underlying the Surveillance service is also available through the GEP as a standard on-demand DInSAR service, thus allowing the scientific community to generate S1 SBAS time series on areas not covered by the Surveillance service itself. It is worth noting that the SBAS Sentinel-1 Surveillance service on GEP represents the core of the EPOSAR service, which will deliver S1 displacement time series of Earth surface on a regular basis for the European Plate Observing System (EPOS) Research Infrastructure community. In particular, the main goal of EPOSAR is to contribute with advanced technique and methods, which have already well demonstrated their effectiveness and relevance, in investigating the physical processes controlling earthquakes, volcanic eruptions and unrest episodes as well as those driving tectonics and Earth surface dynamics.

  8. Two-pass imputation algorithm for missing value estimation in gene expression time series.

    PubMed

    Tsiporkova, Elena; Boeva, Veselka

    2007-10-01

    Gene expression microarray experiments frequently generate datasets with multiple values missing. However, most of the analysis, mining, and classification methods for gene expression data require a complete matrix of gene array values. Therefore, the accurate estimation of missing values in such datasets has been recognized as an important issue, and several imputation algorithms have already been proposed to the biological community. Most of these approaches, however, are not particularly suitable for time series expression profiles. In view of this, we propose a novel imputation algorithm, which is specially suited for the estimation of missing values in gene expression time series data. The algorithm utilizes Dynamic Time Warping (DTW) distance in order to measure the similarity between time expression profiles, and subsequently selects for each gene expression profile with missing values a dedicated set of candidate profiles for estimation. Three different DTW-based imputation (DTWimpute) algorithms have been considered: position-wise, neighborhood-wise, and two-pass imputation. These have initially been prototyped in Perl, and their accuracy has been evaluated on yeast expression time series data using several different parameter settings. The experiments have shown that the two-pass algorithm consistently outperforms, in particular for datasets with a higher level of missing entries, the neighborhood-wise and the position-wise algorithms. The performance of the two-pass DTWimpute algorithm has further been benchmarked against the weighted K-Nearest Neighbors algorithm, which is widely used in the biological community; the former algorithm has appeared superior to the latter one. Motivated by these findings, indicating clearly the added value of the DTW techniques for missing value estimation in time series data, we have built an optimized C++ implementation of the two-pass DTWimpute algorithm. The software also provides for a choice between three different initial rough imputation methods.

  9. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  10. Spatio-temporal prediction of daily temperatures using time-series of MODIS LST images

    NASA Astrophysics Data System (ADS)

    Hengl, Tomislav; Heuvelink, Gerard B. M.; Perčec Tadić, Melita; Pebesma, Edzer J.

    2012-01-01

    A computational framework to generate daily temperature maps using time-series of publicly available MODIS MOD11A2 product Land Surface Temperature (LST) images (1 km resolution; 8-day composites) is illustrated using temperature measurements from the national network of meteorological stations (159) in Croatia. The input data set contains 57,282 ground measurements of daily temperature for the year 2008. Temperature was modeled as a function of latitude, longitude, distance from the sea, elevation, time, insolation, and the MODIS LST images. The original rasters were first converted to principal components to reduce noise and filter missing pixels in the LST images. The residual were next analyzed for spatio-temporal auto-correlation; sum-metric separable variograms were fitted to account for zonal and geometric space-time anisotropy. The final predictions were generated for time-slices of a 3D space-time cube, constructed in the R environment for statistical computing. The results show that the space-time regression model can explain a significant part of the variation in station-data (84%). MODIS LST 8-day (cloud-free) images are unbiased estimator of the daily temperature, but with relatively low precision (±4.1°C); however their added value is that they systematically improve detection of local changes in land surface temperature due to local meteorological conditions and/or active heat sources (urban areas, land cover classes). The results of 10-fold cross-validation show that use of spatio-temporal regression-kriging and incorporation of time-series of remote sensing images leads to significantly more accurate maps of temperature than if plain spatial techniques were used. The average (global) accuracy of mapping temperature was ±2.4°C. The regression-kriging explained 91% of variability in daily temperatures, compared to 44% for ordinary kriging. Further software advancement—interactive space-time variogram exploration and automated retrieval, resampling and filtering of MODIS images—are anticipated.

  11. Synthesis, characterization, and relative stabilities of self-assembled monolayers on gold generated from bidentate n-alkyl xanthic acids.

    PubMed

    Moore, H Justin; Colorado, Ramon; Lee, Han Ju; Jamison, Andrew C; Lee, T Randall

    2013-08-27

    A series of self-assembled monolayers (SAMs) on gold were generated by the adsorption of n-alkyl xanthic acids (NAXAs) having the general formula CH3(CH2)nOCS2H (n = 12-15). The structural features of these SAMs were characterized by optical ellipsometry, contact angle goniometry, polarization modulation infrared reflection absorption spectroscopy (PM-IRRAS), and X-ray photoelectron spectroscopy (XPS). This series of xanthate SAMs were compared to SAMs generated from the corresponding n-alkanethiols and aliphatic dithiocarboxylic acids (ADTCAs). The collected data indicate that the NAXAs generate densely packed and well-ordered monolayers. The contact angles of hexadecane on the xanthate monolayers exhibited a large "odd-even" effect similar to that produced by the ADTCA SAMs. The relative stability of these bidentate xanthate SAMs was evaluated by monitoring the changes in ellipsometric thicknesses and wettability as a function of time under various conditions. The results demonstrate that SAMs formed from NAXAs are much less stable than analogous n-alkanethiolate and ADTCA SAMs.

  12. Combining synthetic controls and interrupted time series analysis to improve causal inference in program evaluation.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome. The internal validity is strengthened considerably when the treated unit is contrasted with a comparable control group. In this paper, we introduce a robust evaluation framework that combines the synthetic controls method (SYNTH) to generate a comparable control group and ITSA regression to assess covariate balance and estimate treatment effects. We evaluate the effect of California's Proposition 99 for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. SYNTH is used to reweight nontreated units to make them comparable to the treated unit. These weights are then used in ITSA regression models to assess covariate balance and estimate treatment effects. Covariate balance was achieved for all but one covariate. While California experienced a significant decrease in the annual trend of cigarette sales after Proposition 99, there was no statistically significant treatment effect when compared to synthetic controls. The advantage of using this framework over regression alone is that it ensures that a comparable control group is generated. Additionally, it offers a common set of statistical measures familiar to investigators, the capability for assessing covariate balance, and enhancement of the evaluation with a comprehensive set of postestimation measures. Therefore, this robust framework should be considered as a primary approach for evaluating treatment effects in multiple group time series analysis. © 2018 John Wiley & Sons, Ltd.

  13. Uncertain Classification of Variable Stars: Handling Observational GAPS and Noise

    NASA Astrophysics Data System (ADS)

    Castro, Nicolás; Protopapas, Pavlos; Pichara, Karim

    2018-01-01

    Automatic classification methods applied to sky surveys have revolutionized the astronomical target selection process. Most surveys generate a vast amount of time series, or “lightcurves,” that represent the brightness variability of stellar objects in time. Unfortunately, lightcurves’ observations take several years to be completed, producing truncated time series that generally remain without the application of automatic classifiers until they are finished. This happens because state-of-the-art methods rely on a variety of statistical descriptors or features that present an increasing degree of dispersion when the number of observations decreases, which reduces their precision. In this paper, we propose a novel method that increases the performance of automatic classifiers of variable stars by incorporating the deviations that scarcity of observations produces. Our method uses Gaussian process regression to form a probabilistic model of each lightcurve’s observations. Then, based on this model, bootstrapped samples of the time series features are generated. Finally, a bagging approach is used to improve the overall performance of the classification. We perform tests on the MAssive Compact Halo Object (MACHO) and Optical Gravitational Lensing Experiment (OGLE) catalogs, results show that our method effectively classifies some variability classes using a small fraction of the original observations. For example, we found that RR Lyrae stars can be classified with ~80% accuracy just by observing the first 5% of the whole lightcurves’ observations in the MACHO and OGLE catalogs. We believe these results prove that, when studying lightcurves, it is important to consider the features’ error and how the measurement process impacts it.

  14. A multi-temporal analysis approach for land cover mapping in support of nuclear incident response

    NASA Astrophysics Data System (ADS)

    Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.

    2012-06-01

    Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.

  15. Univariate Time Series Prediction of Solar Power Using a Hybrid Wavelet-ARMA-NARX Prediction Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nazaripouya, Hamidreza; Wang, Yubo; Chu, Chi-Cheng

    This paper proposes a new hybrid method for super short-term solar power prediction. Solar output power usually has a complex, nonstationary, and nonlinear characteristic due to intermittent and time varying behavior of solar radiance. In addition, solar power dynamics is fast and is inertia less. An accurate super short-time prediction is required to compensate for the fluctuations and reduce the impact of solar power penetration on the power system. The objective is to predict one step-ahead solar power generation based only on historical solar power time series data. The proposed method incorporates discrete wavelet transform (DWT), Auto-Regressive Moving Average (ARMA)more » models, and Recurrent Neural Networks (RNN), while the RNN architecture is based on Nonlinear Auto-Regressive models with eXogenous inputs (NARX). The wavelet transform is utilized to decompose the solar power time series into a set of richer-behaved forming series for prediction. ARMA model is employed as a linear predictor while NARX is used as a nonlinear pattern recognition tool to estimate and compensate the error of wavelet-ARMA prediction. The proposed method is applied to the data captured from UCLA solar PV panels and the results are compared with some of the common and most recent solar power prediction methods. The results validate the effectiveness of the proposed approach and show a considerable improvement in the prediction precision.« less

  16. Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.

    PubMed

    Bühler, Jonas; von Lieres, Eric; Huber, Gregor J

    2018-01-01

    Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.

  17. 'TIME': A Web Application for Obtaining Insights into Microbial Ecology Using Longitudinal Microbiome Data.

    PubMed

    Baksi, Krishanu D; Kuntal, Bhusan K; Mande, Sharmila S

    2018-01-01

    Realization of the importance of microbiome studies, coupled with the decreasing sequencing cost, has led to the exponential growth of microbiome data. A number of these microbiome studies have focused on understanding changes in the microbial community over time. Such longitudinal microbiome studies have the potential to offer unique insights pertaining to the microbial social networks as well as their responses to perturbations. In this communication, we introduce a web based framework called 'TIME' (Temporal Insights into Microbial Ecology'), developed specifically to obtain meaningful insights from microbiome time series data. The TIME web-server is designed to accept a wide range of popular formats as input with options to preprocess and filter the data. Multiple samples, defined by a series of longitudinal time points along with their metadata information, can be compared in order to interactively visualize the temporal variations. In addition to standard microbiome data analytics, the web server implements popular time series analysis methods like Dynamic time warping, Granger causality and Dickey Fuller test to generate interactive layouts for facilitating easy biological inferences. Apart from this, a new metric for comparing metagenomic time series data has been introduced to effectively visualize the similarities/differences in the trends of the resident microbial groups. Augmenting the visualizations with the stationarity information pertaining to the microbial groups is utilized to predict the microbial competition as well as community structure. Additionally, the 'causality graph analysis' module incorporated in TIME allows predicting taxa that might have a higher influence on community structure in different conditions. TIME also allows users to easily identify potential taxonomic markers from a longitudinal microbiome analysis. We illustrate the utility of the web-server features on a few published time series microbiome data and demonstrate the ease with which it can be used to perform complex analysis.

  18. SUSTAINABILITY LOGISTICS BASING SCIENCE AND TECHNOLOGY OBJECTIVE DEMONSTRATION; SELECTED TECHNOLOGY ASSESSMENT

    DTIC Science & Technology

    2018-03-22

    generators by not running them as often and reducing wet-stacking. Force Projection: If the IPDs of the microgrid replace, but don’t add to, the number...decrease generator run time, reduce fuel consumption, enable silent operation, and provide power redundancy for military applications. Important...it requires some failsafe features – run out of water, drive out of the sun. o Integration was a challenge; series of valves to run this experiment

  19. A Multi-Scale Structural Health Monitoring Approach for Damage Detection, Diagnosis and Prognosis in Aerospace Structures

    DTIC Science & Technology

    2012-01-20

    ultrasonic Lamb waves to plastic strain and fatigue life. Theory was developed and validated to predict second harmonic generation for specific mode... Fatigue and damage generation and progression are processes consisting of a series of interrelated events that span large scales of space and time...strain and fatigue life A set of experiments were completed that worked to relate the acoustic nonlinearity measured with Lamb waves to both the

  20. Modeling and Simulation of the Economics of Mining in the Bitcoin Market

    PubMed Central

    Marchesi, Michele

    2016-01-01

    In January 3, 2009, Satoshi Nakamoto gave rise to the “Bitcoin Blockchain”, creating the first block of the chain hashing on his computer’s central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU’s generation. They are GPU’s, FPGA’s and ASIC’s generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU’s generation, the first with economic significance. The model reproduces some “stylized facts” found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network. PMID:27768691

  1. Influence of El Niño Southern Oscillation on global hydropower production

    NASA Astrophysics Data System (ADS)

    Ng, Jia Yi; Turner, Sean W. D.; Galelli, Stefano

    2017-03-01

    El Niño Southern Oscillation (ENSO) strongly influences the global climate system, affecting hydrology in many of the world’s river basins. This raises the prospect of ENSO-driven variability in global and regional hydroelectric power generation. Here we study these effects by generating time series of power production for 1593 hydropower dams, which collectively represent more than half of the world’s existing installed hydropower capacity. The time series are generated by forcing a detailed dam model with monthly-resolution, 20th century inflows—the model includes plant specifications, storage dynamics and realistic operating schemes, and runs irrespectively of the dam construction year. More than one third of simulated dams exhibit statistically significant annual energy production anomalies in at least one of the two ENSO phases of El Niño and La Niña. For most dams, the variability of relative anomalies in power production tends to be less than that of the forcing inflows—a consequence of dam design specifications, namely maximum turbine release rate and reservoir storage, which allows inflows to accumulate for power generation in subsequent dry years. Production is affected most prominently in Northwest United States, South America, Central America, the Iberian Peninsula, Southeast Asia and Southeast Australia. When aggregated globally, positive and negative energy production anomalies effectively cancel each other out, resulting in a weak and statistically insignificant net global anomaly for both ENSO phases.

  2. Quasi-static time-series simulation using OpenDSS in IEEE distribution feeder model with high PV penetration and its impact on solar forecasting

    NASA Astrophysics Data System (ADS)

    Mohammed, Touseef Ahmed Faisal

    Since 2000, renewable electricity installations in the United States (excluding hydropower) have more than tripled. Renewable electricity has grown at a compounded annual average of nearly 14% per year from 2000-2010. Wind, Concentrated Solar Power (CSP) and solar Photo Voltaic (PV) are the fastest growing renewable energy sectors. In 2010 in the U.S., solar PV grew over 71% and CSP grew by 18% from the previous year. Globally renewable electricity installations have more than quadrupled from 2000-2010. Solar PV generation grew by a factor of more than 28 between 2000 and 2010. The amount of CSP and solar PV installations are increasing on the distribution grid. These PV installations transmit electrical current from the load centers to the generating stations. But the transmission and distribution grid have been designed for uni-directional flow of electrical energy from generating stations to load centers. This causes imbalances in voltage and switchgear of the electrical circuitry. With the continuous rise in PV installations, analysis of voltage profile and penetration levels remain an active area of research. Standard distributed photovoltaic (PV) generators represented in simulation studies do not reflect the exact location and variability properties such as distance between interconnection points to substations, voltage regulators, solar irradiance and other environmental factors. Quasi-Static simulations assist in peak load planning hour and day ahead as it gives a time sequence analysis to help in generation allocation. Simulation models can be daily, hourly or yearly depending on duty cycle and dynamics of the system. High penetration of PV into the power grid changes the voltage profile and power flow dynamically in the distribution circuits due to the inherent variability of PV. There are a number of modeling and simulations tools available for the study of such high penetration PV scenarios. This thesis will specifically utilize OpenDSS, a open source Distribution System Simulator developed by Electric Power Research Institute, to simulate grid voltage profile with a large scale PV system under quasi-static time series considering variations of PV output in seconds, minutes, and the average daily load variations. A 13 bus IEEE distribution feeder model is utilized with distributed residential and commercial scale PV at different buses for simulation studies. Time series simulations are discussed for various modes of operation considering dynamic PV penetration at different time periods in a day. In addition, this thesis demonstrates simulations taking into account the presence of moving cloud for solar forecasting studies.

  3. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface.

    PubMed

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B; Almon, Richard R; DuBois, Debra C; Jusko, William J; Hoffman, Eric P

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp).

  4. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface

    PubMed Central

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B.; Almon, Richard R.; DuBois, Debra C.; Jusko, William J.; Hoffman, Eric P.

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp). PMID:14681485

  5. Generation and Evaluation of a Global Land Surface Phenology Product from Suomi-NPP VIIRS Observations

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Liu, L.; Yan, D.; Moon, M.; Liu, Y.; Henebry, G. M.; Friedl, M. A.; Schaaf, C.

    2017-12-01

    Land surface phenology (LSP) datasets have been produced from a variety of coarse spatial resolution satellite observations at both regional and global scales and spanning different time periods since 1982. However, the LSP product generated from NASA's MODerate Resolution Imaging Spectroradiometer (MODIS) data at a spatial resolution of 500m, which is termed Land Cover Dynamics (MCD12Q2), is the only global product operationally produced and freely accessible at annual time steps from 2001. Because MODIS instrument is aging and will be replaced by the Visible Infrared Imaging Radiometer Suite (VIIRS), this research focuses on the generation and evaluation of a global LSP product from Suomi-NPP VIIRS time series observations that provide continuity with the MCD12Q2 product. Specifically, we generate 500m VIIRS global LSP data using daily VIIRS Nadir BRDF (bidirectional reflectance distribution function)-Adjusted reflectances (NBAR) in combination with land surface temperature, snow cover, and land cover type as inputs. The product provides twelve phenological metrics (seven phenological dates and five phenological greenness magnitudes), along with six quality metrics characterizing the confidence and quality associated with phenology retrievals at each pixel. In this paper, we describe the input data and algorithms used to produce this new product, and investigate the impact of VIIRS data time series quality on phenology detections across various climate regimes and ecosystems. As part of our analysis, the VIIRS LSP is evaluated using PhenoCam imagery in North America and Asia, and using higher spatial resolution satellite observations from Landsat 8 over an agricultural area in the central USA. We also explore the impact of high frequency cloud cover on the VIIRS LSP product by comparing with phenology detected from the Advanced Himawari Imager (AHI) onboard Himawari-8. AHI is a new geostationary sensor that observes land surface every 10 minutes, which increases the ability to capture cloud-free observations relative to data collected from polar-orbiting satellites such as Suomi-NPP, thereby improving the quality of daily time series data in regions with heavy cloud cover. Finally, the VIIRS LSP is compared with MCD12Q2 data to investigate the continuity of long-term global LSP data records.

  6. Development of the general interpolants method for the CYBER 200 series of supercomputers

    NASA Technical Reports Server (NTRS)

    Stalnaker, J. F.; Robinson, M. A.; Spradley, L. W.; Kurzius, S. C.; Thoenes, J.

    1988-01-01

    The General Interpolants Method (GIM) is a 3-D, time-dependent, hybrid procedure for generating numerical analogs of the conservation laws. This study is directed toward the development and application of the GIM computer code for fluid dynamic research applications as implemented for the Cyber 200 series of supercomputers. An elliptic and quasi-parabolic version of the GIM code are discussed. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and an implicit finite difference scheme are also included.

  7. Operational data fusion framework for building frequent Landsat-like imagery in a cloudy region

    USDA-ARS?s Scientific Manuscript database

    An operational data fusion framework is built to generate dense time-series Landsat-like images for a cloudy region by fusing Moderate Resolution Imaging Spectroradiometer (MODIS) data products and Landsat imagery. The Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) is integrated in ...

  8. Correlation between Identification Accuracy and Response Confidence for Common Environmental Sounds

    DTIC Science & Technology

    set of environmental sounds with stimulus control and precision. The present study is one in a series of efforts to provide a baseline evaluation of a...sounds from six broad categories: household items, alarms, animals, human generated, mechanical, and vehicle sounds. Each sound was presented five times

  9. An empirical method for approximating stream baseflow time series using groundwater table fluctuations

    NASA Astrophysics Data System (ADS)

    Meshgi, Ali; Schmitter, Petra; Babovic, Vladan; Chui, Ting Fong May

    2014-11-01

    Developing reliable methods to estimate stream baseflow has been a subject of interest due to its importance in catchment response and sustainable watershed management. However, to date, in the absence of complex numerical models, baseflow is most commonly estimated using statistically derived empirical approaches that do not directly incorporate physically-meaningful information. On the other hand, Artificial Intelligence (AI) tools such as Genetic Programming (GP) offer unique capabilities to reduce the complexities of hydrological systems without losing relevant physical information. This study presents a simple-to-use empirical equation to estimate baseflow time series using GP so that minimal data is required and physical information is preserved. A groundwater numerical model was first adopted to simulate baseflow for a small semi-urban catchment (0.043 km2) located in Singapore. GP was then used to derive an empirical equation relating baseflow time series to time series of groundwater table fluctuations, which are relatively easily measured and are physically related to baseflow generation. The equation was then generalized for approximating baseflow in other catchments and validated for a larger vegetation-dominated basin located in the US (24 km2). Overall, this study used GP to propose a simple-to-use equation to predict baseflow time series based on only three parameters: minimum daily baseflow of the entire period, area of the catchment and groundwater table fluctuations. It serves as an alternative approach for baseflow estimation in un-gauged systems when only groundwater table and soil information is available, and is thus complementary to other methods that require discharge measurements.

  10. A Fresh Look at Spatio-Temporal Remote Sensing Data: Data Formats, Processing Flow, and Visualization

    NASA Astrophysics Data System (ADS)

    Gens, R.

    2017-12-01

    With increasing number of experimental and operational satellites in orbit, remote sensing based mapping and monitoring of the dynamic Earth has entered into the realm of `big data'. Just the Landsat series of satellites provide a near continuous archive of 45 years of data. The availability of such spatio-temporal datasets has created opportunities for long-term monitoring diverse features and processes operating on the Earth's terrestrial and aquatic systems. Processes such as erosion, deposition, subsidence, uplift, evapotranspiration, urbanization, land-cover regime shifts can not only be monitored and change can be quantified using time-series data analysis. This unique opportunity comes with new challenges in management, analysis, and visualization of spatio-temporal datasets. Data need to be stored in a user-friendly format, and relevant metadata needs to be recorded, to allow maximum flexibility for data exchange and use. Specific data processing workflows need to be defined to support time-series analysis for specific applications. Value-added data products need to be generated keeping in mind the needs of the end-users, and using best practices in complex data visualization. This presentation systematically highlights the various steps for preparing spatio-temporal remote sensing data for time series analysis. It showcases a prototype workflow for remote sensing based change detection that can be generically applied while preserving the application-specific fidelity of the datasets. The prototype includes strategies for visualizing change over time. This has been exemplified using a time-series of optical and SAR images for visualizing the changing glacial, coastal, and wetland landscapes in parts of Alaska.

  11. Why the null matters: statistical tests, random walks and evolution.

    PubMed

    Sheets, H D; Mitchell, C E

    2001-01-01

    A number of statistical tests have been developed to determine what type of dynamics underlie observed changes in morphology in evolutionary time series, based on the pattern of change within the time series. The theory of the 'scaled maximum', the 'log-rate-interval' (LRI) method, and the Hurst exponent all operate on the same principle of comparing the maximum change, or rate of change, in the observed dataset to the maximum change expected of a random walk. Less change in a dataset than expected of a random walk has been interpreted as indicating stabilizing selection, while more change implies directional selection. The 'runs test' in contrast, operates on the sequencing of steps, rather than on excursion. Applications of these tests to computer generated, simulated time series of known dynamical form and various levels of additive noise indicate that there is a fundamental asymmetry in the rate of type II errors of the tests based on excursion: they are all highly sensitive to noise in models of directional selection that result in a linear trend within a time series, but are largely noise immune in the case of a simple model of stabilizing selection. Additionally, the LRI method has a lower sensitivity than originally claimed, due to the large range of LRI rates produced by random walks. Examination of the published results of these tests show that they have seldom produced a conclusion that an observed evolutionary time series was due to directional selection, a result which needs closer examination in light of the asymmetric response of these tests.

  12. Fast generation of computer-generated hologram by graphics processing unit

    NASA Astrophysics Data System (ADS)

    Matsuda, Sho; Fujii, Tomohiko; Yamaguchi, Takeshi; Yoshikawa, Hiroshi

    2009-02-01

    A cylindrical hologram is well known to be viewable in 360 deg. This hologram depends high pixel resolution.Therefore, Computer-Generated Cylindrical Hologram (CGCH) requires huge calculation amount.In our previous research, we used look-up table method for fast calculation with Intel Pentium4 2.8 GHz.It took 480 hours to calculate high resolution CGCH (504,000 x 63,000 pixels and the average number of object points are 27,000).To improve quality of CGCH reconstructed image, fringe pattern requires higher spatial frequency and resolution.Therefore, to increase the calculation speed, we have to change the calculation method. In this paper, to reduce the calculation time of CGCH (912,000 x 108,000 pixels), we employ Graphics Processing Unit (GPU).It took 4,406 hours to calculate high resolution CGCH on Xeon 3.4 GHz.Since GPU has many streaming processors and a parallel processing structure, GPU works as the high performance parallel processor.In addition, GPU gives max performance to 2 dimensional data and streaming data.Recently, GPU can be utilized for the general purpose (GPGPU).For example, NVIDIA's GeForce7 series became a programmable processor with Cg programming language.Next GeForce8 series have CUDA as software development kit made by NVIDIA.Theoretically, calculation ability of GPU is announced as 500 GFLOPS. From the experimental result, we have achieved that 47 times faster calculation compared with our previous work which used CPU.Therefore, CGCH can be generated in 95 hours.So, total time is 110 hours to calculate and print the CGCH.

  13. Near-Surface Flow Fields Deduced Using Correlation Tracking and Time-Distance Analysis

    NASA Technical Reports Server (NTRS)

    DeRosa, Marc; Duvall, T. L., Jr.; Toomre, Juri

    1999-01-01

    Near-photospheric flow fields on the Sun are deduced using two independent methods applied to the same time series of velocity images observed by SOI-MDI on SOHO. Differences in travel times between f modes entering and leaving each pixel measured using time-distance helioseismology are used to determine sites of supergranular outflows. Alternatively, correlation tracking analysis of mesogranular scales of motion applied to the same time series is used to deduce the near-surface flow field. These two approaches provide the means to assess the patterns and evolution of horizontal flows on supergranular scales even near disk center, which is not feasible with direct line-of-sight Doppler measurements. We find that the locations of the supergranular outflows seen in flow fields generated from correlation tracking coincide well with the locations of the outflows determined from the time-distance analysis, with a mean correlation coefficient after smoothing of bar-r(sub s) = 0.840. Near-surface velocity field measurements can used to study the evolution of the supergranular network, as merging and splitting events are observed to occur in these images. The data consist of one 2048-minute time series of high-resolution (0.6" pixels) line-of-sight velocity images taken by MDI on 1997 January 16-18 at a cadence of one minute.

  14. Generation of Long-time Complex Signals for Testing the Instruments for Detection of Voltage Quality Disturbances

    NASA Astrophysics Data System (ADS)

    Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir

    2018-04-01

    Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.

  15. Satellite image time series simulation for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Guo, Tao

    2014-11-01

    The performance of environmental monitoring heavily depends on the availability of consecutive observation data and it turns out an increasing demand in remote sensing community for satellite image data in the sufficient resolution with respect to both spatial and temporal requirements, which appear to be conflictive and hard to tune tradeoffs. Multiple constellations could be a solution if without concerning cost, and thus it is so far interesting but very challenging to develop a method which can simultaneously improve both spatial and temporal details. There are some research efforts to deal with the problem from various aspects, a type of approaches is to enhance the spatial resolution using techniques of super resolution, pan-sharpen etc. which can produce good visual effects, but mostly cannot preserve spectral signatures and result in losing analytical value. Another type is to fill temporal frequency gaps by adopting time interpolation, which actually doesn't increase informative context at all. In this paper we presented a novel method to generate satellite images in higher spatial and temporal details, which further enables satellite image time series simulation. Our method starts with a pair of high-low resolution data set, and then a spatial registration is done by introducing LDA model to map high and low resolution pixels correspondingly. Afterwards, temporal change information is captured through a comparison of low resolution time series data, and the temporal change is then projected onto high resolution data plane and assigned to each high resolution pixel referring the predefined temporal change patterns of each type of ground objects to generate a simulated high resolution data. A preliminary experiment shows that our method can simulate a high resolution data with a good accuracy. We consider the contribution of our method is to enable timely monitoring of temporal changes through analysis of low resolution images time series only, and usage of costly high resolution data can be reduced as much as possible, and it presents an efficient solution with great cost performance to build up an economically operational monitoring service for environment, agriculture, forest, land use investigation, and other applications.

  16. Characterizing the Responses of Land Surface Phenology to the Rainy Season in the Congo Basin

    NASA Astrophysics Data System (ADS)

    Yan, D.; Zhang, X.; Yu, Y.; Guo, W.

    2016-12-01

    The most pronounced climate changes across the Congo Basin are predicted to be the changes in the timing and amount of rainfall in the coming decades. It is expected to alter a significant shift in land surface phenology (LSP), so that an understanding of its responses to the rainy season can benefit the predictions of changes in the Congolese ecosystem under future climate change scenarios. However, quantitative analyses has not been performed to investigate the relationship between LSP and the rainy season in the Congo Basin. Based on 30-minute observations acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard the METEOSAT Second Generation series of geostationary satellites, we generated a time series of three-day angularly corrected Two-band Enhanced Vegetation Index (EVI2) between 2006 and 2013. We then reconstructed EVI2 temporal trajectories and retrieved the timings and magnitudes of LSP using the hybrid piecewise logistic model. We further associated the phenological timings and magnitudes with those of the rainy seasons derived from the three-hourly rainfall rate measurements provided by the Tropical Rainfall Measurement Mission Product 3B42. Finally, we investigated the impacts of tree cover on the timing discrepancy between LSP and the rainy season. Results show that LSP was strongly associated with the rainy season. Specifically, the SEVIRI EVI2 time series reveals that two annual canopy greenness cycles (CGC) occur in the Congolese rainforests whereas a single annual CGC with strong seasonal amplitude was identified for other land cover types. The spatial shifts in CGC timings closely follow those of the rainy season controlled by the seasonal migration of the Intertropical Convergence Zone. However, the tree cover controls the timing discrepancy between LSP and the rainy season. The accumulated vegetation greenness during a CGC shows a strong dependence on the total rainfall received.

  17. A morphological perceptron with gradient-based learning for Brazilian stock market forecasting.

    PubMed

    Araújo, Ricardo de A

    2012-04-01

    Several linear and non-linear techniques have been proposed to solve the stock market forecasting problem. However, a limitation arises from all these techniques and is known as the random walk dilemma (RWD). In this scenario, forecasts generated by arbitrary models have a characteristic one step ahead delay with respect to the time series values, so that, there is a time phase distortion in stock market phenomena reconstruction. In this paper, we propose a suitable model inspired by concepts in mathematical morphology (MM) and lattice theory (LT). This model is generically called the increasing morphological perceptron (IMP). Also, we present a gradient steepest descent method to design the proposed IMP based on ideas from the back-propagation (BP) algorithm and using a systematic approach to overcome the problem of non-differentiability of morphological operations. Into the learning process we have included a procedure to overcome the RWD, which is an automatic correction step that is geared toward eliminating time phase distortions that occur in stock market phenomena. Furthermore, an experimental analysis is conducted with the IMP using four complex non-linear problems of time series forecasting from the Brazilian stock market. Additionally, two natural phenomena time series are used to assess forecasting performance of the proposed IMP with other non financial time series. At the end, the obtained results are discussed and compared to results found using models recently proposed in the literature. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Reconstructions of parameters of radiophysical chaotic generator with delayed feedback from short time series

    NASA Astrophysics Data System (ADS)

    Ishbulatov, Yu. M.; Karavaev, A. S.; Kiselev, A. R.; Semyachkina-Glushkovskaya, O. V.; Postnov, D. E.; Bezruchko, B. P.

    2018-04-01

    A method for the reconstruction of time-delayed feedback system is investigated, which is based on the detection of synchronous response of a slave time-delay system with respect to the driving from the master system under study. The structure of the driven system is similar to the structure of the studied time-delay system, but the feedback circuit is broken in the driven system. The method efficiency is tested using short and noisy data gained from an electronic chaotic oscillator with time-delayed feedback.

  19. Program Helps Generate And Manage Graphics

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Living Color Frame Maker (LCFM) computer program generates computer-graphics frames. Graphical frames saved as text files, in readable and disclosed format, easily retrieved and manipulated by user programs for wide range of real-time visual information applications. LCFM implemented in frame-based expert system for visual aids in management of systems. Monitoring, diagnosis, and/or control, diagrams of circuits or systems brought to "life" by use of designated video colors and intensities to symbolize status of hardware components (via real-time feedback from sensors). Status of systems can be displayed. Written in C++ using Borland C++ 2.0 compiler for IBM PC-series computers and compatible computers running MS-DOS.

  20. ReTrOS: a MATLAB toolbox for reconstructing transcriptional activity from gene and protein expression data.

    PubMed

    Minas, Giorgos; Momiji, Hiroshi; Jenkins, Dafyd J; Costa, Maria J; Rand, David A; Finkenstädt, Bärbel

    2017-06-26

    Given the development of high-throughput experimental techniques, an increasing number of whole genome transcription profiling time series data sets, with good temporal resolution, are becoming available to researchers. The ReTrOS toolbox (Reconstructing Transcription Open Software) provides MATLAB-based implementations of two related methods, namely ReTrOS-Smooth and ReTrOS-Switch, for reconstructing the temporal transcriptional activity profile of a gene from given mRNA expression time series or protein reporter time series. The methods are based on fitting a differential equation model incorporating the processes of transcription, translation and degradation. The toolbox provides a framework for model fitting along with statistical analyses of the model with a graphical interface and model visualisation. We highlight several applications of the toolbox, including the reconstruction of the temporal cascade of transcriptional activity inferred from mRNA expression data and protein reporter data in the core circadian clock in Arabidopsis thaliana, and how such reconstructed transcription profiles can be used to study the effects of different cell lines and conditions. The ReTrOS toolbox allows users to analyse gene and/or protein expression time series where, with appropriate formulation of prior information about a minimum of kinetic parameters, in particular rates of degradation, users are able to infer timings of changes in transcriptional activity. Data from any organism and obtained from a range of technologies can be used as input due to the flexible and generic nature of the model and implementation. The output from this software provides a useful analysis of time series data and can be incorporated into further modelling approaches or in hypothesis generation.

  1. Endogenous time-varying risk aversion and asset returns.

    PubMed

    Berardi, Michele

    2016-01-01

    Stylized facts about statistical properties for short horizon returns in financial markets have been identified in the literature, but a satisfactory understanding for their manifestation is yet to be achieved. In this work, we show that a simple asset pricing model with representative agent is able to generate time series of returns that replicate such stylized facts if the risk aversion coefficient is allowed to change endogenously over time in response to unexpected excess returns under evolutionary forces. The same model, under constant risk aversion, would instead generate returns that are essentially Gaussian. We conclude that an endogenous time-varying risk aversion represents a very parsimonious way to make the model match real data on key statistical properties, and therefore deserves careful consideration from economists and practitioners alike.

  2. Bessel function expansion to reduce the calculation time and memory usage for cylindrical computer-generated holograms.

    PubMed

    Sando, Yusuke; Barada, Daisuke; Jackin, Boaz Jessie; Yatagai, Toyohiko

    2017-07-10

    This study proposes a method to reduce the calculation time and memory usage required for calculating cylindrical computer-generated holograms. The wavefront on the cylindrical observation surface is represented as a convolution integral in the 3D Fourier domain. The Fourier transformation of the kernel function involving this convolution integral is analytically performed using a Bessel function expansion. The analytical solution can drastically reduce the calculation time and the memory usage without any cost, compared with the numerical method using fast Fourier transform to Fourier transform the kernel function. In this study, we present the analytical derivation, the efficient calculation of Bessel function series, and a numerical simulation. Furthermore, we demonstrate the effectiveness of the analytical solution through comparisons of calculation time and memory usage.

  3. Unveiling signatures of interdecadal climate changes by Hilbert analysis

    NASA Astrophysics Data System (ADS)

    Zappalà, Dario; Barreiro, Marcelo; Masoller, Cristina

    2017-04-01

    A recent study demonstrated that, in a class of networks of oscillators, the optimal network reconstruction from dynamics is obtained when the similarity analysis is performed not on the original dynamical time series, but on transformed series obtained by Hilbert transform. [1] That motivated us to use Hilbert transform to study another kind of (in a broad sense) "oscillating" series, such as the series of temperature. Actually, we found that Hilbert analysis of SAT (Surface Air Temperature) time series uncovers meaningful information about climate and is therefore a promising tool for the study of other climatological variables. [2] In this work we analysed a large dataset of SAT series, performing Hilbert transform and further analysis with the goal of finding signs of climate change during the analysed period. We used the publicly available ERA-Interim dataset, containing reanalysis data. [3] In particular, we worked on daily SAT time series, from year 1979 to 2015, in 16380 points arranged over a regular grid on the Earth surface. From each SAT time series we calculate the anomaly series and also, by using the Hilbert transform, we calculate the instantaneous amplitude and instantaneous frequency series. Our first approach is to calculate the relative variation: the difference between the average value on the last 10 years and the average value on the first 10 years, divided by the average value over all the analysed period. We did this calculations on our transformed series: frequency and amplitude, both with average values and standard deviation values. Furthermore, to have a comparison with an already known analysis methods, we did these same calculations on the anomaly series. We plotted these results as maps, where the colour of each site indicates the value of its relative variation. Finally, to gain insight in the interpretation of our results over real SAT data, we generated synthetic sinusoidal series with various levels of additive noise. By applying Hilbert analysis to the synthetic data, we uncovered a clear trend between mean amplitude and mean frequency: as the noise level grows, the amplitude increases while the frequency decreases. Research funded in part by AGAUR (Generalitat de Catalunya), EU LINC project (Grant No. 289447) and Spanish MINECO (FIS2015-66503-C3-2-P).

  4. Application of SDSM and LARS-WG for simulating and downscaling of rainfall and temperature

    NASA Astrophysics Data System (ADS)

    Hassan, Zulkarnain; Shamsudin, Supiah; Harun, Sobri

    2014-04-01

    Climate change is believed to have significant impacts on the water basin and region, such as in a runoff and hydrological system. However, impact studies on the water basin and region are difficult, since general circulation models (GCMs), which are widely used to simulate future climate scenarios, do not provide reliable hours of daily series rainfall and temperature for hydrological modeling. There is a technique named as "downscaling techniques", which can derive reliable hour of daily series rainfall and temperature due to climate scenarios from the GCMs output. In this study, statistical downscaling models are used to generate the possible future values of local meteorological variables such as rainfall and temperature in the selected stations in Peninsular of Malaysia. The models are: (1) statistical downscaling model (SDSM) that utilized the regression models and stochastic weather generators and (2) Long Ashton research station weather generator (LARS-WG) that only utilized the stochastic weather generators. The LARS-WG and SDSM models obviously are feasible methods to be used as tools in quantifying effects of climate change condition in a local scale. SDSM yields a better performance compared to LARS-WG, except SDSM is slightly underestimated for the wet and dry spell lengths. Although both models do not provide identical results, the time series generated by both methods indicate a general increasing trend in the mean daily temperature values. Meanwhile, the trend of the daily rainfall is not similar to each other, with SDSM giving a relatively higher change of annual rainfall compared to LARS-WG.

  5. Decadal GPS Time Series and Velocity Fields Spanning the North American Continent and Beyond: New Data Products, Cyberinfrastructure and Case Studies from the EarthScope Plate Boundary Observatory (PBO) and Other Regional Networks

    NASA Astrophysics Data System (ADS)

    Phillips, D. A.; Herring, T.; Melbourne, T. I.; Murray, M. H.; Szeliga, W. M.; Floyd, M.; Puskas, C. M.; King, R. W.; Boler, F. M.; Meertens, C. M.; Mattioli, G. S.

    2017-12-01

    The Geodesy Advancing Geosciences and EarthScope (GAGE) Facility, operated by UNAVCO, provides a diverse suite of geodetic data, derived products and cyberinfrastructure services to support community Earth science research and education. GPS data and products including decadal station position time series and velocities are provided for 2000+ continuous GPS stations from the Plate Boundary Observatory (PBO) and other networks distributed throughout the high Arctic, North America, and Caribbean regions. The position time series contain a multitude of signals in addition to the secular motions, including coseismic and postseismic displacements, interseismic strain accumulation, and transient signals associated with hydrologic and other processes. We present our latest velocity field solutions, new time series offset estimate products, and new time series examples associated with various phenomena. Position time series, and the signals they contain, are inherently dependent upon analysis parameters such as network scaling and reference frame realization. The estimation of scale changes for example, a common practice, has large impacts on vertical motion estimates. GAGE/PBO velocities and time series are currently provided in IGS (IGb08) and North America (NAM08, IGb08 rotated to a fixed North America Plate) reference frames. We are reprocessing all data (1996 to present) as part of the transition from IGb08 to IGS14 that began in 2017. New NAM14 and IGS14 data products are discussed. GAGE/PBO GPS data products are currently generated using onsite computing clusters. As part of an NSF funded EarthCube Building Blocks project called "Deploying MultiFacility Cyberinfrastructure in Commercial and Private Cloud-based Systems (GeoSciCloud)", we are investigating performance, cost, and efficiency differences between local computing resources and cloud based resources. Test environments include a commercial cloud provider (Amazon/AWS), NSF cloud-like infrastructures within XSEDE (TACC, the Texas Advanced Computing Center), and in-house cyberinfrastructures. Preliminary findings from this effort are presented. Web services developed by UNAVCO to facilitate the discovery, customization and dissemination of GPS data and products are also presented.

  6. Generational Influences in Academic Emergency Medicine: Teaching and Learning, Mentoring, and Technology (Part I)

    PubMed Central

    Mohr, Nicholas M.; Moreno-Walton, Lisa; Mills, Angela M.; Brunett, Patrick H.; Promes, Susan B.

    2010-01-01

    For the first time in history, four generations are working together – Traditionalists, Baby Boomers, Generation Xers, and Millennials. Members of each generation carry with them a unique perspective of the world and interact differently with those around them. Through a review of the literature and consensus by modified Delphi methodology of the Society for Academic Emergency Medicine (SAEM) Aging and Generational Issues Task Force, the authors have developed this two-part series to address generational issues present in academic emergency medicine (EM). Understanding generational characteristics and mitigating strategies can help address some common issues encountered in academic EM. Through recognition of the unique characteristics of each of the generations with respect to teaching and learning, mentoring, and technology, academicians have the opportunity to strategically optimize interactions with one another. PMID:21314779

  7. Rainfall Stochastic models

    NASA Astrophysics Data System (ADS)

    Campo, M. A.; Lopez, J. J.; Rebole, J. P.

    2012-04-01

    This work was carried out in north of Spain. San Sebastian A meteorological station, where there are available precipitation records every ten minutes was selected. Precipitation data covers from October of 1927 to September of 1997. Pulse models describe the temporal process of rainfall as a succession of rainy cells, main storm, whose origins are distributed in time according to a Poisson process and a secondary process that generates a random number of cells of rain within each storm. Among different pulse models, the Bartlett-Lewis was used. On the other hand, alternative renewal processes and Markov chains describe the way in which the process will evolve in the future depending only on the current state. Therefore they are nor dependant on past events. Two basic processes are considered when describing the occurrence of rain: the alternation of wet and dry periods and temporal distribution of rainfall in each rain event, which determines the rainwater collected in each of the intervals that make up the rain. This allows the introduction of alternative renewal processes and Markov chains of three states, where interstorm time is given by either of the two dry states, short or long. Thus, the stochastic model of Markov chains tries to reproduce the basis of pulse models: the succession of storms, each one composed for a series of rain, separated by a short interval of time without theoretical complexity of these. In a first step, we analyzed all variables involved in the sequential process of the rain: rain event duration, event duration of non-rain, average rainfall intensity in rain events, and finally, temporal distribution of rainfall within the rain event. Additionally, for pulse Bartlett-Lewis model calibration, main descriptive statistics were calculated for each month, considering the process of seasonal rainfall in each month. In a second step, both models were calibrated. Finally, synthetic series were simulated with calibration parameters; series were recorded every ten minutes and hourly, aggregated. Preliminary results show adequate simulation of the main features of rain. Main variables are well simulated for time series of ten minutes, also over one hour precipitation time series, which are those that generate higher rainfall hydrologic design. For coarse scales, less than one hour, rainfall durations are not appropriate under the simulation. A hypothesis may be an excessive number of simulated events, which causes further fragmentation of storms, resulting in an excess of rain "short" (less than 1 hour), and therefore also among rain events, compared with the ones that occur in the actual series.

  8. Quality Assessment of Collection 6 MODIS Atmospheric Science Products

    NASA Astrophysics Data System (ADS)

    Manoharan, V. S.; Ridgway, B.; Platnick, S. E.; Devadiga, S.; Mauoka, E.

    2015-12-01

    Since the launch of the NASA Terra and Aqua satellites in December 1999 and May 2002, respectively, atmosphere and land data acquired by the MODIS (Moderate Resolution Imaging Spectroradiometer) sensor on-board these satellites have been reprocessed five times at the MODAPS (MODIS Adaptive Processing System) located at NASA GSFC. The global land and atmosphere products use science algorithms developed by the NASA MODIS science team investigators. MODAPS completed Collection 6 reprocessing of MODIS Atmosphere science data products in April 2015 and is currently generating the Collection 6 products using the latest version of the science algorithms. This reprocessing has generated one of the longest time series of consistent data records for understanding cloud, aerosol, and other constituents in the earth's atmosphere. It is important to carefully evaluate and assess the quality of this data and remove any artifacts to maintain a useful climate data record. Quality Assessment (QA) is an integral part of the processing chain at MODAPS. This presentation will describe the QA approaches and tools adopted by the MODIS Land/Atmosphere Operational Product Evaluation (LDOPE) team to assess the quality of MODIS operational Atmospheric products produced at MODAPS. Some of the tools include global high resolution images, time series analysis and statistical QA metrics. The new high resolution global browse images with pan and zoom have provided the ability to perform QA of products in real time through synoptic QA on the web. This global browse generation has been useful in identifying production error, data loss, and data quality issues from calibration error, geolocation error and algorithm performance. A time series analysis for various science datasets in the Level-3 monthly product was recently developed for assessing any long term drifts in the data arising from instrument errors or other artifacts. This presentation will describe and discuss some test cases from the recently processed C6 products. We will also describe the various tools and approaches developed to verify and assess the algorithm changes implemented by the science team to address known issues in the products and improve the quality of the products.

  9. Validation of a Monte Carlo Simulation of Binary Time Series.

    DTIC Science & Technology

    1981-09-18

    the probability distribution corresponding to the population from which the n sample vectors are generated. Simple unbiased estimators were chosen for...Cowcept A s*us Agew Bethesd, Marylnd H. L. Wauom Am D. RoQuE SymMS Reserch Brach , p" Ssms Delsbian September 18, 1981 DTIC EL E C T E SEP 24 =I98ST...is generated from the sample of such vectors produced by several independent replications of the Monte Carlo simulation. Then the validity of the

  10. FELIX-1.0: A finite element solver for the time dependent generator coordinate method with the Gaussian overlap approximation

    DOE PAGES

    Regnier, D.; Verriere, M.; Dubray, N.; ...

    2015-11-30

    In this study, we describe the software package FELIX that solves the equations of the time-dependent generator coordinate method (TDGCM) in NN-dimensions (N ≥ 1) under the Gaussian overlap approximation. The numerical resolution is based on the Galerkin finite element discretization of the collective space and the Crank–Nicolson scheme for time integration. The TDGCM solver is implemented entirely in C++. Several additional tools written in C++, Python or bash scripting language are also included for convenience. In this paper, the solver is tested with a series of benchmarks calculations. We also demonstrate the ability of our code to handle amore » realistic calculation of fission dynamics.« less

  11. Modelling the initial phase of an epidemic using incidence and infection network data: 2009 H1N1 pandemic in Israel as a case study

    PubMed Central

    Katriel, G.; Yaari, R.; Huppert, A.; Roll, U.; Stone, L.

    2011-01-01

    This paper presents new computational and modelling tools for studying the dynamics of an epidemic in its initial stages that use both available incidence time series and data describing the population's infection network structure. The work is motivated by data collected at the beginning of the H1N1 pandemic outbreak in Israel in the summer of 2009. We formulated a new discrete-time stochastic epidemic SIR (susceptible-infected-recovered) model that explicitly takes into account the disease's specific generation-time distribution and the intrinsic demographic stochasticity inherent to the infection process. Moreover, in contrast with many other modelling approaches, the model allows direct analytical derivation of estimates for the effective reproductive number (Re) and of their credible intervals, by maximum likelihood and Bayesian methods. The basic model can be extended to include age–class structure, and a maximum likelihood methodology allows us to estimate the model's next-generation matrix by combining two types of data: (i) the incidence series of each age group, and (ii) infection network data that provide partial information of ‘who-infected-who’. Unlike other approaches for estimating the next-generation matrix, the method developed here does not require making a priori assumptions about the structure of the next-generation matrix. We show, using a simulation study, that even a relatively small amount of information about the infection network greatly improves the accuracy of estimation of the next-generation matrix. The method is applied in practice to estimate the next-generation matrix from the Israeli H1N1 pandemic data. The tools developed here should be of practical importance for future investigations of epidemics during their initial stages. However, they require the availability of data which represent a random sample of the real epidemic process. We discuss the conditions under which reporting rates may or may not influence our estimated quantities and the effects of bias. PMID:21247949

  12. Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing

    NASA Astrophysics Data System (ADS)

    Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa

    2017-02-01

    Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture—for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments—as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series—daily Poaceae pollen concentrations over the period 2006-2014—was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.

  13. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  14. End-group characterisation of poly(propylene glycol)s by means of electrospray ionisation-tandem mass spectrometry (ESI-MS/MS).

    PubMed

    Jackson, Anthony T; Slade, Susan E; Thalassinos, Konstantinos; Scrivens, James H

    2008-10-01

    The end-group functionalisation of a series of poly(propylene glycol)s has been characterised by means of electrospray ionisation-tandem mass spectrometry (ESI-MS/MS). A series of peaks with mass-to-charge ratios that are close to that of the precursor ion were used to generate information on the end-group functionalities of the poly(propylene glycol)s. Fragment ions resulting from losses of both of the end groups were noted from some of the samples. An example is presented of how software can be used to significantly reduce the length of time involved in data interpretation (which is typically the most time-consuming part of the analysis).

  15. Enhancements to the Branched Lagrangian Transport Modeling System

    USGS Publications Warehouse

    Jobson, Harvey E.

    1997-01-01

    The Branched Lagrangian Transport Model (BLTM) has received wide use within the U.S. Geological Survey over the past 10 years. This report documents the enhancements and modifications that have been made to this modeling system since it was first introduced. The programs in the modeling system are arranged into five levels?programs to generate time-series of meteorological data (EQULTMP, SOLAR), programs to process time-series data (INTRP, MRG), programs to build input files for transport model (BBLTM, BQUAL2E), the model with defined reaction kinetics (BLTM, QUAL2E), and post processor plotting programs (CTPLT, CXPLT). An example application is presented to illustrate how the modeling system can be used to simulate 10 water-quality constituents in the Chattahoochee River below Atlanta, Georgia.

  16. Time Series Analysis of Technology Trends based on the Internet Resources

    NASA Astrophysics Data System (ADS)

    Kobayashi, Shin-Ichi; Shirai, Yasuyuki; Hiyane, Kazuo; Kumeno, Fumihiro; Inujima, Hiroshi; Yamauchi, Noriyoshi

    Information technology is increasingly important in recent years for the development of our society. IT has brought many changes to everything in our society with incredible speed. Hence, when we investigate R & D themes or plan business strategies in IT, we must understand overall situation around the target technology area besides technology itself. Especially it is crucial to understand overall situation as time series to know what will happen in the near future in the target area. For this purpose, we developed a method to generate Multiple-phased trend maps automatically based on the Internet content. Furthermore, we introduced quantitative indicators to analyze near future possible changes. According to the evaluation of this method we got successful and interesting results.

  17. Single center experience with third-generation cryosurgery for management of organ-confined prostate cancer: critical evaluation of short-term outcomes, complications, and patient quality of life.

    PubMed

    Hubosky, Scott G; Fabrizio, Michael D; Schellhammer, Paul F; Barone, Bethany B; Tepera, Christopher M; Given, Robert W

    2007-12-01

    Technical refinements such as improved ultrasonographic localization and the routine use of urethral warmers and small-gauge needle delivery systems have renewed interest in cryosurgical treatment as a minimally invasive option for selected patients with localized prostate cancer. Only three reports of quality of life (QoL) in prostate cryoablation exist, and none report on patients treated with third-generation cryoablative technology. We critically examine our initial series of consecutive patients at a single institution undergoing primary third-generation cryosurgical treatment of localized prostate cancer with respect to treatment outcome, morbidity profile, and QoL parameters. To our knowledge, this is the first QoL report on third-generation cryoablation of the prostate. We retrospectively review the records of 89 consecutive patients with median followup of 11 months (1-32) who have undergone third-generation cryosurgical ablation of the prostate as primary treatment for localized prostate cancer with intention to cure. Patients were risk stratified according to preprocedural parameters of prostate-specific antigen (PSA), clinical stage, and Gleason score. PSA trends were recorded and treatment effectiveness was observed using different definitions of biochemical failure. Charts were reviewed for postprocedure complications. Quality of life was measured prospectively using the University of California, Los Angeles, Prostate Cancer Index as well as American Urological Association symptom scores. We compare a percent of baseline score (%BS) for various domains between our series of patients treated with primary cryoablation with a series of patients undergoing brachytherapy for localized prostate cancer. Treatment success was defined by achievement of a PSA nadir of < or =0.1 ng/mL and by biochemical disease-free survival (BDFS) assessed with both a PSA threshold of < or =0.4 ng/dL over time and the American Society for Therapeutic Radiology and Oncology (ASTRO) definition of three consecutive rises in PSA. According to risk stratification, 86%, 81.5%, and 78% of low-, intermediate-, and high-risk patients, respectively, achieved a PSA nadir of < or =0.1 ng/mL. Overall, at 12 months follow-up, 94% of patients achieved BDFS using ASTRO criteria while 70% achieved BDFS using a PSA threshold of < or =0.4 ng/mL. With risk stratification, 74%, 70%, and 60% of low-, intermediate-, and high-risk patients, respectively, achieved BDFS defined by PSA threshold of < or =0.4 ng/mL. Complications were rare. The response rate for Health Related Quality of Life (HRQoL) questionnaires was 71% for cryoablation patients and 51% for brachytherapy patients. At 12 months follow-up, patients undergoing cryoablation on average achieved urinary and bowel domain scores comparable to baseline, but sexual domains remained well below baseline. When compared with a brachytherapy series with better baseline sexual function (P = 0.04) and urinary function (P = 0.03), cryotherapy patients experienced more negative impact on sexual function steadily for up to 12 months (P = 0.02). Urinary function was similar between the groups until 18 months, at which time cryoablation patients fared better (P = 0.01); this was sustained up to 24 months (P = 0.04). Treatment success with cryosurgery varies with definition; however, our results are comparable to other series with regard to short-term cancer control. Complication rates in this series of third-generation cryosurgical patients are low. QoL characteristics of third-generation cryoablation are similar to those described in second-generation cryoablation series. Compared with brachytherapy, cryotherapy results in less irritative and obstructive voiding symptoms in the early post-treatment period and may improve urinary function up to 24 months after treatment. In a small group of older patients with baseline erectile dysfunction undergoing cryoablation, sexual function returns to 20% of its baseline value with up to 12 months follow-up.

  18. Local processes and regional patterns - Interpreting a multi-decadal altimetry record of Greenland Ice Sheet changes

    NASA Astrophysics Data System (ADS)

    Csatho, B. M.; Schenk, A. F.; Babonis, G. S.; van den Broeke, M. R.; Kuipers Munneke, P.; van der Veen, C. J.; Khan, S. A.; Porter, D. F.

    2016-12-01

    This study presents a new, comprehensive reconstruction of Greenland Ice Sheet elevation changes, generated using the Surface Elevation And Change detection (SERAC) approach. 35-year long elevation-change time series (1980-2015) were obtained at more than 150,000 locations from observations acquired by NASA's airborne and spaceborne laser altimeters (ATM, LVIS, ICESat), PROMICE laser altimetry data (2007-2011) and a DEM covering the ice sheet margin derived from stereo aerial photographs (1970s-80s). After removing the effect of Glacial Isostatic Adjustment (GIA) and the elastic crustal response to changes in ice loading, the time series were partitioned into changes due to surface processes and ice dynamics and then converted into mass change histories. Using gridded products, we examined ice sheet elevation, and mass change patterns, and compared them with other estimates at different scales from individual outlet glaciers through large drainage basins, on to the entire ice sheet. Both the SERAC time series and the grids derived from these time series revealed significant spatial and temporal variations of dynamic mass loss and widespread intermittent thinning, indicating the complexity of ice sheet response to climate forcing. To investigate the regional and local controls of ice dynamics, we examined thickness change time series near outlet glacier grounding lines. Changes on most outlet glaciers were consistent with one or more episodes of dynamic thinning that propagates upstream from the glacier terminus. The spatial pattern of the onset, duration, and termination of these dynamic thinning events suggest a regional control, such as warming ocean and air temperatures. However, the intricate spatiotemporal pattern of dynamic thickness change suggests that, regardless of the forcing responsible for initial glacier acceleration and thinning, the response of individual glaciers is modulated by local conditions. We use statistical methods, such as principal component analysis and multivariate regression to analyze the dynamic ice-thickness change time series derived by SERAC and to investigate the primary forcings and controls on outlet glacier changes.

  19. Long-term retrospective analysis of mackerel spawning in the North Sea: a new time series and modeling approach to CPR data.

    PubMed

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock.

  20. Time Series Analysis for Spatial Node Selection in Environment Monitoring Sensor Networks

    PubMed Central

    Bhandari, Siddhartha; Jurdak, Raja; Kusy, Branislav

    2017-01-01

    Wireless sensor networks are widely used in environmental monitoring. The number of sensor nodes to be deployed will vary depending on the desired spatio-temporal resolution. Selecting an optimal number, position and sampling rate for an array of sensor nodes in environmental monitoring is a challenging question. Most of the current solutions are either theoretical or simulation-based where the problems are tackled using random field theory, computational geometry or computer simulations, limiting their specificity to a given sensor deployment. Using an empirical dataset from a mine rehabilitation monitoring sensor network, this work proposes a data-driven approach where co-integrated time series analysis is used to select the number of sensors from a short-term deployment of a larger set of potential node positions. Analyses conducted on temperature time series show 75% of sensors are co-integrated. Using only 25% of the original nodes can generate a complete dataset within a 0.5 °C average error bound. Our data-driven approach to sensor position selection is applicable for spatiotemporal monitoring of spatially correlated environmental parameters to minimize deployment cost without compromising data resolution. PMID:29271880

  1. Long-Term Retrospective Analysis of Mackerel Spawning in the North Sea: A New Time Series and Modeling Approach to CPR Data

    PubMed Central

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221

  2. A graph-based approach to detect spatiotemporal dynamics in satellite image time series

    NASA Astrophysics Data System (ADS)

    Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal

    2017-08-01

    Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.

  3. Dynamic Cross-Entropy.

    PubMed

    Aur, Dorian; Vila-Rodriguez, Fidel

    2017-01-01

    Complexity measures for time series have been used in many applications to quantify the regularity of one dimensional time series, however many dynamical systems are spatially distributed multidimensional systems. We introduced Dynamic Cross-Entropy (DCE) a novel multidimensional complexity measure that quantifies the degree of regularity of EEG signals in selected frequency bands. Time series generated by discrete logistic equations with varying control parameter r are used to test DCE measures. Sliding window DCE analyses are able to reveal specific period doubling bifurcations that lead to chaos. A similar behavior can be observed in seizures triggered by electroconvulsive therapy (ECT). Sample entropy data show the level of signal complexity in different phases of the ictal ECT. The transition to irregular activity is preceded by the occurrence of cyclic regular behavior. A significant increase of DCE values in successive order from high frequencies in gamma to low frequencies in delta band reveals several phase transitions into less ordered states, possible chaos in the human brain. To our knowledge there are no reliable techniques able to reveal the transition to chaos in case of multidimensional times series. In addition, DCE based on sample entropy appears to be robust to EEG artifacts compared to DCE based on Shannon entropy. The applied technique may offer new approaches to better understand nonlinear brain activity. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Reconstruction of extended Petri nets from time series data and its application to signal transduction and to gene regulatory networks

    PubMed Central

    2011-01-01

    Background Network inference methods reconstruct mathematical models of molecular or genetic networks directly from experimental data sets. We have previously reported a mathematical method which is exclusively data-driven, does not involve any heuristic decisions within the reconstruction process, and deliveres all possible alternative minimal networks in terms of simple place/transition Petri nets that are consistent with a given discrete time series data set. Results We fundamentally extended the previously published algorithm to consider catalysis and inhibition of the reactions that occur in the underlying network. The results of the reconstruction algorithm are encoded in the form of an extended Petri net involving control arcs. This allows the consideration of processes involving mass flow and/or regulatory interactions. As a non-trivial test case, the phosphate regulatory network of enterobacteria was reconstructed using in silico-generated time-series data sets on wild-type and in silico mutants. Conclusions The new exact algorithm reconstructs extended Petri nets from time series data sets by finding all alternative minimal networks that are consistent with the data. It suggested alternative molecular mechanisms for certain reactions in the network. The algorithm is useful to combine data from wild-type and mutant cells and may potentially integrate physiological, biochemical, pharmacological, and genetic data in the form of a single model. PMID:21762503

  5. Interpretable Categorization of Heterogeneous Time Series Data

    NASA Technical Reports Server (NTRS)

    Lee, Ritchie; Kochenderfer, Mykel J.; Mengshoel, Ole J.; Silbermann, Joshua

    2017-01-01

    We analyze data from simulated aircraft encounters to validate and inform the development of a prototype aircraft collision avoidance system. The high-dimensional and heterogeneous time series dataset is analyzed to discover properties of near mid-air collisions (NMACs) and categorize the NMAC encounters. Domain experts use these properties to better organize and understand NMAC occurrences. Existing solutions either are not capable of handling high-dimensional and heterogeneous time series datasets or do not provide explanations that are interpretable by a domain expert. The latter is critical to the acceptance and deployment of safety-critical systems. To address this gap, we propose grammar-based decision trees along with a learning algorithm. Our approach extends decision trees with a grammar framework for classifying heterogeneous time series data. A context-free grammar is used to derive decision expressions that are interpretable, application-specific, and support heterogeneous data types. In addition to classification, we show how grammar-based decision trees can also be used for categorization, which is a combination of clustering and generating interpretable explanations for each cluster. We apply grammar-based decision trees to a simulated aircraft encounter dataset and evaluate the performance of four variants of our learning algorithm. The best algorithm is used to analyze and categorize near mid-air collisions in the aircraft encounter dataset. We describe each discovered category in detail and discuss its relevance to aircraft collision avoidance.

  6. Confounding environmental colour and distribution shape leads to underestimation of population extinction risk.

    PubMed

    Fowler, Mike S; Ruokolainen, Lasse

    2013-01-01

    The colour of environmental variability influences the size of population fluctuations when filtered through density dependent dynamics, driving extinction risk through dynamical resonance. Slow fluctuations (low frequencies) dominate in red environments, rapid fluctuations (high frequencies) in blue environments and white environments are purely random (no frequencies dominate). Two methods are commonly employed to generate the coloured spatial and/or temporal stochastic (environmental) series used in combination with population (dynamical feedback) models: autoregressive [AR(1)] and sinusoidal (1/f) models. We show that changing environmental colour from white to red with 1/f models, and from white to red or blue with AR(1) models, generates coloured environmental series that are not normally distributed at finite time-scales, potentially confounding comparison with normally distributed white noise models. Increasing variability of sample Skewness and Kurtosis and decreasing mean Kurtosis of these series alter the frequency distribution shape of the realised values of the coloured stochastic processes. These changes in distribution shape alter patterns in the probability of single and series of extreme conditions. We show that the reduced extinction risk for undercompensating (slow growing) populations in red environments previously predicted with traditional 1/f methods is an artefact of changes in the distribution shapes of the environmental series. This is demonstrated by comparison with coloured series controlled to be normally distributed using spectral mimicry. Changes in the distribution shape that arise using traditional methods lead to underestimation of extinction risk in normally distributed, red 1/f environments. AR(1) methods also underestimate extinction risks in traditionally generated red environments. This work synthesises previous results and provides further insight into the processes driving extinction risk in model populations. We must let the characteristics of known natural environmental covariates (e.g., colour and distribution shape) guide us in our choice of how to best model the impact of coloured environmental variation on population dynamics.

  7. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  8. Impact of strong climate change on balancing and storage needs in a fully renewable energy system

    NASA Astrophysics Data System (ADS)

    Weber, Juliane; Wohland, Jan; Witthaut, Dirk

    2017-04-01

    We investigate the impact of strong climate change on a European energy system dominated by wind power. No robust trend can be observed regarding the change of the wind power yield for most countries in Europe. However, intra-annual variabilities in wind power generation robustly increase in most of Central and Western Europe and decrease in Spain, Portugal and Greece by the end of this century. Thus, the generation of wind power tends to increase (decrease) in the winter months compared to the summer months. Due to higher (lower) intra-annual variations, the probability for extreme events with long periods of low power production increases (decreases) in summer. This implies that more (less) energy has to be provided by backup power plants. Our simulations are based on the results of five different Global Climate Models (GCMs) using the Representative Concentration Pathway scenario 8.5 (RCP8.5). These results are dynamically downscaled with the regional atmospheric model RCA4 by the EURO-CORDEX initiative (Coordinated Downscaling Experiment - European Domain). A comparison was made between historical data (1970-2000) and mid-century (2030-2060) and end-of-century (2070-2100) data, respectively. For all timeframes we made the assumption that a certain amount of energy is provided by wind power plants. This implies that changes in wind power potentials are neglected and only temporal effects are considered. Wind speed time series are converted to power generation time series using an extrapolation to hub height and a standardized power curve. Assuming a scenario for the future distribution of wind turbines, we obtain a wind power generation time series aggregated on a national level. The operation of backup power plants and storage facilities is simulated on coarse scales assuming an optimal storage strategy. Backup is required whenever the storage facilities are empty. The amount of change of the backup energy depends on the storage capacity - the higher the capacity, the higher the change as long as storage capacities do not allow for multi-year storage.

  9. Spectral analysis of a two-species competition model: Determining the effects of extreme conditions on the color of noise generated from simulated time series

    NASA Astrophysics Data System (ADS)

    Golinski, M. R.

    2006-07-01

    Ecologists have observed that environmental noise affects population variance in the logistic equation for one-species growth. Interactions between deterministic and stochastic dynamics in a one-dimensional system result in increased variance in species population density over time. Since natural populations do not live in isolation, the present paper simulates a discrete-time two-species competition model with environmental noise to determine the type of colored population noise generated by extreme conditions in the long-term population dynamics of competing populations. Discrete Fourier analysis is applied to the simulation results and the calculated Hurst exponent ( H) is used to determine how the color of population noise for the two species corresponds to extreme conditions in population dynamics. To interpret the biological meaning of the color of noise generated by the two-species model, the paper determines the color of noise generated by three reference models: (1) A two-dimensional discrete-time white noise model (0⩽ H<1/2); (2) A two-dimensional fractional Brownian motion model (H=1/2); and (3) A two-dimensional discrete-time model with noise for unbounded growth of two uncoupled species (1/2< H⩽1).

  10. New Times, New Fathers = A temps moderne, papas modernes.

    ERIC Educational Resources Information Center

    Theilheimer, Ish, Ed.

    1994-01-01

    This theme issue of "Transition" features a series of articles on fatherhood and the changing role of fathers in parenting. The articles include: (1) "From Cloth to Paper Diapers and Back: Reflections on Fatherhood during Two Generations" (Robert Couchman), which relates experiences of a new father 20 years ago and today; (2)…

  11. Terrain Measurement with SAR/InSAR

    NASA Astrophysics Data System (ADS)

    Li, Deren; Liao, Mingsheng; Balz, Timo; Zhang, Lu; Yang, Tianliang

    2016-08-01

    Terrain measurement and surface motion estimation are the most important applications for commercial and scientific SAR missions. In Dragon-3, we worked on these applications, especially regarding DEM generation, surface motion estimation with SAR time- series for urban subsidence monitoring and landslide motion estimation, as well as developing tomographic SAR processing methods in urban areas.

  12. Using the Microcomputer to Generate Materials for Bibliographic Instruction.

    ERIC Educational Resources Information Center

    Hendley, Gaby G.

    Guide-worksheets were developed on a word processor in a high school library for bibliographic instruction of English and social studies students to cover the following reference sources: Facts on File; Social Issues Resource Series (S.I.R.S.); Editorial Research Reports; Great Contemporary Issues (New York Times), which also includes Facts on…

  13. Time series analysis of travel trends in Vermont

    Treesearch

    Varna M. Ramaswamy; Walter F. Kuentzel

    1995-01-01

    Vermont's travel and tourism industry is not keeping pace with the nation-wide growth in the travel industry. While travel indicators such as domestic travel expenditures, tourism generated employment, payroll and tax receipts have been steadily increasing across the United States, these indicators in Vermont peaked in 1978 and have declined ever since. The state...

  14. Further characterization of the time transfer capabilities of precise point positioning (PPP): the Sliding Batch Procedure.

    PubMed

    Guyennon, Nicolas; Cerretto, Giancarlo; Tavella, Patrizia; Lahaye, François

    2009-08-01

    In recent years, many national timing laboratories have installed geodetic Global Positioning System receivers together with their traditional GPS/GLONASS Common View receivers and Two Way Satellite Time and Frequency Transfer equipment. Many of these geodetic receivers operate continuously within the International GNSS Service (IGS), and their data are regularly processed by IGS Analysis Centers. From its global network of over 350 stations and its Analysis Centers, the IGS generates precise combined GPS ephemeredes and station and satellite clock time series referred to the IGS Time Scale. A processing method called Precise Point Positioning (PPP) is in use in the geodetic community allowing precise recovery of GPS antenna position, clock phase, and atmospheric delays by taking advantage of these IGS precise products. Previous assessments, carried out at Istituto Nazionale di Ricerca Metrologica (INRiM; formerly IEN) with a PPP implementation developed at Natural Resources Canada (NRCan), showed PPP clock solutions have better stability over short/medium term than GPS CV and GPS P3 methods and significantly reduce the day-boundary discontinuities when used in multi-day continuous processing, allowing time-limited, campaign-style time-transfer experiments. This paper reports on follow-on work performed at INRiM and NRCan to further characterize and develop the PPP method for time transfer applications, using data from some of the National Metrology Institutes. We develop a processing procedure that takes advantage of the improved stability of the phase-connected multi-day PPP solutions while allowing the generation of continuous clock time series, more applicable to continuous operation/monitoring of timing equipment.

  15. Statistical characteristics of surrogate data based on geophysical measurements

    NASA Astrophysics Data System (ADS)

    Venema, V.; Bachner, S.; Rust, H. W.; Simmer, C.

    2006-09-01

    In this study, the statistical properties of a range of measurements are compared with those of their surrogate time series. Seven different records are studied, amongst others, historical time series of mean daily temperature, daily rain sums and runoff from two rivers, and cloud measurements. Seven different algorithms are used to generate the surrogate time series. The best-known method is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm, which is able to reproduce the measured distribution as well as the power spectrum. Using this setup, the measurements and their surrogates are compared with respect to their power spectrum, increment distribution, structure functions, annual percentiles and return values. It is found that the surrogates that reproduce the power spectrum and the distribution of the measurements are able to closely match the increment distributions and the structure functions of the measurements, but this often does not hold for surrogates that only mimic the power spectrum of the measurement. However, even the best performing surrogates do not have asymmetric increment distributions, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found deviations of the structure functions on small scales.

  16. An Interrupted Time-Series Analysis of Durkheim's Social Deregulation Thesis: The Case of the Russian Federation

    PubMed Central

    Pridemore, William Alex; Chamlin, Mitchell B.; Cochran, John K.

    2009-01-01

    The dissolution of the Soviet Union resulted in sudden, widespread, and fundamental changes to Russian society. The former social welfare system-with its broad guarantees of employment, healthcare, education, and other forms of social support-was dismantled in the shift toward democracy, rule of law, and a free-market economy. This unique natural experiment provides a rare opportunity to examine the potentially disintegrative effects of rapid social change on deviance, and thus to evaluate one of Durkheim's core tenets. We took advantage of this opportunity by performing interrupted time-series analyses of annual age-adjusted homicide, suicide, and alcohol-related mortality rates for the Russian Federation using data from 1956 to 2002, with 1992-2002 as the postintervention time-frame. The ARIMA models indicate that, controlling for the long-term processes that generated these three time series, the breakup of the Soviet Union was associated with an appreciable increase in each of the cause-of-death rates. We interpret these findings as being consistent with the Durkheimian hypothesis that rapid social change disrupts social order, thereby increasing the level of crime and deviance. PMID:20165565

  17. Frequency Analysis of Modis Ndvi Time Series for Determining Hotspot of Land Degradation in Mongolia

    NASA Astrophysics Data System (ADS)

    Nasanbat, E.; Sharav, S.; Sanjaa, T.; Lkhamjav, O.; Magsar, E.; Tuvdendorj, B.

    2018-04-01

    This study examines MODIS NDVI satellite imagery time series can be used to determine hotspot of land degradation area in whole Mongolia. The trend statistical analysis of Mann-Kendall was applied to a 16-year MODIS NDVI satellite imagery record, based on 16-day composited temporal data (from May to September) for growing seasons and from 2000 to 2016. We performed to frequency analysis that resulting NDVI residual trend pattern would enable successful determined of negative and positive changes in photo synthetically health vegetation. Our result showed that negative and positive values and generated a map of significant trends. Also, we examined long-term of meteorological parameters for the same period. The result showed positive and negative NDVI trends concurred with land cover types change representing an improve or a degrade in vegetation, respectively. Also, integrated the climate parameters which were precipitation and air temperature changes in the same time period seem to have had an affecting on huge NDVI trend area. The time series trend analysis approach applied successfully determined hotspot of an improvement and a degraded area due to land degradation and desertification.

  18. Ship Speed Retrieval From Single Channel TerraSAR-X Data

    NASA Astrophysics Data System (ADS)

    Soccorsi, Matteo; Lehner, Susanne

    2010-04-01

    A method to estimate the speed of a moving ship is presented. The technique, introduced in Kirscht (1998), is extended to marine application and validated on TerraSAR-X High-Resolution (HR) data. The generation of a sequence of single-look SAR images from a single- channel image corresponds to an image time series with reduced resolution. This allows applying change detection techniques on the time series to evaluate the velocity components in range and azimuth of the ship. The evaluation of the displacement vector of a moving target in consecutive images of the sequence allows the estimation of the azimuth velocity component. The range velocity component is estimated by evaluating the variation of the signal amplitude during the sequence. In order to apply the technique on TerraSAR-X Spot Light (SL) data a further processing step is needed. The phase has to be corrected as presented in Eineder et al. (2009) due to the SL acquisition mode; otherwise the image sequence cannot be generated. The analysis, when possible validated by the Automatic Identification System (AIS), was performed in the framework of the ESA project MARISS.

  19. Virtual non-contrast dual-energy CT compared to single-energy CT of the urinary tract: a prospective study.

    PubMed

    Lundin, Margareta; Lidén, Mats; Magnuson, Anders; Mohammed, Ahmed Abdulilah; Geijer, Håkan; Andersson, Torbjörn; Persson, Anders

    2012-07-01

    Dual-energy computed tomography (DECT) has been shown to be useful for subtracting bone or calcium in CT angiography and gives an opportunity to produce a virtual non-contrast-enhanced (VNC) image from a series where contrast agents have been given intravenously. High noise levels and low resolution have previously limited the diagnostic value of the VNC images created with the first generation of DECT. With the recent introduction of a second generation of DECT, there is a possibility of obtaining VNC images with better image quality at hopefully lower radiation dose compared to the previous generation. To compare the image quality of the single-energy series to a VNC series obtained with a two generations of DECT scanners. CT of the urinary tract was used as a model. Thirty patients referred for evaluation of hematuria were examined with an older system (Somatom Definition) and another 30 patients with a new generation (Somatom Definition Flash). One single-energy series was obtained before and one dual-energy series after administration of intravenous contrast media. We created a VNC series from the contrast-enhanced images. Images were assessed concerning image quality with a visual grading scale evaluation of the VNC series with the single-energy series as gold standard. The image quality of the VNC images was rated inferior to the single-energy variant for both scanners, OR 11.5-67.3 for the Definition and OR 2.1-2.8 for the Definition Flash. Visual noise and overall quality were regarded as better with Flash than Definition. Image quality of VNC images obtained with the new generation of DECT is still slightly inferior compared to native images. However, the difference is smaller with the new compared to the older system.

  20. Investigation of a catalytic gas generator for the Space Shuttle APU. [hydrazine Auxiliary Propulsion Unit

    NASA Technical Reports Server (NTRS)

    Emmons, D. L.; Huxtable, D. D.; Blevins, D. R.

    1974-01-01

    An investigation was conducted to establish the capability of a monopropellant hydrazine catalytic gas generator to meet the requirements specified for the Space Shuttle APU. Detailed analytical and experimental studies were conducted on potential problem areas including long-term nitriding effects on materials, design variables affecting catalyst life, vehicle vibration effects, and catalyst oxidation/contamination. A full-scale gas generator, designed to operate at a chamber pressure of 750 psia and a flow rate of 0.36 lbm/sec, was fabricated and subjected to three separate life test series. The objective of the first test series was to demonstrate the capability of the gas generator to successfully complete 20 simulated Space Shuttle missions in steady-state operation. The gas generator was then refurbished and subjected to a second series of tests to demonstrate the pulse-mode capability of the gas generator during 20 simulated missions. The third series of tests was conducted with a refurbished reactor to further demonstrate pulse-mode capability with a modified catalyst bed.

  1. Finite difference time domain grid generation from AMC helicopter models

    NASA Technical Reports Server (NTRS)

    Cravey, Robin L.

    1992-01-01

    A simple technique is presented which forms a cubic grid model of a helicopter from an Aircraft Modeling Code (AMC) input file. The AMC input file defines the helicopter fuselage as a series of polygonal cross sections. The cubic grid model is used as an input to a Finite Difference Time Domain (FDTD) code to obtain predictions of antenna performance on a generic helicopter model. The predictions compare reasonably well with measured data.

  2. Preventing Heat Injuries by Predicting Individualized Human Core Temperature

    DTIC Science & Technology

    2015-10-14

    hardware/software warning system of an impending rise in TC and generate alerts to potentially prevent heat injuries. PREVENTING HEAT INJURIES BY...TC estimates, provides ahead-of-time alerts about an impending rise in TC and 2) an individualized model that uses non-invasive measurements of AC...PREDICTION AND ALERT ALGORITHMS Here, we detail the development of an algorithm that uses a time series of recent-past TC measurements to provide

  3. Activation of Central Pattern Generator for Respiration Following Complete High Cervical Spinal Cord Interruption

    DTIC Science & Technology

    2017-09-01

    periodical or series . Include any significant publication in the proceedings of a one- time conference or in the report of a one- time study... Interruption PRINCIPAL INVESTIGATOR: Vitaliy Marchenko,MD,PhD CONTRACTING ORGANIZATION DREXEL UNIVERSITY PHILADELPHIA PA 19104-2875 REPORT DATE...the author(s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other

  4. Investigation of a long time series of CO2 from a tall tower using WRF-SPA

    NASA Astrophysics Data System (ADS)

    Smallman, Luke; Williams, Mathew; Moncrieff, John B.

    2013-04-01

    Atmospheric observations from tall towers are an important source of information about CO2 exchange at the regional scale. Here, we have used a forward running model, WRF-SPA, to generate a time series of CO2 at a tall tower for comparison with observations from Scotland over multiple years (2006-2008). We use this comparison to infer strength and distribution of sources and sinks of carbon and ecosystem process information at the seasonal scale. The specific aim of this research is to combine a high resolution (6 km) forward running meteorological model (WRF) with a modified version of a mechanistic ecosystem model (SPA). SPA provides surface fluxes calculated from coupled energy, hydrological and carbon cycles. This closely coupled representation of the biosphere provides realistic surface exchanges to drive mixing within the planetary boundary layer. The combined model is used to investigate the sources and sinks of CO2 and to explore which land surfaces contribute to a time series of hourly observations of atmospheric CO2 at a tall tower, Angus, Scotland. In addition to comparing the modelled CO2 time series to observations, modelled ecosystem specific (i.e. forest, cropland, grassland) CO2 tracers (e.g., assimilation and respiration) have been compared to the modelled land surface assimilation to investigate how representative tall tower observations are of land surface processes. WRF-SPA modelled CO2 time series compares well to observations (R2 = 0.67, rmse = 3.4 ppm, bias = 0.58 ppm). Through comparison of model-observation residuals, we have found evidence that non-cropped components of agricultural land (e.g., hedgerows and forest patches) likely contribute a significant and observable impact on regional carbon balance.

  5. Spectral entropy as a mean to quantify water stress history for natural vegetation and irrigated agriculture in a water-stressed tropical environment

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Johnson, M. S.

    2017-12-01

    Spectral entropy (Hs) is an index which can be used to measure the structural complexity of time series data. When a time series is made up of one periodic function, the Hs value becomes smaller, while Hs becomes larger when a time series is composed of several periodic functions. We hypothesized that this characteristic of the Hs could be used to quantify the water stress history of vegetation. For the ideal condition for which sufficient water is supplied to an agricultural crop or natural vegetation, there should be a single distinct phenological cycle represented in a vegetation index time series (e.g., NDVI and EVI). However, time series data for a vegetation area that repeatedly experiences water stress may include several fluctuations that can be observed in addition to the predominant phenological cycle. This is because the process of experiencing water stress and recovering from it generates small fluctuations in phenological characteristics. Consequently, the value of Hs increases when vegetation experiences several water shortages. Therefore, the Hs could be used as an indicator for water stress history. To test this hypothesis, we analyzed Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) data for a natural area in comparison to a nearby sugarcane area in seasonally-dry western Costa Rica. In this presentation we will illustrate the use of spectral entropy to evaluate the vegetative responses of natural vegetation (dry tropical forest) and sugarcane under three different irrigation techniques (center pivot irrigation, drip irrigation and flood irrigation). Through this comparative analysis, the utility of Hs as an indicator will be tested. Furthermore, crop response to the different irrigation methods will be discussed in terms of Hs, NDVI and yield.

  6. Use of a scenario-neutral approach to identify the key hydro-meteorological attributes that impact runoff from a natural catchment

    NASA Astrophysics Data System (ADS)

    Guo, Danlu; Westra, Seth; Maier, Holger R.

    2017-11-01

    Scenario-neutral approaches are being used increasingly for assessing the potential impact of climate change on water resource systems, as these approaches allow the performance of these systems to be evaluated independently of climate change projections. However, practical implementations of these approaches are still scarce, with a key limitation being the difficulty of generating a range of plausible future time series of hydro-meteorological data. In this study we apply a recently developed inverse stochastic generation approach to support the scenario-neutral analysis, and thus identify the key hydro-meteorological variables to which the system is most sensitive. The stochastic generator simulates synthetic hydro-meteorological time series that represent plausible future changes in (1) the average, extremes and seasonal patterns of rainfall; and (2) the average values of temperature (Ta), relative humidity (RH) and wind speed (uz) as variables that drive PET. These hydro-meteorological time series are then fed through a conceptual rainfall-runoff model to simulate the potential changes in runoff as a function of changes in the hydro-meteorological variables, and runoff sensitivity is assessed with both correlation and Sobol' sensitivity analyses. The method was applied to a case study catchment in South Australia, and the results showed that the most important hydro-meteorological attributes for runoff were winter rainfall followed by the annual average rainfall, while the PET-related meteorological variables had comparatively little impact. The high importance of winter rainfall can be related to the winter-dominated nature of both the rainfall and runoff regimes in this catchment. The approach illustrated in this study can greatly enhance our understanding of the key hydro-meteorological attributes and processes that are likely to drive catchment runoff under a changing climate, thus enabling the design of tailored climate impact assessments to specific water resource systems.

  7. Two time-series analyses of the impact of antibiotic consumption and alcohol-based hand disinfection on the incidences of nosocomial methicillin-resistant Staphylococcus aureus infection and Clostridium difficile infection.

    PubMed

    Kaier, Klaus; Hagist, Christian; Frank, Uwe; Conrad, Andreas; Meyer, Elisabeth

    2009-04-01

    To determine the impact of antibiotic consumption and alcohol-based hand disinfection on the incidences of nosocomial methicillin-resistant Staphylococcus aureus (MRSA) infection and Clostridium difficile infection (CDI). Two multivariate time-series analyses were performed that used as dependent variables the monthly incidences of nosocomial MRSA infection and CDI at the Freiburg University Medical Center during the period January 2003 through October 2007. The volume of alcohol-based hand rub solution used per month was quantified in liters per 1,000 patient-days. Antibiotic consumption was calculated in terms of the number of defined daily doses per 1,000 patient-days per month. The use of alcohol-based hand rub was found to have a significant impact on the incidence of nosocomial MRSA infection (P< .001). The multivariate analysis (R2=0.66) showed that a higher volume of use of alcohol-based hand rub was associated with a lower incidence of nosocomial MRSA infection. Conversely, a higher level of consumption of selected antimicrobial agents was associated with a higher incidence of nosocomial MRSA infection. This analysis showed this relationship was the same for the use of second-generation cephalosporins (P= .023), third-generation cephalosporins (P= .05), fluoroquinolones (P= .01), and lincosamides (P= .05). The multivariate analysis (R2=0.55) showed that a higher level of consumption of third-generation cephalosporins (P= .008), fluoroquinolones (P= .084), and/or macrolides (P= .007) was associated with a higher incidence of CDI. A correlation with use of alcohol-based hand rub was not detected. In 2 multivariate time-series analyses, we were able to show the impact of hand hygiene and antibiotic use on the incidence of nosocomial MRSA infection, but we found no association between hand hygiene and incidence of CDI.

  8. Time trends in recurrence of juvenile nasopharyngeal angiofibroma: Experience of the past 4 decades.

    PubMed

    Mishra, Anupam; Mishra, Subhash Chandra

    2016-01-01

    An analysis of time distribution of juvenile nasopharyngeal angiofibroma (JNA) from the last 4 decades is presented. Sixty recurrences were analyzed as per actuarial survival. SPSS software was used to generate Kaplan-Meier (KM) curves and time distributions were compared by Log-rank, Breslow and Tarone-Ware test. The overall recurrence rate was 17.59%. Majority underwent open transpalatal approach(es) without embolization. The probability of detecting a recurrence was 95% in first 24months and comparison of KM curves of 4 different time periods was not significant. This is the first and largest series to address the time-distribution. The required follow up period is 2years. Our recurrence is just half of the largest series (reported so far) suggesting the superiority of transpalatal techniques. The similarity of curves suggests less likelihood for recent technical advances to influence the recurrence that as per our hypothesis is more likely to reflect tumor biology per se. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. A convergent series expansion for hyperbolic systems of conservation laws

    NASA Technical Reports Server (NTRS)

    Harabetian, E.

    1985-01-01

    The discontinuities piecewise analytic initial value problem for a wide class of conservation laws is considered which includes the full three-dimensional Euler equations. The initial interaction at an arbitrary curved surface is resolved in time by a convergent series. Among other features the solution exhibits shock, contact, and expansion waves as well as sound waves propagating on characteristic surfaces. The expansion waves correspond to he one-dimensional rarefactions but have a more complicated structure. The sound waves are generated in place of zero strength shocks, and they are caused by mismatches in derivatives.

  10. A 7.8 kV nanosecond pulse generator with a 500 Hz repetition rate

    NASA Astrophysics Data System (ADS)

    Lin, M.; Liao, H.; Liu, M.; Zhu, G.; Yang, Z.; Shi, P.; Lu, Q.; Sun, X.

    2018-04-01

    Pseudospark switches are widely used in pulsed power applications. In this paper, we present the design and performance of a 500 Hz repetition rate high-voltage pulse generator to drive TDI-series pseudospark switches. A high-voltage pulse is produced by discharging an 8 μF capacitor through a primary windings of a setup isolation transformer using a single metal-oxide-semiconductor field-effect transistor (MOSFET) as a control switch. In addition, a self-break spark gap is used to steepen the pulse front. The pulse generator can deliver a high-voltage pulse with a peak trigger voltage of 7.8 kV, a peak trigger current of 63 A, a full width at half maximum (FWHM) of ~30 ns, and a rise time of 5 ns to the trigger pin of the pseudospark switch. During burst mode operation, the generator achieved up to a 500 Hz repetition rate. Meanwhile, we also provide an AC heater power circuit for heating a H2 reservoir. This pulse generator can be used in circuits with TDI-series pseudospark switches with either a grounded cathode or with a cathode electrically floating operation. The details of the circuits and their implementation are described in the paper.

  11. Sampled-time control of a microbial fuel cell stack

    NASA Astrophysics Data System (ADS)

    Boghani, Hitesh C.; Dinsdale, Richard M.; Guwy, Alan J.; Premier, Giuliano C.

    2017-07-01

    Research into microbial fuel cells (MFCs) has reached the point where cubic metre-scale systems and stacks are being built and tested. Apart from performance enhancement through catalysis, materials and design, an important research area for industrial applicability is stack control, which can enhance MFCs stack power output. An MFC stack is controlled using a sampled-time digital control strategy, which has the advantage of intermittent operation with consequent power saving, and when used in a hybrid series stack connectivity, can avoid voltage reversals. A MFC stack comprising four tubular MFCs was operated hydraulically in series. Each MFC was connected to an independent controller and the stack was connected electrically in series, creating a hybrid-series connectivity. The voltage of each MFC in the stack was controlled such that the overall series stack voltage generated was the algebraic sum (1.26 V) of the individual MFC voltages (0.32, 0.32, 0.32 and 0.3). The controllers were able to control the individual voltages to the point where 2.52 mA was drawn from the stack at a load of 499.9 Ω (delivering 3.18 mW). The controllers were able to reject the disturbances and perturbations caused by electrical loading, temperature and substrate concentration.

  12. NEW SUNS IN THE COSMOS. III. MULTIFRACTAL SIGNATURE ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freitas, D. B. de; Nepomuceno, M. M. F.; Junior, P. R. V. de Moraes

    2016-11-01

    In the present paper, we investigate the multifractality signatures in hourly time series extracted from the CoRoT spacecraft database. Our analysis is intended to highlight the possibility that astrophysical time series can be members of a particular class of complex and dynamic processes, which require several photometric variability diagnostics to characterize their structural and topological properties. To achieve this goal, we search for contributions due to a nonlinear temporal correlation and effects caused by heavier tails than the Gaussian distribution, using a detrending moving average algorithm for one-dimensional multifractal signals (MFDMA). We observe that the correlation structure is the mainmore » source of multifractality, while heavy-tailed distribution plays a minor role in generating the multifractal effects. Our work also reveals that the rotation period of stars is inherently scaled by the degree of multifractality. As a result, analyzing the multifractal degree of the referred series, we uncover an evolution of multifractality from shorter to larger periods.« less

  13. Fading channel simulator

    DOEpatents

    Argo, Paul E.; Fitzgerald, T. Joseph

    1993-01-01

    Fading channel effects on a transmitted communication signal are simulated with both frequency and time variations using a channel scattering function to affect the transmitted signal. A conventional channel scattering function is converted to a series of channel realizations by multiplying the square root of the channel scattering function by a complex number of which the real and imaginary parts are each independent variables. The two-dimensional inverse-FFT of this complex-valued channel realization yields a matrix of channel coefficients that provide a complete frequency-time description of the channel. The transmitted radio signal is segmented to provide a series of transmitted signal and each segment is subject to FFT to generate a series of signal coefficient matrices. The channel coefficient matrices and signal coefficient matrices are then multiplied and subjected to inverse-FFT to output a signal representing the received affected radio signal. A variety of channel scattering functions can be used to characterize the response of a transmitter-receiver system to such atmospheric effects.

  14. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    PubMed

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  15. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  16. Development of a takeoff performance monitoring system. Ph.D. Thesis. Contractor Report, Jan. 1984 - Jun. 1985

    NASA Technical Reports Server (NTRS)

    Srivatsan, Raghavachari; Downing, David R.

    1987-01-01

    Discussed are the development and testing of a real-time takeoff performance monitoring algorithm. The algorithm is made up of two segments: a pretakeoff segment and a real-time segment. One-time imputs of ambient conditions and airplane configuration information are used in the pretakeoff segment to generate scheduled performance data for that takeoff. The real-time segment uses the scheduled performance data generated in the pretakeoff segment, runway length data, and measured parameters to monitor the performance of the airplane throughout the takeoff roll. Airplane and engine performance deficiencies are detected and annunciated. An important feature of this algorithm is the one-time estimation of the runway rolling friction coefficient. The algorithm was tested using a six-degree-of-freedom airplane model in a computer simulation. Results from a series of sensitivity analyses are also included.

  17. On the multifractal effects generated by monofractal signals

    NASA Astrophysics Data System (ADS)

    Grech, Dariusz; Pamuła, Grzegorz

    2013-12-01

    We study quantitatively the level of false multifractal signal one may encounter while analyzing multifractal phenomena in time series within multifractal detrended fluctuation analysis (MF-DFA). The investigated effect appears as a result of finite length of used data series and is additionally amplified by the long-term memory the data eventually may contain. We provide the detailed quantitative description of such apparent multifractal background signal as a threshold in spread of generalized Hurst exponent values Δh or a threshold in the width of multifractal spectrum Δα below which multifractal properties of the system are only apparent, i.e. do not exist, despite Δα≠0 or Δh≠0. We find this effect quite important for shorter or persistent series and we argue it is linear with respect to autocorrelation exponent γ. Its strength decays according to power law with respect to the length of time series. The influence of basic linear and nonlinear transformations applied to initial data in finite time series with various levels of long memory is also investigated. This provides additional set of semi-analytical results. The obtained formulas are significant in any interdisciplinary application of multifractality, including physics, financial data analysis or physiology, because they allow to separate the ‘true’ multifractal phenomena from the apparent (artificial) multifractal effects. They should be a helpful tool of the first choice to decide whether we do in particular case with the signal with real multiscaling properties or not.

  18. Nonlinear pattern analysis of ventricular premature beats by mutual information

    NASA Technical Reports Server (NTRS)

    Osaka, M.; Saitoh, H.; Yokoshima, T.; Kishida, H.; Hayakawa, H.; Cohen, R. J.

    1997-01-01

    The frequency of ventricular premature beats (VPBs) has been related to the risk of mortality. However, little is known about the temporal pattern of occurrence of VPBs and its relationship to autonomic activity. Hence, we applied a general correlation measure, mutual information, to quantify how VPBs are generated over time. We also used mutual information to determine the correlation between VPB production and heart rate in order to evaluate effects of autonomic activity on VPB production. We examined twenty subjects with more than 3000 VPBs/day and simulated random time series of VPB occurrence. We found that mutual information values could be used to characterize quantitatively the temporal patterns of VPB generation. Our data suggest that VPB production is not random and VPBs generated with a higher value of mutual information may be more greatly affected by autonomic activity.

  19. Kato perturbative expansion in classical mechanics and an explicit expression for the Deprit generator

    NASA Astrophysics Data System (ADS)

    Nikolaev, A. S.

    2015-03-01

    We study the structure of the canonical Poincaré-Lindstedt perturbation series in the Deprit operator formalism and establish its connection to the Kato resolvent expansion. A discussion of invariant definitions for averaging and integrating perturbation operators and their canonical identities reveals a regular pattern in the series for the Deprit generator. This regularity is explained using Kato series and the relation of the perturbation operators to the Laurent coefficients for the resolvent of the Liouville operator. This purely canonical approach systematizes the series and leads to an explicit expression for the Deprit generator in any order of the perturbation theory: , where is the partial pseudoinverse of the perturbed Liouville operator. The corresponding Kato series provides a reasonably effective computational algorithm. The canonical connection of the perturbed and unperturbed averaging operators allows describing ambiguities in the generator and transformed Hamiltonian, while Gustavson integrals turn out to be insensitive to the normalization style. We use nonperturbative examples for illustration.

  20. Method for leveling the power output of an electromechanical battery as a function of speed

    DOEpatents

    Post, R.F.

    1999-03-16

    The invention is a method of leveling the power output of an electromechanical battery during its discharge, while at the same time maximizing its power output into a given load. The method employs the concept of series resonance, employing a capacitor the parameters of which are chosen optimally to achieve the desired near-flatness of power output over any chosen charged-discharged speed ratio. Capacitors are inserted in series with each phase of the windings to introduce capacitative reactances that act to compensate the inductive reactance of these windings. This compensating effect both increases the power that can be drawn from the generator before inductive voltage drops in the windings become dominant and acts to flatten the power output over a chosen speed range. The values of the capacitors are chosen so as to optimally flatten the output of the generator over the chosen speed range. 3 figs.

Top