Sample records for structural time series

  1. Multiscale structure of time series revealed by the monotony spectrum.

    PubMed

    Vamoş, Călin

    2017-03-01

    Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.

  2. Why didn't Box-Jenkins win (again)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pack, D.J.; Downing, D.J.

    This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less

  3. Relating the large-scale structure of time series and visibility networks.

    PubMed

    Rodríguez, Miguel A

    2017-06-01

    The structure of time series is usually characterized by means of correlations. A new proposal based on visibility networks has been considered recently. Visibility networks are complex networks mapped from surfaces or time series using visibility properties. The structures of time series and visibility networks are closely related, as shown by means of fractional time series in recent works. In these works, a simple relationship between the Hurst exponent H of fractional time series and the exponent of the distribution of edges γ of the corresponding visibility network, which exhibits a power law, is shown. To check and generalize these results, in this paper we delve into this idea of connected structures by defining both structures more properly. In addition to the exponents used before, H and γ, which take into account local properties, we consider two more exponents that, as we will show, characterize global properties. These are the exponent α for time series, which gives the scaling of the variance with the size as var∼T^{2α}, and the exponent κ of their corresponding network, which gives the scaling of the averaged maximum of the number of edges, 〈k_{M}〉∼N^{κ}. With this representation, a more precise connection between the structures of general time series and their associated visibility network is achieved. Similarities and differences are more clearly established, and new scaling forms of complex networks appear in agreement with their respective classes of time series.

  4. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  5. Multiple Indicator Stationary Time Series Models.

    ERIC Educational Resources Information Center

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  6. Multilevel Dynamic Generalized Structured Component Analysis for Brain Connectivity Analysis in Functional Neuroimaging Data.

    PubMed

    Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S

    2016-06-01

    We extend dynamic generalized structured component analysis (GSCA) to enhance its data-analytic capability in structural equation modeling of multi-subject time series data. Time series data of multiple subjects are typically hierarchically structured, where time points are nested within subjects who are in turn nested within a group. The proposed approach, named multilevel dynamic GSCA, accommodates the nested structure in time series data. Explicitly taking the nested structure into account, the proposed method allows investigating subject-wise variability of the loadings and path coefficients by looking at the variance estimates of the corresponding random effects, as well as fixed loadings between observed and latent variables and fixed path coefficients between latent variables. We demonstrate the effectiveness of the proposed approach by applying the method to the multi-subject functional neuroimaging data for brain connectivity analysis, where time series data-level measurements are nested within subjects.

  7. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  8. Empirical method to measure stochasticity and multifractality in nonlinear time series

    NASA Astrophysics Data System (ADS)

    Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping

    2013-12-01

    An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.

  9. Using machine learning to identify structural breaks in single-group interrupted time series designs.

    PubMed

    Linden, Ariel; Yarnold, Paul R

    2016-12-01

    Single-group interrupted time series analysis (ITSA) is a popular evaluation methodology in which a single unit of observation is being studied, the outcome variable is serially ordered as a time series and the intervention is expected to 'interrupt' the level and/or trend of the time series, subsequent to its introduction. Given that the internal validity of the design rests on the premise that the interruption in the time series is associated with the introduction of the treatment, treatment effects may seem less plausible if a parallel trend already exists in the time series prior to the actual intervention. Thus, sensitivity analyses should focus on detecting structural breaks in the time series before the intervention. In this paper, we introduce a machine-learning algorithm called optimal discriminant analysis (ODA) as an approach to determine if structural breaks can be identified in years prior to the initiation of the intervention, using data from California's 1988 voter-initiated Proposition 99 to reduce smoking rates. The ODA analysis indicates that numerous structural breaks occurred prior to the actual initiation of Proposition 99 in 1989, including perfect structural breaks in 1983 and 1985, thereby casting doubt on the validity of treatment effects estimated for the actual intervention when using a single-group ITSA design. Given the widespread use of ITSA for evaluating observational data and the increasing use of machine-learning techniques in traditional research, we recommend that structural break sensitivity analysis is routinely incorporated in all research using the single-group ITSA design. © 2016 John Wiley & Sons, Ltd.

  10. Introduction and application of the multiscale coefficient of variation analysis.

    PubMed

    Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh

    2017-10-01

    Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.

  11. Signatures of ecological processes in microbial community time series.

    PubMed

    Faust, Karoline; Bauchinger, Franziska; Laroche, Béatrice; de Buyl, Sophie; Lahti, Leo; Washburne, Alex D; Gonze, Didier; Widder, Stefanie

    2018-06-28

    Growth rates, interactions between community members, stochasticity, and immigration are important drivers of microbial community dynamics. In sequencing data analysis, such as network construction and community model parameterization, we make implicit assumptions about the nature of these drivers and thereby restrict model outcome. Despite apparent risk of methodological bias, the validity of the assumptions is rarely tested, as comprehensive procedures are lacking. Here, we propose a classification scheme to determine the processes that gave rise to the observed time series and to enable better model selection. We implemented a three-step classification scheme in R that first determines whether dependence between successive time steps (temporal structure) is present in the time series and then assesses with a recently developed neutrality test whether interactions between species are required for the dynamics. If the first and second tests confirm the presence of temporal structure and interactions, then parameters for interaction models are estimated. To quantify the importance of temporal structure, we compute the noise-type profile of the community, which ranges from black in case of strong dependency to white in the absence of any dependency. We applied this scheme to simulated time series generated with the Dirichlet-multinomial (DM) distribution, Hubbell's neutral model, the generalized Lotka-Volterra model and its discrete variant (the Ricker model), and a self-organized instability model, as well as to human stool microbiota time series. The noise-type profiles for all but DM data clearly indicated distinctive structures. The neutrality test correctly classified all but DM and neutral time series as non-neutral. The procedure reliably identified time series for which interaction inference was suitable. Both tests were required, as we demonstrated that all structured time series, including those generated with the neutral model, achieved a moderate to high goodness of fit to the Ricker model. We present a fast and robust scheme to classify community structure and to assess the prevalence of interactions directly from microbial time series data. The procedure not only serves to determine ecological drivers of microbial dynamics, but also to guide selection of appropriate community models for prediction and follow-up analysis.

  12. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    NASA Astrophysics Data System (ADS)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be studied through the copula theory. As to the parameter estimation, the maximum likelihood estimation (MLE) will be applied. To illustrate the method, the univariate time series model and the dependence structure will be determined and tested using the monthly discharge time series of Cuyahoga River Basin.

  13. Multiscale Poincaré plots for visualizing the structure of heartbeat time series.

    PubMed

    Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L

    2016-02-09

    Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.

  14. Two different flavours of complexity in financial data

    NASA Astrophysics Data System (ADS)

    Buonocore, R. J.; Musmeci, N.; Aste, T.; Matteo, T. Di

    2016-12-01

    We discuss two elements that define the complexity of financial time series: one is the multiscaling property, which is linked to how the statistics of a single time-series changes with the time horizon; the second is the structure of dependency between time-series, which accounts for the collective behaviour, i.e. the market structure. Financial time-series have statistical properties which change with the time horizon and the quantification of such multiscaling property has been successful to distinguish among different degrees of development of markets, monitor the stability of firms and estimate risk. The study of the structure of dependency between time-series with the use of information filtering graphs can reveal important insight on the market structure highlighting risks, stress and portfolio management strategies. In this contribution we highlight achievements, major successes and discuss major challenges and open problems in the study of these two elements of complexity, hoping to attract the interest of more researchers in this research area. We indeed believe that with the advent of the Big Data era, the need and the further development of such approaches, designed to deal with systems with many degrees of freedom, have become more urgent.

  15. Testing the structure of earthquake networks from multivariate time series of successive main shocks in Greece

    NASA Astrophysics Data System (ADS)

    Chorozoglou, D.; Kugiumtzis, D.; Papadimitriou, E.

    2018-06-01

    The seismic hazard assessment in the area of Greece is attempted by studying the earthquake network structure, such as small-world and random. In this network, a node represents a seismic zone in the study area and a connection between two nodes is given by the correlation of the seismic activity of two zones. To investigate the network structure, and particularly the small-world property, the earthquake correlation network is compared with randomized ones. Simulations on multivariate time series of different length and number of variables show that for the construction of randomized networks the method randomizing the time series performs better than methods randomizing directly the original network connections. Based on the appropriate randomization method, the network approach is applied to time series of earthquakes that occurred between main shocks in the territory of Greece spanning the period 1999-2015. The characterization of networks on sliding time windows revealed that small-world structure emerges in the last time interval, shortly before the main shock.

  16. AIR POLLUTION EPIDEMIOLOGY: CAN INFORMATION BE OBTAINED FROM THE VARIATIONS IN SIGNIFICANCE AND RISK AS A FUNCTION OF DAYS AFTER EXPOSURE (LAG STRUCTURE)?

    EPA Science Inventory

    Determine if analysis of lag structure from time series epidemiology, using gases, particles, and source factor time series, can contribute to understanding the relationships among various air pollution indicators. Methods: Analyze lag structure from an epidemiologic study of ca...

  17. Mesoscopic Community Structure of Financial Markets Revealed by Price and Sign Fluctuations.

    PubMed

    Almog, Assaf; Besamusca, Ferry; MacMahon, Mel; Garlaschelli, Diego

    2015-01-01

    The mesoscopic organization of complex systems, from financial markets to the brain, is an intermediate between the microscopic dynamics of individual units (stocks or neurons, in the mentioned cases), and the macroscopic dynamics of the system as a whole. The organization is determined by "communities" of units whose dynamics, represented by time series of activity, is more strongly correlated internally than with the rest of the system. Recent studies have shown that the binary projections of various financial and neural time series exhibit nontrivial dynamical features that resemble those of the original data. This implies that a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. Here, we explore whether the binary signatures of multiple time series can replicate the same complex community organization of the financial market, as the original weighted time series. We adopt a method that has been specifically designed to detect communities from cross-correlation matrices of time series data. Our analysis shows that the simpler binary representation leads to a community structure that is almost identical with that obtained using the full weighted representation. These results confirm that binary projections of financial time series contain significant structural information.

  18. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    PubMed

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Design and fabrication of two kind of SOI-based EA-type VOAs

    NASA Astrophysics Data System (ADS)

    Yuan, Pei; Wang, Yue; Wu, Yuanda; An, Junming; Hu, Xiongwei

    2018-06-01

    SOI-based variable optical attenuators based on electro-absorption mechanism are demonstrated in this paper. Two different doping structures are adopted to realize the attenuation: a structure with a single lateral p-i-n diode and a structure with several lateral p-i-n diodes connected in series. The VOAs with lateral p-i-n diodes connected in series (series VOA) can greatly improve the device attenuation efficiency compared to VOAs with a single lateral p-i-n diode structure (single VOA), which is verified by the experimental results that the attenuation efficiency of the series VOA and the single VOA is 3.76 dB/mA and 0.189 dB/mA respectively. The corresponding power consumption at 20 dB attenuation is 202 mW (series VOA) and 424 mW (single VOA) respectively. The raise time is 34.5 ns (single VOA) and 45.5 ns (series VOA), and the fall time is 37 ns (single VOA) and 48.5 ns (series VOA).

  20. Have the temperature time series a structural change after 1998?

    NASA Astrophysics Data System (ADS)

    Werner, Rolf; Valev, Dimitare; Danov, Dimitar

    2012-07-01

    The global and hemisphere temperature GISS and Hadcrut3 time series were analysed for structural changes. We postulate the continuity of the preceding temperature function depending from the time. The slopes are calculated for a sequence of segments limited by time thresholds. We used a standard method, the restricted linear regression with dummy variables. We performed the calculations and tests for different number of thresholds. The thresholds are searched continuously in determined time intervals. The F-statistic is used to obtain the time points of the structural changes.

  1. Estimating survival rates with time series of standing age‐structure data

    USGS Publications Warehouse

    Udevitz, Mark S.; Gogan, Peter J.

    2012-01-01

    It has long been recognized that age‐structure data contain useful information for assessing the status and dynamics of wildlife populations. For example, age‐specific survival rates can be estimated with just a single sample from the age distribution of a stable, stationary population. For a population that is not stable, age‐specific survival rates can be estimated using techniques such as inverse methods that combine time series of age‐structure data with other demographic data. However, estimation of survival rates using these methods typically requires numerical optimization, a relatively long time series of data, and smoothing or other constraints to provide useful estimates. We developed general models for possibly unstable populations that combine time series of age‐structure data with other demographic data to provide explicit maximum likelihood estimators of age‐specific survival rates with as few as two years of data. As an example, we applied these methods to estimate survival rates for female bison (Bison bison) in Yellowstone National Park, USA. This approach provides a simple tool for monitoring survival rates based on age‐structure data.

  2. Mesoscopic Community Structure of Financial Markets Revealed by Price and Sign Fluctuations

    PubMed Central

    Almog, Assaf; Besamusca, Ferry; MacMahon, Mel; Garlaschelli, Diego

    2015-01-01

    The mesoscopic organization of complex systems, from financial markets to the brain, is an intermediate between the microscopic dynamics of individual units (stocks or neurons, in the mentioned cases), and the macroscopic dynamics of the system as a whole. The organization is determined by “communities” of units whose dynamics, represented by time series of activity, is more strongly correlated internally than with the rest of the system. Recent studies have shown that the binary projections of various financial and neural time series exhibit nontrivial dynamical features that resemble those of the original data. This implies that a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. Here, we explore whether the binary signatures of multiple time series can replicate the same complex community organization of the financial market, as the original weighted time series. We adopt a method that has been specifically designed to detect communities from cross-correlation matrices of time series data. Our analysis shows that the simpler binary representation leads to a community structure that is almost identical with that obtained using the full weighted representation. These results confirm that binary projections of financial time series contain significant structural information. PMID:26226226

  3. Modeling climate change impacts on combined sewer overflow using synthetic precipitation time series.

    PubMed

    Bendel, David; Beck, Ferdinand; Dittmer, Ulrich

    2013-01-01

    In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).

  4. Road safety forecasts in five European countries using structural time series models.

    PubMed

    Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George

    2014-01-01

    Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.

  5. Multifractal analysis of the Korean agricultural market

    NASA Astrophysics Data System (ADS)

    Kim, Hongseok; Oh, Gabjin; Kim, Seunghwan

    2011-11-01

    We have studied the long-term memory effects of the Korean agricultural market using the detrended fluctuation analysis (DFA) method. In general, the return time series of various financial data, including stock indices, foreign exchange rates, and commodity prices, are uncorrelated in time, while the volatility time series are strongly correlated. However, we found that the return time series of Korean agricultural commodity prices are anti-correlated in time, while the volatility time series are correlated. The n-point correlations of time series were also examined, and it was found that a multifractal structure exists in Korean agricultural market prices.

  6. A Multitaper, Causal Decomposition for Stochastic, Multivariate Time Series: Application to High-Frequency Calcium Imaging Data.

    PubMed

    Sornborger, Andrew T; Lauderdale, James D

    2016-11-01

    Neural data analysis has increasingly incorporated causal information to study circuit connectivity. Dimensional reduction forms the basis of most analyses of large multivariate time series. Here, we present a new, multitaper-based decomposition for stochastic, multivariate time series that acts on the covariance of the time series at all lags, C ( τ ), as opposed to standard methods that decompose the time series, X ( t ), using only information at zero-lag. In both simulated and neural imaging examples, we demonstrate that methods that neglect the full causal structure may be discarding important dynamical information in a time series.

  7. Modeling global vector fields of chaotic systems from noisy time series with the aid of structure-selection techniques.

    PubMed

    Xu, Daolin; Lu, Fangfang

    2006-12-01

    We address the problem of reconstructing a set of nonlinear differential equations from chaotic time series. A method that combines the implicit Adams integration and the structure-selection technique of an error reduction ratio is proposed for system identification and corresponding parameter estimation of the model. The structure-selection technique identifies the significant terms from a pool of candidates of functional basis and determines the optimal model through orthogonal characteristics on data. The technique with the Adams integration algorithm makes the reconstruction available to data sampled with large time intervals. Numerical experiment on Lorenz and Rossler systems shows that the proposed strategy is effective in global vector field reconstruction from noisy time series.

  8. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  9. A new methodological approach for worldwide beryllium-7 time series analysis

    NASA Astrophysics Data System (ADS)

    Bianchi, Stefano; Longo, Alessandro; Plastino, Wolfango

    2018-07-01

    Time series analyses of cosmogenic radionuclide 7Be and 22Na atmospheric activity concentrations and meteorological data observed at twenty-five International Monitoring System (IMS) stations of the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) have shown great variability in terms of noise structures, harmonic content, cross-correlation patterns and local Hurst exponent behaviour. Noise content and its structure has been extracted and characterised for the two radionuclides time series. It has been found that the yearly component, which is present in most of the time series, is not stationary, but has a percentage weight that varies with time. Analysis of atmospheric activity concentrations of 7Be, measured at IMS stations, has shown them to be influenced by distinct meteorological patterns, mainly by atmospheric pressure and temperature.

  10. Dual Fractal Dimension and Long-Range Correlation of Chinese Stock Prices

    NASA Astrophysics Data System (ADS)

    Chen, Chaoshi; Wang, Lei

    2012-03-01

    The recently developed modified inverse random midpoint displacement (mIRMD) and conventional detrended fluctuation analysis (DFA) algorithms are used to analyze the tick-by-tick high-frequency time series of Chinese A-share stock prices and indexes. A dual-fractal structure with a crossover at about 10 min is observed. The majority of the selected time series show visible persistence within this time threshold, but approach a random walk on a longer time scale. The phenomenon is found to be industry-dependent, i.e., the crossover is much more prominent for stocks belonging to cyclical industries than for those belonging to noncyclical (defensive) industries. We have also shown that the sign series show a similar dual-fractal structure, while like generally found, the magnitude series show a much longer time persistence.

  11. Segmenting the Stream of Consciousness: The Psychological Correlates of Temporal Structures in the Time Series Data of a Continuous Performance Task

    ERIC Educational Resources Information Center

    Smallwood, Jonathan; McSpadden, Merrill; Luus, Bryan; Schooler, Joanthan

    2008-01-01

    Using principal component analysis, we examined whether structural properties in the time series of response time would identify different mental states during a continuous performance task. We examined whether it was possible to identify regular patterns which were present in blocks classified as lacking controlled processing, either…

  12. A likelihood-based time series modeling approach for application in dendrochronology to examine the growth-climate relations and forest disturbance history

    EPA Science Inventory

    A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for ...

  13. Using time series structural characteristics to analyze grain prices in food insecure countries

    USGS Publications Warehouse

    Davenport, Frank; Funk, Chris

    2015-01-01

    Two components of food security monitoring are accurate forecasts of local grain prices and the ability to identify unusual price behavior. We evaluated a method that can both facilitate forecasts of cross-country grain price data and identify dissimilarities in price behavior across multiple markets. This method, characteristic based clustering (CBC), identifies similarities in multiple time series based on structural characteristics in the data. Here, we conducted a simulation experiment to determine if CBC can be used to improve the accuracy of maize price forecasts. We then compared forecast accuracies among clustered and non-clustered price series over a rolling time horizon. We found that the accuracy of forecasts on clusters of time series were equal to or worse than forecasts based on individual time series. However, in the following experiment we found that CBC was still useful for price analysis. We used the clusters to explore the similarity of price behavior among Kenyan maize markets. We found that price behavior in the isolated markets of Mandera and Marsabit has become increasingly dissimilar from markets in other Kenyan cities, and that these dissimilarities could not be explained solely by geographic distance. The structural isolation of Mandera and Marsabit that we find in this paper is supported by field studies on food security and market integration in Kenya. Our results suggest that a market with a unique price series (as measured by structural characteristics that differ from neighboring markets) may lack market integration and food security.

  14. GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns

    DOE PAGES

    Senin, Pavel; Lin, Jessica; Wang, Xing; ...

    2018-02-23

    The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less

  15. GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senin, Pavel; Lin, Jessica; Wang, Xing

    The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less

  16. Conventional and advanced time series estimation: application to the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database, 1993-2006.

    PubMed

    Moran, John L; Solomon, Patricia J

    2011-02-01

    Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time-dependence may be established using conventional and advanced econometric time series estimators. © 2010 Blackwell Publishing Ltd.

  17. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  18. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  19. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  20. Inference of scale-free networks from gene expression time series.

    PubMed

    Daisuke, Tominaga; Horton, Paul

    2006-04-01

    Quantitative time-series observation of gene expression is becoming possible, for example by cell array technology. However, there are no practical methods with which to infer network structures using only observed time-series data. As most computational models of biological networks for continuous time-series data have a high degree of freedom, it is almost impossible to infer the correct structures. On the other hand, it has been reported that some kinds of biological networks, such as gene networks and metabolic pathways, may have scale-free properties. We hypothesize that the architecture of inferred biological network models can be restricted to scale-free networks. We developed an inference algorithm for biological networks using only time-series data by introducing such a restriction. We adopt the S-system as the network model, and a distributed genetic algorithm to optimize models to fit its simulated results to observed time series data. We have tested our algorithm on a case study (simulated data). We compared optimization under no restriction, which allows for a fully connected network, and under the restriction that the total number of links must equal that expected from a scale free network. The restriction reduced both false positive and false negative estimation of the links and also the differences between model simulation and the given time-series data.

  1. Characterization of chaotic attractors under noise: A recurrence network perspective

    NASA Astrophysics Data System (ADS)

    Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.

    2016-12-01

    We undertake a detailed numerical investigation to understand how the addition of white and colored noise to a chaotic time series changes the topology and the structure of the underlying attractor reconstructed from the time series. We use the methods and measures of recurrence plot and recurrence network generated from the time series for this analysis. We explicitly show that the addition of noise obscures the property of recurrence of trajectory points in the phase space which is the hallmark of every dynamical system. However, the structure of the attractor is found to be robust even upto high noise levels of 50%. An advantage of recurrence network measures over the conventional nonlinear measures is that they can be applied on short and non stationary time series data. By using the results obtained from the above analysis, we go on to analyse the light curves from a dominant black hole system and show that the recurrence network measures are capable of identifying the nature of noise contamination in a time series.

  2. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    NASA Astrophysics Data System (ADS)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government policies in China on the changes of dynamics of GDP and the three industries adjustment. The work in our paper provides a new way to understand the dynamics of economic development.

  3. 24 CFR 3282.13 - Voluntary certification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... certifies that if, at any time it manufactures structures which are not manufactured homes, it will identify... production and that the series of serial numbers for such structures shall be distinguishable on the structures and in its records from the series of serial numbers used for manufactured homes. (c) Whenever a...

  4. 24 CFR 3282.13 - Voluntary certification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... certifies that if, at any time it manufactures structures which are not manufactured homes, it will identify... production and that the series of serial numbers for such structures shall be distinguishable on the structures and in its records from the series of serial numbers used for manufactured homes. (c) Whenever a...

  5. 24 CFR 3282.13 - Voluntary certification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... certifies that if, at any time it manufactures structures which are not manufactured homes, it will identify... production and that the series of serial numbers for such structures shall be distinguishable on the structures and in its records from the series of serial numbers used for manufactured homes. (c) Whenever a...

  6. 24 CFR 3282.13 - Voluntary certification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... certifies that if, at any time it manufactures structures which are not manufactured homes, it will identify... production and that the series of serial numbers for such structures shall be distinguishable on the structures and in its records from the series of serial numbers used for manufactured homes. (c) Whenever a...

  7. 24 CFR 3282.13 - Voluntary certification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... certifies that if, at any time it manufactures structures which are not manufactured homes, it will identify... production and that the series of serial numbers for such structures shall be distinguishable on the structures and in its records from the series of serial numbers used for manufactured homes. (c) Whenever a...

  8. A Deep Machine Learning Method for Classifying Cyclic Time Series of Biological Signals Using Time-Growing Neural Network.

    PubMed

    Gharehbaghi, Arash; Linden, Maria

    2017-10-12

    This paper presents a novel method for learning the cyclic contents of stochastic time series: the deep time-growing neural network (DTGNN). The DTGNN combines supervised and unsupervised methods in different levels of learning for an enhanced performance. It is employed by a multiscale learning structure to classify cyclic time series (CTS), in which the dynamic contents of the time series are preserved in an efficient manner. This paper suggests a systematic procedure for finding the design parameter of the classification method for a one-versus-multiple class application. A novel validation method is also suggested for evaluating the structural risk, both in a quantitative and a qualitative manner. The effect of the DTGNN on the performance of the classifier is statistically validated through the repeated random subsampling using different sets of CTS, from different medical applications. The validation involves four medical databases, comprised of 108 recordings of the electroencephalogram signal, 90 recordings of the electromyogram signal, 130 recordings of the heart sound signal, and 50 recordings of the respiratory sound signal. Results of the statistical validations show that the DTGNN significantly improves the performance of the classification and also exhibits an optimal structural risk.

  9. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    DTIC Science & Technology

    2016-05-12

    valued times series from a sample. (A practical algorithm to compute the estimator is a work in progress.) Third, finitely-valued spatial processes...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics; time series ; Markov chains; random...proved. Second, a statistical method is developed to estimate the memory depth of discrete- time and continuously-valued times series from a sample. (A

  11. Nonlinear analysis and dynamic structure in the energy market

    NASA Astrophysics Data System (ADS)

    Aghababa, Hajar

    This research assesses the dynamic structure of the energy sector of the aggregate economy in the context of nonlinear mechanisms. Earlier studies have focused mainly on the price of the energy products when detecting nonlinearities in time series data of the energy market, and there is little mention of the production side of the market. Moreover, there is a lack of exploration about the implication of high dimensionality and time aggregation when analyzing the market's fundamentals. This research will address these gaps by including the quantity side of the market in addition to the price and by systematically incorporating various frequencies for sample sizes in three essays. The goal of this research is to provide an inclusive and exhaustive examination of the dynamics in the energy markets. The first essay begins with the application of statistical techniques, and it incorporates the most well-known univariate tests for nonlinearity with distinct power functions over alternatives and tests different null hypotheses. It utilizes the daily spot price observations on five major products in the energy market. The results suggest that the time series daily spot prices of the energy products are highly nonlinear in their nature. They demonstrate apparent evidence of general nonlinear serial dependence in each individual series, as well as nonlinearity in the first, second, and third moments of the series. The second essay examines the underlying mechanism of crude oil production and identifies the nonlinear structure of the production market by utilizing various monthly time series observations of crude oil production: the U.S. field, Organization of the Petroleum Exporting Countries (OPEC), non-OPEC, and the world production of crude oil. The finding implies that the time series data of the U.S. field, OPEC, and the world production of crude oil exhibit deep nonlinearity in their structure and are generated by nonlinear mechanisms. However, the dynamics of the non-OPEC production time series data does not reveal signs of nonlinearity. The third essay explores nonlinear structure in the case of high dimensionality of the observations, different frequencies of sample sizes, and division of the samples into sub-samples. It systematically examines the robustness of the inference methods at various levels of time aggregation by employing daily spot prices on crude oil for 26 years as well as monthly spot price index on crude oil for 41 years. The daily and monthly samples are divided into sub-samples as well. All the tests detect strong evidence of nonlinear structure in the daily spot price of crude oil; whereas in monthly observations the evidence of nonlinear dependence is less dramatic, indicating that the nonlinear serial dependence will not be as intense when the time aggregation increase in time series observations.

  12. Characterizing system dynamics with a weighted and directed network constructed from time series data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Xiaoran, E-mail: sxr0806@gmail.com; School of Mathematics and Statistics, The University of Western Australia, Crawley WA 6009; Small, Michael, E-mail: michael.small@uwa.edu.au

    In this work, we propose a novel method to transform a time series into a weighted and directed network. For a given time series, we first generate a set of segments via a sliding window, and then use a doubly symbolic scheme to characterize every windowed segment by combining absolute amplitude information with an ordinal pattern characterization. Based on this construction, a network can be directly constructed from the given time series: segments corresponding to different symbol-pairs are mapped to network nodes and the temporal succession between nodes is represented by directed links. With this conversion, dynamics underlying the timemore » series has been encoded into the network structure. We illustrate the potential of our networks with a well-studied dynamical model as a benchmark example. Results show that network measures for characterizing global properties can detect the dynamical transitions in the underlying system. Moreover, we employ a random walk algorithm to sample loops in our networks, and find that time series with different dynamics exhibits distinct cycle structure. That is, the relative prevalence of loops with different lengths can be used to identify the underlying dynamics.« less

  13. Investigation of Time Series Representations and Similarity Measures for Structural Damage Pattern Recognition

    PubMed Central

    Swartz, R. Andrew

    2013-01-01

    This paper investigates the time series representation methods and similarity measures for sensor data feature extraction and structural damage pattern recognition. Both model-based time series representation and dimensionality reduction methods are studied to compare the effectiveness of feature extraction for damage pattern recognition. The evaluation of feature extraction methods is performed by examining the separation of feature vectors among different damage patterns and the pattern recognition success rate. In addition, the impact of similarity measures on the pattern recognition success rate and the metrics for damage localization are also investigated. The test data used in this study are from the System Identification to Monitor Civil Engineering Structures (SIMCES) Z24 Bridge damage detection tests, a rigorous instrumentation campaign that recorded the dynamic performance of a concrete box-girder bridge under progressively increasing damage scenarios. A number of progressive damage test case datasets and damage test data with different damage modalities are used. The simulation results show that both time series representation methods and similarity measures have significant impact on the pattern recognition success rate. PMID:24191136

  14. Crossing trend analysis methodology and application for Turkish rainfall records

    NASA Astrophysics Data System (ADS)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  15. Riemannian multi-manifold modeling and clustering in brain networks

    NASA Astrophysics Data System (ADS)

    Slavakis, Konstantinos; Salsabilian, Shiva; Wack, David S.; Muldoon, Sarah F.; Baidoo-Williams, Henry E.; Vettel, Jean M.; Cieslak, Matthew; Grafton, Scott T.

    2017-08-01

    This paper introduces Riemannian multi-manifold modeling in the context of brain-network analytics: Brainnetwork time-series yield features which are modeled as points lying in or close to a union of a finite number of submanifolds within a known Riemannian manifold. Distinguishing disparate time series amounts thus to clustering multiple Riemannian submanifolds. To this end, two feature-generation schemes for brain-network time series are put forth. The first one is motivated by Granger-causality arguments and uses an auto-regressive moving average model to map low-rank linear vector subspaces, spanned by column vectors of appropriately defined observability matrices, to points into the Grassmann manifold. The second one utilizes (non-linear) dependencies among network nodes by introducing kernel-based partial correlations to generate points in the manifold of positivedefinite matrices. Based on recently developed research on clustering Riemannian submanifolds, an algorithm is provided for distinguishing time series based on their Riemannian-geometry properties. Numerical tests on time series, synthetically generated from real brain-network structural connectivity matrices, reveal that the proposed scheme outperforms classical and state-of-the-art techniques in clustering brain-network states/structures.

  16. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  17. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    DOE PAGES

    Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.; ...

    2016-10-20

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less

  18. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less

  19. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    PubMed Central

    Sippel, Sebastian; Mahecha, Miguel D.; Hauhs, Michael; Bodesheim, Paul; Kaminski, Thomas; Gans, Fabian; Rosso, Osvaldo A.

    2016-01-01

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observed and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. We demonstrate here that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics. PMID:27764187

  20. Boolean network inference from time series data incorporating prior biological knowledge.

    PubMed

    Haider, Saad; Pal, Ranadip

    2012-01-01

    Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.

  1. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  2. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  3. Time lagged ordinal partition networks for capturing dynamics of continuous dynamical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCullough, Michael; Iu, Herbert Ho-Ching; Small, Michael

    2015-05-15

    We investigate a generalised version of the recently proposed ordinal partition time series to network transformation algorithm. First, we introduce a fixed time lag for the elements of each partition that is selected using techniques from traditional time delay embedding. The resulting partitions define regions in the embedding phase space that are mapped to nodes in the network space. Edges are allocated between nodes based on temporal succession thus creating a Markov chain representation of the time series. We then apply this new transformation algorithm to time series generated by the Rössler system and find that periodic dynamics translate tomore » ring structures whereas chaotic time series translate to band or tube-like structures—thereby indicating that our algorithm generates networks whose structure is sensitive to system dynamics. Furthermore, we demonstrate that simple network measures including the mean out degree and variance of out degrees can track changes in the dynamical behaviour in a manner comparable to the largest Lyapunov exponent. We also apply the same analysis to experimental time series generated by a diode resonator circuit and show that the network size, mean shortest path length, and network diameter are highly sensitive to the interior crisis captured in this particular data set.« less

  4. Application of computational mechanics to the analysis of natural data: an example in geomagnetism.

    PubMed

    Clarke, Richard W; Freeman, Mervyn P; Watkins, Nicholas W

    2003-01-01

    We discuss how the ideal formalism of computational mechanics can be adapted to apply to a noninfinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the concept of effective soficity is introduced. We believe that computational mechanics cannot be applied to a noisy and finite data series without invoking an argument based upon effective soficity. A related distinction between noise and unresolved structure is also defined: Noise can only be eliminated by increasing the length of the time series, whereas the resolution of previously unresolved structure only requires the finite memory of the analysis to be increased. The benefits of these concepts are demonstrated in a simulated times series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to an analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the Earth-based station under an electrical current pattern that is fixed with respect to the Sun-Earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, some useful terminology for the discussion of model construction in general is introduced.

  5. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    PubMed

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  6. Spatio-temporal Event Classification using Time-series Kernel based Structured Sparsity

    PubMed Central

    Jeni, László A.; Lőrincz, András; Szabó, Zoltán; Cohn, Jeffrey F.; Kanade, Takeo

    2016-01-01

    In many behavioral domains, such as facial expression and gesture, sparse structure is prevalent. This sparsity would be well suited for event detection but for one problem. Features typically are confounded by alignment error in space and time. As a consequence, high-dimensional representations such as SIFT and Gabor features have been favored despite their much greater computational cost and potential loss of information. We propose a Kernel Structured Sparsity (KSS) method that can handle both the temporal alignment problem and the structured sparse reconstruction within a common framework, and it can rely on simple features. We characterize spatio-temporal events as time-series of motion patterns and by utilizing time-series kernels we apply standard structured-sparse coding techniques to tackle this important problem. We evaluated the KSS method using both gesture and facial expression datasets that include spontaneous behavior and differ in degree of difficulty and type of ground truth coding. KSS outperformed both sparse and non-sparse methods that utilize complex image features and their temporal extensions. In the case of early facial event classification KSS had 10% higher accuracy as measured by F1 score over kernel SVM methods1. PMID:27830214

  7. Analysis of HD 73045 light curve data

    NASA Astrophysics Data System (ADS)

    Das, Mrinal Kanti; Bhatraju, Naveen Kumar; Joshi, Santosh

    2018-04-01

    In this work we analyzed the Kepler light curve data of HD 73045. The raw data has been smoothened using standard filters. The power spectrum has been obtained by using a fast Fourier transform routine. It shows the presence of more than one period. In order to take care of any non-stationary behavior, we carried out a wavelet analysis to obtain the wavelet power spectrum. In addition, to identify the scale invariant structure, the data has been analyzed using a multifractal detrended fluctuation analysis. Further to characterize the diversity of embedded patterns in the HD 73045 flux time series, we computed various entropy-based complexity measures e.g. sample entropy, spectral entropy and permutation entropy. The presence of periodic structure in the time series was further analyzed using the visibility network and horizontal visibility network model of the time series. The degree distributions in the two network models confirm such structures.

  8. 76 FR 67149 - Atlantic Highly Migratory Species; Atlantic Shark Management Measures; 2012 Research Fishery

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-31

    ... shark research fishery to maintain time series data for stock assessments and to meet NMFS' [[Page 67150... tagging programs for identification of migration corridors and stock structure; Maintain time-series of.... DATES: Shark Research Fishery Applications must be received no later than 5 p.m., local time, on...

  9. 78 FR 70018 - Atlantic Highly Migratory Species; Atlantic Shark Management Measures; 2014 Research Fishery

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ...) established, among other things, a shark research fishery to maintain time series data for stock assessments... stock structure using dart and/or spaghetti tags; Maintain time-series of abundance from previously...: Shark Research Fishery Applications must be received no later than 5 p.m., local time, on December 23...

  10. 75 FR 57259 - Atlantic Highly Migratory Species; Atlantic Shark Management Measures; 2011 Research Fishery

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... fishery to maintain time series data for stock assessments and to meet NMFS' research objectives. The... for identification of migration corridors and stock structure; Maintain time-series of abundance from... considered. DATES: Shark Research Fishery Applications must be received no later than 5 p.m., local time, on...

  11. Three-dimensional liver motion tracking using real-time two-dimensional MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brix, Lau, E-mail: lau.brix@stab.rm.dk; Ringgaard, Steffen; Sørensen, Thomas Sangild

    2014-04-15

    Purpose: Combined magnetic resonance imaging (MRI) systems and linear accelerators for radiotherapy (MR-Linacs) are currently under development. MRI is noninvasive and nonionizing and can produce images with high soft tissue contrast. However, new tracking methods are required to obtain fast real-time spatial target localization. This study develops and evaluates a method for tracking three-dimensional (3D) respiratory liver motion in two-dimensional (2D) real-time MRI image series with high temporal and spatial resolution. Methods: The proposed method for 3D tracking in 2D real-time MRI series has three steps: (1) Recording of a 3D MRI scan and selection of a blood vessel (ormore » tumor) structure to be tracked in subsequent 2D MRI series. (2) Generation of a library of 2D image templates oriented parallel to the 2D MRI image series by reslicing and resampling the 3D MRI scan. (3) 3D tracking of the selected structure in each real-time 2D image by finding the template and template position that yield the highest normalized cross correlation coefficient with the image. Since the tracked structure has a known 3D position relative to each template, the selection and 2D localization of a specific template translates into quantification of both the through-plane and in-plane position of the structure. As a proof of principle, 3D tracking of liver blood vessel structures was performed in five healthy volunteers in two 5.4 Hz axial, sagittal, and coronal real-time 2D MRI series of 30 s duration. In each 2D MRI series, the 3D localization was carried out twice, using nonoverlapping template libraries, which resulted in a total of 12 estimated 3D trajectories per volunteer. Validation tests carried out to support the tracking algorithm included quantification of the breathing induced 3D liver motion and liver motion directionality for the volunteers, and comparison of 2D MRI estimated positions of a structure in a watermelon with the actual positions. Results: Axial, sagittal, and coronal 2D MRI series yielded 3D respiratory motion curves for all volunteers. The motion directionality and amplitude were very similar when measured directly as in-plane motion or estimated indirectly as through-plane motion. The mean peak-to-peak breathing amplitude was 1.6 mm (left-right), 11.0 mm (craniocaudal), and 2.5 mm (anterior-posterior). The position of the watermelon structure was estimated in 2D MRI images with a root-mean-square error of 0.52 mm (in-plane) and 0.87 mm (through-plane). Conclusions: A method for 3D tracking in 2D MRI series was developed and demonstrated for liver tracking in volunteers. The method would allow real-time 3D localization with integrated MR-Linac systems.« less

  12. Damage classification and estimation in experimental structures using time series analysis and pattern recognition

    NASA Astrophysics Data System (ADS)

    de Lautour, Oliver R.; Omenzetter, Piotr

    2010-07-01

    Developed for studying long sequences of regularly sampled data, time series analysis methods are being increasingly investigated for the use of Structural Health Monitoring (SHM). In this research, Autoregressive (AR) models were used to fit the acceleration time histories obtained from two experimental structures: a 3-storey bookshelf structure and the ASCE Phase II Experimental SHM Benchmark Structure, in undamaged and limited number of damaged states. The coefficients of the AR models were considered to be damage-sensitive features and used as input into an Artificial Neural Network (ANN). The ANN was trained to classify damage cases or estimate remaining structural stiffness. The results showed that the combination of AR models and ANNs are efficient tools for damage classification and estimation, and perform well using small number of damage-sensitive features and limited sensors.

  13. The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.

    PubMed

    Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny

    2018-04-16

    We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.

  14. Reconstruction of network topology using status-time-series data

    NASA Astrophysics Data System (ADS)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  15. Lead-lag cross-sectional structure and detection of correlated anticorrelated regime shifts: Application to the volatilities of inflation and economic growth rates

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    2007-07-01

    We have recently introduced the “thermal optimal path” (TOP) method to investigate the real-time lead-lag structure between two time series. The TOP method consists in searching for a robust noise-averaged optimal path of the distance matrix along which the two time series have the greatest similarity. Here, we generalize the TOP method by introducing a more general definition of distance which takes into account possible regime shifts between positive and negative correlations. This generalization to track possible changes of correlation signs is able to identify possible transitions from one convention (or consensus) to another. Numerical simulations on synthetic time series verify that the new TOP method performs as expected even in the presence of substantial noise. We then apply it to investigate changes of convention in the dependence structure between the historical volatilities of the USA inflation rate and economic growth rate. Several measures show that the new TOP method significantly outperforms standard cross-correlation methods.

  16. Acoustic emission linear pulse holography

    DOEpatents

    Collins, H. Dale; Busse, Lawrence J.; Lemon, Douglas K.

    1985-01-01

    Defects in a structure are imaged as they propagate, using their emitted acoustic energy as a monitored source. Short bursts of acoustic energy propagate through the structure to a discrete element receiver array. A reference timing transducer located between the array and the inspection zone initiates a series of time-of-flight measurements. A resulting series of time-of-flight measurements are then treated as aperture data and are transferred to a computer for reconstruction of a synthetic linear holographic image. The images can be displayed and stored as a record of defect growth.

  17. Self-organising mixture autoregressive model for non-stationary time series modelling.

    PubMed

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  18. Homogenising time series: beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2011-06-01

    In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.

  19. Patterns of time series of numbers of emergency hospitalizations in mental hospitals in Moscow and Kazan (common features and differences)

    NASA Astrophysics Data System (ADS)

    Aptikaeva, O. I.; Gamburtsev, A. G.; Martyushov, A. N.

    2012-12-01

    We have investigated the numbers of emergency hospitalizations in mental and drug-treatment hospitals in Kazan in 1996-2006 and in Moscow in 1984-1996. Samples have been analyzed by disease type, sex, age, and place of residence (city or village). This study aims to discover differences and common traits in various structures of series of hospitalizations in these samples and their possible relationships with the changing parameters of the environment. We have found similar structures of series of samples of the same type both in Moscow and in Kazan. In some cases, cyclic structures of series of numbers of hospitalizations and series of changes in solar activity and the rate of rotation of the earth change simultaneously.

  20. A comparison between MS-VECM and MS-VECMX on economic time series data

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Wai; Ismail, Mohd Tahir; Sek, Siok-Kun

    2014-07-01

    Multivariate Markov switching models able to provide useful information on the study of structural change data since the regime switching model can analyze the time varying data and capture the mean and variance in the series of dependence structure. This paper will investigates the oil price and gold price effects on Malaysia, Singapore, Thailand and Indonesia stock market returns. Two forms of Multivariate Markov switching models are used namely the mean adjusted heteroskedasticity Markov Switching Vector Error Correction Model (MSMH-VECM) and the mean adjusted heteroskedasticity Markov Switching Vector Error Correction Model with exogenous variable (MSMH-VECMX). The reason for using these two models are to capture the transition probabilities of the data since real financial time series data always exhibit nonlinear properties such as regime switching, cointegrating relations, jumps or breaks passing the time. A comparison between these two models indicates that MSMH-VECM model able to fit the time series data better than the MSMH-VECMX model. In addition, it was found that oil price and gold price affected the stock market changes in the four selected countries.

  1. Modified DTW for a quantitative estimation of the similarity between rainfall time series

    NASA Astrophysics Data System (ADS)

    Djallel Dilmi, Mohamed; Barthès, Laurent; Mallet, Cécile; Chazottes, Aymeric

    2017-04-01

    The Precipitations are due to complex meteorological phenomenon and can be described as intermittent process. The spatial and temporal variability of this phenomenon is significant and covers large scales. To analyze and model this variability and / or structure, several studies use a network of rain gauges providing several time series of precipitation measurements. To compare these different time series, the authors compute for each time series some parameters (PDF, rain peak intensity, occurrence, amount, duration, intensity …). However, and despite the calculation of these parameters, the comparison of the parameters between two series of measurements remains qualitative. Due to the advection processes, when different sensors of an observation network measure precipitation time series identical in terms of intermitency or intensities, there is a time lag between the different measured series. Analyzing and extracting relevant information on physical phenomena from these precipitation time series implies the development of automatic analytical methods capable of comparing two time series of precipitation measured by different sensors or at two different locations and thus quantifying the difference / similarity. The limits of the Euclidean distance to measure the similarity between the time series of precipitation have been well demonstrated and explained (eg the Euclidian distance is indeed very sensitive to the effects of phase shift : between two identical but slightly shifted time series, this distance is not negligible). To quantify and analysis these time lag, the correlation functions are well established, normalized and commonly used to measure the spatial dependences that are required by many applications. However, authors generally observed that there is always a considerable scatter of the inter-rain gauge correlation coefficients obtained from the individual pairs of rain gauges. Because of a substantial dispersion of estimated time lag, the interpretation of this inter-correlation is not straightforward. We propose here to use an improvement of the Euclidian distance which integrates the global complexity of the rainfall series. The Dynamic Time Wrapping (DTW) used in speech recognition allows matching two time series instantly different and provide the most probable time lag. However, the original formulation of the DTW suffers from some limitations. In particular, it is not adequate to the rain intermittency. In this study we present an adaptation of the DTW for the analysis of rainfall time series : we used time series from the "Météo France" rain gauge network observed between January 1st, 2007 and December 31st, 2015 on 25 stations located in the Île de France area. Then we analyze the results (eg. The distance, the relationship between the time lag detected by our methods and others measured parameters like speed and direction of the wind…) to show the ability of the proposed similarity to provide usefull information on the rain structure. The possibility of using this measure of similarity to define a quality indicator of a sensor integrated into an observation network is also envisaged.

  2. Moving Average Models with Bivariate Exponential and Geometric Distributions.

    DTIC Science & Technology

    1985-03-01

    ordinary time series and of point processes. Developments in Statistics, Vol. 1, P.R. Krishnaiah , ed. Academic Press, New York. [9] Esary, J.D. and...valued and discrete - valued time series with ARMA correlation structure. Multivariate Analysis V, P.R. Krishnaiah , ed. North-Holland. 151-166. [28

  3. 77 FR 69508 - Inservice Inspection of Prestressed Concrete Containment Structures With Grouted Tendons

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-19

    ... real-time, multiple-strategy approach (i.e., appropriate grout design and installation, installed... is available in ADAMS) is provided the first time that a document is referenced. Revision 2 of... ``Regulatory Guide'' series. This series was developed to describe and make available to the public information...

  4. MULTIVARIATE STATISTICAL MODELS FOR EFFECTS OF PM AND COPOLLUTANTS IN A DAILY TIME SERIES EPIDEMIOLOGY STUDY

    EPA Science Inventory

    Most analyses of daily time series epidemiology data relate mortality or morbidity counts to PM and other air pollutants by means of single-outcome regression models using multiple predictors, without taking into account the complex statistical structure of the predictor variable...

  5. Application of cross-sectional time series modeling for the prediction of energy expenditure from heart rate and accelerometry

    USDA-ARS?s Scientific Manuscript database

    Accurate estimation of energy expenditure (EE) in children and adolescents is required for a better understanding of physiological, behavioral, and environmental factors affecting energy balance. Cross-sectional time series (CSTS) models, which account for correlation structure of repeated observati...

  6. The parametric modified limited penetrable visibility graph for constructing complex networks from time series

    NASA Astrophysics Data System (ADS)

    Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang

    2018-02-01

    This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.

  7. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  8. Patterns of variations in large pelagic fish: A comparative approach between the Indian and the Atlantic Oceans

    NASA Astrophysics Data System (ADS)

    Corbineau, A.; Rouyer, T.; Fromentin, J.-M.; Cazelles, B.; Fonteneau, A.; Ménard, F.

    2010-07-01

    Catch data of large pelagic fish such as tuna, swordfish and billfish are highly variable ranging from short to long term. Based on fisheries data, these time series are noisy and reflect mixed information on exploitation (targeting, strategy, fishing power), population dynamics (recruitment, growth, mortality, migration, etc.), and environmental forcing (local conditions or dominant climate patterns). In this work, we investigated patterns of variation of large pelagic fish (i.e. yellowfin tuna, bigeye tuna, swordfish and blue marlin) in Japanese longliners catch data from 1960 to 2004. We performed wavelet analyses on the yearly time series of each fish species in each biogeographic province of the tropical Indian and Atlantic Oceans. In addition, we carried out cross-wavelet analyses between these biological time series and a large-scale climatic index, i.e. the Southern Oscillation Index (SOI). Results showed that the biogeographic province was the most important factor structuring the patterns of variability of Japanese catch time series. Relationships between the SOI and the fish catches in the Indian and Atlantic Oceans also pointed out the role of climatic variability for structuring patterns of variation of catch time series. This work finally confirmed that Japanese longline CPUE data poorly reflect the underlying population dynamics of tunas.

  9. Novel Covariance-Based Neutrality Test of Time-Series Data Reveals Asymmetries in Ecological and Economic Systems

    PubMed Central

    Burby, Joshua W.; Lacker, Daniel

    2016-01-01

    Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or the number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems. PMID:27689714

  10. Hierarchical organization of brain functional networks during visual tasks.

    PubMed

    Zhuo, Zhao; Cai, Shi-Min; Fu, Zhong-Qian; Zhang, Jie

    2011-09-01

    The functional network of the brain is known to demonstrate modular structure over different hierarchical scales. In this paper, we systematically investigated the hierarchical modular organizations of the brain functional networks that are derived from the extent of phase synchronization among high-resolution EEG time series during a visual task. In particular, we compare the modular structure of the functional network from EEG channels with that of the anatomical parcellation of the brain cortex. Our results show that the modular architectures of brain functional networks correspond well to those from the anatomical structures over different levels of hierarchy. Most importantly, we find that the consistency between the modular structures of the functional network and the anatomical network becomes more pronounced in terms of vision, sensory, vision-temporal, motor cortices during the visual task, which implies that the strong modularity in these areas forms the functional basis for the visual task. The structure-function relationship further reveals that the phase synchronization of EEG time series in the same anatomical group is much stronger than that of EEG time series from different anatomical groups during the task and that the hierarchical organization of functional brain network may be a consequence of functional segmentation of the brain cortex.

  11. How does the host population's network structure affect the estimation accuracy of epidemic parameters?

    NASA Astrophysics Data System (ADS)

    Yashima, Kenta; Ito, Kana; Nakamura, Kazuyuki

    2013-03-01

    When an Infectious disease where to prevail throughout the population, epidemic parameters such as the basic reproduction ratio, initial point of infection etc. are estimated from the time series data of infected population. However, it is unclear how does the structure of host population affects this estimation accuracy. In other words, what kind of city is difficult to estimate its epidemic parameters? To answer this question, epidemic data are simulated by constructing a commuting network with different network structure and running the infection process over this network. From the given time series data for each network structure, we would like to analyzed estimation accuracy of epidemic parameters.

  12. Effects of ageing on the electrical characteristics of Zn/ZnS/n-GaAs/In structure

    NASA Astrophysics Data System (ADS)

    Güzeldir, B.; Sağlam, M.

    2016-04-01

    Zn/ZnS/n-GaAs/In structure has been fabricated by the Successive Ionic Layer Adsorption and Reaction (SILAR) method and the influence of the time dependent or ageing on the characteristic parameters are examined. The current-voltage (I-V) of the structure have been measured immediately, 1, 3, 5, 15, 30, 45, 60, 75, 90, 105, 120, 135, 150 and 165 days after fabrication of this structure. The characteristics parameters of this structure such as barrier height, ideality factor, series resistance are calculated from the I-V measurements. It has been seen that the changes of characteristic parameters such as barrier height, ideality factor and series resistance of Zn/ZnS/n-GaAs/In structure have lightly changed with increasing ageing time.

  13. Introduction to multifractal detrended fluctuation analysis in matlab.

    PubMed

    Ihlen, Espen A F

    2012-01-01

    Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra.

  14. Introduction to Multifractal Detrended Fluctuation Analysis in Matlab

    PubMed Central

    Ihlen, Espen A. F.

    2012-01-01

    Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra. PMID:22675302

  15. A wavelet based approach to measure and manage contagion at different time scales

    NASA Astrophysics Data System (ADS)

    Berger, Theo

    2015-10-01

    We decompose financial return series of US stocks into different time scales with respect to different market regimes. First, we examine dependence structure of decomposed financial return series and analyze the impact of the current financial crisis on contagion and changing interdependencies as well as upper and lower tail dependence for different time scales. Second, we demonstrate to which extent the information of different time scales can be used in the context of portfolio management. As a result, minimizing the variance of short-run noise outperforms a portfolio that minimizes the variance of the return series.

  16. Option pricing from wavelet-filtered financial series

    NASA Astrophysics Data System (ADS)

    de Almeida, V. T. X.; Moriconi, L.

    2012-10-01

    We perform wavelet decomposition of high frequency financial time series into large and small time scale components. Taking the FTSE100 index as a case study, and working with the Haar basis, it turns out that the small scale component defined by most (≃99.6%) of the wavelet coefficients can be neglected for the purpose of option premium evaluation. The relevance of the hugely compressed information provided by low-pass wavelet-filtering is related to the fact that the non-gaussian statistical structure of the original financial time series is essentially preserved for expiration times which are larger than just one trading day.

  17. Permutation entropy with vector embedding delays

    NASA Astrophysics Data System (ADS)

    Little, Douglas J.; Kane, Deb M.

    2017-12-01

    Permutation entropy (PE) is a statistic used widely for the detection of structure within a time series. Embedding delay times at which the PE is reduced are characteristic timescales for which such structure exists. Here, a generalized scheme is investigated where embedding delays are represented by vectors rather than scalars, permitting PE to be calculated over a (D -1 ) -dimensional space, where D is the embedding dimension. This scheme is applied to numerically generated noise, sine wave and logistic map series, and experimental data sets taken from a vertical-cavity surface emitting laser exhibiting temporally localized pulse structures within the round-trip time of the laser cavity. Results are visualized as PE maps as a function of embedding delay, with low PE values indicating combinations of embedding delays where correlation structure is present. It is demonstrated that vector embedding delays enable identification of structure that is ambiguous or masked, when the embedding delay is constrained to scalar form.

  18. United States forest disturbance trends observed with landsat time series

    Treesearch

    Jeffrey G. Masek; Samuel N. Goward; Robert E. Kennedy; Warren B. Cohen; Gretchen G. Moisen; Karen Schleweiss; Chengquan Huang

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing US land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest...

  19. Causal discovery and inference: concepts and recent methodological advances.

    PubMed

    Spirtes, Peter; Zhang, Kun

    This paper aims to give a broad coverage of central concepts and principles involved in automated causal inference and emerging approaches to causal discovery from i.i.d data and from time series. After reviewing concepts including manipulations, causal models, sample predictive modeling, causal predictive modeling, and structural equation models, we present the constraint-based approach to causal discovery, which relies on the conditional independence relationships in the data, and discuss the assumptions underlying its validity. We then focus on causal discovery based on structural equations models, in which a key issue is the identifiability of the causal structure implied by appropriately defined structural equation models: in the two-variable case, under what conditions (and why) is the causal direction between the two variables identifiable? We show that the independence between the error term and causes, together with appropriate structural constraints on the structural equation, makes it possible. Next, we report some recent advances in causal discovery from time series. Assuming that the causal relations are linear with nonGaussian noise, we mention two problems which are traditionally difficult to solve, namely causal discovery from subsampled data and that in the presence of confounding time series. Finally, we list a number of open questions in the field of causal discovery and inference.

  20. Clustering of financial time series

    NASA Astrophysics Data System (ADS)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  1. An evaluation of dynamic mutuality measurements and methods in cyclic time series

    NASA Astrophysics Data System (ADS)

    Xia, Xiaohua; Huang, Guitian; Duan, Na

    2010-12-01

    Several measurements and techniques have been developed to detect dynamic mutuality and synchronicity of time series in econometrics. This study aims to compare the performances of five methods, i.e., linear regression, dynamic correlation, Markov switching models, concordance index and recurrence quantification analysis, through numerical simulations. We evaluate the abilities of these methods to capture structure changing and cyclicity in time series and the findings of this paper would offer guidance to both academic and empirical researchers. Illustration examples are also provided to demonstrate the subtle differences of these techniques.

  2. A robust damage-detection technique with environmental variability combining time-series models with principal components

    NASA Astrophysics Data System (ADS)

    Lakshmi, K.; Rama Mohan Rao, A.

    2014-10-01

    In this paper, a novel output-only damage-detection technique based on time-series models for structural health monitoring in the presence of environmental variability and measurement noise is presented. The large amount of data obtained in the form of time-history response is transformed using principal component analysis, in order to reduce the data size and thereby improve the computational efficiency of the proposed algorithm. The time instant of damage is obtained by fitting the acceleration time-history data from the structure using autoregressive (AR) and AR with exogenous inputs time-series prediction models. The probability density functions (PDFs) of damage features obtained from the variances of prediction errors corresponding to references and healthy current data are found to be shifting from each other due to the presence of various uncertainties such as environmental variability and measurement noise. Control limits using novelty index are obtained using the distances of the peaks of the PDF curves in healthy condition and used later for determining the current condition of the structure. Numerical simulation studies have been carried out using a simply supported beam and also validated using an experimental benchmark data corresponding to a three-storey-framed bookshelf structure proposed by Los Alamos National Laboratory. Studies carried out in this paper clearly indicate the efficiency of the proposed algorithm for damage detection in the presence of measurement noise and environmental variability.

  3. Behavior of road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin

    2014-12-01

    Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.

  4. Structural models used in real-time biosurveillance outbreak detection and outbreak curve isolation from noisy background morbidity levels

    PubMed Central

    Cheng, Karen Elizabeth; Crary, David J; Ray, Jaideep; Safta, Cosmin

    2013-01-01

    Objective We discuss the use of structural models for the analysis of biosurveillance related data. Methods and results Using a combination of real and simulated data, we have constructed a data set that represents a plausible time series resulting from surveillance of a large scale bioterrorist anthrax attack in Miami. We discuss the performance of anomaly detection with structural models for these data using receiver operating characteristic (ROC) and activity monitoring operating characteristic (AMOC) analysis. In addition, we show that these techniques provide a method for predicting the level of the outbreak valid for approximately 2 weeks, post-alarm. Conclusions Structural models provide an effective tool for the analysis of biosurveillance data, in particular for time series with noisy, non-stationary background and missing data. PMID:23037798

  5. Metric projection for dynamic multiplex networks.

    PubMed

    Jurman, Giuseppe

    2016-08-01

    Evolving multiplex networks are a powerful model for representing the dynamics along time of different phenomena, such as social networks, power grids, biological pathways. However, exploring the structure of the multiplex network time series is still an open problem. Here we propose a two-step strategy to tackle this problem based on the concept of distance (metric) between networks. Given a multiplex graph, first a network of networks is built for each time step, and then a real valued time series is obtained by the sequence of (simple) networks by evaluating the distance from the first element of the series. The effectiveness of this approach in detecting the occurring changes along the original time series is shown on a synthetic example first, and then on the Gulf dataset of political events.

  6. Ordinary kriging as a tool to estimate historical daily streamflow records

    USGS Publications Warehouse

    Farmer, William H.

    2016-01-01

    Efficient and responsible management of water resources relies on accurate streamflow records. However, many watersheds are ungaged, limiting the ability to assess and understand local hydrology. Several tools have been developed to alleviate this data scarcity, but few provide continuous daily streamflow records at individual streamgages within an entire region. Building on the history of hydrologic mapping, ordinary kriging was extended to predict daily streamflow time series on a regional basis. Pooling parameters to estimate a single, time-invariant characterization of spatial semivariance structure is shown to produce accurate reproduction of streamflow. This approach is contrasted with a time-varying series of variograms, representing the temporal evolution and behavior of the spatial semivariance structure. Furthermore, the ordinary kriging approach is shown to produce more accurate time series than more common, single-index hydrologic transfers. A comparison between topological kriging and ordinary kriging is less definitive, showing the ordinary kriging approach to be significantly inferior in terms of Nash–Sutcliffe model efficiencies while maintaining significantly superior performance measured by root mean squared errors. Given the similarity of performance and the computational efficiency of ordinary kriging, it is concluded that ordinary kriging is useful for first-order approximation of daily streamflow time series in ungaged watersheds.

  7. Improvements to surrogate data methods for nonstationary time series.

    PubMed

    Lucio, J H; Valdés, R; Rodríguez, L R

    2012-05-01

    The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.

  8. Imaging Molecular Motion: Femtosecond X-Ray Scattering of an Electrocyclic Chemical Reaction

    NASA Astrophysics Data System (ADS)

    Minitti, M. P.; Budarz, J. M.; Kirrander, A.; Robinson, J. S.; Ratner, D.; Lane, T. J.; Zhu, D.; Glownia, J. M.; Kozina, M.; Lemke, H. T.; Sikorski, M.; Feng, Y.; Nelson, S.; Saita, K.; Stankus, B.; Northey, T.; Hastings, J. B.; Weber, P. M.

    2015-06-01

    Structural rearrangements within single molecules occur on ultrafast time scales. Many aspects of molecular dynamics, such as the energy flow through excited states, have been studied using spectroscopic techniques, yet the goal to watch molecules evolve their geometrical structure in real time remains challenging. By mapping nuclear motions using femtosecond x-ray pulses, we have created real-space representations of the evolving dynamics during a well-known chemical reaction and show a series of time-sorted structural snapshots produced by ultrafast time-resolved hard x-ray scattering. A computational analysis optimally matches the series of scattering patterns produced by the x rays to a multitude of potential reaction paths. In so doing, we have made a critical step toward the goal of viewing chemical reactions on femtosecond time scales, opening a new direction in studies of ultrafast chemical reactions in the gas phase.

  9. Imaging Molecular Motion: Femtosecond X-Ray Scattering of an Electrocyclic Chemical Reaction.

    PubMed

    Minitti, M P; Budarz, J M; Kirrander, A; Robinson, J S; Ratner, D; Lane, T J; Zhu, D; Glownia, J M; Kozina, M; Lemke, H T; Sikorski, M; Feng, Y; Nelson, S; Saita, K; Stankus, B; Northey, T; Hastings, J B; Weber, P M

    2015-06-26

    Structural rearrangements within single molecules occur on ultrafast time scales. Many aspects of molecular dynamics, such as the energy flow through excited states, have been studied using spectroscopic techniques, yet the goal to watch molecules evolve their geometrical structure in real time remains challenging. By mapping nuclear motions using femtosecond x-ray pulses, we have created real-space representations of the evolving dynamics during a well-known chemical reaction and show a series of time-sorted structural snapshots produced by ultrafast time-resolved hard x-ray scattering. A computational analysis optimally matches the series of scattering patterns produced by the x rays to a multitude of potential reaction paths. In so doing, we have made a critical step toward the goal of viewing chemical reactions on femtosecond time scales, opening a new direction in studies of ultrafast chemical reactions in the gas phase.

  10. Time-series analysis of delta13C from tree rings. I. Time trends and autocorrelation.

    PubMed

    Monserud, R A; Marshall, J D

    2001-09-01

    Univariate time-series analyses were conducted on stable carbon isotope ratios obtained from tree-ring cellulose. We looked for the presence and structure of autocorrelation. Significant autocorrelation violates the statistical independence assumption and biases hypothesis tests. Its presence would indicate the existence of lagged physiological effects that persist for longer than the current year. We analyzed data from 28 trees (60-85 years old; mean = 73 years) of western white pine (Pinus monticola Dougl.), ponderosa pine (Pinus ponderosa Laws.), and Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. glauca) growing in northern Idaho. Material was obtained by the stem analysis method from rings laid down in the upper portion of the crown throughout each tree's life. The sampling protocol minimized variation caused by changing light regimes within each tree. Autoregressive moving average (ARMA) models were used to describe the autocorrelation structure over time. Three time series were analyzed for each tree: the stable carbon isotope ratio (delta(13)C); discrimination (delta); and the difference between ambient and internal CO(2) concentrations (c(a) - c(i)). The effect of converting from ring cellulose to whole-leaf tissue did not affect the analysis because it was almost completely removed by the detrending that precedes time-series analysis. A simple linear or quadratic model adequately described the time trend. The residuals from the trend had a constant mean and variance, thus ensuring stationarity, a requirement for autocorrelation analysis. The trend over time for c(a) - c(i) was particularly strong (R(2) = 0.29-0.84). Autoregressive moving average analyses of the residuals from these trends indicated that two-thirds of the individual tree series contained significant autocorrelation, whereas the remaining third were random (white noise) over time. We were unable to distinguish between individuals with and without significant autocorrelation beforehand. Significant ARMA models were all of low order, with either first- or second-order (i.e., lagged 1 or 2 years, respectively) models performing well. A simple autoregressive (AR(1)), model was the most common. The most useful generalization was that the same ARMA model holds for each of the three series (delta(13)C, delta, c(a) - c(i)) for an individual tree, if the time trend has been properly removed for each series. The mean series for the two pine species were described by first-order ARMA models (1-year lags), whereas the Douglas-fir mean series were described by second-order models (2-year lags) with negligible first-order effects. Apparently, the process of constructing a mean time series for a species preserves an underlying signal related to delta(13)C while canceling some of the random individual tree variation. Furthermore, the best model for the overall mean series (e.g., for a species) cannot be inferred from a consensus of the individual tree model forms, nor can its parameters be estimated reliably from the mean of the individual tree parameters. Because two-thirds of the individual tree time series contained significant autocorrelation, the normal assumption of a random structure over time is unwarranted, even after accounting for the time trend. The residuals of an appropriate ARMA model satisfy the independence assumption, and can be used to make hypothesis tests.

  11. A novel time series link prediction method: Learning automata approach

    NASA Astrophysics Data System (ADS)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  12. Visibility graph network analysis of natural gas price: The case of North American market

    NASA Astrophysics Data System (ADS)

    Sun, Mei; Wang, Yaqi; Gao, Cuixia

    2016-11-01

    Fluctuations in prices of natural gas significantly affect global economy. Therefore, the research on the characteristics of natural gas price fluctuations, turning points and its influencing cycle on the subsequent price series is of great significance. Global natural gas trade concentrates on three regional markets: the North American market, the European market and the Asia-Pacific market, with North America having the most developed natural gas financial market. In addition, perfect legal supervision and coordinated regulations make the North American market more open and more competitive. This paper focuses on the North American natural gas market specifically. The Henry Hub natural gas spot price time series is converted to a visibility graph network which provides a new direction for macro analysis of time series, and several indicators are investigated: degree and degree distribution, the average shortest path length and community structure. The internal mechanisms underlying price fluctuations are explored through the indicators. The results show that the natural gas prices visibility graph network (NGP-VGN) is of small-world and scale-free properties simultaneously. After random rearrangement of original price time series, the degree distribution of network becomes exponential distribution, different from the original ones. This means that, the original price time series is of long-range negative correlation fractal characteristic. In addition, nodes with large degree correspond to significant geopolitical or economic events. Communities correspond to time cycles in visibility graph network. The cycles of time series and the impact scope of hubs can be found by community structure partition.

  13. Economic Conditions and the Divorce Rate: A Time-Series Analysis of the Postwar United States.

    ERIC Educational Resources Information Center

    South, Scott J.

    1985-01-01

    Challenges the belief that the divorce rate rises during prosperity and falls during economic recessions. Time-series regression analysis of postwar United States reveals small but positive effects of unemployment on divorce rate. Stronger influences on divorce rates are changes in age structure and labor-force participation rate of women.…

  14. Behaviour of a series of reservoirs separated by drowned gates

    NASA Astrophysics Data System (ADS)

    Kolechkina, Alla; van Nooijen, Ronald

    2017-04-01

    Modern control systems tend to be based on computers and therefore to operate by sending commands to structures at given intervals (discrete time control system). Moreover, for almost all water management control systems there are practical lower limits on the time interval between structure adjustments and even between measurements. The water resource systems that are being controlled are physical systems whose state changes continuously. If we combine a continuously changing system and a discrete time controller we get a hybrid system. We use material from recent control theory literature to examine the behaviour of a series of reservoirs separated by drowned gates where the gates are under computer control.

  15. Inferring the relative resilience of alternative states

    USGS Publications Warehouse

    Angeler, David G.; Allen, Craig R.; Rojo, Carmen; Alvarez-Cobelas, Miguel; Rodrigo, Maria A.; Sanchez-Carrillo, Salvador

    2013-01-01

    Ecological systems may occur in alternative states that differ in ecological structures, functions and processes. Resilience is the measure of disturbance an ecological system can absorb before changing states. However, how the intrinsic structures and processes of systems that characterize their states affects their resilience remains unclear. We analyzed time series of phytoplankton communities at three sites in a floodplain in central Spain to assess the dominant frequencies or “temporal scales” in community dynamics and compared the patterns between a wet and a dry alternative state. The identified frequencies and cross-scale structures are expected to arise from positive feedbacks that are thought to reinforce processes in alternative states of ecological systems and regulate emergent phenomena such as resilience. Our analyses show a higher species richness and diversity but lower evenness in the dry state. Time series modeling revealed a decrease in the importance of short-term variability in the communities, suggesting that community dynamics slowed down in the dry relative to the wet state. The number of temporal scales at which community dynamics manifested, and the explanatory power of time series models, was lower in the dry state. The higher diversity, reduced number of temporal scales and the lower explanatory power of time series models suggest that species dynamics tended to be more stochastic in the dry state. From a resilience perspective our results highlight a paradox: increasing species richness may not necessarily enhance resilience. The loss of cross-scale structure (i.e. the lower number of temporal scales) in community dynamics across sites suggests that resilience erodes during drought. Phytoplankton communities in the dry state are therefore likely less resilient than in the wet state. Our case study demonstrates the potential of time series modeling to assess attributes that mediate resilience. The approach is useful for assessing resilience of alternative states across ecological and other complex systems.

  16. The local properties of ocean surface waves by the phase-time method

    NASA Technical Reports Server (NTRS)

    Huang, Norden E.; Long, Steven R.; Tung, Chi-Chao; Donelan, Mark A.; Yuan, Yeli; Lai, Ronald J.

    1992-01-01

    A new approach using phase information to view and study the properties of frequency modulation, wave group structures, and wave breaking is presented. The method is applied to ocean wave time series data and a new type of wave group (containing the large 'rogue' waves) is identified. The method also has the capability of broad applications in the analysis of time series data in general.

  17. Diffusive and subdiffusive dynamics of indoor microclimate: a time series modeling.

    PubMed

    Maciejewska, Monika; Szczurek, Andrzej; Sikora, Grzegorz; Wyłomańska, Agnieszka

    2012-09-01

    The indoor microclimate is an issue in modern society, where people spend about 90% of their time indoors. Temperature and relative humidity are commonly used for its evaluation. In this context, the two parameters are usually considered as behaving in the same manner, just inversely correlated. This opinion comes from observation of the deterministic components of temperature and humidity time series. We focus on the dynamics and the dependency structure of the time series of these parameters, without deterministic components. Here we apply the mean square displacement, the autoregressive integrated moving average (ARIMA), and the methodology for studying anomalous diffusion. The analyzed data originated from five monitoring locations inside a modern office building, covering a period of nearly one week. It was found that the temperature data exhibited a transition between diffusive and subdiffusive behavior, when the building occupancy pattern changed from the weekday to the weekend pattern. At the same time the relative humidity consistently showed diffusive character. Also the structures of the dependencies of the temperature and humidity data sets were different, as shown by the different structures of the ARIMA models which were found appropriate. In the space domain, the dynamics and dependency structure of the particular parameter were preserved. This work proposes an approach to describe the very complex conditions of indoor air and it contributes to the improvement of the representative character of microclimate monitoring.

  18. Long-range memory and multifractality in gold markets

    NASA Astrophysics Data System (ADS)

    Mali, Provash; Mukhopadhyay, Amitabha

    2015-03-01

    Long-range correlation and fluctuation in the gold market time series of the world's two leading gold consuming countries, namely China and India, are studied. For both the market series during the period 1985-2013 we observe a long-range persistence of memory in the sequences of maxima (minima) of returns in successive time windows of fixed length, but the series, as a whole, are found to be uncorrelated. Multifractal analysis for these series as well as for the sequences of maxima (minima) is carried out in terms of the multifractal detrended fluctuation analysis (MF-DFA) method. We observe a weak multifractal structure for the original series that mainly originates from the fat-tailed probability distribution function of the values, and the multifractal nature of the original time series is enriched into their sequences of maximal (minimal) returns. A quantitative measure of multifractality is provided by using a set of ‘complexity parameters’.

  19. Coil-to-coil physiological noise correlations and their impact on fMRI time-series SNR

    PubMed Central

    Triantafyllou, C.; Polimeni, J. R.; Keil, B.; Wald, L. L.

    2017-01-01

    Purpose Physiological nuisance fluctuations (“physiological noise”) are a major contribution to the time-series Signal to Noise Ratio (tSNR) of functional imaging. While thermal noise correlations between array coil elements have a well-characterized effect on the image Signal to Noise Ratio (SNR0), the element-to-element covariance matrix of the time-series fluctuations has not yet been analyzed. We examine this effect with a goal of ultimately improving the combination of multichannel array data. Theory and Methods We extend the theoretical relationship between tSNR and SNR0 to include a time-series noise covariance matrix Ψt, distinct from the thermal noise covariance matrix Ψ0, and compare its structure to Ψ0 and the signal coupling matrix SSH formed from the signal intensity vectors S. Results Inclusion of the measured time-series noise covariance matrix into the model relating tSNR and SNR0 improves the fit of experimental multichannel data and is shown to be distinct from Ψ0 or SSH. Conclusion Time-series noise covariances in array coils are found to differ from Ψ0 and more surprisingly, from the signal coupling matrix SSH. Correct characterization of the time-series noise has implications for the analysis of time-series data and for improving the coil element combination process. PMID:26756964

  20. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    PubMed

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.

  1. Dependency structure and scaling properties of financial time series are related

    PubMed Central

    Morales, Raffaello; Di Matteo, T.; Aste, Tomaso

    2014-01-01

    We report evidence of a deep interplay between cross-correlations hierarchical properties and multifractality of New York Stock Exchange daily stock returns. The degree of multifractality displayed by different stocks is found to be positively correlated to their depth in the hierarchy of cross-correlations. We propose a dynamical model that reproduces this observation along with an array of other empirical properties. The structure of this model is such that the hierarchical structure of heterogeneous risks plays a crucial role in the time evolution of the correlation matrix, providing an interpretation to the mechanism behind the interplay between cross-correlation and multifractality in financial markets, where the degree of multifractality of stocks is associated to their hierarchical positioning in the cross-correlation structure. Empirical observations reported in this paper present a new perspective towards the merging of univariate multi scaling and multivariate cross-correlation properties of financial time series. PMID:24699417

  2. Dependency structure and scaling properties of financial time series are related

    NASA Astrophysics Data System (ADS)

    Morales, Raffaello; Di Matteo, T.; Aste, Tomaso

    2014-04-01

    We report evidence of a deep interplay between cross-correlations hierarchical properties and multifractality of New York Stock Exchange daily stock returns. The degree of multifractality displayed by different stocks is found to be positively correlated to their depth in the hierarchy of cross-correlations. We propose a dynamical model that reproduces this observation along with an array of other empirical properties. The structure of this model is such that the hierarchical structure of heterogeneous risks plays a crucial role in the time evolution of the correlation matrix, providing an interpretation to the mechanism behind the interplay between cross-correlation and multifractality in financial markets, where the degree of multifractality of stocks is associated to their hierarchical positioning in the cross-correlation structure. Empirical observations reported in this paper present a new perspective towards the merging of univariate multi scaling and multivariate cross-correlation properties of financial time series.

  3. Time scale defined by the fractal structure of the price fluctuations in foreign exchange markets

    NASA Astrophysics Data System (ADS)

    Kumagai, Yoshiaki

    2010-04-01

    In this contribution, a new time scale named C-fluctuation time is defined by price fluctuations observed at a given resolution. The intraday fractal structures and the relations of the three time scales: real time (physical time), tick time and C-fluctuation time, in foreign exchange markets are analyzed. The data set used is trading prices of foreign exchange rates; US dollar (USD)/Japanese yen (JPY), USD/Euro (EUR), and EUR/JPY. The accuracy of the data is one minute and data within a minute are recorded in order of transaction. The series of instantaneous velocity of C-fluctuation time flowing are exponentially distributed for small C when they are measured by real time and for tiny C when they are measured by tick time. When the market is volatile, for larger C, the series of instantaneous velocity are exponentially distributed.

  4. Automated smoother for the numerical decoupling of dynamics models.

    PubMed

    Vilela, Marco; Borges, Carlos C H; Vinga, Susana; Vasconcelos, Ana Tereza R; Santos, Helena; Voit, Eberhard O; Almeida, Jonas S

    2007-08-21

    Structure identification of dynamic models for complex biological systems is the cornerstone of their reverse engineering. Biochemical Systems Theory (BST) offers a particularly convenient solution because its parameters are kinetic-order coefficients which directly identify the topology of the underlying network of processes. We have previously proposed a numerical decoupling procedure that allows the identification of multivariate dynamic models of complex biological processes. While described here within the context of BST, this procedure has a general applicability to signal extraction. Our original implementation relied on artificial neural networks (ANN), which caused slight, undesirable bias during the smoothing of the time courses. As an alternative, we propose here an adaptation of the Whittaker's smoother and demonstrate its role within a robust, fully automated structure identification procedure. In this report we propose a robust, fully automated solution for signal extraction from time series, which is the prerequisite for the efficient reverse engineering of biological systems models. The Whittaker's smoother is reformulated within the context of information theory and extended by the development of adaptive signal segmentation to account for heterogeneous noise structures. The resulting procedure can be used on arbitrary time series with a nonstationary noise process; it is illustrated here with metabolic profiles obtained from in-vivo NMR experiments. The smoothed solution that is free of parametric bias permits differentiation, which is crucial for the numerical decoupling of systems of differential equations. The method is applicable in signal extraction from time series with nonstationary noise structure and can be applied in the numerical decoupling of system of differential equations into algebraic equations, and thus constitutes a rather general tool for the reverse engineering of mechanistic model descriptions from multivariate experimental time series.

  5. The short time Fourier transform and local signals

    NASA Astrophysics Data System (ADS)

    Okumura, Shuhei

    In this thesis, I examine the theoretical properties of the short time discrete Fourier transform (STFT). The STFT is obtained by applying the Fourier transform by a fixed-sized, moving window to input series. We move the window by one time point at a time, so we have overlapping windows. I present several theoretical properties of the STFT, applied to various types of complex-valued, univariate time series inputs, and their outputs in closed forms. In particular, just like the discrete Fourier transform, the STFT's modulus time series takes large positive values when the input is a periodic signal. One main point is that a white noise time series input results in the STFT output being a complex-valued stationary time series and we can derive the time and time-frequency dependency structure such as the cross-covariance functions. Our primary focus is the detection of local periodic signals. I present a method to detect local signals by computing the probability that the squared modulus STFT time series has consecutive large values exceeding some threshold after one exceeding observation following one observation less than the threshold. We discuss a method to reduce the computation of such probabilities by the Box-Cox transformation and the delta method, and show that it works well in comparison to the Monte Carlo simulation method.

  6. New Results in Magnitude and Sign Correlations in Heartbeat Fluctuations for Healthy Persons and Congestive Heart Failure (CHF) Patients

    NASA Astrophysics Data System (ADS)

    Diosdado, A. Muñoz; Cruz, H. Reyes; Hernández, D. Bueno; Coyt, G. Gálvez; González, J. Arellanes

    2008-08-01

    Heartbeat fluctuations exhibit temporal structure with fractal and nonlinear features that reflect changes in the neuroautonomic control. In this work we have used the detrended fluctuation analysis (DFA) to analyze heartbeat (RR) intervals of 54 healthy subjects and 40 patients with congestive heart failure during 24 hours; we separate time series for sleep and wake phases. We observe long-range correlations in time series of healthy persons and CHF patients. However, the correlations for CHF patients are weaker than the correlations for healthy persons; this fact has been reported by Ashkenazy et al. [1] but with a smaller group of subjects. In time series of CHF patients there is a crossover, it means that the correlations for high and low frequencies are different, but in time series of healthy persons there are not crossovers even if they are sleeping. These crossovers are more pronounced for CHF patients in the sleep phase. We decompose the heartbeat interval time series into magnitude and sign series, we know that these kinds of signals can exhibit different time organization for the magnitude and sign and the magnitude series relates to nonlinear properties of the original time series, while the sign series relates to the linear properties. Magnitude series are long-range correlated, while the sign series are anticorrelated. Newly, the correlations for healthy persons are different that the correlations for CHF patients both for magnitude and sign time series. In the paper of Ashkenazy et al. they proposed the empirical relation: αsign≈1/2(αoriginal+αmagnitude) for the short-range regime (high frequencies), however, we have found a different relation that in our calculations is valid for short and long-range regime: αsign≈1/4(αoriginal+αmagnitude).

  7. Time Series Discord Detection in Medical Data using a Parallel Relational Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Rintoul, Mark Daniel; Wilson, Andrew T.

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  8. Time Series Discord Detection in Medical Data using a Parallel Relational Database [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Wilson, Andrew T.; Rintoul, Mark Daniel

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  9. Statistical characteristics of surrogate data based on geophysical measurements

    NASA Astrophysics Data System (ADS)

    Venema, V.; Bachner, S.; Rust, H. W.; Simmer, C.

    2006-09-01

    In this study, the statistical properties of a range of measurements are compared with those of their surrogate time series. Seven different records are studied, amongst others, historical time series of mean daily temperature, daily rain sums and runoff from two rivers, and cloud measurements. Seven different algorithms are used to generate the surrogate time series. The best-known method is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm, which is able to reproduce the measured distribution as well as the power spectrum. Using this setup, the measurements and their surrogates are compared with respect to their power spectrum, increment distribution, structure functions, annual percentiles and return values. It is found that the surrogates that reproduce the power spectrum and the distribution of the measurements are able to closely match the increment distributions and the structure functions of the measurements, but this often does not hold for surrogates that only mimic the power spectrum of the measurement. However, even the best performing surrogates do not have asymmetric increment distributions, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found deviations of the structure functions on small scales.

  10. A systematic review on the use of time series data in the study of antimicrobial consumption and Pseudomonas aeruginosa resistance.

    PubMed

    Athanasiou, Christos I; Kopsini, Angeliki

    2018-06-12

    In the field of antimicrobial resistance, the number of studies that use time series data has increased recently. The purpose of this study is the systematic review of all studies on antibacterial consumption and on Pseudomonas aeruginosa resistance in healthcare settings, that have used time series data. A systematic review of the literature till June 2017 was conducted. All the studies that have used time series data and have examined the inhospital antibiotic consumption and Ps. aeruginosa resistance rates or incidence were eligible. No other exclusion criteria were applied. Data on the structure, terminology used, methods used and results of each article were recorded and analyzed as possible. A total of thirty six studies were retrieved, twenty three of which were in accordance with our criteria. Thirteen of them were quasi experimental studies and ten were ecological observational studies. Eighteen studies collected time series data of both parameters and the statistical methodology of "time series analysis" was applied in nine studies. Most of the studies were published in the last eight years. The Interrupted Time Series design was the most widespread. As expected, there was high heterogeneity in regard to the study design, terminology and statistical methods applied. Copyright © 2018. Published by Elsevier Ltd.

  11. Multifractal surrogate-data generation algorithm that preserves pointwise Hölder regularity structure, with initial applications to turbulence

    NASA Astrophysics Data System (ADS)

    Keylock, C. J.

    2017-03-01

    An algorithm is described that can generate random variants of a time series while preserving the probability distribution of original values and the pointwise Hölder regularity. Thus, it preserves the multifractal properties of the data. Our algorithm is similar in principle to well-known algorithms based on the preservation of the Fourier amplitude spectrum and original values of a time series. However, it is underpinned by a dual-tree complex wavelet transform rather than a Fourier transform. Our method, which we term the iterated amplitude adjusted wavelet transform can be used to generate bootstrapped versions of multifractal data, and because it preserves the pointwise Hölder regularity but not the local Hölder regularity, it can be used to test hypotheses concerning the presence of oscillating singularities in a time series, an important feature of turbulence and econophysics data. Because the locations of the data values are randomized with respect to the multifractal structure, hypotheses about their mutual coupling can be tested, which is important for the velocity-intermittency structure of turbulence and self-regulating processes.

  12. Multiscale multifractal detrended cross-correlation analysis of financial time series

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Wang, Jing; Lin, Aijing

    2014-06-01

    In this paper, we introduce a method called multiscale multifractal detrended cross-correlation analysis (MM-DCCA). The method allows us to extend the description of the cross-correlation properties between two time series. MM-DCCA may provide new ways of measuring the nonlinearity of two signals, and it helps to present much richer information than multifractal detrended cross-correlation analysis (MF-DCCA) by sweeping all the range of scale at which the multifractal structures of complex system are discussed. Moreover, to illustrate the advantages of this approach we make use of the MM-DCCA to analyze the cross-correlation properties between financial time series. We show that this new method can be adapted to investigate stock markets under investigation. It can provide a more faithful and more interpretable description of the dynamic mechanism between financial time series than traditional MF-DCCA. We also propose to reduce the scale ranges to analyze short time series, and some inherent properties which remain hidden when a wide range is used may exhibit perfectly in this way.

  13. Retrieving hydrological connectivity from empirical causality in karst systems

    NASA Astrophysics Data System (ADS)

    Delforge, Damien; Vanclooster, Marnik; Van Camp, Michel; Poulain, Amaël; Watlet, Arnaud; Hallet, Vincent; Kaufmann, Olivier; Francis, Olivier

    2017-04-01

    Because of their complexity, karst systems exhibit nonlinear dynamics. Moreover, if one attempts to model a karst, the hidden behavior complicates the choice of the most suitable model. Therefore, both intense investigation methods and nonlinear data analysis are needed to reveal the underlying hydrological connectivity as a prior for a consistent physically based modelling approach. Convergent Cross Mapping (CCM), a recent method, promises to identify causal relationships between time series belonging to the same dynamical systems. The method is based on phase space reconstruction and is suitable for nonlinear dynamics. As an empirical causation detection method, it could be used to highlight the hidden complexity of a karst system by revealing its inner hydrological and dynamical connectivity. Hence, if one can link causal relationships to physical processes, the method should show great potential to support physically based model structure selection. We present the results of numerical experiments using karst model blocks combined in different structures to generate time series from actual rainfall series. CCM is applied between the time series to investigate if the empirical causation detection is consistent with the hydrological connectivity suggested by the karst model.

  14. Novel covariance-based neutrality test of time-series data reveals asymmetries in ecological and economic systems

    DOE PAGES

    Washburne, Alex D.; Burby, Joshua W.; Lacker, Daniel; ...

    2016-09-30

    Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or themore » number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Here, future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems.« less

  15. Novel covariance-based neutrality test of time-series data reveals asymmetries in ecological and economic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washburne, Alex D.; Burby, Joshua W.; Lacker, Daniel

    Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or themore » number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Here, future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems.« less

  16. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  17. Multi-frequency complex network from time series for uncovering oil-water flow structure.

    PubMed

    Gao, Zhong-Ke; Yang, Yu-Xuan; Fang, Peng-Cheng; Jin, Ning-De; Xia, Cheng-Yi; Hu, Li-Dan

    2015-02-04

    Uncovering complex oil-water flow structure represents a challenge in diverse scientific disciplines. This challenge stimulates us to develop a new distributed conductance sensor for measuring local flow signals at different positions and then propose a novel approach based on multi-frequency complex network to uncover the flow structures from experimental multivariate measurements. In particular, based on the Fast Fourier transform, we demonstrate how to derive multi-frequency complex network from multivariate time series. We construct complex networks at different frequencies and then detect community structures. Our results indicate that the community structures faithfully represent the structural features of oil-water flow patterns. Furthermore, we investigate the network statistic at different frequencies for each derived network and find that the frequency clustering coefficient enables to uncover the evolution of flow patterns and yield deep insights into the formation of flow structures. Current results present a first step towards a network visualization of complex flow patterns from a community structure perspective.

  18. Nonlinear Dynamics, Poor Data, and What to Make of Them?

    NASA Astrophysics Data System (ADS)

    Ghil, M.; Zaliapin, I. V.

    2005-12-01

    The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict variability in the geosciences. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this talk we will describe the connections between time series analysis and nonlinear dynamics, discuss signal-to-noise enhancement, and present some of the novel methods for spectral analysis. These fall into two broad categories: (i) methods that try to ferret out regularities of the time series; and (ii) methods aimed at describing the characteristics of irregular processes. The former include singular-spectrum analysis (SSA), the multi-taper method (MTM), and the maximum-entropy method (MEM). The various steps, as well as the advantages and disadvantages of these methods, will be illustrated by their application to several important climatic time series, such as the Southern Oscillation Index (SOI), paleoclimatic time series, and instrumental temperature time series. The SOI index captures major features of interannual climate variability and is used extensively in its prediction. The other time series cover interdecadal and millennial time scales. The second category includes the calculation of fractional dimension, leading Lyapunov exponents, and Hurst exponents. More recently, multi-trend analysis (MTA), binary-decomposition analysis (BDA), and related methods have attempted to describe the structure of time series that include both regular and irregular components. Within the time available, I will try to give a feeling for how these methods work, and how well.

  19. A likelihood-based time series modeling approach for application in dendrochronology to examine the growth-climate relations and forest disturbance history.

    PubMed

    Lee, E Henry; Wickham, Charlotte; Beedlow, Peter A; Waschmann, Ronald S; Tingey, David T

    2017-10-01

    A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for climate and forest disturbances (i.e., pests, diseases, fire). The statistical method is illustrated with a tree-ring width time series for a mature closed-canopy Douglas-fir stand on the west slopes of the Cascade Mountains of Oregon, USA that is impacted by Swiss needle cast disease caused by the foliar fungus, Phaecryptopus gaeumannii (Rhode) Petrak. The likelihood-based TSIA method is proposed for the field of dendrochronology to understand the interaction of temperature, water, and forest disturbances that are important in forest ecology and climate change studies.

  20. A New Hybrid-Multiscale SSA Prediction of Non-Stationary Time Series

    NASA Astrophysics Data System (ADS)

    Ghanbarzadeh, Mitra; Aminghafari, Mina

    2016-02-01

    Singular spectral analysis (SSA) is a non-parametric method used in the prediction of non-stationary time series. It has two parameters, which are difficult to determine and very sensitive to their values. Since, SSA is a deterministic-based method, it does not give good results when the time series is contaminated with a high noise level and correlated noise. Therefore, we introduce a novel method to handle these problems. It is based on the prediction of non-decimated wavelet (NDW) signals by SSA and then, prediction of residuals by wavelet regression. The advantages of our method are the automatic determination of parameters and taking account of the stochastic structure of time series. As shown through the simulated and real data, we obtain better results than SSA, a non-parametric wavelet regression method and Holt-Winters method.

  1. Testing for nonlinearity in time series: The method of surrogate data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, J.; Galdrikian, B.; Longtin, A.

    1991-01-01

    We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less

  2. Coil-to-coil physiological noise correlations and their impact on functional MRI time-series signal-to-noise ratio.

    PubMed

    Triantafyllou, Christina; Polimeni, Jonathan R; Keil, Boris; Wald, Lawrence L

    2016-12-01

    Physiological nuisance fluctuations ("physiological noise") are a major contribution to the time-series signal-to-noise ratio (tSNR) of functional imaging. While thermal noise correlations between array coil elements have a well-characterized effect on the image Signal to Noise Ratio (SNR 0 ), the element-to-element covariance matrix of the time-series fluctuations has not yet been analyzed. We examine this effect with a goal of ultimately improving the combination of multichannel array data. We extend the theoretical relationship between tSNR and SNR 0 to include a time-series noise covariance matrix Ψ t , distinct from the thermal noise covariance matrix Ψ 0 , and compare its structure to Ψ 0 and the signal coupling matrix SS H formed from the signal intensity vectors S. Inclusion of the measured time-series noise covariance matrix into the model relating tSNR and SNR 0 improves the fit of experimental multichannel data and is shown to be distinct from Ψ 0 or SS H . Time-series noise covariances in array coils are found to differ from Ψ 0 and more surprisingly, from the signal coupling matrix SS H . Correct characterization of the time-series noise has implications for the analysis of time-series data and for improving the coil element combination process. Magn Reson Med 76:1708-1719, 2016. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  3. XVII International Conference on Hadron Spectroscopy and Structure

    NASA Astrophysics Data System (ADS)

    2017-09-01

    The Hadron 2017 Conference is the seventeenth of a series of biennial conferences started in 1985 at Maryland, USA. Its official name, XVII International Conference on Hadron Spectroscopy and Structure, includes for the first time the term structure to emphasize the importance that this issue has acquired in recent editions of the series. The aim of the conference is to provide an overview of the present status and progress in hadron structure and dynamics, as well as a preview of the forthcoming investigations. It will cover lectures on both experimental and theoretical aspects, including in particular the presentation of new results.

  4. Dynamic GSCA (Generalized Structured Component Analysis) with Applications to the Analysis of Effective Connectivity in Functional Neuroimaging Data

    ERIC Educational Resources Information Center

    Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S.

    2012-01-01

    We propose a new method of structural equation modeling (SEM) for longitudinal and time series data, named Dynamic GSCA (Generalized Structured Component Analysis). The proposed method extends the original GSCA by incorporating a multivariate autoregressive model to account for the dynamic nature of data taken over time. Dynamic GSCA also…

  5. 17 CFR 229.1113 - (Item 1113) Structure of the transaction.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... other structural features designed to enhance credit, facilitate the timely payment of monies due on the... and invested. Also describe the length of time cash will be held pending distributions to security... connection with these securitizations. (e) Master trusts. If one or more additional series or classes have...

  6. Extracting Leading Nonlinear Modes of Changing Climate From Global SST Time Series

    NASA Astrophysics Data System (ADS)

    Mukhin, D.; Gavrilov, A.; Loskutov, E. M.; Feigin, A. M.; Kurths, J.

    2017-12-01

    Data-driven modeling of climate requires adequate principal variables extracted from observed high-dimensional data. For constructing such variables it is needed to find spatial-temporal patterns explaining a substantial part of the variability and comprising all dynamically related time series from the data. The difficulties of this task rise from the nonlinearity and non-stationarity of the climate dynamical system. The nonlinearity leads to insufficiency of linear methods of data decomposition for separating different processes entangled in the observed time series. On the other hand, various forcings, both anthropogenic and natural, make the dynamics non-stationary, and we should be able to describe the response of the system to such forcings in order to separate the modes explaining the internal variability. The method we present is aimed to overcome both these problems. The method is based on the Nonlinear Dynamical Mode (NDM) decomposition [1,2], but takes into account external forcing signals. An each mode depends on hidden, unknown a priori, time series which, together with external forcing time series, are mapped onto data space. Finding both the hidden signals and the mapping allows us to study the evolution of the modes' structure in changing external conditions and to compare the roles of the internal variability and forcing in the observed behavior. The method is used for extracting of the principal modes of SST variability on inter-annual and multidecadal time scales accounting the external forcings such as CO2, variations of the solar activity and volcanic activity. The structure of the revealed teleconnection patterns as well as their forecast under different CO2 emission scenarios are discussed.[1] Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. [2] Gavrilov, A., Mukhin, D., Loskutov, E., Volodin, E., Feigin, A., & Kurths, J. (2016). Method for reconstructing nonlinear modes with adaptive structure from multidimensional data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 26(12), 123101.

  7. Trends and structural shifts in health tourism: evidence from seasonal time-series data on health-related travel spending by Canada during 1970-2010.

    PubMed

    Loh, Chung-Ping A

    2015-05-01

    There has been a growing interest in better understanding the trends and determinants of health tourism activities. While much of the expanding literature on health tourism offers theoretical or qualitative discussion, empirical evidences has been lacking. This study employs Canada's outbound health tourism activities as an example to examine the trends in health tourism and its association with changing domestic health care market characteristics. A time-series model that accounts for potential structural changes in the trend is employed to analyze the quarterly health-related travel spending series reported in the Balance of Payments Statistics (BOPS) during 1970-2010 (n = 156). We identified a structural shift point which marks the start of an accelerated growth of health tourism and a flattened seasonality in such activities. We found that the health tourism activities of Canadian consumers increase when the private investment in medical facilities declines or when the private MPI increases during the years following the structural-change. We discussed the possible linkage of the structural shift to the General Agreement on Trade in Services (GATS), which went into effect in January, 1995. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Numerical analysis of transient fields near thin-wire antennas and scatterers

    NASA Astrophysics Data System (ADS)

    Landt, J. A.

    1981-11-01

    Under the premise that `accelerated charge radiates,' one would expect radiation on wire structures to occur from driving points, ends of wires, bends in wires, or locations of lumped loading. Here, this premise is investigated in a series of numerical experiments. The numerical procedure is based on a moment-method solution of a thin-wire time-domain electric-field integral equation. The fields in the vicinity of wire structures are calculated for short impulsive-type excitations, and are viewed in a series of time sequences or snapshots. For these excitations, the fields are spatially limited in the radial dimension, and expand in spheres centered about points of radiation. These centers of radiation coincide with the above list of possible source regions. Time retardation permits these observations to be made clearly in the time domain, similar to time-range gating. In addition to providing insight into transient radiation processes, these studies show that the direction of energy flow is not always defined by Poynting's vector near wire structures.

  9. Structural Equation Modeling of Multivariate Time Series

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Browne, Michael W.

    2007-01-01

    The covariance structure of a vector autoregressive process with moving average residuals (VARMA) is derived. It differs from other available expressions for the covariance function of a stationary VARMA process and is compatible with current structural equation methodology. Structural equation modeling programs, such as LISREL, may therefore be…

  10. miniSEED: The Backbone Data Format for Seismological Time Series

    NASA Astrophysics Data System (ADS)

    Ahern, T. K.; Benson, R. B.; Trabant, C. M.

    2017-12-01

    In 1987, the International Federation of Digital Seismograph Networks (FDSN), adopted the Standard for the Exchange of Earthquake Data (SEED) format to be used for data archiving and exchange of seismological time series data. Since that time, the format has evolved to accommodate new capabilities and features. For example, a notable change in 1992 allowed the format, which includes both the comprehensive metadata and the time series samples, to be used in two additional forms: a container for metadata only called "dataless SEED", and 2) a stand-alone structure for time series called "miniSEED". While specifically designed for seismological data and related metadata, this format has proven to be a useful format for a wide variety of geophysical time series data. Many FDSN data centers now store temperature, pressure, infrasound, tilt and other time series measurements in this internationally used format. Since April 2016, members of the FDSN have been in discussions to design a next generation miniSEED format to accommodate current and future needs, to further generalize the format, and to address a number of historical problems or limitations. We believe the correct approach is to simplify the header, allow for arbitrary header additions, expand the current identifiers, and allow for anticipated future identifiers which are currently unknown. We also believe the primary goal of the format is for efficient archiving, selection and exchange of time series data. By focusing on these goals we avoid trying to generalize the format too broadly into specialized areas such as efficient, low-latency delivery, or including unbounded non-time series data. Our presentation will provide an overview of this format and highlight its most valuable characteristics for time series data from any geophysical domain or beyond.

  11. Identifying the scale-dependent motifs in atmospheric surface layer by ordinal pattern analysis

    NASA Astrophysics Data System (ADS)

    Li, Qinglei; Fu, Zuntao

    2018-07-01

    Ramp-like structures in various atmospheric surface layer time series have been long studied, but the presence of motifs with the finer scale embedded within larger scale ramp-like structures has largely been overlooked in the reported literature. Here a novel, objective and well-adapted methodology, the ordinal pattern analysis, is adopted to study the finer-scaled motifs in atmospheric boundary-layer (ABL) time series. The studies show that the motifs represented by different ordinal patterns take clustering properties and 6 dominated motifs out of the whole 24 motifs account for about 45% of the time series under particular scales, which indicates the higher contribution of motifs with the finer scale to the series. Further studies indicate that motif statistics are similar for both stable conditions and unstable conditions at larger scales, but large discrepancies are found at smaller scales, and the frequencies of motifs "1234" and/or "4321" are a bit higher under stable conditions than unstable conditions. Under stable conditions, there are great changes for the occurrence frequencies of motifs "1234" and "4321", where the occurrence frequencies of motif "1234" decrease from nearly 24% to 4.5% with the scale factor increasing, and the occurrence frequencies of motif "4321" change nonlinearly with the scale increasing. These great differences of dominated motifs change with scale can be taken as an indicator to quantify the flow structure changes under different stability conditions, and motif entropy can be defined just by only 6 dominated motifs to quantify this time-scale independent property of the motifs. All these results suggest that the defined scale of motifs with the finer scale should be carefully taken into consideration in the interpretation of turbulence coherent structures.

  12. The matrix exponential in transient structural analysis

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    1987-01-01

    The primary usefulness of the presented theory is in the ability to represent the effects of high frequency linear response with accuracy, without requiring very small time steps in the analysis of dynamic response. The matrix exponential contains a series approximation to the dynamic model. However, unlike the usual analysis procedure which truncates the high frequency response, the approximation in the exponential matrix solution is in the time domain. By truncating the series solution to the matrix exponential short, the solution is made inaccurate after a certain time. Yet, up to that time the solution is extremely accurate, including all high frequency effects. By taking finite time increments, the exponential matrix solution can compute the response very accurately. Use of the exponential matrix in structural dynamics is demonstrated by simulating the free vibration response of multi degree of freedom models of cantilever beams.

  13. Behavioral pattern identification for structural health monitoring in complex systems

    NASA Astrophysics Data System (ADS)

    Gupta, Shalabh

    Estimation of structural damage and quantification of structural integrity are critical for safe and reliable operation of human-engineered complex systems, such as electromechanical, thermofluid, and petrochemical systems. Damage due to fatigue crack is one of the most commonly encountered sources of structural degradation in mechanical systems. Early detection of fatigue damage is essential because the resulting structural degradation could potentially cause catastrophic failures, leading to loss of expensive equipment and human life. Therefore, for reliable operation and enhanced availability, it is necessary to develop capabilities for prognosis and estimation of impending failures, such as the onset of wide-spread fatigue crack damage in mechanical structures. This dissertation presents information-based online sensing of fatigue damage using the analytical tools of symbolic time series analysis ( STSA). Anomaly detection using STSA is a pattern recognition method that has been recently developed based upon a fixed-structure, fixed-order Markov chain. The analysis procedure is built upon the principles of Symbolic Dynamics, Information Theory and Statistical Pattern Recognition. The dissertation demonstrates real-time fatigue damage monitoring based on time series data of ultrasonic signals. Statistical pattern changes are measured using STSA to monitor the evolution of fatigue damage. Real-time anomaly detection is presented as a solution to the forward (analysis) problem and the inverse (synthesis) problem. (1) the forward problem - The primary objective of the forward problem is identification of the statistical changes in the time series data of ultrasonic signals due to gradual evolution of fatigue damage. (2) the inverse problem - The objective of the inverse problem is to infer the anomalies from the observed time series data in real time based on the statistical information generated during the forward problem. A computer-controlled special-purpose fatigue test apparatus, equipped with multiple sensing devices (e.g., ultrasonics and optical microscope) for damage analysis, has been used to experimentally validate the STSA method for early detection of anomalous behavior. The sensor information is integrated with a software module consisting of the STSA algorithm for real-time monitoring of fatigue damage. Experiments have been conducted under different loading conditions on specimens constructed from the ductile aluminium alloy 7075 - T6. The dissertation has also investigated the application of the STSA method for early detection of anomalies in other engineering disciplines. Two primary applications include combustion instability in a generic thermal pulse combustor model and whirling phenomenon in a typical misaligned shaft.

  14. Exploring total cardiac variability in healthy and pathophysiological subjects using improved refined multiscale entropy.

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2017-02-01

    Multiscale entropy (MSE) and refined multiscale entropy (RMSE) techniques are being widely used to evaluate the complexity of a time series across multiple time scales 't'. Both these techniques, at certain time scales (sometimes for the entire time scales, in the case of RMSE), assign higher entropy to the HRV time series of certain pathologies than that of healthy subjects, and to their corresponding randomized surrogate time series. This incorrect assessment of signal complexity may be due to the fact that these techniques suffer from the following limitations: (1) threshold value 'r' is updated as a function of long-term standard deviation and hence unable to explore the short-term variability as well as substantial variability inherited in beat-to-beat fluctuations of long-term HRV time series. (2) In RMSE, entropy values assigned to different filtered scaled time series are the result of changes in variance, but do not completely reflect the real structural organization inherited in original time series. In the present work, we propose an improved RMSE (I-RMSE) technique by introducing a new procedure to set the threshold value by taking into account the period-to-period variability inherited in a signal and evaluated it on simulated and real HRV database. The proposed I-RMSE assigns higher entropy to the age-matched healthy subjects than that of patients suffering from atrial fibrillation, congestive heart failure, sudden cardiac death and diabetes mellitus, for the entire time scales. The results strongly support the reduction in complexity of HRV time series in female group, old-aged, patients suffering from severe cardiovascular and non-cardiovascular diseases, and in their corresponding surrogate time series.

  15. Time reversibility from visibility graphs of nonstationary processes

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas; Flanagan, Ryan

    2015-08-01

    Visibility algorithms are a family of methods to map time series into networks, with the aim of describing the structure of time series and their underlying dynamical properties in graph-theoretical terms. Here we explore some properties of both natural and horizontal visibility graphs associated to several nonstationary processes, and we pay particular attention to their capacity to assess time irreversibility. Nonstationary signals are (infinitely) irreversible by definition (independently of whether the process is Markovian or producing entropy at a positive rate), and thus the link between entropy production and time series irreversibility has only been explored in nonequilibrium stationary states. Here we show that the visibility formalism naturally induces a new working definition of time irreversibility, which allows us to quantify several degrees of irreversibility for stationary and nonstationary series, yielding finite values that can be used to efficiently assess the presence of memory and off-equilibrium dynamics in nonstationary processes without the need to differentiate or detrend them. We provide rigorous results complemented by extensive numerical simulations on several classes of stochastic processes.

  16. Inflow forecasting model construction with stochastic time series for coordinated dam operation

    NASA Astrophysics Data System (ADS)

    Kim, T.; Jung, Y.; Kim, H.; Heo, J. H.

    2014-12-01

    Dam inflow forecasting is one of the most important tasks in dam operation for an effective water resources management and control. In general, dam inflow forecasting with stochastic time series model is possible to apply when the data is stationary because most of stochastic process based on stationarity. However, recent hydrological data cannot be satisfied the stationarity anymore because of climate change. Therefore a stochastic time series model, which can consider seasonality and trend in the data series, named SARIMAX(Seasonal Autoregressive Integrated Average with eXternal variable) model were constructed in this study. This SARIMAX model could increase the performance of stochastic time series model by considering the nonstationarity components and external variable such as precipitation. For application, the models were constructed for four coordinated dams on Han river in South Korea with monthly time series data. As a result, the models of each dam have similar performance and it would be possible to use the model for coordinated dam operation.Acknowledgement This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-NH-12-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.

  17. 'TIME': A Web Application for Obtaining Insights into Microbial Ecology Using Longitudinal Microbiome Data.

    PubMed

    Baksi, Krishanu D; Kuntal, Bhusan K; Mande, Sharmila S

    2018-01-01

    Realization of the importance of microbiome studies, coupled with the decreasing sequencing cost, has led to the exponential growth of microbiome data. A number of these microbiome studies have focused on understanding changes in the microbial community over time. Such longitudinal microbiome studies have the potential to offer unique insights pertaining to the microbial social networks as well as their responses to perturbations. In this communication, we introduce a web based framework called 'TIME' (Temporal Insights into Microbial Ecology'), developed specifically to obtain meaningful insights from microbiome time series data. The TIME web-server is designed to accept a wide range of popular formats as input with options to preprocess and filter the data. Multiple samples, defined by a series of longitudinal time points along with their metadata information, can be compared in order to interactively visualize the temporal variations. In addition to standard microbiome data analytics, the web server implements popular time series analysis methods like Dynamic time warping, Granger causality and Dickey Fuller test to generate interactive layouts for facilitating easy biological inferences. Apart from this, a new metric for comparing metagenomic time series data has been introduced to effectively visualize the similarities/differences in the trends of the resident microbial groups. Augmenting the visualizations with the stationarity information pertaining to the microbial groups is utilized to predict the microbial competition as well as community structure. Additionally, the 'causality graph analysis' module incorporated in TIME allows predicting taxa that might have a higher influence on community structure in different conditions. TIME also allows users to easily identify potential taxonomic markers from a longitudinal microbiome analysis. We illustrate the utility of the web-server features on a few published time series microbiome data and demonstrate the ease with which it can be used to perform complex analysis.

  18. Parameter motivated mutual correlation analysis: Application to the study of currency exchange rates based on intermittency parameter and Hurst exponent

    NASA Astrophysics Data System (ADS)

    Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.

    2012-04-01

    We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.

  19. Monitoring Farmland Loss Caused by Urbanization in Beijing from Modis Time Series Using Hierarchical Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Yuan, Y.; Meng, Y.; Chen, Y. X.; Jiang, C.; Yue, A. Z.

    2018-04-01

    In this study, we proposed a method to map urban encroachment onto farmland using satellite image time series (SITS) based on the hierarchical hidden Markov model (HHMM). In this method, the farmland change process is decomposed into three hierarchical levels, i.e., the land cover level, the vegetation phenology level, and the SITS level. Then a three-level HHMM is constructed to model the multi-level semantic structure of farmland change process. Once the HHMM is established, a change from farmland to built-up could be detected by inferring the underlying state sequence that is most likely to generate the input time series. The performance of the method is evaluated on MODIS time series in Beijing. Results on both simulated and real datasets demonstrate that our method improves the change detection accuracy compared with the HMM-based method.

  20. Credit Default Swaps networks and systemic risk

    PubMed Central

    Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano

    2014-01-01

    Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities. PMID:25366654

  1. Credit Default Swaps networks and systemic risk.

    PubMed

    Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano

    2014-11-04

    Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities.

  2. Credit Default Swaps networks and systemic risk

    NASA Astrophysics Data System (ADS)

    Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano

    2014-11-01

    Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities.

  3. Chaotic examination

    NASA Astrophysics Data System (ADS)

    Bildirici, Melike; Sonustun, Fulya Ozaksoy; Sonustun, Bahri

    2018-01-01

    In the regards of chaos theory, new concepts such as complexity, determinism, quantum mechanics, relativity, multiple equilibrium, complexity, (continuously) instability, nonlinearity, heterogeneous agents, irregularity were widely questioned in economics. It is noticed that linear models are insufficient for analyzing unpredictable, irregular and noncyclical oscillations of economies, and for predicting bubbles, financial crisis, business cycles in financial markets. Therefore, economists gave great consequence to use appropriate tools for modelling non-linear dynamical structures and chaotic behaviors of the economies especially in macro and the financial economy. In this paper, we aim to model the chaotic structure of exchange rates (USD-TL and EUR-TL). To determine non-linear patterns of the selected time series, daily returns of the exchange rates were tested by BDS during the period from January 01, 2002 to May 11, 2017 which covers after the era of the 2001 financial crisis. After specifying the non-linear structure of the selected time series, it was aimed to examine the chaotic characteristic for the selected time period by Lyapunov Exponents. The findings verify the existence of the chaotic structure of the exchange rate returns in the analyzed time period.

  4. Two cloud-based cues for estimating scene structure and camera calibration.

    PubMed

    Jacobs, Nathan; Abrams, Austin; Pless, Robert

    2013-10-01

    We describe algorithms that use cloud shadows as a form of stochastically structured light to support 3D scene geometry estimation. Taking video captured from a static outdoor camera as input, we use the relationship of the time series of intensity values between pairs of pixels as the primary input to our algorithms. We describe two cues that relate the 3D distance between a pair of points to the pair of intensity time series. The first cue results from the fact that two pixels that are nearby in the world are more likely to be under a cloud at the same time than two distant points. We describe methods for using this cue to estimate focal length and scene structure. The second cue is based on the motion of cloud shadows across the scene; this cue results in a set of linear constraints on scene structure. These constraints have an inherent ambiguity, which we show how to overcome by combining the cloud motion cue with the spatial cue. We evaluate our method on several time lapses of real outdoor scenes.

  5. Revisiting the European sovereign bonds with a permutation-information-theory approach

    NASA Astrophysics Data System (ADS)

    Fernández Bariviera, Aurelio; Zunino, Luciano; Guercio, María Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-12-01

    In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.

  6. Prony series spectra of structural relaxation in N-BK7 for finite element modeling.

    PubMed

    Koontz, Erick; Blouin, Vincent; Wachtel, Peter; Musgraves, J David; Richardson, Kathleen

    2012-12-20

    Structural relaxation behavior of N-BK7 glass was characterized at temperatures 20 °C above and below T(12) for this glass, using a thermo mechanical analyzer (TMA). T(12) is a characteristic temperature corresponding to a viscosity of 10(12) Pa·s. The glass was subject to quick temperature down-jumps preceded and followed by long isothermal holds. The exponential-like decay of the sample height was recorded and fitted using a unique Prony series method. The result of his method was a plot of the fit parameters revealing the presence of four distinct peaks or distributions of relaxation times. The number of relaxation times decreased as final test temperature was increased. The relaxation times did not shift significantly with changing temperature; however, the Prony weight terms varied essentially linearly with temperature. It was also found that the structural relaxation behavior of the glass trended toward single exponential behavior at temperatures above the testing range. The result of the analysis was a temperature-dependent Prony series model that can be used in finite element modeling of glass behavior in processes such as precision glass molding (PGM).

  7. The combination of circle topology and leaky integrator neurons remarkably improves the performance of echo state network on time series prediction.

    PubMed

    Xue, Fangzheng; Li, Qian; Li, Xiumin

    2017-01-01

    Recently, echo state network (ESN) has attracted a great deal of attention due to its high accuracy and efficient learning performance. Compared with the traditional random structure and classical sigmoid units, simple circle topology and leaky integrator neurons have more advantages on reservoir computing of ESN. In this paper, we propose a new model of ESN with both circle reservoir structure and leaky integrator units. By comparing the prediction capability on Mackey-Glass chaotic time series of four ESN models: classical ESN, circle ESN, traditional leaky integrator ESN, circle leaky integrator ESN, we find that our circle leaky integrator ESN shows significantly better performance than other ESNs with roughly 2 orders of magnitude reduction of the predictive error. Moreover, this model has stronger ability to approximate nonlinear dynamics and resist noise than conventional ESN and ESN with only simple circle structure or leaky integrator neurons. Our results show that the combination of circle topology and leaky integrator neurons can remarkably increase dynamical diversity and meanwhile decrease the correlation of reservoir states, which contribute to the significant improvement of computational performance of Echo state network on time series prediction.

  8. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere

    PubMed Central

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375

  9. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere.

    PubMed

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.

  10. Modelling spatiotemporal change using multidimensional arrays Meng

    NASA Astrophysics Data System (ADS)

    Lu, Meng; Appel, Marius; Pebesma, Edzer

    2017-04-01

    The large variety of remote sensors, model simulations, and in-situ records provide great opportunities to model environmental change. The massive amount of high-dimensional data calls for methods to integrate data from various sources and to analyse spatiotemporal and thematic information jointly. An array is a collection of elements ordered and indexed in arbitrary dimensions, which naturally represent spatiotemporal phenomena that are identified by their geographic locations and recording time. In addition, array regridding (e.g., resampling, down-/up-scaling), dimension reduction, and spatiotemporal statistical algorithms are readily applicable to arrays. However, the role of arrays in big geoscientific data analysis has not been systematically studied: How can arrays discretise continuous spatiotemporal phenomena? How can arrays facilitate the extraction of multidimensional information? How can arrays provide a clean, scalable and reproducible change modelling process that is communicable between mathematicians, computer scientist, Earth system scientist and stakeholders? This study emphasises on detecting spatiotemporal change using satellite image time series. Current change detection methods using satellite image time series commonly analyse data in separate steps: 1) forming a vegetation index, 2) conducting time series analysis on each pixel, and 3) post-processing and mapping time series analysis results, which does not consider spatiotemporal correlations and ignores much of the spectral information. Multidimensional information can be better extracted by jointly considering spatial, spectral, and temporal information. To approach this goal, we use principal component analysis to extract multispectral information and spatial autoregressive models to account for spatial correlation in residual based time series structural change modelling. We also discuss the potential of multivariate non-parametric time series structural change methods, hierarchical modelling, and extreme event detection methods to model spatiotemporal change. We show how array operations can facilitate expressing these methods, and how the open-source array data management and analytics software SciDB and R can be used to scale the process and make it easily reproducible.

  11. Dynamical glucometry: Use of multiscale entropy analysis in diabetes

    NASA Astrophysics Data System (ADS)

    Costa, Madalena D.; Henriques, Teresa; Munshi, Medha N.; Segal, Alissa R.; Goldberger, Ary L.

    2014-09-01

    Diabetes mellitus (DM) is one of the world's most prevalent medical conditions. Contemporary management focuses on lowering mean blood glucose values toward a normal range, but largely ignores the dynamics of glucose fluctuations. We probed analyte time series obtained from continuous glucose monitor (CGM) sensors. We show that the fluctuations in CGM values sampled every 5 min are not uncorrelated noise. Next, using multiscale entropy analysis, we quantified the complexity of the temporal structure of the CGM time series from a group of elderly subjects with type 2 DM and age-matched controls. We further probed the structure of these CGM time series using detrended fluctuation analysis. Our findings indicate that the dynamics of glucose fluctuations from control subjects are more complex than those of subjects with type 2 DM over time scales ranging from about 5 min to 5 h. These findings support consideration of a new framework, dynamical glucometry, to guide mechanistic research and to help assess and compare therapeutic interventions, which should enhance complexity of glucose fluctuations and not just lower mean and variance of blood glucose levels.

  12. Permutation entropy of finite-length white-noise time series.

    PubMed

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  13. Inferring the interplay between network structure and market effects in Bitcoin

    NASA Astrophysics Data System (ADS)

    Kondor, Dániel; Csabai, István; Szüle, János; Pósfai, Márton; Vattay, Gábor

    2014-12-01

    A main focus in economics research is understanding the time series of prices of goods and assets. While statistical models using only the properties of the time series itself have been successful in many aspects, we expect to gain a better understanding of the phenomena involved if we can model the underlying system of interacting agents. In this article, we consider the history of Bitcoin, a novel digital currency system, for which the complete list of transactions is available for analysis. Using this dataset, we reconstruct the transaction network between users and analyze changes in the structure of the subgraph induced by the most active users. Our approach is based on the unsupervised identification of important features of the time variation of the network. Applying the widely used method of Principal Component Analysis to the matrix constructed from snapshots of the network at different times, we are able to show how structural changes in the network accompany significant changes in the exchange price of bitcoins.

  14. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Compressive Sensing of Foot Gait Signals and Its Application for the Estimation of Clinically Relevant Time Series.

    PubMed

    Pant, Jeevan K; Krishnan, Sridhar

    2016-07-01

    A new signal reconstruction algorithm for compressive sensing based on the minimization of a pseudonorm which promotes block-sparse structure on the first-order difference of the signal is proposed. Involved optimization is carried out by using a sequential version of Fletcher-Reeves' conjugate-gradient algorithm, and the line search is based on Banach's fixed-point theorem. The algorithm is suitable for the reconstruction of foot gait signals which admit block-sparse structure on the first-order difference. An additional algorithm for the estimation of stride-interval, swing-interval, and stance-interval time series from the reconstructed foot gait signals is also proposed. This algorithm is based on finding zero crossing indices of the foot gait signal and using the resulting indices for the computation of time series. Extensive simulation results demonstrate that the proposed signal reconstruction algorithm yields improved signal-to-noise ratio and requires significantly reduced computational effort relative to several competing algorithms over a wide range of compression ratio. For a compression ratio in the range from 88% to 94%, the proposed algorithm is found to offer improved accuracy for the estimation of clinically relevant time-series parameters, namely, the mean value, variance, and spectral index of stride-interval, stance-interval, and swing-interval time series, relative to its nearest competitor algorithm. The improvement in performance for compression ratio as high as 94% indicates that the proposed algorithms would be useful for designing compressive sensing-based systems for long-term telemonitoring of human gait signals.

  16. Time series regression model for infectious disease and weather.

    PubMed

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time series or relations among phase shifted time series.

  18. A hybrid-domain approach for modeling climate data time series

    NASA Astrophysics Data System (ADS)

    Wen, Qiuzi H.; Wang, Xiaolan L.; Wong, Augustine

    2011-09-01

    In order to model climate data time series that often contain periodic variations, trends, and sudden changes in mean (mean shifts, mostly artificial), this study proposes a hybrid-domain (HD) algorithm, which incorporates a time domain test and a newly developed frequency domain test through an iterative procedure that is analogue to the well known backfitting algorithm. A two-phase competition procedure is developed to address the confounding issue between modeling periodic variations and mean shifts. A variety of distinctive features of climate data time series, including trends, periodic variations, mean shifts, and a dependent noise structure, can be modeled in tandem using the HD algorithm. This is particularly important for homogenization of climate data from a low density observing network in which reference series are not available to help preserve climatic trends and long-term periodic variations, preventing them from being mistaken as artificial shifts. The HD algorithm is also powerful in estimating trend and periodicity in a homogeneous data time series (i.e., in the absence of any mean shift). The performance of the HD algorithm (in terms of false alarm rate and hit rate in detecting shifts/cycles, and estimation accuracy) is assessed via a simulation study. Its power is further illustrated through its application to a few climate data time series.

  19. The application of a shift theorem analysis technique to multipoint measurements

    NASA Astrophysics Data System (ADS)

    Dieckmann, M. E.; Chapman, S. C.

    1999-03-01

    A Fourier domain technique has been proposed previously which, in principle, quantifies the extent to which multipoint in-situ measurements can identify whether or not an observed structure is time stationary in its rest frame. Once a structure, sampled for example by four spacecraft, is shown to be quasi-stationary in its rest frame, the structure's velocity vector can be determined with respect to the sampling spacecraft. We investigate the properties of this technique, which we will refer to as a stationarity test, by applying it to two point measurements of a simulated boundary layer. The boundary layer was evolved using a PIC (particle in cell) electromagnetic code. Initial and boundary conditions were chosen such, that two cases could be considered, i.e. a spacecraft pair moving through (1) a time stationary boundary structure and (2) a boundary structure which is evolving (expanding) in time. The code also introduces noise in the simulated data time series which is uncorrelated between the two spacecraft. We demonstrate that, provided that the time series is Hanning windowed, the test is effective in determining the relative velocity between the boundary layer and spacecraft and in determining the range of frequencies over which the data can be treated as time stationary or time evolving. This work presents a first step towards understanding the effectiveness of this technique, as required in order for it to be applied to multispacecraft data.

  20. Structural connectome topology relates to regional BOLD signal dynamics in the mouse brain

    NASA Astrophysics Data System (ADS)

    Sethi, Sarab S.; Zerbi, Valerio; Wenderoth, Nicole; Fornito, Alex; Fulcher, Ben D.

    2017-04-01

    Brain dynamics are thought to unfold on a network determined by the pattern of axonal connections linking pairs of neuronal elements; the so-called connectome. Prior work has indicated that structural brain connectivity constrains pairwise correlations of brain dynamics ("functional connectivity"), but it is not known whether inter-regional axonal connectivity is related to the intrinsic dynamics of individual brain areas. Here we investigate this relationship using a weighted, directed mesoscale mouse connectome from the Allen Mouse Brain Connectivity Atlas and resting state functional MRI (rs-fMRI) time-series data measured in 184 brain regions in eighteen anesthetized mice. For each brain region, we measured degree, betweenness, and clustering coefficient from weighted and unweighted, and directed and undirected versions of the connectome. We then characterized the univariate rs-fMRI dynamics in each brain region by computing 6930 time-series properties using the time-series analysis toolbox, hctsa. After correcting for regional volume variations, strong and robust correlations between structural connectivity properties and rs-fMRI dynamics were found only when edge weights were accounted for, and were associated with variations in the autocorrelation properties of the rs-fMRI signal. The strongest relationships were found for weighted in-degree, which was positively correlated to the autocorrelation of fMRI time series at time lag τ = 34 s (partial Spearman correlation ρ = 0.58 ), as well as a range of related measures such as relative high frequency power (f > 0.4 Hz: ρ = - 0.43 ). Our results indicate that the topology of inter-regional axonal connections of the mouse brain is closely related to intrinsic, spontaneous dynamics such that regions with a greater aggregate strength of incoming projections display longer timescales of activity fluctuations.

  1. Small-world bias of correlation networks: From brain to climate

    NASA Astrophysics Data System (ADS)

    Hlinka, Jaroslav; Hartman, David; Jajcay, Nikola; Tomeček, David; Tintěra, Jaroslav; Paluš, Milan

    2017-03-01

    Complex systems are commonly characterized by the properties of their graph representation. Dynamical complex systems are then typically represented by a graph of temporal dependencies between time series of state variables of their subunits. It has been shown recently that graphs constructed in this way tend to have relatively clustered structure, potentially leading to spurious detection of small-world properties even in the case of systems with no or randomly distributed true interactions. However, the strength of this bias depends heavily on a range of parameters and its relevance for real-world data has not yet been established. In this work, we assess the relevance of the bias using two examples of multivariate time series recorded in natural complex systems. The first is the time series of local brain activity as measured by functional magnetic resonance imaging in resting healthy human subjects, and the second is the time series of average monthly surface air temperature coming from a large reanalysis of climatological data over the period 1948-2012. In both cases, the clustering in the thresholded correlation graph is substantially higher compared with a realization of a density-matched random graph, while the shortest paths are relatively short, showing thus distinguishing features of small-world structure. However, comparable or even stronger small-world properties were reproduced in correlation graphs of model processes with randomly scrambled interconnections. This suggests that the small-world properties of the correlation matrices of these real-world systems indeed do not reflect genuinely the properties of the underlying interaction structure, but rather result from the inherent properties of correlation matrix.

  2. A shift to randomness of brain oscillations in people with autism.

    PubMed

    Lai, Meng-Chuan; Lombardo, Michael V; Chakrabarti, Bhismadev; Sadek, Susan A; Pasco, Greg; Wheelwright, Sally J; Bullmore, Edward T; Baron-Cohen, Simon; Suckling, John

    2010-12-15

    Resting-state functional magnetic resonance imaging (fMRI) enables investigation of the intrinsic functional organization of the brain. Fractal parameters such as the Hurst exponent, H, describe the complexity of endogenous low-frequency fMRI time series on a continuum from random (H = .5) to ordered (H = 1). Shifts in fractal scaling of physiological time series have been associated with neurological and cardiac conditions. Resting-state fMRI time series were recorded in 30 male adults with an autism spectrum condition (ASC) and 33 age- and IQ-matched male volunteers. The Hurst exponent was estimated in the wavelet domain and between-group differences were investigated at global and voxel level and in regions known to be involved in autism. Complex fractal scaling of fMRI time series was found in both groups but globally there was a significant shift to randomness in the ASC (mean H = .758, SD = .045) compared with neurotypical volunteers (mean H = .788, SD = .047). Between-group differences in H, which was always reduced in the ASC group, were seen in most regions previously reported to be involved in autism, including cortical midline structures, medial temporal structures, lateral temporal and parietal structures, insula, amygdala, basal ganglia, thalamus, and inferior frontal gyrus. Severity of autistic symptoms was negatively correlated with H in retrosplenial and right anterior insular cortex. Autism is associated with a small but significant shift to randomness of endogenous brain oscillations. Complexity measures may provide physiological indicators for autism as they have done for other medical conditions. Copyright © 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  3. Small-world bias of correlation networks: From brain to climate.

    PubMed

    Hlinka, Jaroslav; Hartman, David; Jajcay, Nikola; Tomeček, David; Tintěra, Jaroslav; Paluš, Milan

    2017-03-01

    Complex systems are commonly characterized by the properties of their graph representation. Dynamical complex systems are then typically represented by a graph of temporal dependencies between time series of state variables of their subunits. It has been shown recently that graphs constructed in this way tend to have relatively clustered structure, potentially leading to spurious detection of small-world properties even in the case of systems with no or randomly distributed true interactions. However, the strength of this bias depends heavily on a range of parameters and its relevance for real-world data has not yet been established. In this work, we assess the relevance of the bias using two examples of multivariate time series recorded in natural complex systems. The first is the time series of local brain activity as measured by functional magnetic resonance imaging in resting healthy human subjects, and the second is the time series of average monthly surface air temperature coming from a large reanalysis of climatological data over the period 1948-2012. In both cases, the clustering in the thresholded correlation graph is substantially higher compared with a realization of a density-matched random graph, while the shortest paths are relatively short, showing thus distinguishing features of small-world structure. However, comparable or even stronger small-world properties were reproduced in correlation graphs of model processes with randomly scrambled interconnections. This suggests that the small-world properties of the correlation matrices of these real-world systems indeed do not reflect genuinely the properties of the underlying interaction structure, but rather result from the inherent properties of correlation matrix.

  4. A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall

    NASA Astrophysics Data System (ADS)

    Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.

    2017-06-01

    Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.

  5. Time domain nonlinear SMA damper force identification approach and its numerical validation

    NASA Astrophysics Data System (ADS)

    Xin, Lulu; Xu, Bin; He, Jia

    2012-04-01

    Most of the currently available vibration-based identification approaches for structural damage detection are based on eigenvalues and/or eigenvectors extracted from vibration measurements and, strictly speaking, are only suitable for linear system. However, the initiation and development of damage in engineering structures under severe dynamic loadings are typical nonlinear procedure. Studies on the identification of restoring force which is a direct indicator of the extent of the nonlinearity have received increasing attention in recent years. In this study, a date-based time domain identification approach for general nonlinear system was developed. The applied excitation and the corresponding response time series of the structure were used for identification by means of standard least-square techniques and a power series polynomial model (PSPM) which was utilized to model the nonlinear restoring force (NRF). The feasibility and robustness of the proposed approach was verified by a 2 degree-of-freedoms (DOFs) lumped mass numerical model equipped with a shape memory ally (SMA) damper mimicking nonlinear behavior. The results show that the proposed data-based time domain method is capable of identifying the NRF in engineering structures without any assumptions on the mass distribution and the topology of the structure, and provides a promising way for damage detection in the presence of structural nonlinearities.

  6. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    NASA Astrophysics Data System (ADS)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ <α-, where α+ is related to heart rate decelerations and α- to heart rate accelerations, and the proportion of the signal in which the above inequality holds. A very similar effect is observed if asymmetric noise is added to a symmetric self-affine function. No such phenomena are observed in the same physiological data after shuffling or with a group of symmetric synthetic time series.

  7. Mutual connectivity analysis (MCA) using generalized radial basis function neural networks for nonlinear functional connectivity network recovery in resting-state functional MRI

    NASA Astrophysics Data System (ADS)

    D'Souza, Adora M.; Abidin, Anas Zainul; Nagarajan, Mahesh B.; Wismüller, Axel

    2016-03-01

    We investigate the applicability of a computational framework, called mutual connectivity analysis (MCA), for directed functional connectivity analysis in both synthetic and resting-state functional MRI data. This framework comprises of first evaluating non-linear cross-predictability between every pair of time series prior to recovering the underlying network structure using community detection algorithms. We obtain the non-linear cross-prediction score between time series using Generalized Radial Basis Functions (GRBF) neural networks. These cross-prediction scores characterize the underlying functionally connected networks within the resting brain, which can be extracted using non-metric clustering approaches, such as the Louvain method. We first test our approach on synthetic models with known directional influence and network structure. Our method is able to capture the directional relationships between time series (with an area under the ROC curve = 0.92 +/- 0.037) as well as the underlying network structure (Rand index = 0.87 +/- 0.063) with high accuracy. Furthermore, we test this method for network recovery on resting-state fMRI data, where results are compared to the motor cortex network recovered from a motor stimulation sequence, resulting in a strong agreement between the two (Dice coefficient = 0.45). We conclude that our MCA approach is effective in analyzing non-linear directed functional connectivity and in revealing underlying functional network structure in complex systems.

  8. Mutual Connectivity Analysis (MCA) Using Generalized Radial Basis Function Neural Networks for Nonlinear Functional Connectivity Network Recovery in Resting-State Functional MRI.

    PubMed

    DSouza, Adora M; Abidin, Anas Zainul; Nagarajan, Mahesh B; Wismüller, Axel

    2016-03-29

    We investigate the applicability of a computational framework, called mutual connectivity analysis (MCA), for directed functional connectivity analysis in both synthetic and resting-state functional MRI data. This framework comprises of first evaluating non-linear cross-predictability between every pair of time series prior to recovering the underlying network structure using community detection algorithms. We obtain the non-linear cross-prediction score between time series using Generalized Radial Basis Functions (GRBF) neural networks. These cross-prediction scores characterize the underlying functionally connected networks within the resting brain, which can be extracted using non-metric clustering approaches, such as the Louvain method. We first test our approach on synthetic models with known directional influence and network structure. Our method is able to capture the directional relationships between time series (with an area under the ROC curve = 0.92 ± 0.037) as well as the underlying network structure (Rand index = 0.87 ± 0.063) with high accuracy. Furthermore, we test this method for network recovery on resting-state fMRI data, where results are compared to the motor cortex network recovered from a motor stimulation sequence, resulting in a strong agreement between the two (Dice coefficient = 0.45). We conclude that our MCA approach is effective in analyzing non-linear directed functional connectivity and in revealing underlying functional network structure in complex systems.

  9. Multifractality Signatures in Quasars Time Series. I. 3C 273

    NASA Astrophysics Data System (ADS)

    Belete, A. Bewketu; Bravo, J. P.; Canto Martins, B. L.; Leão, I. C.; De Araujo, J. M.; De Medeiros, J. R.

    2018-05-01

    The presence of multifractality in a time series shows different correlations for different time scales as well as intermittent behaviour that cannot be captured by a single scaling exponent. The identification of a multifractal nature allows for a characterization of the dynamics and of the intermittency of the fluctuations in non-linear and complex systems. In this study, we search for a possible multifractal structure (multifractality signature) of the flux variability in the quasar 3C 273 time series for all electromagnetic wavebands at different observation points, and the origins for the observed multifractality. This study is intended to highlight how the scaling behaves across the different bands of the selected candidate which can be used as an additional new technique to group quasars based on the fractal signature observed in their time series and determine whether quasars are non-linear physical systems or not. The Multifractal Detrended Moving Average algorithm (MFDMA) has been used to study the scaling in non-linear, complex and dynamic systems. To achieve this goal, we applied the backward (θ = 0) MFDMA method for one-dimensional signals. We observe weak multifractal (close to monofractal) behaviour in some of the time series of our candidate except in the mm, UV and X-ray bands. The non-linear temporal correlation is the main source of the observed multifractality in the time series whereas the heaviness of the distribution contributes less.

  10. Base connections for signal/sign structures.

    DOT National Transportation Integrated Search

    2012-02-01

    The Atlantic hurricane season of 2004 brought with it a series of four major hurricanes that made landfall across : Florida within a six-week period. During this time, a number of cantilever sign structures along the state interstate system : failed....

  11. Using chaotic forcing to detect damage in a structure

    USGS Publications Warehouse

    Moniz, L.; Nichols, J.; Trickey, S.; Seaver, M.; Pecora, D.; Pecora, L.

    2005-01-01

    In this work we develop a numerical test for Holder continuity and apply it and another test for continuity to the difficult problem of detecting damage in structures. We subject a thin metal plate with incremental damage to the plate changes, its filtering properties, and therefore the phase space trajectories of the response chaotic excitation of various bandwidths. Damage to the plate changes its filtering properties and therefore the phase space of the response. Because the data are multivariate (the plate is instrumented with multiple sensors) we use a singular value decomposition of the set of the output time series to reduce the embedding dimension of the response time series. We use two geometric tests to compare an attractor reconstructed from data from an undamaged structure to that reconstructed from data from a damaged structure. These two tests translate to testing for both generalized and differentiable synchronization between responses. We show loss of synchronization of responses with damage to the structure. ?? 2005 American Institute of Physics.

  12. Using chaotic forcing to detect damage in a structure.

    USGS Publications Warehouse

    Moniz, L.; Nichols, J.; Trickey, S.; Seaver, M.; Pecora, D.; Pecora, L.

    2005-01-01

    In this work we develop a numerical test for Holder continuity and apply it and another test for continuity to the difficult problem of detecting damage in structures. We subject a thin metal plate with incremental damage to the plate changes, its filtering properties, and therefore the phase space trajectories of the response chaotic excitation of various bandwidths. Damage to the plate changes its filtering properties and therefore the phase space of the response. Because the data are multivariate (the plate is instrumented with multiple sensors) we use a singular value decomposition of the set of the output time series to reduce the embedding dimension of the response time series. We use two geometric tests to compare an attractor reconstructed from data from an undamaged structure to that reconstructed from data from a damaged structure. These two tests translate to testing for both generalized and differentiable synchronization between responses. We show loss of synchronization of responses with damage to the structure.

  13. Mapping tropical dry forest height, foliage height profiles and disturbance type and age with a time series of cloud-cleared Landsat and ALI image mosaics to characterize avian habitat

    Treesearch

    E.H. Helmer; Thomas S. Ruzycki; Jr. Joseph M. Wunderle; Shannon Vogesser; Bonnie Ruefenacht; Charles Kwit; Thomas J. Brandeis; David N. Ewert

    2010-01-01

    Remote sensing of forest vertical structure is possible with lidar data, but lidar is not widely available. Here we map tropical dry forest height (RMSE=0.9 m, R2=0.84, range 0.6–7 m), and we map foliage height profiles, with a time series of Landsat and Advanced Land Imager (ALI) imagery on the island of Eleuthera, The Bahamas, substituting time for vertical canopy...

  14. Corroded Anchor Structure Stability/Reliability (CAS_Stab-R) Software for Hydraulic Structures

    DTIC Science & Technology

    2017-12-01

    This report describes software that provides a probabilistic estimate of time -to-failure for a corroding anchor strand system. These anchor...stability to the structure. A series of unique pull-test experiments conducted by Ebeling et al. (2016) at the U.S. Army Engineer Research and...Reliability (CAS_Stab-R) produces probabilistic Remaining Anchor Life time estimates for anchor cables based upon the direct corrosion rate for the

  15. Robust evaluation of time series classification algorithms for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Worden, Keith; Todd, Michael D.

    2014-03-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and mechanical infrastructure through analysis of structural response measurements. The supervised learning methodology for data-driven SHM involves computation of low-dimensional, damage-sensitive features from raw measurement data that are then used in conjunction with machine learning algorithms to detect, classify, and quantify damage states. However, these systems often suffer from performance degradation in real-world applications due to varying operational and environmental conditions. Probabilistic approaches to robust SHM system design suffer from incomplete knowledge of all conditions a system will experience over its lifetime. Info-gap decision theory enables nonprobabilistic evaluation of the robustness of competing models and systems in a variety of decision making applications. Previous work employed info-gap models to handle feature uncertainty when selecting various components of a supervised learning system, namely features from a pre-selected family and classifiers. In this work, the info-gap framework is extended to robust feature design and classifier selection for general time series classification through an efficient, interval arithmetic implementation of an info-gap data model. Experimental results are presented for a damage type classification problem on a ball bearing in a rotating machine. The info-gap framework in conjunction with an evolutionary feature design system allows for fully automated design of a time series classifier to meet performance requirements under maximum allowable uncertainty.

  16. Future mission studies: Forecasting solar flux directly from its chaotic time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.

  17. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    PubMed

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  18. Centrality measures in temporal networks with time series analysis

    NASA Astrophysics Data System (ADS)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  19. Hindfoot arthroscopic surgery for posterior ankle impingement: a systematic surgical approach and case series.

    PubMed

    Smyth, Niall A; Murawski, Christopher D; Levine, David S; Kennedy, John G

    2013-08-01

    Hindfoot arthroscopic surgery has been described as a minimally invasive surgical treatment for posterior ankle impingement syndrome. The current article describes a systematic approach for identifying relevant hindfoot structures as well as the clinical results of a case series. To present a structured systematic surgical approach for identifying relevant anatomic structures and abnormalities during hindfoot arthroscopic surgery. In addition, we report the clinical results of a case series. Case series; Level of evidence, 4. The systematic surgical approach divides the extra-articular structures of the hindfoot into quadrants as defined by the intermalleolar ligament. Twenty-two patients underwent hindfoot arthroscopic surgery for the treatment of posterior ankle impingement syndrome. The mean follow-up time was 25 months (range, 14-35 months). Standard patient-reported outcome questionnaires of the foot and ankle outcome score (FAOS) and Short Form-12 (SF-12) general health survey were administered at standard time points after surgery. Return to sporting activities was also calculated as the time period from the date of surgery until the patient was able to participate at their previous level of activity. The mean FAOS score improved from 59 (range, 22-94) preoperatively to 86 (range, 47-100) postoperatively (P < .01). The mean SF-12 score showed similar improvement with a mean of 66 (range, 42-96) preoperatively to 86 (range, 56-98) postoperatively (P < .01). Nineteen patients reported competing at some level of athletic sport before surgery. All patients returned to their previous level of competition after surgery. The mean time to return to sporting activities was 12 weeks (range, 6-16 weeks). Two complications were reported postoperatively: 1 wound infection and 1 case of dysesthesia of the deep peroneal nerve. Hindfoot arthroscopic surgery is a safe and effective treatment strategy for posterior ankle impingement syndrome. In addition, it allows the patients a rapid return to sporting activities.

  20. Classification of damage in structural systems using time series analysis and supervised and unsupervised pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; de Lautour, Oliver R.

    2010-04-01

    Developed for studying long, periodic records of various measured quantities, time series analysis methods are inherently suited and offer interesting possibilities for Structural Health Monitoring (SHM) applications. However, their use in SHM can still be regarded as an emerging application and deserves more studies. In this research, Autoregressive (AR) models were used to fit experimental acceleration time histories from two experimental structural systems, a 3- storey bookshelf-type laboratory structure and the ASCE Phase II SHM Benchmark Structure, in healthy and several damaged states. The coefficients of the AR models were chosen as damage sensitive features. Preliminary visual inspection of the large, multidimensional sets of AR coefficients to check the presence of clusters corresponding to different damage severities was achieved using Sammon mapping - an efficient nonlinear data compression technique. Systematic classification of damage into states based on the analysis of the AR coefficients was achieved using two supervised classification techniques: Nearest Neighbor Classification (NNC) and Learning Vector Quantization (LVQ), and one unsupervised technique: Self-organizing Maps (SOM). This paper discusses the performance of AR coefficients as damage sensitive features and compares the efficiency of the three classification techniques using experimental data.

  1. Fitting ARMA Time Series by Structural Equation Models.

    ERIC Educational Resources Information Center

    van Buuren, Stef

    1997-01-01

    This paper outlines how the stationary ARMA (p,q) model (G. Box and G. Jenkins, 1976) can be specified as a structural equation model. Maximum likelihood estimates for the parameters in the ARMA model can be obtained by software for fitting structural equation models. The method is applied to three problem types. (SLD)

  2. Temporal Structure of Support Surface Translations Drive the Temporal Structure of Postural Control During Standing

    PubMed Central

    Rand, Troy J.; Myers, Sara A.; Kyvelidou, Anastasia; Mukherjee, Mukul

    2015-01-01

    A healthy biological system is characterized by a temporal structure that exhibits fractal properties and is highly complex. Unhealthy systems demonstrate lowered complexity and either greater or less predictability in the temporal structure of a time series. The purpose of this research was to determine if support surface translations with different temporal structures would affect the temporal structure of the center of pressure (COP) signal. Eight healthy young participants stood on a force platform that was translated in the anteroposterior direction for input conditions of varying complexity: white noise, pink noise, brown noise, and sine wave. Detrended fluctuation analysis was used to characterize the long-range correlations of the COP time series in the AP direction. Repeated measures ANOVA revealed differences among conditions (P < .001). The less complex support surface translations resulted in a less complex COP compared to normal standing. A quadratic trend analysis demonstrated an inverted-u shape across an increasing order of predictability of the conditions (P < .001). The ability to influence the complexity of postural control through support surface translations can have important implications for rehabilitation. PMID:25994281

  3. Post-Flight Estimation of Motion of Space Structures: Part 2

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Breckenridge, William

    2008-01-01

    A computer program related to the one described in the immediately preceding article estimates the relative position of two space structures that are hinged to each other. The input to the program consists of time-series data on distances, measured by two range finders at different positions on one structure, to a corner-cube retroreflector on the other structure. Given a Cartesian (x,y,z) coordinate system and the known x coordinate of the retroreflector relative to the y,z plane that contains the range finders, the program estimates the y and z coordinates of the retroreflector. The estimation process involves solving for the y,z coordinates of the intersection between (1) the y,z plane that contains the retroreflector and (2) spheres, centered on the range finders, having radii equal to the measured distances. In general, there are two such solutions and the program chooses the one consistent with the design of the structures. The program implements a Kalman filter. The output of the program is a time series of estimates of the relative position of the structures.

  4. Structured Sensory Trauma Interventions

    ERIC Educational Resources Information Center

    Steele, William; Kuban, Caelan

    2010-01-01

    This article features the National Institute of Trauma and Loss in Children (TLC), a program that has demonstrated via field testing, exploratory research, time series studies, and evidence-based research studies that its Structured Sensory Intervention for Traumatized Children, Adolescents, and Parents (SITCAP[R]) produces statistically…

  5. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  6. Empirical analysis of the effects of cyber security incidents.

    PubMed

    Davis, Ginger; Garcia, Alfredo; Zhang, Weide

    2009-09-01

    We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.

  7. Scaling laws from geomagnetic time series

    USGS Publications Warehouse

    Voros, Z.; Kovacs, P.; Juhasz, A.; Kormendi, A.; Green, A.W.

    1998-01-01

    The notion of extended self-similarity (ESS) is applied here for the X - component time series of geomagnetic field fluctuations. Plotting nth order structure functions against the fourth order structure function we show that low-frequency geomagnetic fluctuations up to the order n = 10 follow the same scaling laws as MHD fluctuations in solar wind, however, for higher frequencies (f > l/5[h]) a clear departure from the expected universality is observed for n > 6. ESS does not allow to make an unambiguous statement about the non triviality of scaling laws in "geomagnetic" turbulence. However, we suggest to use higher order moments as promising diagnostic tools for mapping the contributions of various remote magnetospheric sources to local observatory data. Copyright 1998 by the American Geophysical Union.

  8. Hangar Fire Suppression Utilizing Novec 1230

    DTIC Science & Technology

    2018-01-01

    The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...fuel fires in aircraft hangars. A 30×30×8-ft concrete-and-steel test structure was constructed for this test series . Four discharge assemblies...structure. System discharge parameters---discharge time , discharge rate, and quantity of agent discharged---were adjusted to produce the desired Novec 1230

  9. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  10. Accuracy of time-domain and frequency-domain methods used to characterize catchment transit time distributions

    NASA Astrophysics Data System (ADS)

    Godsey, S. E.; Kirchner, J. W.

    2008-12-01

    The mean residence time - the average time that it takes rainfall to reach the stream - is a basic parameter used to characterize catchment processes. Heterogeneities in these processes lead to a distribution of travel times around the mean residence time. By examining this travel time distribution, we can better predict catchment response to contamination events. A catchment system with shorter residence times or narrower distributions will respond quickly to contamination events, whereas systems with longer residence times or longer-tailed distributions will respond more slowly to those same contamination events. The travel time distribution of a catchment is typically inferred from time series of passive tracers (e.g., water isotopes or chloride) in precipitation and streamflow. Variations in the tracer concentration in streamflow are usually damped compared to those in precipitation, because precipitation inputs from different storms (with different tracer signatures) are mixed within the catchment. Mathematically, this mixing process is represented by the convolution of the travel time distribution and the precipitation tracer inputs to generate the stream tracer outputs. Because convolution in the time domain is equivalent to multiplication in the frequency domain, it is relatively straightforward to estimate the parameters of the travel time distribution in either domain. In the time domain, the parameters describing the travel time distribution are typically estimated by maximizing the goodness of fit between the modeled and measured tracer outputs. In the frequency domain, the travel time distribution parameters can be estimated by fitting a power-law curve to the ratio of precipitation spectral power to stream spectral power. Differences between the methods of parameter estimation in the time and frequency domain mean that these two methods may respond differently to variations in data quality, record length and sampling frequency. Here we evaluate how well these two methods of travel time parameter estimation respond to different sources of uncertainty and compare the methods to one another. We do this by generating synthetic tracer input time series of different lengths, and convolve these with specified travel-time distributions to generate synthetic output time series. We then sample both the input and output time series at various sampling intervals and corrupt the time series with realistic error structures. Using these 'corrupted' time series, we infer the apparent travel time distribution, and compare it to the known distribution that was used to generate the synthetic data in the first place. This analysis allows us to quantify how different record lengths, sampling intervals, and error structures in the tracer measurements affect the apparent mean residence time and the apparent shape of the travel time distribution.

  11. A prospective interrupted time series study of interventions to improve the quality, rating, framing and structure of goal-setting in community-based brain injury rehabilitation.

    PubMed

    Hassett, Leanne; Simpson, Grahame; Cotter, Rachel; Whiting, Diane; Hodgkinson, Adeline; Martin, Diane

    2015-04-01

    To investigate whether the introduction of an electronic goals system followed by staff training improved the quality, rating, framing and structure of goals written by a community-based brain injury rehabilitation team. Interrupted time series design. Two interventions were introduced six months apart. The first intervention comprised the introduction of an electronic goals system. The second intervention comprised a staff goal training workshop. An audit protocol was devised to evaluate the goals. A random selection of goal statements from the 12 months prior to the interventions (Time 1 baseline) were compared with all goal statements written after the introduction of the electronic goals system (Time 2) and staff training (Time 3). All goals were de-identified for client and time-period, and randomly ordered. A total of 745 goals (Time 1 n = 242; Time 2 n = 283; Time 3 n = 220) were evaluated. Compared with baseline, the introduction of the electronic goals system alone significantly increased goal rating, framing and structure (χ(2) tests 144.7, 18.9, 48.1, respectively, p < 0.001). The addition of staff training meant that the improvement in goal quality, which was only a trend at Time 2, was statistically significant at Time 3 (χ(2) 15.0, p ≤ 001). The training also led to a further significant increase in the framing and structuring of goals over the electronic goals system (χ(2) 11.5, 12.5, respectively, p ≤ 0.001). An electronic goals system combined with staff training improved the quality, rating, framing and structure of goal statements. © The Author(s) 2014.

  12. The effect of transverse wave vector and magnetic fields on resonant tunneling times in double-barrier structures

    NASA Astrophysics Data System (ADS)

    Wang, Hongmei; Zhang, Yafei; Xu, Huaizhe

    2007-01-01

    The effect of transverse wave vector and magnetic fields on resonant tunneling times in double-barrier structures, which is significant but has been frequently omitted in previous theoretical methods, has been reported in this paper. The analytical expressions of the longitudinal energies of quasibound levels (LEQBL) and the lifetimes of quasibound levels (LQBL) in symmetrical double-barrier (SDB) structures have been derived as a function of transverse wave vector and longitudinal magnetic fields perpendicular to interfaces. Based on our derived analytical expressions, the LEQBL and LQBL dependence upon transverse wave vector and longitudinal magnetic fields has been explored numerically for a SDB structure. Model calculations show that the LEQBL decrease monotonically and the LQBL shorten with increasing transverse wave vector, and each original LEQBL splits to a series of sub-LEQBL which shift nearly linearly toward the well bottom and the lifetimes of quasibound level series (LQBLS) shorten with increasing Landau-level indices and magnetic fields.

  13. Correlation filtering in financial time series (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Aste, T.; Di Matteo, Tiziana; Tumminello, M.; Mantegna, R. N.

    2005-05-01

    We apply a method to filter relevant information from the correlation coefficient matrix by extracting a network of relevant interactions. This method succeeds to generate networks with the same hierarchical structure of the Minimum Spanning Tree but containing a larger amount of links resulting in a richer network topology allowing loops and cliques. In Tumminello et al.,1 we have shown that this method, applied to a financial portfolio of 100 stocks in the USA equity markets, is pretty efficient in filtering relevant information about the clustering of the system and its hierarchical structure both on the whole system and within each cluster. In particular, we have found that triangular loops and 4 element cliques have important and significant relations with the market structure and properties. Here we apply this filtering procedure to the analysis of correlation in two different kind of interest rate time series (16 Eurodollars and 34 US interest rates).

  14. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.

    PubMed

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.

  15. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations

    PubMed Central

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach–the Principal Component Gradient Analysis (PCGA)–to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA. PMID:27467508

  16. Single crystal synthesis and magnetism of the Ba Ln 2O 4 family ( Ln = lanthanide)

    DOE PAGES

    Besara, Tiglet; Lundberg, Matthew S.; Sun, Jifeng; ...

    2014-05-27

    The series of compounds in the Ba Ln 2O 4 family (Ln = La–Lu, Y) has been synthesized for the first time in single crystalline form, using a molten metal flux. The series crystallizes in the CaV 2O 4 structure type with primitive orthorhombic symmetry (space group Pnma, #62), and a complete structural study of atomic positions, bonds, angles, and distortions across the lanthanide series is presented. With the exception of the Y, La, Eu, and Lu members, magnetic susceptibility measurements were performed between 2 K and 300 K. BaCe 2O 4 and BaYb 2O 4 display large crystal fieldsmore » effects and suppression of magnetic ordering. As a result, all compounds show signs of magnetic frustration due to the trigonal arrangements of the trivalent lanthanide cations in the structure.« less

  17. Reconstructions of parameters of radiophysical chaotic generator with delayed feedback from short time series

    NASA Astrophysics Data System (ADS)

    Ishbulatov, Yu. M.; Karavaev, A. S.; Kiselev, A. R.; Semyachkina-Glushkovskaya, O. V.; Postnov, D. E.; Bezruchko, B. P.

    2018-04-01

    A method for the reconstruction of time-delayed feedback system is investigated, which is based on the detection of synchronous response of a slave time-delay system with respect to the driving from the master system under study. The structure of the driven system is similar to the structure of the studied time-delay system, but the feedback circuit is broken in the driven system. The method efficiency is tested using short and noisy data gained from an electronic chaotic oscillator with time-delayed feedback.

  18. NEW SUNS IN THE COSMOS. III. MULTIFRACTAL SIGNATURE ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freitas, D. B. de; Nepomuceno, M. M. F.; Junior, P. R. V. de Moraes

    2016-11-01

    In the present paper, we investigate the multifractality signatures in hourly time series extracted from the CoRoT spacecraft database. Our analysis is intended to highlight the possibility that astrophysical time series can be members of a particular class of complex and dynamic processes, which require several photometric variability diagnostics to characterize their structural and topological properties. To achieve this goal, we search for contributions due to a nonlinear temporal correlation and effects caused by heavier tails than the Gaussian distribution, using a detrending moving average algorithm for one-dimensional multifractal signals (MFDMA). We observe that the correlation structure is the mainmore » source of multifractality, while heavy-tailed distribution plays a minor role in generating the multifractal effects. Our work also reveals that the rotation period of stars is inherently scaled by the degree of multifractality. As a result, analyzing the multifractal degree of the referred series, we uncover an evolution of multifractality from shorter to larger periods.« less

  19. Governance structure reform and antibiotics prescription in community health centres in Shenzhen, China.

    PubMed

    Liang, Xiaoyun; Xia, Tingsong; Zhang, Xiulan; Jin, Chenggang

    2014-06-01

    It is unclear whether changing the governance structure of community health centres (CHCs) could affect antibiotic prescribing behaviour. To explore how changes in governance structure affect antibiotic prescription for children younger than 5 years of age with acute upper respiratory tract infections (AURI) in CHCs in Shenzhen, China. This study used an interrupted time series design with a comparison series. On 1 June 2009, the Health Bureau of Shenzhen's Baoan District transferred CHCs from a hospital-affiliated model to a self-managed independent model regarding finance, personnel and employee compensation. We collected 23481 electronic medical records of children younger than 5 years of age who were treated for AURI on an outpatient basis 1 year before and 1 year after governance structure reform. We used segmented regression analysis to evaluate the effect of reform on antibiotic prescription. After the reform, the proportion of patients receiving an antibiotic injection per month and the proportion of patients receiving two or more antibiotics conditional on receiving an antibiotic per month decreased 9.17% and 7.34%, respectively (P < 0.01 or P < 0.05). In the intervention series, the proportion of patients receiving an antibiotic injection per month and the monthly average cost of the antibiotics prescribed per patient continued to decrease over time compared with the control series (P < 0.001 or P < 0.05). This study suggests that governance structure reform can have positive effects on behaviour for antibiotic prescribing. Moreover, this short-term effect might have important implications for further community health reforms in China. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. 40 CFR 796.3100 - Aerobic aquatic biodegradation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Aerobic aquatic biodegradation. (a) Introduction—(1) Purpose. (i) This Guideline is designed to develop... biodegradability of a series of functionally or structurally related chemicals, media from all inoculum flasks may..., and control system should be analyzed at time zero and at a minimum of four other times from time zero...

  1. 40 CFR 796.3100 - Aerobic aquatic biodegradation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Aerobic aquatic biodegradation. (a) Introduction—(1) Purpose. (i) This Guideline is designed to develop... biodegradability of a series of functionally or structurally related chemicals, media from all inoculum flasks may..., and control system should be analyzed at time zero and at a minimum of four other times from time zero...

  2. 40 CFR 796.3100 - Aerobic aquatic biodegradation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Aerobic aquatic biodegradation. (a) Introduction—(1) Purpose. (i) This Guideline is designed to develop... biodegradability of a series of functionally or structurally related chemicals, media from all inoculum flasks may..., and control system should be analyzed at time zero and at a minimum of four other times from time zero...

  3. First Indications of Intraplate Deformations in Central Germany from Reprocessed GNSS Time Series and Geological Data

    NASA Astrophysics Data System (ADS)

    Becker, Matthias; Leinen, Stefan; Läufer, Gwendolyn; Lehné, Rouwen

    2013-04-01

    Six years of GPS data have been reprocessed in ITRF2008 for a regional SAPOS CORS network in the federal state of Hesse with 25 stations and some anchor sites of IGS and EPN to derive accurate and consistent coordinate time series. Based on daily network solutions coordinate time series parameters like velocities, offsets in case of antenna changes and annual periodic variation have been estimated. The estimation process includes the fitting of a sophisticated stochastic model for the time series which accounts for inherent time correlation. The results are blended with geological data to verify information from geology on potential recent deformations by the geodetic analyses. Besides of some information on the reprocessing of the GNSS the results the stochastics of the derived velocity field will be discussed in detail. Special emphasis will be on the intra-plate deformation: for the horizontal component the residual velocity field after removal of a plate rotation model is presented, while for the vertical velocities the datum-induced systematic effect is removed in order to analyze the remaining vertical motion. The residual velocity field is then matched with the geology for Hesse. Correlation of both vertical and horizontal movements with major geological structures reveals good accordance. SAPOS stations with documented significant subsidence are mainly located in tertiary Graben structures such as the Lower Hessian Basin (station Kassel), the Wetterau (station Kloppenheim) or the Upper Rhine Graben (Station Darmstadt). From the geological point of view these structures are supposed to be subsiding ones. Other major geological features, i.e. the Rhenish Shield as well as the East Hessian Bunter massif are supposed to be affected by recent uplift. SAPOS stations located in these regions match the assumed movement (e.g. Weilburg, Wiesbaden, Bingen, Fulda). Furthermore SAPOS-derived horizontal movements seem to trace tectonic movements in the region, i.e. extension along the tertiary Graben structures, including a sinistral strike slip component. However, a more detailed analysis is needed to confirm the link between detected movement and geodynamic processes.

  4. An Evaluation of Subsurface Plumbing of a Hydrothermal Seep Field and Possible Influence from Local Seismicity from New Time-Series Data Collected at the Davis-Schrimpf Seep Field, Salton Trough, California

    NASA Astrophysics Data System (ADS)

    Rao, A.; Onderdonk, N.

    2016-12-01

    The Davis­-Schrimpf Seep Field (DSSF) is a group of approximately 50 geothermal mud seeps (gryphons) in the Salton Trough of southeastern California. Its location puts it in line with the mapped San Andreas Fault, if extended further south, as well as within the poorly-understood Brawley Seismic Zone. Much of the geomorphology, geochemistry, and other characteristics of the DSSF have been analyzed, but its subsurface structure remains unknown. Here we present data and interpretations from five new temperature time­series from four separate gryphons at the DSSF, and compare them both amongst themselves, and within the context of all previously collected data to identify possible patterns constraining the subsurface dynamics. Simultaneously collected time-series from different seeps were cross-correlated to quantify similarity. All years' time-series were checked against the record of local seismicity to identify any seismic influence on temperature excursions. Time-series captured from the same feature in different years were statistically summarized and the results plotted to examine their evolution over time. We found that adjacent vents often alternate in temperature, suggesting a switching of flow path of the erupted mud at the scale of a few meters or less. Noticeable warming over time was observed in most of the features with time-series covering multiple years. No synchronicity was observed between DSSF features' temperature excursions, and seismic events within a 24 kilometer radius covering most of the width of the surrounding Salton Trough.

  5. Theoretical research and experimental validation of quasi-static load spectra on bogie frame structures of high-speed trains

    NASA Astrophysics Data System (ADS)

    Zhu, Ning; Sun, Shou-Guang; Li, Qiang; Zou, Hua

    2014-12-01

    One of the major problems in structural fatigue life analysis is establishing structural load spectra under actual operating conditions. This study conducts theoretical research and experimental validation of quasi-static load spectra on bogie frame structures of high-speed trains. The quasistatic load series that corresponds to quasi-static deformation modes are identified according to the structural form and bearing conditions of high-speed train bogie frames. Moreover, a force-measuring frame is designed and manufactured based on the quasi-static load series. The load decoupling model of the quasi-static load series is then established via calibration tests. Quasi-static load-time histories, together with online tests and decoupling analysis, are obtained for the intermediate range of the Beijing—Shanghai dedicated passenger line. The damage consistency calibration of the quasi-static discrete load spectra is performed according to a damage consistency criterion and a genetic algorithm. The calibrated damage that corresponds with the quasi-static discrete load spectra satisfies the safety requirements of bogie frames.

  6. Microsecond resolved single-molecule FRET time series measurements based on the line confocal optical system combined with hybrid photodetectors.

    PubMed

    Oikawa, Hiroyuki; Takahashi, Takumi; Kamonprasertsuk, Supawich; Takahashi, Satoshi

    2018-01-31

    Single-molecule (sm) fluorescence time series measurements based on the line confocal optical system are a powerful strategy for the investigation of the structure, dynamics, and heterogeneity of biological macromolecules. This method enables the detection of more than several thousands of fluorescence photons per millisecond from single fluorophores, implying that the potential time resolution for measurements of the fluorescence resonance energy transfer (FRET) efficiency is 10 μs. However, the necessity of using imaging photodetectors in the method limits the time resolution in the FRET efficiency measurements to approximately 100 μs. In this investigation, a new photodetector called a hybrid photodetector (HPD) was incorporated into the line confocal system to improve the time resolution without sacrificing the length of the time series detection. Among several settings examined, the system based on a slit width of 10 μm and a high-speed counting device made the best of the features of the line confocal optical system and the HPD. This method achieved a time resolution of 10 μs and an observation time of approximately 5 ms in the sm-FRET time series measurements. The developed device was used for the native state of the B domain of protein A.

  7. New Features for Neuron Classification.

    PubMed

    Hernández-Pérez, Leonardo A; Delgado-Castillo, Duniel; Martín-Pérez, Rainer; Orozco-Morales, Rubén; Lorenzo-Ginori, Juan V

    2018-04-28

    This paper addresses the problem of obtaining new neuron features capable of improving results of neuron classification. Most studies on neuron classification using morphological features have been based on Euclidean geometry. Here three one-dimensional (1D) time series are derived from the three-dimensional (3D) structure of neuron instead, and afterwards a spatial time series is finally constructed from which the features are calculated. Digitally reconstructed neurons were separated into control and pathological sets, which are related to three categories of alterations caused by epilepsy, Alzheimer's disease (long and local projections), and ischemia. These neuron sets were then subjected to supervised classification and the results were compared considering three sets of features: morphological, features obtained from the time series and a combination of both. The best results were obtained using features from the time series, which outperformed the classification using only morphological features, showing higher correct classification rates with differences of 5.15, 3.75, 5.33% for epilepsy and Alzheimer's disease (long and local projections) respectively. The morphological features were better for the ischemia set with a difference of 3.05%. Features like variance, Spearman auto-correlation, partial auto-correlation, mutual information, local minima and maxima, all related to the time series, exhibited the best performance. Also we compared different evaluators, among which ReliefF was the best ranked.

  8. A scalable database model for multiparametric time series: a volcano observatory case study

    NASA Astrophysics Data System (ADS)

    Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea

    2014-05-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  9. A multidisciplinary database for geophysical time series management

    NASA Astrophysics Data System (ADS)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  10. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    NASA Astrophysics Data System (ADS)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  11. Characterizing the temporal variability of L-band backscatter using dense UAVSAR time-series in preparation for the NISAR mission

    NASA Astrophysics Data System (ADS)

    Lavalle, M.; Lee, A.; Shiroma, G. X. H.; Rosen, P. A.

    2017-12-01

    The NASA-ISRO SAR (NISAR) mission will deliver unprecedented global maps of L-band HH/HV backscatter every 12 days with resolution ranging from a few to tens of meters in support of ecosystem, solid Earth and cryosphere science and applications. Understanding and modeling the temporal variability of L-band backscatter over temporal scales of years, months and days is critical for developing retrieval algorithms that can robustly extract the biophysical variables of interest (e.g., forest biomass, soil moisture, etc.) from NISAR time series. In this talk, we will focus on the 5-year time series of 60 JPL/UAVSAR polarimetric images collected near the Sacramento Delta to characterize the inter-annual, seasonal and short-scale variability of the L-band polarimetric backscatter for a broad range of land cover types. Our preliminary analysis reveals that backscatter from man-made structures is very stable over time, whereas backscatter from bare soil and herbaceous vegetation fluctuates over time with standard deviation of 2.3 dB. Land-cover classes with larger biomass such as trees and tall vegetation show about 1.5 dB standard deviation in temporal backscatter variability. Closer examination of high-spatial resolution UAVSAR imagery reveal also that vegetation structure, speckle noise and horizontal forest heterogeneity in the Sacramento Delta area can significantly affect the point-wise backscatter value. In our talk, we will illustrate the long UAVSAR time series, describe our data analysis strategy, show the results of polarimetric variability for different land cover classes and number of looks, and discuss the implications for the development of NISAR L2/L3 retrieval algorithms of ecosystem science.

  12. Time-frequency analysis of functional optical mammographic images

    NASA Astrophysics Data System (ADS)

    Barbour, Randall L.; Graber, Harry L.; Schmitz, Christoph H.; Tarantini, Frank; Khoury, Georges; Naar, David J.; Panetta, Thomas F.; Lewis, Theophilus; Pei, Yaling

    2003-07-01

    We have introduced working technology that provides for time-series imaging of the hemoglobin signal in large tissue structures. In this study we have explored our ability to detect aberrant time-frequency responses of breast vasculature for subjects with Stage II breast cancer at rest and in response to simple provocations. The hypothesis being explored is that time-series imaging will be sensitive to the known structural and functional malformations of the tumor vasculature. Mammographic studies were conducted using an adjustable hemisheric measuring head containing 21 source and 21 detector locations (441 source-detector pairs). Simultaneous dual-wavelength studies were performed at 760 and 830 nm at a framing rate of ~2.7 Hz. Optical measures were performed on women lying prone with the breast hanging in a pendant position. Two class of measures were performed: (1) 20- minute baseline measure wherein the subject was at rest; (2) provocation studies wherein the subject was asked to perform some simple breathing maneuvers. Collected data were analyzed to identify the time-frequency structure and central tendencies of the detector responses and those of the image time series. Imaging data were generated using the Normalized Difference Method (Pei et al., Appl. Opt. 40, 5755-5769, 2001). Results obtained clearly document three classes of anomalies when compared to the normal contralateral breast. 1) Breast tumors exhibit altered oxygen supply/demand imbalance in response to an oxidative challenge (breath hold). 2) The vasomotor response of the tumor vasculature is mainly depressed and exhibits an altered modulation. 3) The affected area of the breast wherein the altered vasomotor signature is seen extends well beyond the limits of the tumor itself.

  13. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical calculations such as running averages are easily implemented against the time-centric columns of data.

  14. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.

  15. Development and Testing of Data Mining Algorithms for Earth Observation

    NASA Technical Reports Server (NTRS)

    Glymour, Clark

    2005-01-01

    The new algorithms developed under this project included a principled procedure for classification of objects, events or circumstances according to a target variable when a very large number of potential predictor variables is available but the number of cases that can be used for training a classifier is relatively small. These "high dimensional" problems require finding a minimal set of variables -called the Markov Blanket-- sufficient for predicting the value of the target variable. An algorithm, the Markov Blanket Fan Search, was developed, implemented and tested on both simulated and real data in conjunction with a graphical model classifier, which was also implemented. Another algorithm developed and implemented in TETRAD IV for time series elaborated on work by C. Granger and N. Swanson, which in turn exploited some of our earlier work. The algorithms in question learn a linear time series model from data. Given such a time series, the simultaneous residual covariances, after factoring out time dependencies, may provide information about causal processes that occur more rapidly than the time series representation allow, so called simultaneous or contemporaneous causal processes. Working with A. Monetta, a graduate student from Italy, we produced the correct statistics for estimating the contemporaneous causal structure from time series data using the TETRAD IV suite of algorithms. Two economists, David Bessler and Kevin Hoover, have independently published applications using TETRAD style algorithms to the same purpose. These implementations and algorithmic developments were separately used in two kinds of studies of climate data: Short time series of geographically proximate climate variables predicting agricultural effects in California, and longer duration climate measurements of temperature teleconnections.

  16. Nonlinear dynamic analysis of D α signals for type I edge localized modes characterization on JET with a carbon wall

    NASA Astrophysics Data System (ADS)

    Cannas, Barbara; Fanni, Alessandra; Murari, Andrea; Pisano, Fabio; Contributors, JET

    2018-02-01

    In this paper, the dynamic characteristics of type-I ELM time-series from the JET tokamak, the world’s largest magnetic confinement plasma physics experiment, have been investigated. The dynamic analysis has been focused on the detection of nonlinear structure in D α radiation time series. Firstly, the method of surrogate data has been applied to evaluate the statistical significance of the null hypothesis of static nonlinear distortion of an underlying Gaussian linear process. Several nonlinear statistics have been evaluated, such us the time delayed mutual information, the correlation dimension and the maximal Lyapunov exponent. The obtained results allow us to reject the null hypothesis, giving evidence of underlying nonlinear dynamics. Moreover, no evidence of low-dimensional chaos has been found; indeed, the analysed time series are better characterized by the power law sensitivity to initial conditions which can suggest a motion at the ‘edge of chaos’, at the border between chaotic and regular non-chaotic dynamics. This uncertainty makes it necessary to further investigate about the nature of the nonlinear dynamics. For this purpose, a second surrogate test to distinguish chaotic orbits from pseudo-periodic orbits has been applied. In this case, we cannot reject the null hypothesis which means that the ELM time series is possibly pseudo-periodic. In order to reproduce pseudo-periodic dynamical properties, a periodic state-of-the-art model, proposed to reproduce the ELM cycle, has been corrupted by a dynamical noise, obtaining time series qualitatively in agreement with experimental time series.

  17. Long-Term Stability of Radio Sources in VLBI Analysis

    NASA Technical Reports Server (NTRS)

    Engelhardt, Gerald; Thorandt, Volkmar

    2010-01-01

    Positional stability of radio sources is an important requirement for modeling of only one source position for the complete length of VLBI data of presently more than 20 years. The stability of radio sources can be verified by analyzing time series of radio source coordinates. One approach is a statistical test for normal distribution of residuals to the weighted mean for each radio source component of the time series. Systematic phenomena in the time series can thus be detected. Nevertheless, an inspection of rate estimation and weighted root-mean-square (WRMS) variations about the mean is also necessary. On the basis of the time series computed by the BKG group in the frame of the ICRF2 working group, 226 stable radio sources with an axis stability of 10 as could be identified. They include 100 ICRF2 axes-defining sources which are determined independently of the method applied in the ICRF2 working group. 29 stable radio sources with a source structure index of less than 3.0 can also be used to increase the number of 295 ICRF2 defining sources.

  18. Effect of real-time boundary wind conditions on the air flow and pollutant dispersion in an urban street canyon—Large eddy simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Yun-Wei; Gu, Zhao-Lin; Cheng, Yan; Lee, Shun-Cheng

    2011-07-01

    Air flow and pollutant dispersion characteristics in an urban street canyon are studied under the real-time boundary conditions. A new scheme for realizing real-time boundary conditions in simulations is proposed, to keep the upper boundary wind conditions consistent with the measured time series of wind data. The air flow structure and its evolution under real-time boundary wind conditions are simulated by using this new scheme. The induced effect of time series of ambient wind conditions on the flow structures inside and above the street canyon is investigated. The flow shows an obvious intermittent feature in the street canyon and the flapping of the shear layer forms near the roof layer under real-time wind conditions, resulting in the expansion or compression of the air mass in the canyon. The simulations of pollutant dispersion show that the pollutants inside and above the street canyon are transported by different dispersion mechanisms, relying on the time series of air flow structures. Large scale air movements in the processes of the air mass expansion or compression in the canyon exhibit obvious effects on pollutant dispersion. The simulations of pollutant dispersion also show that the transport of pollutants from the canyon to the upper air flow is dominated by the shear layer turbulence near the roof level and the expansion or compression of the air mass in street canyon under real-time boundary wind conditions. Especially, the expansion of the air mass, which features the large scale air movement of the air mass, makes more contribution to the pollutant dispersion in this study. Comparisons of simulated results under different boundary wind conditions indicate that real-time boundary wind conditions produces better condition for pollutant dispersion than the artificially-designed steady boundary wind conditions.

  19. A New Strategy for Analyzing Time-Series Data Using Dynamic Networks: Identifying Prospective Biomarkers of Hepatocellular Carcinoma.

    PubMed

    Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui

    2016-08-31

    Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.

  20. A New Strategy for Analyzing Time-Series Data Using Dynamic Networks: Identifying Prospective Biomarkers of Hepatocellular Carcinoma

    NASA Astrophysics Data System (ADS)

    Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui

    2016-08-01

    Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.

  1. Combined risk assessment of nonstationary monthly water quality based on Markov chain and time-varying copula.

    PubMed

    Shi, Wei; Xia, Jun

    2017-02-01

    Water quality risk management is a global hot research linkage with the sustainable water resource development. Ammonium nitrogen (NH 3 -N) and permanganate index (COD Mn ) as the focus indicators in Huai River Basin, are selected to reveal their joint transition laws based on Markov theory. The time-varying moments model with either time or land cover index as explanatory variables is applied to build the time-varying marginal distributions of water quality time series. Time-varying copula model, which takes the non-stationarity in the marginal distribution and/or the time variation in dependence structure between water quality series into consideration, is constructed to describe a bivariate frequency analysis for NH 3 -N and COD Mn series at the same monitoring gauge. The larger first-order Markov joint transition probability indicates water quality state Class V w , Class IV and Class III will occur easily in the water body of Bengbu Sluice. Both marginal distribution and copula models are nonstationary, and the explanatory variable time yields better performance than land cover index in describing the non-stationarities in the marginal distributions. In modelling the dependence structure changes, time-varying copula has a better fitting performance than the copula with the constant or the time-trend dependence parameter. The largest synchronous encounter risk probability of NH 3 -N and COD Mn simultaneously reaching Class V is 50.61%, while the asynchronous encounter risk probability is largest when NH 3 -N and COD Mn is inferior to class V and class IV water quality standards, respectively.

  2. Estimation of stochastic volatility with long memory for index prices of FTSE Bursa Malaysia KLCI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Kho Chia; Kane, Ibrahim Lawal; Rahman, Haliza Abd

    In recent years, modeling in long memory properties or fractionally integrated processes in stochastic volatility has been applied in the financial time series. A time series with structural breaks can generate a strong persistence in the autocorrelation function, which is an observed behaviour of a long memory process. This paper considers the structural break of data in order to determine true long memory time series data. Unlike usual short memory models for log volatility, the fractional Ornstein-Uhlenbeck process is neither a Markovian process nor can it be easily transformed into a Markovian process. This makes the likelihood evaluation and parametermore » estimation for the long memory stochastic volatility (LMSV) model challenging tasks. The drift and volatility parameters of the fractional Ornstein-Unlenbeck model are estimated separately using the least square estimator (lse) and quadratic generalized variations (qgv) method respectively. Finally, the empirical distribution of unobserved volatility is estimated using the particle filtering with sequential important sampling-resampling (SIR) method. The mean square error (MSE) between the estimated and empirical volatility indicates that the performance of the model towards the index prices of FTSE Bursa Malaysia KLCI is fairly well.« less

  3. Understanding characteristics in multivariate traffic flow time series from complex network structure

    NASA Astrophysics Data System (ADS)

    Yan, Ying; Zhang, Shen; Tang, Jinjun; Wang, Xiaofei

    2017-07-01

    Discovering dynamic characteristics in traffic flow is the significant step to design effective traffic managing and controlling strategy for relieving traffic congestion in urban cities. A new method based on complex network theory is proposed to study multivariate traffic flow time series. The data were collected from loop detectors on freeway during a year. In order to construct complex network from original traffic flow, a weighted Froenius norm is adopt to estimate similarity between multivariate time series, and Principal Component Analysis is implemented to determine the weights. We discuss how to select optimal critical threshold for networks at different hour in term of cumulative probability distribution of degree. Furthermore, two statistical properties of networks: normalized network structure entropy and cumulative probability of degree, are utilized to explore hourly variation in traffic flow. The results demonstrate these two statistical quantities express similar pattern to traffic flow parameters with morning and evening peak hours. Accordingly, we detect three traffic states: trough, peak and transitional hours, according to the correlation between two aforementioned properties. The classifying results of states can actually represent hourly fluctuation in traffic flow by analyzing annual average hourly values of traffic volume, occupancy and speed in corresponding hours.

  4. Estimation of stochastic volatility with long memory for index prices of FTSE Bursa Malaysia KLCI

    NASA Astrophysics Data System (ADS)

    Chen, Kho Chia; Bahar, Arifah; Kane, Ibrahim Lawal; Ting, Chee-Ming; Rahman, Haliza Abd

    2015-02-01

    In recent years, modeling in long memory properties or fractionally integrated processes in stochastic volatility has been applied in the financial time series. A time series with structural breaks can generate a strong persistence in the autocorrelation function, which is an observed behaviour of a long memory process. This paper considers the structural break of data in order to determine true long memory time series data. Unlike usual short memory models for log volatility, the fractional Ornstein-Uhlenbeck process is neither a Markovian process nor can it be easily transformed into a Markovian process. This makes the likelihood evaluation and parameter estimation for the long memory stochastic volatility (LMSV) model challenging tasks. The drift and volatility parameters of the fractional Ornstein-Unlenbeck model are estimated separately using the least square estimator (lse) and quadratic generalized variations (qgv) method respectively. Finally, the empirical distribution of unobserved volatility is estimated using the particle filtering with sequential important sampling-resampling (SIR) method. The mean square error (MSE) between the estimated and empirical volatility indicates that the performance of the model towards the index prices of FTSE Bursa Malaysia KLCI is fairly well.

  5. About the Modeling of Radio Source Time Series as Linear Splines

    NASA Astrophysics Data System (ADS)

    Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald

    2016-12-01

    Many of the time series of radio sources observed in geodetic VLBI show variations, caused mainly by changes in source structure. However, until now it has been common practice to consider source positions as invariant, or to exclude known misbehaving sources from the datum conditions. This may lead to a degradation of the estimated parameters, as unmodeled apparent source position variations can propagate to the other parameters through the least squares adjustment. In this paper we will introduce an automated algorithm capable of parameterizing the radio source coordinates as linear splines.

  6. Characterization of complexities in combustion instability in a lean premixed gas-turbine model combustor.

    PubMed

    Gotoda, Hiroshi; Amano, Masahito; Miyano, Takaya; Ikawa, Takuya; Maki, Koshiro; Tachibana, Shigeru

    2012-12-01

    We characterize complexities in combustion instability in a lean premixed gas-turbine model combustor by nonlinear time series analysis to evaluate permutation entropy, fractal dimensions, and short-term predictability. The dynamic behavior in combustion instability near lean blowout exhibits a self-affine structure and is ascribed to fractional Brownian motion. It undergoes chaos by the onset of combustion oscillations with slow amplitude modulation. Our results indicate that nonlinear time series analysis is capable of characterizing complexities in combustion instability close to lean blowout.

  7. Analyzing Single-Molecule Time Series via Nonparametric Bayesian Inference

    PubMed Central

    Hines, Keegan E.; Bankston, John R.; Aldrich, Richard W.

    2015-01-01

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. PMID:25650922

  8. Correlates of depression in bipolar disorder

    PubMed Central

    Moore, Paul J.; Little, Max A.; McSharry, Patrick E.; Goodwin, Guy M.; Geddes, John R.

    2014-01-01

    We analyse time series from 100 patients with bipolar disorder for correlates of depression symptoms. As the sampling interval is non-uniform, we quantify the extent of missing and irregular data using new measures of compliance and continuity. We find that uniformity of response is negatively correlated with the standard deviation of sleep ratings (ρ = –0.26, p = 0.01). To investigate the correlation structure of the time series themselves, we apply the Edelson–Krolik method for correlation estimation. We examine the correlation between depression symptoms for a subset of patients and find that self-reported measures of sleep and appetite/weight show a lower average correlation than other symptoms. Using surrogate time series as a reference dataset, we find no evidence that depression is correlated between patients, though we note a possible loss of information from sparse sampling. PMID:24352942

  9. On system behaviour using complex networks of a compression algorithm

    NASA Astrophysics Data System (ADS)

    Walker, David M.; Correa, Debora C.; Small, Michael

    2018-01-01

    We construct complex networks of scalar time series using a data compression algorithm. The structure and statistics of the resulting networks can be used to help characterize complex systems, and one property, in particular, appears to be a useful discriminating statistic in surrogate data hypothesis tests. We demonstrate these ideas on systems with known dynamical behaviour and also show that our approach is capable of identifying behavioural transitions within electroencephalogram recordings as well as changes due to a bifurcation parameter of a chaotic system. The technique we propose is dependent on a coarse grained quantization of the original time series and therefore provides potential for a spatial scale-dependent characterization of the data. Finally the method is as computationally efficient as the underlying compression algorithm and provides a compression of the salient features of long time series.

  10. Tropical-Forest Structure and Biomass Dynamics from TanDEM-X Radar Interferometry

    Treesearch

    Robert Treuhaft; Yang Lei; Fabio Gonçalves; Michael Keller; João Santos; Maxim Neumann; André Almeida

    2017-01-01

    Changes in tropical-forest structure and aboveground biomass (AGB) contribute directly to atmospheric changes in CO2, which, in turn, bear on global climate. This paper demonstrates the capability of radar-interferometric phase-height time series at X-band (wavelength = 3 cm) to monitor changes in vertical structure and AGB, with sub-hectare and monthly spatial and...

  11. Structure of a financial cross-correlation matrix under attack

    NASA Astrophysics Data System (ADS)

    Lim, Gyuchang; Kim, SooYong; Kim, Junghwan; Kim, Pyungsoo; Kang, Yoonjong; Park, Sanghoon; Park, Inho; Park, Sang-Bum; Kim, Kyungsik

    2009-09-01

    We investigate the structure of a perturbed stock market in terms of correlation matrices. For the purpose of perturbing a stock market, two distinct methods are used, namely local and global perturbation. The former involves replacing a correlation coefficient of the cross-correlation matrix with one calculated from two Gaussian-distributed time series while the latter reconstructs the cross-correlation matrix just after replacing the original return series with Gaussian-distributed time series. Concerning the local case, it is a technical study only and there is no attempt to model reality. The term ‘global’ means the overall effect of the replacement on other untouched returns. Through statistical analyses such as random matrix theory (RMT), network theory, and the correlation coefficient distributions, we show that the global structure of a stock market is vulnerable to perturbation. However, apart from in the analysis of inverse participation ratios (IPRs), the vulnerability becomes dull under a small-scale perturbation. This means that these analysis tools are inappropriate for monitoring the whole stock market due to the low sensitivity of a stock market to a small-scale perturbation. In contrast, when going down to the structure of business sectors, we confirm that correlation-based business sectors are regrouped in terms of IPRs. This result gives a clue about monitoring the effect of hidden intentions, which are revealed via portfolios taken mostly by large investors.

  12. Shilling attack detection for recommender systems based on credibility of group users and rating time series.

    PubMed

    Zhou, Wei; Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian

    2018-01-01

    Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user's credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method.

  13. Shilling attack detection for recommender systems based on credibility of group users and rating time series

    PubMed Central

    Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian

    2018-01-01

    Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user’s credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method. PMID:29742134

  14. High Resolution Time Series of Plankton Communities: From Early Warning of Harmful Blooms to Sentinels of Climate Change

    NASA Astrophysics Data System (ADS)

    Sosik, H. M.; Campbell, L.; Olson, R. J.

    2016-02-01

    The combination of ocean observatory infrastructure and automated submersible flow cytometry provides an unprecedented capability for sustained high resolution time series of plankton, including taxa that are harmful or early indicators of ecosystem response to environmental change. On-going time series produced with the FlowCytobot series of instruments document important ways this challenge is already being met for phytoplankton and microzooplankton. FlowCytobot and Imaging FlowCytobot use a combination of laser-based scattering and fluorescence measurements and video imaging of individual particles to enumerate and characterize cells ranging from picocyanobacteria to large chaining-forming diatoms. Over a decade of observations at the Martha's Vineyard Coastal Observatory (MVCO), a cabled facility on the New England Shelf, have been compiled from repeated instrument deployments, typically 6 months or longer in duration. These multi-year high resolution (hourly to daily) time series are providing new insights into dynamics of community structure such as blooms, seasonality, and multi-year trends linked to regional climate-related variables. Similar observations in Texas coastal waters at the Texas Observatory for Algal Succession Time series (TOAST) have repeatedly provided early warning of harmful algal bloom events that threaten human and ecosystem health. As coastal ocean observing systems mature and expand, the continued integration of these type of detailed observations of the plankton will provide unparalleled information about variability and patterns of change at the base of the marine food webs, with direct implications for informed ecosystem-based management.

  15. 3-D reconstruction of neurons from multichannel confocal laser scanning image series.

    PubMed

    Wouterlood, Floris G

    2014-04-10

    A confocal laser scanning microscope (CLSM) collects information from a thin, focal plane and ignores out-of-focus information. Scanning of a specimen, with stepwise axial (Z-) movement of the stage in between each scan, produces Z-series of confocal images of a tissue volume, which then can be used to 3-D reconstruct structures of interest. The operator first configures separate channels (e.g., laser, filters, and detector settings) for each applied fluorochrome and then acquires Z-series of confocal images: one series per channel. Channel signal separation is extremely important. Measures to avoid bleaching are vital. Post-acquisition deconvolution of the image series is often performed to increase resolution before 3-D reconstruction takes place. In the 3-D reconstruction programs described in this unit, reconstructions can be inspected in real time from any viewing angle. By altering viewing angles and by switching channels off and on, the spatial relationships of 3-D-reconstructed structures with respect to structures visualized in other channels can be studied. Since each brand of CLSM, computer program, and 3-D reconstruction package has its own proprietary set of procedures, a general approach is provided in this protocol wherever possible. Copyright © 2014 John Wiley & Sons, Inc.

  16. Evenly spaced Detrended Fluctuation Analysis: Selecting the number of points for the diffusion plot

    NASA Astrophysics Data System (ADS)

    Liddy, Joshua J.; Haddad, Jeffrey M.

    2018-02-01

    Detrended Fluctuation Analysis (DFA) has become a widely-used tool to examine the correlation structure of a time series and provided insights into neuromuscular health and disease states. As the popularity of utilizing DFA in the human behavioral sciences has grown, understanding its limitations and how to properly determine parameters is becoming increasingly important. DFA examines the correlation structure of variability in a time series by computing α, the slope of the log SD- log n diffusion plot. When using the traditional DFA algorithm, the timescales, n, are often selected as a set of integers between a minimum and maximum length based on the number of data points in the time series. This produces non-uniformly distributed values of n in logarithmic scale, which influences the estimation of α due to a disproportionate weighting of the long-timescale regions of the diffusion plot. Recently, the evenly spaced DFA and evenly spaced average DFA algorithms were introduced. Both algorithms compute α by selecting k points for the diffusion plot based on the minimum and maximum timescales of interest and improve the consistency of α estimates for simulated fractional Gaussian noise and fractional Brownian motion time series. Two issues that remain unaddressed are (1) how to select k and (2) whether the evenly-spaced DFA algorithms show similar benefits when assessing human behavioral data. We manipulated k and examined its effects on the accuracy, consistency, and confidence limits of α in simulated and experimental time series. We demonstrate that the accuracy and consistency of α are relatively unaffected by the selection of k. However, the confidence limits of α narrow as k increases, dramatically reducing measurement uncertainty for single trials. We provide guidelines for selecting k and discuss potential uses of the evenly spaced DFA algorithms when assessing human behavioral data.

  17. Identification of Boolean Network Models From Time Series Data Incorporating Prior Knowledge.

    PubMed

    Leifeld, Thomas; Zhang, Zhihua; Zhang, Ping

    2018-01-01

    Motivation: Mathematical models take an important place in science and engineering. A model can help scientists to explain dynamic behavior of a system and to understand the functionality of system components. Since length of a time series and number of replicates is limited by the cost of experiments, Boolean networks as a structurally simple and parameter-free logical model for gene regulatory networks have attracted interests of many scientists. In order to fit into the biological contexts and to lower the data requirements, biological prior knowledge is taken into consideration during the inference procedure. In the literature, the existing identification approaches can only deal with a subset of possible types of prior knowledge. Results: We propose a new approach to identify Boolean networks from time series data incorporating prior knowledge, such as partial network structure, canalizing property, positive and negative unateness. Using vector form of Boolean variables and applying a generalized matrix multiplication called the semi-tensor product (STP), each Boolean function can be equivalently converted into a matrix expression. Based on this, the identification problem is reformulated as an integer linear programming problem to reveal the system matrix of Boolean model in a computationally efficient way, whose dynamics are consistent with the important dynamics captured in the data. By using prior knowledge the number of candidate functions can be reduced during the inference. Hence, identification incorporating prior knowledge is especially suitable for the case of small size time series data and data without sufficient stimuli. The proposed approach is illustrated with the help of a biological model of the network of oxidative stress response. Conclusions: The combination of efficient reformulation of the identification problem with the possibility to incorporate various types of prior knowledge enables the application of computational model inference to systems with limited amount of time series data. The general applicability of this methodological approach makes it suitable for a variety of biological systems and of general interest for biological and medical research.

  18. Mapping forest height, foliage height profiles and disturbance characteristics with time series of gap-filled Landsat and ALI imagery

    NASA Astrophysics Data System (ADS)

    Helmer, E.; Ruzycki, T. S.; Wunderle, J. M.; Kwit, C.; Ewert, D. N.; Voggesser, S. M.; Brandeis, T. J.

    2011-12-01

    We mapped tropical dry forest height (RMSE = 0.9 m, R2 = 0.84, range 0.6-7 m) and foliage height profiles with a time series of gap-filled Landsat and Advanced Land Imager (ALI) imagery for the island of Eleuthera, The Bahamas. We also mapped disturbance type and age with decision tree classification of the image time series. Having mapped these variables in the context of studies of wintering habitat of an endangered Nearctic-Neotropical migrant bird, the Kirtland's Warbler (Dendroica kirtlandii), we then illustrated relationships between forest vertical structure, disturbance type and counts of forage species important to the Kirtland's Warbler. The ALI imagery and the Landsat time series were both critical to the result for forest height, which the strong relationship of forest height with disturbance type and age facilitated. Also unique to this study was that seven of the eight image time steps were cloud-gap-filled images: mosaics of the clear parts of several cloudy scenes, in which cloud gaps in a reference scene for each time step are filled with image data from alternate scenes. We created each cloud-cleared image, including a virtually seamless ALI image mosaic, with regression tree normalization of the image data that filled cloud gaps. We also illustrated how viewing time series imagery as red-green-blue composites of tasseled cap wetness (RGB wetness composites) aids reference data collection for classifying tropical forest disturbance type and age.

  19. Simultaneous identification of transfer functions and combustion noise of a turbulent flame

    NASA Astrophysics Data System (ADS)

    Merk, M.; Jaensch, S.; Silva, C.; Polifke, W.

    2018-05-01

    The Large Eddy Simulation/System Identification (LES/SI) approach allows to deduce a flame transfer function (FTF) from LES of turbulent reacting flow: Time series of fluctuations of reference velocity and global heat release rate resulting from broad-band excitation of a simulated turbulent flame are post-processed via SI techniques to derive a low order model of the flame dynamics, from which the FTF is readily deduced. The current work investigates an extension of the established LES/SI approach: In addition to estimation of the FTF, a low order model for the combustion noise source is deduced from the same time series data. By incorporating such a noise model into a linear thermoacoustic model, it is possible to predict the overall level as well as the spectral distribution of sound pressure in confined combustion systems that do not exhibit self-excited thermoacoustic instability. A variety of model structures for estimation of a noise model are tested in the present study. The suitability and quality of these model structures are compared against each other, their sensitivity regarding certain time series properties is studied. The influence of time series length, signal-to-noise ratio as well as acoustic reflection coefficient of the boundary conditions on the identification are examined. It is shown that the Box-Jenkins model structure is superior to simpler approaches for the simultaneous identification of models that describe the FTF as well as the combustion noise source. Subsequent to the question of the most adequate model structure, the choice of optimal model order is addressed, as in particular the optimal parametrization of the noise model is not obvious. Akaike's Information Criterion and a model residual analysis are applied to draw qualitative and quantitative conclusions on the most suitable model order. All investigations are based on a surrogate data model, which allows a Monte Carlo study across a large parameter space with modest computationally effort. The conducted study constitutes a solid basis for the application of advanced SI techniques to actual LES data.

  20. A new method to detect transitory signatures and local time/space variability structures in the climate system: the scale-dependent correlation analysis

    NASA Astrophysics Data System (ADS)

    Rodó, Xavier; Rodríguez-Arias, Miquel-Àngel

    2006-10-01

    The study of transitory signals and local variability structures in both/either time and space and their role as sources of climatic memory, is an important but often neglected topic in climate research despite its obvious importance and extensive coverage in the literature. Transitory signals arise either from non-linearities, in the climate system, transitory atmosphere-ocean couplings, and other processes in the climate system evolving after a critical threshold is crossed. These temporary interactions that, though intense, may not last long, can be responsible for a large amount of unexplained variability but are normally considered of limited relevance and often, discarded. With most of the current techniques at hand these typology of signatures are difficult to isolate because the low signal-to-noise ratio in midlatitudes, the limited recurrence of the transitory signals during a customary interval of data considered. Also, there is often a serious problem arising from the smoothing of local or transitory processes if statistical techniques are applied, that consider all the length of data available, rather than taking into account the size of the specific variability structure under investigation. Scale-dependent correlation (SDC) analysis is a new statistical method capable of highlighting the presence of transitory processes, these former being understood as temporary significant lag-dependent autocovariance in a single series, or covariance structures between two series. This approach, therefore, complements other approaches such as those resulting from the families of wavelet analysis, singular-spectrum analysis and recurrence plots. A main feature of SDC is its high-performance for short time series, its ability to characterize phase-relationships and thresholds in the bivariate domain. Ultimately, SDC helps tracking short-lagged relationships among processes that locally or temporarily couple and uncouple. The use of SDC is illustrated in the present paper by means of some synthetic time-series examples of increasing complexity, and it is compared with wavelet analysis in order to provide a well-known reference of its capabilities. A comparison between SDC and companion techniques is also addressed and results are exemplified for the specific case of some relevant El Niño-Southern Oscillation teleconnections.

  1. Correlations and clustering in wholesale electricity markets

    DOE PAGES

    Cui, Tianyu; Caravelli, Francesco; Ududec, Cozmin

    2017-11-24

    We study the structure of locational marginal prices in day-ahead and real-time wholesale electricity markets. In particular, we consider the case of two North American markets and show that the price correlations contain information on the locational structure of the grid. We study various clustering methods and introduce a type of correlation function based on event synchronization for spiky time series, and another based on string correlations of location names provided by the markets. As a result, this allows us to reconstruct aspects of the locational structure of the grid.

  2. Correlations and clustering in wholesale electricity markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Tianyu; Caravelli, Francesco; Ududec, Cozmin

    We study the structure of locational marginal prices in day-ahead and real-time wholesale electricity markets. In particular, we consider the case of two North American markets and show that the price correlations contain information on the locational structure of the grid. We study various clustering methods and introduce a type of correlation function based on event synchronization for spiky time series, and another based on string correlations of location names provided by the markets. As a result, this allows us to reconstruct aspects of the locational structure of the grid.

  3. Correlations and clustering in wholesale electricity markets

    NASA Astrophysics Data System (ADS)

    Cui, Tianyu; Caravelli, Francesco; Ududec, Cozmin

    2018-02-01

    We study the structure of locational marginal prices in day-ahead and real-time wholesale electricity markets. In particular, we consider the case of two North American markets and show that the price correlations contain information on the locational structure of the grid. We study various clustering methods and introduce a type of correlation function based on event synchronization for spiky time series, and another based on string correlations of location names provided by the markets. This allows us to reconstruct aspects of the locational structure of the grid.

  4. Modeling multivariate time series on manifolds with skew radial basis functions.

    PubMed

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  5. A framework for relating the structures and recovery statistics in pressure time-series surveys for dust devils

    NASA Astrophysics Data System (ADS)

    Jackson, Brian; Lorenz, Ralph; Davis, Karan

    2018-01-01

    Dust devils are likely the dominant source of dust for the martian atmosphere, but the amount and frequency of dust-lifting depend on the statistical distribution of dust devil parameters. Dust devils exhibit pressure perturbations and, if they pass near a barometric sensor, they may register as a discernible dip in a pressure time-series. Leveraging this fact, several surveys using barometric sensors on landed spacecraft have revealed dust devil structures and occurrence rates. However powerful they are, though, such surveys suffer from non-trivial biases that skew the inferred dust devil properties. For example, such surveys are most sensitive to dust devils with the widest and deepest pressure profiles, but the recovered profiles will be distorted, broader and shallow than the actual profiles. In addition, such surveys often do not provide wind speed measurements alongside the pressure time series, and so the durations of the dust devil signals in the time series cannot be directly converted to profile widths. Fortunately, simple statistical and geometric considerations can de-bias these surveys, allowing conversion of the duration of dust devil signals into physical widths, given only a distribution of likely translation velocities, and the recovery of the underlying distributions of physical parameters. In this study, we develop a scheme for de-biasing such surveys. Applying our model to an in-situ survey using data from the Phoenix lander suggests a larger dust flux and a dust devil occurrence rate about ten times larger than previously inferred. Comparing our results to dust devil track surveys suggests only about one in five low-pressure cells lifts sufficient dust to leave a visible track.

  6. Structural Time Series Model for El Niño Prediction

    NASA Astrophysics Data System (ADS)

    Petrova, Desislava; Koopman, Siem Jan; Ballester, Joan; Rodo, Xavier

    2015-04-01

    ENSO is a dominant feature of climate variability on inter-annual time scales destabilizing weather patterns throughout the globe, and having far-reaching socio-economic consequences. It does not only lead to extensive rainfall and flooding in some regions of the world, and anomalous droughts in others, thus ruining local agriculture, but also substantially affects the marine ecosystems and the sustained exploitation of marine resources in particular coastal zones, especially the Pacific South American coast. As a result, forecasting of ENSO and especially of the warm phase of the oscillation (El Niño/EN) has long been a subject of intense research and improvement. Thus, the present study explores a novel method for the prediction of the Niño 3.4 index. In the state-of-the-art the advantageous statistical modeling approach of Structural Time Series Analysis has not been applied. Therefore, we have developed such a model using a State Space approach for the unobserved components of the time series. Its distinguishing feature is that observations consist of various components - level, seasonality, cycle, disturbance, and regression variables incorporated as explanatory covariates. These components are aimed at capturing the various modes of variability of the N3.4 time series. They are modeled separately, then combined in a single model for analysis and forecasting. Customary statistical ENSO prediction models essentially use SST, SLP and wind stress in the equatorial Pacific. We introduce new regression variables - subsurface ocean temperature in the western equatorial Pacific, motivated by recent (Ramesh and Murtugudde, 2012) and classical research (Jin, 1997), (Wyrtki, 1985), showing that subsurface processes and heat accumulation there are fundamental for initiation of an El Niño event; and a southern Pacific temperature-difference tracer, the Rossbell dipole, leading EN by about nine months (Ballester, 2011).

  7. Structural and optical properties of SiC-SiO2 nanocomposite thin films

    NASA Astrophysics Data System (ADS)

    Bozetine, I.; Keffous, A.; Kaci, S.; Menari, H.; Manseri, A.

    2018-03-01

    This study deals with the deposition of thin films of a SiC-SiO2nanocomposite deposited on silicon substrates. The deposition is carried out by a co-sputtering RF magnetron 13.56 MHz, using two targets a polycristallin 6H-SiC and sprigs of SiO2. In order to study the influence of the deposition time on the morphology, the structural and optical properties of the thin films produced, two series of samples were prepared, namely a series A with a 30 min deposition time and a series B of one hour duration. The samples were investigated using different characterization techniques such as Scanning Electron Microscope (SEM), X-ray Diffraction (DRX), Fourier Transform Infrared Spectroscopy (FTIR), Secondary Ion Mass Spectrometry (SIMS) and photoluminescence. The results obtained, reveal an optical gap varies between 1.4 and 2.4 eV depending on the thickness of the film; thus depending on the deposition time. The SIMS profile recorded the presence of oxygen (16O) on the surface, which the signal beneath the silicon signal (28Si) and carbon (12C) signals, which confirms that the oxide (SiO2) is the first material deposited at the interface film - substrate with an a-OSiC structure. The photoluminescence (PL) measurement exhibits two peaks, centred at 390 nm due to the oxide and at 416 nm due probably to the nanocrystals of SiC crystals, note that when the deposition time increases, the intensity of the PL drops drastically, result in agreement with dense and smooth film.

  8. Reconstructing shifts in vital rates driven by long-term environmental change: a new demographic method based on readily available data.

    PubMed

    González, Edgar J; Martorell, Carlos

    2013-07-01

    Frequently, vital rates are driven by directional, long-term environmental changes. Many of these are of great importance, such as land degradation, climate change, and succession. Traditional demographic methods assume a constant or stationary environment, and thus are inappropriate to analyze populations subject to these changes. They also require repeat surveys of the individuals as change unfolds. Methods for reconstructing such lengthy processes are needed. We present a model that, based on a time series of population size structures and densities, reconstructs the impact of directional environmental changes on vital rates. The model uses integral projection models and maximum likelihood to identify the rates that best reconstructs the time series. The procedure was validated with artificial and real data. The former involved simulated species with widely different demographic behaviors. The latter used a chronosequence of populations of an endangered cactus subject to increasing anthropogenic disturbance. In our simulations, the vital rates and their change were always reconstructed accurately. Nevertheless, the model frequently produced alternative results. The use of coarse knowledge of the species' biology (whether vital rates increase or decrease with size or their plausible values) allowed the correct rates to be identified with a 90% success rate. With real data, the model correctly reconstructed the effects of disturbance on vital rates. These effects were previously known from two populations for which demographic data were available. Our procedure seems robust, as the data violated several of the model's assumptions. Thus, time series of size structures and densities contain the necessary information to reconstruct changing vital rates. However, additional biological knowledge may be required to provide reliable results. Because time series of size structures and densities are available for many species or can be rapidly generated, our model can contribute to understand populations that face highly pressing environmental problems.

  9. Predation and fragmentation portrayed in the statistical structure of prey time series

    PubMed Central

    Hendrichsen, Ditte K; Topping, Chris J; Forchhammer, Mads C

    2009-01-01

    Background Statistical autoregressive analyses of direct and delayed density dependence are widespread in ecological research. The models suggest that changes in ecological factors affecting density dependence, like predation and landscape heterogeneity are directly portrayed in the first and second order autoregressive parameters, and the models are therefore used to decipher complex biological patterns. However, independent tests of model predictions are complicated by the inherent variability of natural populations, where differences in landscape structure, climate or species composition prevent controlled repeated analyses. To circumvent this problem, we applied second-order autoregressive time series analyses to data generated by a realistic agent-based computer model. The model simulated life history decisions of individual field voles under controlled variations in predator pressure and landscape fragmentation. Analyses were made on three levels: comparisons between predated and non-predated populations, between populations exposed to different types of predators and between populations experiencing different degrees of habitat fragmentation. Results The results are unambiguous: Changes in landscape fragmentation and the numerical response of predators are clearly portrayed in the statistical time series structure as predicted by the autoregressive model. Populations without predators displayed significantly stronger negative direct density dependence than did those exposed to predators, where direct density dependence was only moderately negative. The effects of predation versus no predation had an even stronger effect on the delayed density dependence of the simulated prey populations. In non-predated prey populations, the coefficients of delayed density dependence were distinctly positive, whereas they were negative in predated populations. Similarly, increasing the degree of fragmentation of optimal habitat available to the prey was accompanied with a shift in the delayed density dependence, from strongly negative to gradually becoming less negative. Conclusion We conclude that statistical second-order autoregressive time series analyses are capable of deciphering interactions within and across trophic levels and their effect on direct and delayed density dependence. PMID:19419539

  10. Reconstructing shifts in vital rates driven by long-term environmental change: a new demographic method based on readily available data

    PubMed Central

    González, Edgar J; Martorell, Carlos

    2013-01-01

    Frequently, vital rates are driven by directional, long-term environmental changes. Many of these are of great importance, such as land degradation, climate change, and succession. Traditional demographic methods assume a constant or stationary environment, and thus are inappropriate to analyze populations subject to these changes. They also require repeat surveys of the individuals as change unfolds. Methods for reconstructing such lengthy processes are needed. We present a model that, based on a time series of population size structures and densities, reconstructs the impact of directional environmental changes on vital rates. The model uses integral projection models and maximum likelihood to identify the rates that best reconstructs the time series. The procedure was validated with artificial and real data. The former involved simulated species with widely different demographic behaviors. The latter used a chronosequence of populations of an endangered cactus subject to increasing anthropogenic disturbance. In our simulations, the vital rates and their change were always reconstructed accurately. Nevertheless, the model frequently produced alternative results. The use of coarse knowledge of the species' biology (whether vital rates increase or decrease with size or their plausible values) allowed the correct rates to be identified with a 90% success rate. With real data, the model correctly reconstructed the effects of disturbance on vital rates. These effects were previously known from two populations for which demographic data were available. Our procedure seems robust, as the data violated several of the model's assumptions. Thus, time series of size structures and densities contain the necessary information to reconstruct changing vital rates. However, additional biological knowledge may be required to provide reliable results. Because time series of size structures and densities are available for many species or can be rapidly generated, our model can contribute to understand populations that face highly pressing environmental problems. PMID:23919169

  11. Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.

    PubMed

    Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald

    2017-01-01

    Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.

  12. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.

  13. Temporal Fine Structure and Applications to Cochlear Implants

    ERIC Educational Resources Information Center

    Li, Xing

    2013-01-01

    Complex broadband sounds are decomposed by the auditory filters into a series of relatively narrowband signals, each of which conveys information about the sound by time-varying features. The slow changes in the overall amplitude constitute envelope, while the more rapid events, such as zero crossings, constitute temporal fine structure (TFS).…

  14. Porous siliconformation and etching process for use in silicon micromachining

    DOEpatents

    Guilinger, Terry R.; Kelly, Michael J.; Martin, Jr., Samuel B.; Stevenson, Joel O.; Tsao, Sylvia S.

    1991-01-01

    A reproducible process for uniformly etching silicon from a series of micromechanical structures used in electrical devices and the like includes providing a micromechanical structure having a silicon layer with defined areas for removal thereon and an electrochemical cell containing an aqueous hydrofluoric acid electrolyte. The micromechanical structure is submerged in the electrochemical cell and the defined areas of the silicon layer thereon are anodically biased by passing a current through the electrochemical cell for a time period sufficient to cause the defined areas of the silicon layer to become porous. The formation of the depth of the porous silicon is regulated by controlling the amount of current passing through the electrochemical cell. The micromechanical structure is then removed from the electrochemical cell and submerged in a hydroxide solution to remove the porous silicon. The process is subsequently repeated for each of the series of micromechanical structures to achieve a reproducibility better than 0.3%.

  15. Molecular surface representation using 3D Zernike descriptors for protein shape comparison and docking.

    PubMed

    Kihara, Daisuke; Sael, Lee; Chikhi, Rayan; Esquivel-Rodriguez, Juan

    2011-09-01

    The tertiary structures of proteins have been solved in an increasing pace in recent years. To capitalize the enormous efforts paid for accumulating the structure data, efficient and effective computational methods need to be developed for comparing, searching, and investigating interactions of protein structures. We introduce the 3D Zernike descriptor (3DZD), an emerging technique to describe molecular surfaces. The 3DZD is a series expansion of mathematical three-dimensional function, and thus a tertiary structure is represented compactly by a vector of coefficients of terms in the series. A strong advantage of the 3DZD is that it is invariant to rotation of target object to be represented. These two characteristics of the 3DZD allow rapid comparison of surface shapes, which is sufficient for real-time structure database screening. In this article, we review various applications of the 3DZD, which have been recently proposed.

  16. Molecular Modeling and Experimental Investigations of Nonlinear Optical Compounds Monosubstituted Derivatives of Dicyanovinylbenzene

    NASA Technical Reports Server (NTRS)

    Timofeeva, Tatiana V.; Nesterov, Vladimir N.; Antipin, Mikhail Yu.; Clark, Ronald D.; Sanghadasa, Mohan; Cardelino, Beatriz H.; Moore, Craig E.; Frazier, Donald O.

    1999-01-01

    A search for potential nonlinear optical compounds was performed using the Cambridge Structure Database and molecular modeling. We investigated a series of monosubstituted derivatives of dicyanovinylbenzene, since the nonlinear optical (NLO) properties of such derivatives (o-methoxy-dicyanovinylbenzene, DIVA) were studied earlier. The molecular geometry of these compounds was investigated with x-ray analysis and discussed along with the results of molecular mechanics and ab initio quantum chemical calculations. The influence of crystal packing on the planarity of the molecules of this series has been revealed. Two new compounds from the series studied, ortho-F and para-Cl-dicyanovinylbenzene, according to powder measurements, were found to be NLO compounds in the crystal state about 10 times more active than urea. The peculiarities of crystal structure formation in the framework of balance between van der Waals and electrostatic interactions have been discussed. The crystal shape of DIVA and two new NLO compounds have been calculated on the basis of the known crystal structure.

  17. On the modular structure of the genus-one Type II superstring low energy expansion

    NASA Astrophysics Data System (ADS)

    D'Hoker, Eric; Green, Michael B.; Vanhove, Pierre

    2015-08-01

    The analytic contribution to the low energy expansion of Type II string amplitudes at genus-one is a power series in space-time derivatives with coefficients that are determined by integrals of modular functions over the complex structure modulus of the world-sheet torus. These modular functions are associated with world-sheet vacuum Feynman diagrams and given by multiple sums over the discrete momenta on the torus. In this paper we exhibit exact differential and algebraic relations for a certain infinite class of such modular functions by showing that they satisfy Laplace eigenvalue equations with inhomogeneous terms that are polynomial in non-holomorphic Eisenstein series. Furthermore, we argue that the set of modular functions that contribute to the coefficients of interactions up to order are linear sums of functions in this class and quadratic polynomials in Eisenstein series and odd Riemann zeta values. Integration over the complex structure results in coefficients of the low energy expansion that are rational numbers multiplying monomials in odd Riemann zeta values.

  18. Hierarchical structure of the energy landscape of proteins revisited by time series analysis. I. Mimicking protein dynamics in different time scales

    NASA Astrophysics Data System (ADS)

    Alakent, Burak; Camurdan, Mehmet C.; Doruker, Pemra

    2005-10-01

    Time series models, which are constructed from the projections of the molecular-dynamics (MD) runs on principal components (modes), are used to mimic the dynamics of two proteins: tendamistat and immunity protein of colicin E7 (ImmE7). Four independent MD runs of tendamistat and three independent runs of ImmE7 protein in vacuum are used to investigate the energy landscapes of these proteins. It is found that mean-square displacements of residues along the modes in different time scales can be mimicked by time series models, which are utilized in dividing protein dynamics into different regimes with respect to the dominating motion type. The first two regimes constitute the dominance of intraminimum motions during the first 5ps and the random walk motion in a hierarchically higher-level energy minimum, which comprise the initial time period of the trajectories up to 20-40ps for tendamistat and 80-120ps for ImmE7. These are also the time ranges within which the linear nonstationary time series are completely satisfactory in explaining protein dynamics. Encountering energy barriers enclosing higher-level energy minima constrains the random walk motion of the proteins, and pseudorelaxation processes at different levels of minima are detected in tendamistat, depending on the sampling window size. Correlation (relaxation) times of 30-40ps and 150-200ps are detected for two energy envelopes of successive levels for tendamistat, which gives an overall idea about the hierarchical structure of the energy landscape. However, it should be stressed that correlation times of the modes are highly variable with respect to conformational subspaces and sampling window sizes, indicating the absence of an actual relaxation. The random-walk step sizes and the time length of the second regime are used to illuminate an important difference between the dynamics of the two proteins, which cannot be clarified by the investigation of relaxation times alone: ImmE7 has lower-energy barriers enclosing the higher-level energy minimum, preventing the protein to relax and letting it move in a random-walk fashion for a longer period of time.

  19. Engineering Probiotics that Improve Warfighter Performance by Maintaining Lean Body Mass and Inhibiting Anxiety

    DTIC Science & Technology

    2017-10-03

    and Microbiome Research Seminar Series . Baylor College of Medicine. 10/26/16. 12. "Rewiring the DNA binding domains ofbacterial two-component system...Structural and Quantitative Biology Seminar Series . 11/16/15. 16. "Engineering bacterial two component signal transduction systems to function as sensors...hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and

  20. Approximate scaling properties of RNA free energy landscapes

    NASA Technical Reports Server (NTRS)

    Baskaran, S.; Stadler, P. F.; Schuster, P.

    1996-01-01

    RNA free energy landscapes are analysed by means of "time-series" that are obtained from random walks restricted to excursion sets. The power spectra, the scaling of the jump size distribution, and the scaling of the curve length measured with different yard stick lengths are used to describe the structure of these "time series". Although they are stationary by construction, we find that their local behavior is consistent with both AR(1) and self-affine processes. Random walks confined to excursion sets (i.e., with the restriction that the fitness value exceeds a certain threshold at each step) exhibit essentially the same statistics as free random walks. We find that an AR(1) time series is in general approximately self-affine on timescales up to approximately the correlation length. We present an empirical relation between the correlation parameter rho of the AR(1) model and the exponents characterizing self-affinity.

  1. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  2. Exploring fractal behaviour of blood oxygen saturation in preterm babies

    NASA Astrophysics Data System (ADS)

    Zahari, Marina; Hui, Tan Xin; Zainuri, Nuryazmin Ahmat; Darlow, Brian A.

    2017-04-01

    Recent evidence has been emerging that oxygenation instability in preterm babies could lead to an increased risk of retinal injury such as retinopathy of prematurity. There is a potential that disease severity could be better understood using nonlinear methods for time series data such as fractal theories [1]. Theories on fractal behaviours have been employed by researchers in various disciplines who were motivated to look into the behaviour or structure of irregular fluctuations in temporal data. In this study, an investigation was carried out to examine whether fractal behaviour could be detected in blood oxygen time series. Detection for the presence of fractals in oxygen data of preterm infants was performed using the methods of power spectrum, empirical probability distribution function and autocorrelation function. The results from these fractal identification methods indicate the possibility that these data exhibit fractal nature. Subsequently, a fractal framework for future research was suggested for oxygen time series.

  3. Cosinor-based rhythmometry

    PubMed Central

    2014-01-01

    A brief overview is provided of cosinor-based techniques for the analysis of time series in chronobiology. Conceived as a regression problem, the method is applicable to non-equidistant data, a major advantage. Another dividend is the feasibility of deriving confidence intervals for parameters of rhythmic components of known periods, readily drawn from the least squares procedure, stressing the importance of prior (external) information. Originally developed for the analysis of short and sparse data series, the extended cosinor has been further developed for the analysis of long time series, focusing both on rhythm detection and parameter estimation. Attention is given to the assumptions underlying the use of the cosinor and ways to determine whether they are satisfied. In particular, ways of dealing with non-stationary data are presented. Examples illustrate the use of the different cosinor-based methods, extending their application from the study of circadian rhythms to the mapping of broad time structures (chronomes). PMID:24725531

  4. Optimal estimation of recurrence structures from time series

    NASA Astrophysics Data System (ADS)

    beim Graben, Peter; Sellers, Kristin K.; Fröhlich, Flavio; Hutt, Axel

    2016-05-01

    Recurrent temporal dynamics is a phenomenon observed frequently in high-dimensional complex systems and its detection is a challenging task. Recurrence quantification analysis utilizing recurrence plots may extract such dynamics, however it still encounters an unsolved pertinent problem: the optimal selection of distance thresholds for estimating the recurrence structure of dynamical systems. The present work proposes a stochastic Markov model for the recurrent dynamics that allows for the analytical derivation of a criterion for the optimal distance threshold. The goodness of fit is assessed by a utility function which assumes a local maximum for that threshold reflecting the optimal estimate of the system's recurrence structure. We validate our approach by means of the nonlinear Lorenz system and its linearized stochastic surrogates. The final application to neurophysiological time series obtained from anesthetized animals illustrates the method and reveals novel dynamic features of the underlying system. We propose the number of optimal recurrence domains as a statistic for classifying an animals' state of consciousness.

  5. Advanced functional network analysis in the geosciences: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-04-01

    Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.

  6. Fine Structure of Anomalously Intense Pulses of PSR J0814+7429 Radio Emission in the Decameter Range

    NASA Astrophysics Data System (ADS)

    Skoryk, A. O.; Ulyanov, O. M.; Zakharenko, V. V.; Shevtsova, A. I.; Vasylieva, I. Y.; Plakhov, M. S.; Kravtsov, I. M.

    2017-06-01

    Purpose: The fine structure of the anomalously intense pulses of PSR J0814+7429 (B0809+74) has been studied. The pulsar radio emission fine structure is investigated to determine its parameters in the lowest part of spectrum available for groundbased observations. Design/methodology/approach: The scattering measure in the interstellar plasma have been estimated using the spectral and correlation analyses of pulsar data recorded by the UTR-2 radio telescope. Results: Two characteristic time scales of the anomalously intense pulses fine structure of the PSR J0814+7429 radio emission have been found. The strongest pulses of this pulsar in the decameter range can have a duration of about t 2÷3 ms. These pulses are emitted in short series. In some cases, they are emitted over the low-intensity plateau consisting of the “long” subpulse component. Conclusions: The narrowest correlation scale of pulsar J0814+7429 radio emission corresponds to the doubled scattering time constant of the interstellar medium impulse response. Broader scale of the fine structure of its radio emission can be explained by the radiation of a short series of narrow pulses or relatively broad pulses inside this pulsar magnetosphere.

  7. Post-Flight Estimation of Motion of Space Structures: Part 1

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Breckenridge, William

    2008-01-01

    A computer program estimates the relative positions and orientations of two space structures from data on the angular positions and distances of fiducial objects on one structure as measured by a target tracking electronic camera and laser range finders on another structure. The program is written specifically for determining the relative alignments of two antennas, connected by a long truss, deployed in outer space from a space shuttle. The program is based partly on transformations among the various coordinate systems involved in the measurements and on a nonlinear mathematical model of vibrations of the truss. The program implements a Kalman filter that blends the measurement data with data from the model. Using time series of measurement data from the tracking camera and range finders, the program generates time series of data on the relative position and orientation of the antennas. A similar program described in a prior NASA Tech Briefs article was used onboard for monitoring the structures during flight. The present program is more precise and designed for use on Earth in post-flight processing of the measurement data to enable correction, for antenna motions, of scientific data acquired by use of the antennas.

  8. Rational optimization of drug-target residence time: Insights from inhibitor binding to the S. aureus FabI enzyme-product complex

    PubMed Central

    Chang, Andrew; Schiebel, Johannes; Yu, Weixuan; Bommineni, Gopal R.; Pan, Pan; Baxter, Michael V.; Khanna, Avinash; Sotriffer, Christoph A.; Kisker, Caroline; Tonge, Peter J.

    2013-01-01

    Drug-target kinetics has recently emerged as an especially important facet of the drug discovery process. In particular, prolonged drug-target residence times may confer enhanced efficacy and selectivity in the open in vivo system. However, the lack of accurate kinetic and structural data for series of congeneric compounds hinders the rational design of inhibitors with decreased off-rates. Therefore, we chose the Staphylococcus aureus enoyl-ACP reductase (saFabI) - an important target for the development of new anti-staphylococcal drugs - as a model system to rationalize and optimize the drug-target residence time on a structural basis. Using our new, efficient and widely applicable mechanistically informed kinetic approach, we obtained a full characterization of saFabI inhibition by a series of 20 diphenyl ethers complemented by a collection of 9 saFabI-inhibitor crystal structures. We identified a strong correlation between the affinities of the investigated saFabI diphenyl ether inhibitors and their corresponding residence times, which can be rationalized on a structural basis. Due to its favorable interactions with the enzyme, the residence time of our most potent compound exceeds 10 hours. In addition, we found that affinity and residence time in this system can be significantly enhanced by modifications predictable by a careful consideration of catalysis. Our study provides a blueprint for investigating and prolonging drug-target kinetics and may aid in the rational design of long-residence-time inhibitors targeting the essential saFabI enzyme. PMID:23697754

  9. Conditional Spectral Analysis of Replicated Multiple Time Series with Application to Nocturnal Physiology.

    PubMed

    Krafty, Robert T; Rosen, Ori; Stoffer, David S; Buysse, Daniel J; Hall, Martica H

    2017-01-01

    This article considers the problem of analyzing associations between power spectra of multiple time series and cross-sectional outcomes when data are observed from multiple subjects. The motivating application comes from sleep medicine, where researchers are able to non-invasively record physiological time series signals during sleep. The frequency patterns of these signals, which can be quantified through the power spectrum, contain interpretable information about biological processes. An important problem in sleep research is drawing connections between power spectra of time series signals and clinical characteristics; these connections are key to understanding biological pathways through which sleep affects, and can be treated to improve, health. Such analyses are challenging as they must overcome the complicated structure of a power spectrum from multiple time series as a complex positive-definite matrix-valued function. This article proposes a new approach to such analyses based on a tensor-product spline model of Cholesky components of outcome-dependent power spectra. The approach exibly models power spectra as nonparametric functions of frequency and outcome while preserving geometric constraints. Formulated in a fully Bayesian framework, a Whittle likelihood based Markov chain Monte Carlo (MCMC) algorithm is developed for automated model fitting and for conducting inference on associations between outcomes and spectral measures. The method is used to analyze data from a study of sleep in older adults and uncovers new insights into how stress and arousal are connected to the amount of time one spends in bed.

  10. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    NASA Astrophysics Data System (ADS)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil protection agency meteorological warning network. ACKNOWLEDGEMENTS The research was co-financed by the Italian Ministry of University, by means of the PRIN 2006 PRIN program, within the research project entitled ‘Definition of critical rainfall thresholds for destructive landslides for civil protection purposes'. REFERENCES Cowpertwait, P.S.P., Kilsby, C.G. and O'Connell, P.E., 2002. A space-time Neyman-Scott model of rainfall: Empirical analysis of extremes, Water Resources Research, 38(8):1-14. Salas, J.D., 1992. Analysis and modeling of hydrological time series, in D.R. Maidment, ed., Handbook of Hydrology, McGraw-Hill, New York. Heneker, T.M., Lambert, M.F. and Kuczera G., 2001. A point rainfall model for risk-based design, Journal of Hydrology, 247(1-2):54-71.

  11. Spectral analysis of hydrological time series of a river basin in southern Spain

    NASA Astrophysics Data System (ADS)

    Luque-Espinar, Juan Antonio; Pulido-Velazquez, David; Pardo-Igúzquiza, Eulogio; Fernández-Chacón, Francisca; Jiménez-Sánchez, Jorge; Chica-Olmo, Mario

    2016-04-01

    Spectral analysis has been applied with the aim to determine the presence and statistical significance of climate cycles in data series from different rainfall, piezometric and gauging stations located in upper Genil River Basin. This river starts in Sierra Nevada Range at 3,480 m a.s.l. and is one of the most important rivers of this region. The study area has more than 2.500 km2, with large topographic differences. For this previous study, we have used more than 30 rain data series, 4 piezometric data series and 3 data series from gauging stations. Considering a monthly temporal unit, the studied period range from 1951 to 2015 but most of the data series have some lacks. Spectral analysis is a methodology widely used to discover cyclic components in time series. The time series is assumed to be a linear combination of sinusoidal functions of known periods but of unknown amplitude and phase. The amplitude is related with the variance of the time series, explained by the oscillation at each frequency (Blackman and Tukey, 1958, Bras and Rodríguez-Iturbe, 1985, Chatfield, 1991, Jenkins and Watts, 1968, among others). The signal component represents the structured part of the time series, made up of a small number of embedded periodicities. Then, we take into account the known result for the one-sided confidence band of the power spectrum estimator. For this study, we established confidence levels of <90%, 90%, 95%, and 99%. Different climate signals have been identified: ENSO, QBO, NAO, Sun Spot cycles, as well as others related to sun activity, but the most powerful signals correspond to the annual cycle, followed by the 6 month and NAO cycles. Nevertheless, significant differences between rain data series and piezometric/flow data series have been pointed out. In piezometric data series and flow data series, ENSO and NAO signals could be stronger than others with high frequencies. The climatic peaks in lower frequencies in rain data are smaller and the confidence level too. On the other hand, the most important influence on groundwater resources and river flows are NAO, Sun Spot, ENSO and annual cycle. Acknowledgments: This research has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO funds and Junta de Andalucía (Group RNM122).

  12. Anomalous Anticipatory Responses in Networked Random Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Roger D.; Bancel, Peter A.

    2006-10-16

    We examine an 8-year archive of synchronized, parallel time series of random data from a world spanning network of physical random event generators (REGs). The archive is a publicly accessible matrix of normally distributed 200-bit sums recorded at 1 Hz which extends from August 1998 to the present. The primary question is whether these data show non-random structure associated with major events such as natural or man-made disasters, terrible accidents, or grand celebrations. Secondarily, we examine the time course of apparently correlated responses. Statistical analyses of the data reveal consistent evidence that events which strongly affect people engender small butmore » significant effects. These include suggestions of anticipatory responses in some cases, leading to a series of specialized analyses to assess possible non-random structure preceding precisely timed events. A focused examination of data collected around the time of earthquakes with Richter magnitude 6 and greater reveals non-random structure with a number of intriguing, potentially important features. Anomalous effects in the REG data are seen only when the corresponding earthquakes occur in populated areas. No structure is found if they occur in the oceans. We infer that an important contributor to the effect is the relevance of the earthquake to humans. Epoch averaging reveals evidence for changes in the data some hours prior to the main temblor, suggestive of reverse causation.« less

  13. Real time wave forecasting using wind time history and numerical model

    NASA Astrophysics Data System (ADS)

    Jain, Pooja; Deo, M. C.; Latha, G.; Rajendran, V.

    Operational activities in the ocean like planning for structural repairs or fishing expeditions require real time prediction of waves over typical time duration of say a few hours. Such predictions can be made by using a numerical model or a time series model employing continuously recorded waves. This paper presents another option to do so and it is based on a different time series approach in which the input is in the form of preceding wind speed and wind direction observations. This would be useful for those stations where the costly wave buoys are not deployed and instead only meteorological buoys measuring wind are moored. The technique employs alternative artificial intelligence approaches of an artificial neural network (ANN), genetic programming (GP) and model tree (MT) to carry out the time series modeling of wind to obtain waves. Wind observations at four offshore sites along the east coast of India were used. For calibration purpose the wave data was generated using a numerical model. The predicted waves obtained using the proposed time series models when compared with the numerically generated waves showed good resemblance in terms of the selected error criteria. Large differences across the chosen techniques of ANN, GP, MT were not noticed. Wave hindcasting at the same time step and the predictions over shorter lead times were better than the predictions over longer lead times. The proposed method is a cost effective and convenient option when a site-specific information is desired.

  14. Studying Climate Response to Forcing by the Nonlinear Dynamical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander

    2017-04-01

    An analysis of global climate response to external forcing, both anthropogenic (mainly, CO2 and aerosol) and natural (solar and volcanic), is needed for adequate predictions of global climate change. Being complex dynamical system, the climate reacts to external perturbations exciting feedbacks (both positive and negative) making the response non-trivial and poorly predictable. Thus an extraction of internal modes of climate system, investigation of their interaction with external forcings and further modeling and forecast of their dynamics, are all the problems providing the success of climate modeling. In the report the new method for principal mode extraction from climate data is presented. The method is based on the Nonlinear Dynamical Mode (NDM) expansion [1,2], but takes into account a number of external forcings applied to the system. Each NDM is represented by hidden time series governing the observed variability, which, together with external forcing time series, are mapped onto data space. While forcing time series are considered to be known, the hidden unknown signals underlying the internal climate dynamics are extracted from observed data by the suggested method. In particular, it gives us an opportunity to study the evolution of principal system's mode structure in changing external conditions and separate the internal climate variability from trends forced by external perturbations. Furthermore, the modes so obtained can be extrapolated beyond the observational time series, and long-term prognosis of modes' structure including characteristics of interconnections and responses to external perturbations, can be carried out. In this work the method is used for reconstructing and studying the principal modes of climate variability on inter-annual and decadal time scales accounting the external forcings such as anthropogenic emissions, variations of the solar activity and volcanic activity. The structure of the obtained modes as well as their response to external factors, e.g. forecast their change in 21 century under different CO2 emission scenarios, are discussed. [1] Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep15510 [2] Gavrilov, A., Mukhin, D., Loskutov, E., Volodin, E., Feigin, A., & Kurths, J. (2016). Method for reconstructing nonlinear modes with adaptive structure from multidimensional data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 26(12), 123101. http://doi.org/10.1063/1.4968852

  15. Piecewise multivariate modelling of sequential metabolic profiling data.

    PubMed

    Rantalainen, Mattias; Cloarec, Olivier; Ebbels, Timothy M D; Lundstedt, Torbjörn; Nicholson, Jeremy K; Holmes, Elaine; Trygg, Johan

    2008-02-19

    Modelling the time-related behaviour of biological systems is essential for understanding their dynamic responses to perturbations. In metabolic profiling studies, the sampling rate and number of sampling points are often restricted due to experimental and biological constraints. A supervised multivariate modelling approach with the objective to model the time-related variation in the data for short and sparsely sampled time-series is described. A set of piecewise Orthogonal Projections to Latent Structures (OPLS) models are estimated, describing changes between successive time points. The individual OPLS models are linear, but the piecewise combination of several models accommodates modelling and prediction of changes which are non-linear with respect to the time course. We demonstrate the method on both simulated and metabolic profiling data, illustrating how time related changes are successfully modelled and predicted. The proposed method is effective for modelling and prediction of short and multivariate time series data. A key advantage of the method is model transparency, allowing easy interpretation of time-related variation in the data. The method provides a competitive complement to commonly applied multivariate methods such as OPLS and Principal Component Analysis (PCA) for modelling and analysis of short time-series data.

  16. Optical signal processing using photonic reservoir computing

    NASA Astrophysics Data System (ADS)

    Salehi, Mohammad Reza; Dehyadegari, Louiza

    2014-10-01

    As a new approach to recognition and classification problems, photonic reservoir computing has such advantages as parallel information processing, power efficient and high speed. In this paper, a photonic structure has been proposed for reservoir computing which is investigated using a simple, yet, non-partial noisy time series prediction task. This study includes the application of a suitable topology with self-feedbacks in a network of SOA's - which lends the system a strong memory - and leads to adjusting adequate parameters resulting in perfect recognition accuracy (100%) for noise-free time series, which shows a 3% improvement over previous results. For the classification of noisy time series, the rate of accuracy showed a 4% increase and amounted to 96%. Furthermore, an analytical approach was suggested to solve rate equations which led to a substantial decrease in the simulation time, which is an important parameter in classification of large signals such as speech recognition, and better results came up compared with previous works.

  17. AQUAdexIM: highly efficient in-memory indexing and querying of astronomy time series images

    NASA Astrophysics Data System (ADS)

    Hong, Zhi; Yu, Ce; Wang, Jie; Xiao, Jian; Cui, Chenzhou; Sun, Jizhou

    2016-12-01

    Astronomy has always been, and will continue to be, a data-based science, and astronomers nowadays are faced with increasingly massive datasets, one key problem of which is to efficiently retrieve the desired cup of data from the ocean. AQUAdexIM, an innovative spatial indexing and querying method, performs highly efficient on-the-fly queries under users' request to search for Time Series Images from existing observation data on the server side and only return the desired FITS images to users, so users no longer need to download entire datasets to their local machines, which will only become more and more impractical as the data size keeps increasing. Moreover, AQUAdexIM manages to keep a very low storage space overhead and its specially designed in-memory index structure enables it to search for Time Series Images of a given area of the sky 10 times faster than using Redis, a state-of-the-art in-memory database.

  18. Spectral signatures of jumps and turbulence in interplanetary speed and magnetic field data

    NASA Technical Reports Server (NTRS)

    Roberts, D. A.; Goldstein, M. L.

    1987-01-01

    It is shown here that, consistent with a suggestion of Burlaga and Mish (1987), the f exp -2 spectra in the magnitudes of the magnetic and velocity fields in the solar wind result from jumps due to various rapid changes in the time series for these quantities. If these jumps are removed from the data, the spectra of the resulting 'difference' time series have the f exp -5/3 form. It is concluded that f exp -2 spectra in these magnitudes arise from phase coherent structures that can be distinguished clearly from incoherent turbulent fluctuations.

  19. Improvements of the two-dimensional FDTD method for the simulation of normal- and superconducting planar waveguides using time series analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofschen, S.; Wolff, I.

    1996-08-01

    Time-domain simulation results of two-dimensional (2-D) planar waveguide finite-difference time-domain (FDTD) analysis are normally analyzed using Fourier transform. The introduced method of time series analysis to extract propagation and attenuation constants reduces the desired computation time drastically. Additionally, a nonequidistant discretization together with an adequate excitation technique is used to reduce the number of spatial grid points. Therefore, it is possible to reduce the number of spatial grid points. Therefore, it is possible to simulate normal- and superconducting planar waveguide structures with very thin conductors and small dimensions, as they are used in MMIC technology. The simulation results are comparedmore » with measurements and show good agreement.« less

  20. The coupling analysis between stock market indices based on permutation measures

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Xia, Jianan; Yeh, Chien-Hung

    2016-04-01

    Many information-theoretic methods have been proposed for analyzing the coupling dependence between time series. And it is significant to quantify the correlation relationship between financial sequences since the financial market is a complex evolved dynamic system. Recently, we developed a new permutation-based entropy, called cross-permutation entropy (CPE), to detect the coupling structures between two synchronous time series. In this paper, we extend the CPE method to weighted cross-permutation entropy (WCPE), to address some of CPE's limitations, mainly its inability to differentiate between distinct patterns of a certain motif and the sensitivity of patterns close to the noise floor. It shows more stable and reliable results than CPE does when applied it to spiky data and AR(1) processes. Besides, we adapt the CPE method to infer the complexity of short-length time series by freely changing the time delay, and test it with Gaussian random series and random walks. The modified method shows the advantages in reducing deviations of entropy estimation compared with the conventional one. Finally, the weighted cross-permutation entropy of eight important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  1. Clustering Multivariate Time Series Using Hidden Markov Models

    PubMed Central

    Ghassempour, Shima; Girosi, Federico; Maeder, Anthony

    2014-01-01

    In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs), where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers. PMID:24662996

  2. Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology

    NASA Astrophysics Data System (ADS)

    Serinaldi, Francesco; Kilsby, Chris G.; Lombardo, Federico

    2018-01-01

    The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as 'deterministic components' or 'trends' even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures.

  3. Conceptual recurrence plots: revealing patterns in human discourse.

    PubMed

    Angus, Daniel; Smith, Andrew; Wiles, Janet

    2012-06-01

    Human discourse contains a rich mixture of conceptual information. Visualization of the global and local patterns within this data stream is a complex and challenging problem. Recurrence plots are an information visualization technique that can reveal trends and features in complex time series data. The recurrence plot technique works by measuring the similarity of points in a time series to all other points in the same time series and plotting the results in two dimensions. Previous studies have applied recurrence plotting techniques to textual data; however, these approaches plot recurrence using term-based similarity rather than conceptual similarity of the text. We introduce conceptual recurrence plots, which use a model of language to measure similarity between pairs of text utterances, and the similarity of all utterances is measured and displayed. In this paper, we explore how the descriptive power of the recurrence plotting technique can be used to discover patterns of interaction across a series of conversation transcripts. The results suggest that the conceptual recurrence plotting technique is a useful tool for exploring the structure of human discourse.

  4. Updating stand-level forest inventories using airborne laser scanning and Landsat time series data

    NASA Astrophysics Data System (ADS)

    Bolton, Douglas K.; White, Joanne C.; Wulder, Michael A.; Coops, Nicholas C.; Hermosilla, Txomin; Yuan, Xiaoping

    2018-04-01

    Vertical forest structure can be mapped over large areas by combining samples of airborne laser scanning (ALS) data with wall-to-wall spatial data, such as Landsat imagery. Here, we use samples of ALS data and Landsat time-series metrics to produce estimates of top height, basal area, and net stem volume for two timber supply areas near Kamloops, British Columbia, Canada, using an imputation approach. Both single-year and time series metrics were calculated from annual, gap-free Landsat reflectance composites representing 1984-2014. Metrics included long-term means of vegetation indices, as well as measures of the variance and slope of the indices through time. Terrain metrics, generated from a 30 m digital elevation model, were also included as predictors. We found that imputation models improved with the inclusion of Landsat time series metrics when compared to single-year Landsat metrics (relative RMSE decreased from 22.8% to 16.5% for top height, from 32.1% to 23.3% for basal area, and from 45.6% to 34.1% for net stem volume). Landsat metrics that characterized 30-years of stand history resulted in more accurate models (for all three structural attributes) than Landsat metrics that characterized only the most recent 10 or 20 years of stand history. To test model transferability, we compared imputed attributes against ALS-based estimates in nearby forest blocks (>150,000 ha) that were not included in model training or testing. Landsat-imputed attributes correlated strongly to ALS-based estimates in these blocks (R2 = 0.62 and relative RMSE = 13.1% for top height, R2 = 0.75 and relative RMSE = 17.8% for basal area, and R2 = 0.67 and relative RMSE = 26.5% for net stem volume), indicating model transferability. These findings suggest that in areas containing spatially-limited ALS data acquisitions, imputation models, and Landsat time series and terrain metrics can be effectively used to produce wall-to-wall estimates of key inventory attributes, providing an opportunity to update estimates of forest attributes in areas where inventory information is either out of date or non-existent.

  5. Systematic identification of an integrative network module during senescence from time-series gene expression.

    PubMed

    Park, Chihyun; Yun, So Jeong; Ryu, Sung Jin; Lee, Soyoung; Lee, Young-Sam; Yoon, Youngmi; Park, Sang Chul

    2017-03-15

    Cellular senescence irreversibly arrests growth of human diploid cells. In addition, recent studies have indicated that senescence is a multi-step evolving process related to important complex biological processes. Most studies analyzed only the genes and their functions representing each senescence phase without considering gene-level interactions and continuously perturbed genes. It is necessary to reveal the genotypic mechanism inferred by affected genes and their interaction underlying the senescence process. We suggested a novel computational approach to identify an integrative network which profiles an underlying genotypic signature from time-series gene expression data. The relatively perturbed genes were selected for each time point based on the proposed scoring measure denominated as perturbation scores. Then, the selected genes were integrated with protein-protein interactions to construct time point specific network. From these constructed networks, the conserved edges across time point were extracted for the common network and statistical test was performed to demonstrate that the network could explain the phenotypic alteration. As a result, it was confirmed that the difference of average perturbation scores of common networks at both two time points could explain the phenotypic alteration. We also performed functional enrichment on the common network and identified high association with phenotypic alteration. Remarkably, we observed that the identified cell cycle specific common network played an important role in replicative senescence as a key regulator. Heretofore, the network analysis from time series gene expression data has been focused on what topological structure was changed over time point. Conversely, we focused on the conserved structure but its context was changed in course of time and showed it was available to explain the phenotypic changes. We expect that the proposed method will help to elucidate the biological mechanism unrevealed by the existing approaches.

  6. Efficient Maize and Sunflower Multi-year Mapping with NDVI Time Series of HJ-1A/1B in Hetao Irrigation District of Inner Mongolia, China

    NASA Astrophysics Data System (ADS)

    Yu, B.; Shang, S.

    2016-12-01

    Food shortage is one of the major challenges that human beings are facing. It is urgent to improve the monitoring of the plantation and distribution of the main crops to solve the following economic and social issues. Recently, with the extensive use of remote sensing satellite data, it has provided favorable conditions for crop identification in large irrigation district with complex planting structure. Difference of different crop phenology is the main basis for crop identification, and the normalized difference vegetation index (NDVI) time-series could better delineate crop phenology cycle. Therefore, the key of crop identification is to obtain high quality NDVI time-series. MODIS and Landsat TM satellite images are the most frequently used, however, neither of them could guarantee high temporal and spatial resolutions at once. Accordingly, this paper makes use of NDVI time-series extracted from China Environment Satellites data, which has two-day-repeat temporal and 30m spatial resolutions. The NDVI time-series are fitted with an asymmetric logistic curve, the fitting effect is good and the correlation coefficient is greater than 0.9. The phonological parameters are derived from NDVI fitting curves, and crop identification is carried out by different relation ellipses between NDVI and its phonological parameters of different crops. This paper takes Hetao Irrigation District of Inner Mongolia as an example, to identify multi-year maize and sunflower in the district, and the identification result is good. Compared with the official statistics, the relative errors are both lower than 5%. The results show that the NDVI time-series dataset derived from HJ-1A/1B CCD could delineate the crop phenology cycle accurately and demonstrate its application in crop identification in irrigated district.

  7. Cameras on the NEPTUNE Canada seafloor observatory: Towards monitoring hydrothermal vent ecosystem dynamics

    NASA Astrophysics Data System (ADS)

    Robert, K.; Matabos, M.; Sarrazin, J.; Sarradin, P.; Lee, R. W.; Juniper, K.

    2010-12-01

    Hydrothermal vent environments are among the most dynamic benthic habitats in the ocean. The relative roles of physical and biological factors in shaping vent community structure remain unclear. Undersea cabled observatories offer the power and bandwidth required for high-resolution, time-series study of the dynamics of vent communities and the physico-chemical forces that influence them. The NEPTUNE Canada cabled instrument array at the Endeavour hydrothermal vents provides a unique laboratory for researchers to conduct long-term, integrated studies of hydrothermal vent ecosystem dynamics in relation to environmental variability. Beginning in September-October 2010, NEPTUNE Canada (NC) will be deploying a multi-disciplinary suite of instruments on the Endeavour Segment of the Juan de Fuca Ridge. Two camera and sensor systems will be used to study ecosystem dynamics in relation to hydrothermal discharge. These studies will make use of new experimental protocols for time-series observations that we have been developing since 2008 at other observatory sites connected to the VENUS and NC networks. These protocols include sampling design, camera calibration (i.e. structure, position, light, settings) and image analysis methodologies (see communication by Aron et al.). The camera systems to be deployed in the Main Endeavour vent field include a Sidus high definition video camera (2010) and the TEMPO-mini system (2011), designed by IFREMER (France). Real-time data from three sensors (O2, dissolved Fe, temperature) integrated with the TEMPO-mini system will enhance interpretation of imagery. For the first year of observations, a suite of internally recording temperature probes will be strategically placed in the field of view of the Sidus camera. These installations aim at monitoring variations in vent community structure and dynamics (species composition and abundances, interactions within and among species) in response to changes in environmental conditions at different temporal scales. High-resolution time-series studies also provide a mean of studying population dynamics, biological rhythms, organism growth and faunal succession. In addition to programmed time-series monitoring, the NC infrastructure will also permit manual and automated modification of observational protocols in response to natural events. This will enhance our ability to document potentially critical but short-lived environmental forces affecting vent communities.

  8. 29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...

  9. 29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...

  10. 29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...

  11. 29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...

  12. 29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...

  13. Visualization of synchronization of the uterine contraction signals: running cross-correlation and wavelet running cross-correlation methods.

    PubMed

    Oczeretko, Edward; Swiatecka, Jolanta; Kitlas, Agnieszka; Laudanski, Tadeusz; Pierzynski, Piotr

    2006-01-01

    In physiological research, we often study multivariate data sets, containing two or more simultaneously recorded time series. The aim of this paper is to present the cross-correlation and the wavelet cross-correlation methods to assess synchronization between contractions in different topographic regions of the uterus. From a medical point of view, it is important to identify time delays between contractions, which may be of potential diagnostic significance in various pathologies. The cross-correlation was computed in a moving window with a width corresponding to approximately two or three contractions. As a result, the running cross-correlation function was obtained. The propagation% parameter assessed from this function allows quantitative description of synchronization in bivariate time series. In general, the uterine contraction signals are very complicated. Wavelet transforms provide insight into the structure of the time series at various frequencies (scales). To show the changes of the propagation% parameter along scales, a wavelet running cross-correlation was used. At first, the continuous wavelet transforms as the uterine contraction signals were received and afterwards, a running cross-correlation analysis was conducted for each pair of transformed time series. The findings show that running functions are very useful in the analysis of uterine contractions.

  14. A method for analyzing temporal patterns of variability of a time series from Poincare plots.

    PubMed

    Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E

    2012-07-01

    The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.

  15. The Fourier decomposition method for nonlinear and non-stationary time series analysis.

    PubMed

    Singh, Pushpendra; Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik

    2017-03-01

    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of 'Fourier intrinsic band functions' (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time-frequency-energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms.

  16. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    NASA Astrophysics Data System (ADS)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of the model is rather poor, and possible explanations are discussed.

  17. Phenomenological analysis of medical time series with regular and stochastic components

    NASA Astrophysics Data System (ADS)

    Timashev, Serge F.; Polyakov, Yuriy S.

    2007-06-01

    Flicker-Noise Spectroscopy (FNS), a general approach to the extraction and parameterization of resonant and stochastic components contained in medical time series, is presented. The basic idea of FNS is to treat the correlation links present in sequences of different irregularities, such as spikes, "jumps", and discontinuities in derivatives of different orders, on all levels of the spatiotemporal hierarchy of the system under study as main information carriers. The tools to extract and analyze the information are power spectra and difference moments (structural functions), which complement the information of each other. The structural function stochastic component is formed exclusively by "jumps" of the dynamic variable while the power spectrum stochastic component is formed by both spikes and "jumps" on every level of the hierarchy. The information "passport" characteristics that are determined by fitting the derived expressions to the experimental variations for the stochastic components of power spectra and structural functions are interpreted as the correlation times and parameters that describe the rate of "memory loss" on these correlation time intervals for different irregularities. The number of the extracted parameters is determined by the requirements of the problem under study. Application of this approach to the analysis of tremor velocity signals for a Parkinsonian patient is discussed.

  18. Mediating Relations: Therapeutic Discourse in American Prime Time Series.

    ERIC Educational Resources Information Center

    White, Mimi

    Although "The Equalizer" and "Finder of Lost Loves" are different kinds of prime time fiction--urban thriller on the one hand and fantasy melodrama on the other--they share an underlying dramatic structure and symbolic problematic in their repeated enactments of a therapeutic cure overseen by a mediating, authority figure. The…

  19. The NCAA's New Hammer

    ERIC Educational Resources Information Center

    Wolverton, Brad

    2012-01-01

    A series of unprecedented scandals has eroded confidence in big-time sports, increasing the appetite for change. Some critics have a tough time seeing the NCAA as a savior; they say the real problem is the NCAA structure itself, which allows athletes to generate billions of dollars for colleges while earning no compensation themselves. Mark A.…

  20. Finite-element time-domain modeling of electromagnetic data in general dispersive medium using adaptive Padé series

    NASA Astrophysics Data System (ADS)

    Cai, Hongzhu; Hu, Xiangyun; Xiong, Bin; Zhdanov, Michael S.

    2017-12-01

    The induced polarization (IP) method has been widely used in geophysical exploration to identify the chargeable targets such as mineral deposits. The inversion of the IP data requires modeling the IP response of 3D dispersive conductive structures. We have developed an edge-based finite-element time-domain (FETD) modeling method to simulate the electromagnetic (EM) fields in 3D dispersive medium. We solve the vector Helmholtz equation for total electric field using the edge-based finite-element method with an unstructured tetrahedral mesh. We adopt the backward propagation Euler method, which is unconditionally stable, with semi-adaptive time stepping for the time domain discretization. We use the direct solver based on a sparse LU decomposition to solve the system of equations. We consider the Cole-Cole model in order to take into account the frequency-dependent conductivity dispersion. The Cole-Cole conductivity model in frequency domain is expanded using a truncated Padé series with adaptive selection of the center frequency of the series for early and late time. This approach can significantly increase the accuracy of FETD modeling.

  1. Detecting and modelling delayed density-dependence in abundance time series of a small mammal (Didelphis aurita)

    NASA Astrophysics Data System (ADS)

    Brigatti, E.; Vieira, M. V.; Kajin, M.; Almeida, P. J. A. L.; de Menezes, M. A.; Cerqueira, R.

    2016-02-01

    We study the population size time series of a Neotropical small mammal with the intent of detecting and modelling population regulation processes generated by density-dependent factors and their possible delayed effects. The application of analysis tools based on principles of statistical generality are nowadays a common practice for describing these phenomena, but, in general, they are more capable of generating clear diagnosis rather than granting valuable modelling. For this reason, in our approach, we detect the principal temporal structures on the bases of different correlation measures, and from these results we build an ad-hoc minimalist autoregressive model that incorporates the main drivers of the dynamics. Surprisingly our model is capable of reproducing very well the time patterns of the empirical series and, for the first time, clearly outlines the importance of the time of attaining sexual maturity as a central temporal scale for the dynamics of this species. In fact, an important advantage of this analysis scheme is that all the model parameters are directly biologically interpretable and potentially measurable, allowing a consistency check between model outputs and independent measurements.

  2. Inferring the 1985-2014 impact of mobile phone use on selected brain cancer subtypes using Bayesian structural time series and synthetic controls.

    PubMed

    de Vocht, Frank

    2016-12-01

    Mobile phone use has been increasing rapidly in the past decades and, in parallel, so has the annual incidence of certain types of brain cancers. However, it remains unclear whether this correlation is coincidental or whether use of mobile phones may cause the development, promotion or progression of specific cancers. The 1985-2014 incidence of selected brain cancer subtypes in England were analyzed and compared to counterfactual 'synthetic control' timeseries. Annual 1985-2014 incidence of malignant glioma, glioblastoma multiforme, and malignant neoplasms of the temporal and parietal lobes in England were modelled based on population-level covariates using Bayesian structural time series models assuming 5,10 and 15year minimal latency periods. Post-latency counterfactual 'synthetic England' timeseries were nowcast based on covariate trends. The impact of mobile phone use was inferred from differences between measured and modelled time series. There is no evidence of an increase in malignant glioma, glioblastoma multiforme, or malignant neoplasms of the parietal lobe not predicted in the 'synthetic England' time series. Malignant neoplasms of the temporal lobe however, have increased faster than expected. A latency period of 10years reflected the earliest latency period when this was measurable and related to mobile phone penetration rates, and indicated an additional increase of 35% (95% Credible Interval 9%:59%) during 2005-2014; corresponding to an additional 188 (95%CI 48-324) cases annually. A causal factor, of which mobile phone use (and possibly other wireless equipment) is in agreement with the hypothesized temporal association, is related to an increased risk of developing malignant neoplasms in the temporal lobe. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  3. Comparing the structure of an emerging market with a mature one under global perturbation

    NASA Astrophysics Data System (ADS)

    Namaki, A.; Jafari, G. R.; Raei, R.

    2011-09-01

    In this paper we investigate the Tehran stock exchange (TSE) and Dow Jones Industrial Average (DJIA) in terms of perturbed correlation matrices. To perturb a stock market, there are two methods, namely local and global perturbation. In the local method, we replace a correlation coefficient of the cross-correlation matrix with one calculated from two Gaussian-distributed time series, whereas in the global method, we reconstruct the correlation matrix after replacing the original return series with Gaussian-distributed time series. The local perturbation is just a technical study. We analyze these markets through two statistical approaches, random matrix theory (RMT) and the correlation coefficient distribution. By using RMT, we find that the largest eigenvalue is an influence that is common to all stocks and this eigenvalue has a peak during financial shocks. We find there are a few correlated stocks that make the essential robustness of the stock market but we see that by replacing these return time series with Gaussian-distributed time series, the mean values of correlation coefficients, the largest eigenvalues of the stock markets and the fraction of eigenvalues that deviate from the RMT prediction fall sharply in both markets. By comparing these two markets, we can see that the DJIA is more sensitive to global perturbations. These findings are crucial for risk management and portfolio selection.

  4. The application of computational mechanics to the analysis of natural data: An example in geomagnetism.

    NASA Astrophysics Data System (ADS)

    Watkins, Nicholas; Clarke, Richard; Freeman, Mervyn

    2002-11-01

    We discuss how the ideal formalism of Computational Mechanics can be adapted to apply to a non-infinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the new concept of effective soficity is introduced. The benefits of these new concepts are demonstrated on simulated time series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the earth-based station under an electrical current pattern that is fixed with respect to the sun-earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, a hypothesis is advanced about model construction in general (see also Clarke et al; arXiv::cond-mat/0110228).

  5. Nonlinear Time Series Analysis of Nodulation Factor Induced Calcium Oscillations: Evidence for Deterministic Chaos?

    PubMed Central

    Hazledine, Saul; Sun, Jongho; Wysham, Derin; Downie, J. Allan; Oldroyd, Giles E. D.; Morris, Richard J.

    2009-01-01

    Legume plants form beneficial symbiotic interactions with nitrogen fixing bacteria (called rhizobia), with the rhizobia being accommodated in unique structures on the roots of the host plant. The legume/rhizobial symbiosis is responsible for a significant proportion of the global biologically available nitrogen. The initiation of this symbiosis is governed by a characteristic calcium oscillation within the plant root hair cells and this signal is activated by the rhizobia. Recent analyses on calcium time series data have suggested that stochastic effects have a large role to play in defining the nature of the oscillations. The use of multiple nonlinear time series techniques, however, suggests an alternative interpretation, namely deterministic chaos. We provide an extensive, nonlinear time series analysis on the nature of this calcium oscillation response. We build up evidence through a series of techniques that test for determinism, quantify linear and nonlinear components, and measure the local divergence of the system. Chaos is common in nature and it seems plausible that properties of chaotic dynamics might be exploited by biological systems to control processes within the cell. Systems possessing chaotic control mechanisms are more robust in the sense that the enhanced flexibility allows more rapid response to environmental changes with less energetic costs. The desired behaviour could be most efficiently targeted in this manner, supporting some intriguing speculations about nonlinear mechanisms in biological signaling. PMID:19675679

  6. Neural networks and traditional time series methods: a synergistic combination in state economic forecasts.

    PubMed

    Hansen, J V; Nelson, R D

    1997-01-01

    Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.

  7. The geometry of chaotic dynamics — a complex network perspective

    NASA Astrophysics Data System (ADS)

    Donner, R. V.; Heitzig, J.; Donges, J. F.; Zou, Y.; Marwan, N.; Kurths, J.

    2011-12-01

    Recently, several complex network approaches to time series analysis have been developed and applied to study a wide range of model systems as well as real-world data, e.g., geophysical or financial time series. Among these techniques, recurrence-based concepts and prominently ɛ-recurrence networks, most faithfully represent the geometrical fine structure of the attractors underlying chaotic (and less interestingly non-chaotic) time series. In this paper we demonstrate that the well known graph theoretical properties local clustering coefficient and global (network) transitivity can meaningfully be exploited to define two new local and two new global measures of dimension in phase space: local upper and lower clustering dimension as well as global upper and lower transitivity dimension. Rigorous analytical as well as numerical results for self-similar sets and simple chaotic model systems suggest that these measures are well-behaved in most non-pathological situations and that they can be estimated reasonably well using ɛ-recurrence networks constructed from relatively short time series. Moreover, we study the relationship between clustering and transitivity dimensions on the one hand, and traditional measures like pointwise dimension or local Lyapunov dimension on the other hand. We also provide further evidence that the local clustering coefficients, or equivalently the local clustering dimensions, are useful for identifying unstable periodic orbits and other dynamically invariant objects from time series. Our results demonstrate that ɛ-recurrence networks exhibit an important link between dynamical systems and graph theory.

  8. Large-scale Granger causality analysis on resting-state functional MRI

    NASA Astrophysics Data System (ADS)

    D'Souza, Adora M.; Abidin, Anas Zainul; Leistritz, Lutz; Wismüller, Axel

    2016-03-01

    We demonstrate an approach to measure the information flow between each pair of time series in resting-state functional MRI (fMRI) data of the human brain and subsequently recover its underlying network structure. By integrating dimensionality reduction into predictive time series modeling, large-scale Granger Causality (lsGC) analysis method can reveal directed information flow suggestive of causal influence at an individual voxel level, unlike other multivariate approaches. This method quantifies the influence each voxel time series has on every other voxel time series in a multivariate sense and hence contains information about the underlying dynamics of the whole system, which can be used to reveal functionally connected networks within the brain. To identify such networks, we perform non-metric network clustering, such as accomplished by the Louvain method. We demonstrate the effectiveness of our approach to recover the motor and visual cortex from resting state human brain fMRI data and compare it with the network recovered from a visuomotor stimulation experiment, where the similarity is measured by the Dice Coefficient (DC). The best DC obtained was 0.59 implying a strong agreement between the two networks. In addition, we thoroughly study the effect of dimensionality reduction in lsGC analysis on network recovery. We conclude that our approach is capable of detecting causal influence between time series in a multivariate sense, which can be used to segment functionally connected networks in the resting-state fMRI.

  9. 77 FR 65506 - Airworthiness Directives; The Boeing Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-29

    ...We propose to supersede an existing airworthiness directive (AD) that applies to certain The Boeing Company Model 757-200 and - 200PF series airplanes. The existing AD currently requires modification of the nacelle strut and wing structure, and repair of any damage found during the modification. Since we issued that AD, a compliance time error involving the optional threshold formula was discovered, which could allow an airplane to exceed the acceptable compliance time for addressing the unsafe condition. This proposed AD would specify a maximum compliance time limit that overrides the optional threshold formula results. We are proposing this AD to prevent fatigue cracking in primary strut structure and consequent reduced structural integrity of the strut.

  10. Hidden discriminative features extraction for supervised high-order time series modeling.

    PubMed

    Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee

    2016-11-01

    In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Stock market context of the Lévy walks with varying velocity

    NASA Astrophysics Data System (ADS)

    Kutner, Ryszard

    2002-11-01

    We developed the most general Lévy walks with varying velocity, shorter called the Weierstrass walks (WW) model, by which one can describe both stationary and non-stationary stochastic time series. We considered a non-Brownian random walk where the walker moves, in general, with a velocity that assumes a different constant value between the successive turning points, i.e., the velocity is a piecewise constant function. This model is a kind of Lévy walks where we assume a hierarchical, self-similar in a stochastic sense, spatio-temporal representation of the main quantities such as waiting-time distribution and sojourn probability density (which are principal quantities in the continuous-time random walk formalism). The WW model makes possible to analyze both the structure of the Hurst exponent and the power-law behavior of kurtosis. This structure results from the hierarchical, spatio-temporal coupling between the walker displacement and the corresponding time of the walks. The analysis uses both the fractional diffusion and the super Burnett coefficients. We constructed the diffusion phase diagram which distinguishes regions occupied by classes of different universality. We study only such classes which are characteristic for stationary situations. We thus have a model ready for describing the data presented, e.g., in the form of moving averages; the operation is often used for stochastic time series, especially financial ones. The model was inspired by properties of financial time series and tested for empirical data extracted from the Warsaw stock exchange since it offers an opportunity to study in an unbiased way several features of stock exchange in its early stage.

  12. Characterizing stand-level forest canopy cover and height using Landsat time series, samples of airborne LiDAR, and the Random Forest algorithm

    NASA Astrophysics Data System (ADS)

    Ahmed, Oumer S.; Franklin, Steven E.; Wulder, Michael A.; White, Joanne C.

    2015-03-01

    Many forest management activities, including the development of forest inventories, require spatially detailed forest canopy cover and height data. Among the various remote sensing technologies, LiDAR (Light Detection and Ranging) offers the most accurate and consistent means for obtaining reliable canopy structure measurements. A potential solution to reduce the cost of LiDAR data, is to integrate transects (samples) of LiDAR data with frequently acquired and spatially comprehensive optical remotely sensed data. Although multiple regression is commonly used for such modeling, often it does not fully capture the complex relationships between forest structure variables. This study investigates the potential of Random Forest (RF), a machine learning technique, to estimate LiDAR measured canopy structure using a time series of Landsat imagery. The study is implemented over a 2600 ha area of industrially managed coastal temperate forests on Vancouver Island, British Columbia, Canada. We implemented a trajectory-based approach to time series analysis that generates time since disturbance (TSD) and disturbance intensity information for each pixel and we used this information to stratify the forest land base into two strata: mature forests and young forests. Canopy cover and height for three forest classes (i.e. mature, young and mature and young (combined)) were modeled separately using multiple regression and Random Forest (RF) techniques. For all forest classes, the RF models provided improved estimates relative to the multiple regression models. The lowest validation error was obtained for the mature forest strata in a RF model (R2 = 0.88, RMSE = 2.39 m and bias = -0.16 for canopy height; R2 = 0.72, RMSE = 0.068% and bias = -0.0049 for canopy cover). This study demonstrates the value of using disturbance and successional history to inform estimates of canopy structure and obtain improved estimates of forest canopy cover and height using the RF algorithm.

  13. Global patterns of phytoplankton dynamics in coastal ecosystems

    USGS Publications Warehouse

    Paerl, H.; Yin, Kedong; Cloern, J.

    2011-01-01

    Scientific Committee on Ocean Research Working Group 137 Meeting; Hangzhou, China, 17-21 October 2010; Phytoplankton biomass and community structure have undergone dramatic changes in coastal ecosystems over the past several decades in response to climate variability and human disturbance. These changes have short- and long-term impacts on global carbon and nutrient cycling, food web structure and productivity, and coastal ecosystem services. There is a need to identify the underlying processes and measure the rates at which they alter coastal ecosystems on a global scale. Hence, the Scientific Committee on Ocean Research (SCOR) formed Working Group 137 (WG 137), "Global Patterns of Phytoplankton Dynamics in Coastal Ecosystems: A Comparative Analysis of Time Series Observations" (http://wg137.net/). This group evolved from a 2007 AGU-sponsored Chapman Conference entitled "Long Time-Series Observations in Coastal Ecosystems: Comparative Analyses of Phytoplankton Dynamics on Regional to Global Scales.".

  14. Information-theoretical noninvasive damage detection in bridge structures

    NASA Astrophysics Data System (ADS)

    Sudu Ambegedara, Amila; Sun, Jie; Janoyan, Kerop; Bollt, Erik

    2016-11-01

    Damage detection of mechanical structures such as bridges is an important research problem in civil engineering. Using spatially distributed sensor time series data collected from a recent experiment on a local bridge in Upper State New York, we study noninvasive damage detection using information-theoretical methods. Several findings are in order. First, the time series data, which represent accelerations measured at the sensors, more closely follow Laplace distribution than normal distribution, allowing us to develop parameter estimators for various information-theoretic measures such as entropy and mutual information. Second, as damage is introduced by the removal of bolts of the first diaphragm connection, the interaction between spatially nearby sensors as measured by mutual information becomes weaker, suggesting that the bridge is "loosened." Finally, using a proposed optimal mutual information interaction procedure to prune away indirect interactions, we found that the primary direction of interaction or influence aligns with the traffic direction on the bridge even after damaging the bridge.

  15. Zernike phase-contrast electron cryotomography applied to marine cyanobacteria infected with cyanophages.

    PubMed

    Dai, Wei; Fu, Caroline; Khant, Htet A; Ludtke, Steven J; Schmid, Michael F; Chiu, Wah

    2014-11-01

    Advances in electron cryotomography have provided new opportunities to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase-contrast optics produces images with markedly increased contrast compared with images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods for obtaining 3D structures of cyanophage assembly intermediates in the host by subtomogram alignment, classification and averaging. Acquiring three or four tomographic tilt series takes ∼12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. The time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume.

  16. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  17. Detrended Fluctuation Analysis and Adaptive Fractal Analysis of Stride Time Data in Parkinson's Disease: Stitching Together Short Gait Trials

    PubMed Central

    Liebherr, Magnus; Haas, Christian T.

    2014-01-01

    Variability indicates motor control disturbances and is suitable to identify gait pathologies. It can be quantified by linear parameters (amplitude estimators) and more sophisticated nonlinear methods (structural information). Detrended Fluctuation Analysis (DFA) is one method to measure structural information, e.g., from stride time series. Recently, an improved method, Adaptive Fractal Analysis (AFA), has been proposed. This method has not been applied to gait data before. Fractal scaling methods (FS) require long stride-to-stride data to obtain valid results. However, in clinical studies, it is not usual to measure a large number of strides (e.g., strides). Amongst others, clinical gait analysis is limited due to short walkways, thus, FS seem to be inapplicable. The purpose of the present study was to evaluate FS under clinical conditions. Stride time data of five self-paced walking trials ( strides each) of subjects with PD and a healthy control group (CG) was measured. To generate longer time series, stride time sequences were stitched together. The coefficient of variation (CV), fractal scaling exponents (DFA) and (AFA) were calculated. Two surrogate tests were performed: A) the whole time series was randomly shuffled; B) the single trials were randomly shuffled separately and afterwards stitched together. CV did not discriminate between PD and CG. However, significant differences between PD and CG were found concerning and . Surrogate version B yielded a higher mean squared error and empirical quantiles than version A. Hence, we conclude that the stitching procedure creates an artificial structure resulting in an overestimation of true . The method of stitching together sections of gait seems to be appropriate in order to distinguish between PD and CG with FS. It provides an approach to integrate FS as standard in clinical gait analysis and to overcome limitations such as short walkways. PMID:24465708

  18. Dollar$ & $en$e. Part IV: Measuring the value of people, structural, and customer capital.

    PubMed

    Wilkinson, I

    2001-01-01

    In Part I of this series, I introduced the concept of memes (1). Memes are ideas or concepts, the information world equivalent of genes. The goal of this series of articles is to infect you with my memes, so that you will assimilate, translate, and express them. We discovered that no matter what our area of expertise or "-ology," we all are in the information business. Our goal is to be in the wisdom business. We saw that when we convert raw data into wisdom we are moving along a value chain. Each step in the chain adds a different amount of value to the final product: timely, relevant, accurate, and precise knowledge which can then be applied to create the ultimate product in the value chain: wisdom. In Part II of this series, I infected you with a set of memes for measuring the cost of adding value (2). In Part III of this series, I infected you with a new set of memes for measuring the added value of knowledge, i.e., intellectual capital (3). In Part IV of this series, I will infect you with memes for measuring the value of people, structural, and customer capital.

  19. A generalized conditional heteroscedastic model for temperature downscaling

    NASA Astrophysics Data System (ADS)

    Modarres, R.; Ouarda, T. B. M. J.

    2014-11-01

    This study describes a method for deriving the time varying second order moment, or heteroscedasticity, of local daily temperature and its association to large Coupled Canadian General Circulation Models predictors. This is carried out by applying a multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) approach to construct the conditional variance-covariance structure between General Circulation Models (GCMs) predictors and maximum and minimum temperature time series during 1980-2000. Two MGARCH specifications namely diagonal VECH and dynamic conditional correlation (DCC) are applied and 25 GCM predictors were selected for a bivariate temperature heteroscedastic modeling. It is observed that the conditional covariance between predictors and temperature is not very strong and mostly depends on the interaction between the random process governing temporal variation of predictors and predictants. The DCC model reveals a time varying conditional correlation between GCM predictors and temperature time series. No remarkable increasing or decreasing change is observed for correlation coefficients between GCM predictors and observed temperature during 1980-2000 while weak winter-summer seasonality is clear for both conditional covariance and correlation. Furthermore, the stationarity and nonlinearity Kwiatkowski-Phillips-Schmidt-Shin (KPSS) and Brock-Dechert-Scheinkman (BDS) tests showed that GCM predictors, temperature and their conditional correlation time series are nonlinear but stationary during 1980-2000 according to BDS and KPSS test results. However, the degree of nonlinearity of temperature time series is higher than most of the GCM predictors.

  20. Three-dimensional reconstruction of single-cell chromosome structure using recurrence plots.

    PubMed

    Hirata, Yoshito; Oda, Arisa; Ohta, Kunihiro; Aihara, Kazuyuki

    2016-10-11

    Single-cell analysis of the three-dimensional (3D) chromosome structure can reveal cell-to-cell variability in genome activities. Here, we propose to apply recurrence plots, a mathematical method of nonlinear time series analysis, to reconstruct the 3D chromosome structure of a single cell based on information of chromosomal contacts from genome-wide chromosome conformation capture (Hi-C) data. This recurrence plot-based reconstruction (RPR) method enables rapid reconstruction of a unique structure in single cells, even from incomplete Hi-C information.

  1. Three-dimensional reconstruction of single-cell chromosome structure using recurrence plots

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito; Oda, Arisa; Ohta, Kunihiro; Aihara, Kazuyuki

    2016-10-01

    Single-cell analysis of the three-dimensional (3D) chromosome structure can reveal cell-to-cell variability in genome activities. Here, we propose to apply recurrence plots, a mathematical method of nonlinear time series analysis, to reconstruct the 3D chromosome structure of a single cell based on information of chromosomal contacts from genome-wide chromosome conformation capture (Hi-C) data. This recurrence plot-based reconstruction (RPR) method enables rapid reconstruction of a unique structure in single cells, even from incomplete Hi-C information.

  2. Using GNSS for Assessment Recent Sea Level Rise in the Northwestern Part of the Arabian Gulf

    NASA Astrophysics Data System (ADS)

    Alothman, A. O.; Bos, M. S.; Fernandes, R.

    2017-12-01

    Due to the global warming acting recently (in the 21st century) on the planet Earth, an associated sea level rise is predicted to reach up to 30 cm to 60 cm in some regions. Sea level monitoring is important for the Kingdom of Saudi Arabia, since it is surrounded by very long cost of about 3400 km in length and hundreds of isolated islands. The eastern coast line of KSA, in the Arabian Gulf, needs some monitoring in the long term, due to low land nature of the region. Also, the ongoing oil withdrawal activities in the area, may affect the regional sea level rise. In addition to these two facts, the tectonic structure of the Arabian Peninsula is one factor. The Regional Relative sea level in the eastern cost of Saudi Arabia has been estimated in the past using tide gauge data of more than 28 years using the vertical displacement of permanent Global Navigation Satellite System GNSS stations having time span of only about 3 years. In this paper, we discuss and update the methodology and results from Alothman et al. (2014), particularly by checking and extending the GNSS solutions. Since 3 of the 6 GPS stations used only started observing in the end of 2011, the longer time series have now significantly lower uncertainties in the estimated vertical rate. Longer time span of GNSS observations were included and 500 synthetic time series were estimated and seasonal signals were analysed. it is concluded that the varying seasonal signal present in the GNSS time series causes an underestimation of 0.1 mm/yr for short time series of 3 years. In addition to the implications of using short time series to estimate the vertical land motion, we found that if the varying seasonal signals are present in the data, the problem is aggravated. This finding can be useful for other studies analyzing short GNSS time series.

  3. Autoregressive-model-based missing value estimation for DNA microarray time series data.

    PubMed

    Choong, Miew Keen; Charbit, Maurice; Yan, Hong

    2009-01-01

    Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.

  4. Correlation of physical properties with molecular structure for some dicyclic hydrocarbons having high thermal-energy release per unit volume -- 2-alkylbiphenyl and the two isomeric 2-alkylbicyclohexyl series

    NASA Technical Reports Server (NTRS)

    Goodman, Irving A; Wise, Paul H

    1952-01-01

    Three homologous series of related dicyclic hydrocarbons are presented for comparison on the basis of their physical properties, which include net heat of combustion, density, melting point, boiling point, and kinematic viscosity. The three series investigated include the 2-n-alkylbiphenyl, 2-n-alkylbicyclohexyl (high boiling), and 2-n-alkylbiphenyls (low boiling) series through c sub 16, in addition to three branched-chain (isopropyl, sec-butyl, and isobutyl) 2-alkylbiphenyls and their corresponding 2-alkylbicyclohexyls. The physical properties of the low-boiling and high-boiling isomers of 2-sec-butylbicyclohexyl and 2-isobutylbicyclohexyl are reported herein for the first time.

  5. Predicting the process of extinction in experimental microcosms and accounting for interspecific interactions in single-species time series

    PubMed Central

    Ferguson, Jake M; Ponciano, José M

    2014-01-01

    Predicting population extinction risk is a fundamental application of ecological theory to the practice of conservation biology. Here, we compared the prediction performance of a wide array of stochastic, population dynamics models against direct observations of the extinction process from an extensive experimental data set. By varying a series of biological and statistical assumptions in the proposed models, we were able to identify the assumptions that affected predictions about population extinction. We also show how certain autocorrelation structures can emerge due to interspecific interactions, and that accounting for the stochastic effect of these interactions can improve predictions of the extinction process. We conclude that it is possible to account for the stochastic effects of community interactions on extinction when using single-species time series. PMID:24304946

  6. Understanding multi-scale structural evolution in granular systems through gMEMS

    NASA Astrophysics Data System (ADS)

    Walker, David M.; Tordesillas, Antoinette

    2013-06-01

    We show how the rheological response of a material to applied loads can be systematically coded, analyzed and succinctly summarized, according to an individual grain's property (e.g. kinematics). Individual grains are considered as their own smart sensor akin to microelectromechanical systems (e.g. gyroscopes, accelerometers), each capable of recognizing their evolving role within self-organizing building block structures (e.g. contact cycles and force chains). A symbolic time series is used to represent their participation in such self-assembled building blocks and a complex network summarizing their interrelationship with other grains is constructed. In particular, relationships between grain time series are determined according to the information theory Hamming distance or the metric Euclidean distance. We then use topological distance to find network communities enabling groups of grains at remote physical metric distances in the material to share a classification. In essence grains with similar structural and functional roles at different scales are identified together. This taxonomy distills the dissipative structural rearrangements of grains down to its essential features and thus provides pointers for objective physics-based internal variable formalisms used in the construction of robust predictive continuum models.

  7. Aeroelastic impact of above-rated wave-induced structural motions on the near-wake stability of a floating offshore wind turbine rotor

    NASA Astrophysics Data System (ADS)

    Rodriguez, Steven; Jaworski, Justin

    2017-11-01

    The impact of above-rated wave-induced motions on the stability of floating offshore wind turbine near-wakes is studied numerically. The rotor near-wake is generated using a lifting-line free vortex wake method, which is strongly coupled to a finite element solver for kinematically nonlinear blade deformations. A synthetic time series of relatively high-amplitude/high-frequency representative of above-rated conditions of the NREL 5MW referece wind turbine is imposed on the rotor structure. To evaluate the impact of these above-rated conditions, a linear stability analysis is first performed on the near wake generated by a fixed-tower wind turbine configuration at above-rated inflow conditions. The platform motion is then introduced via synthetic time series, and a stability analysis is performed on the wake generated by the floating offshore wind turbine at the same above-rated inflow conditions. The stability trends (disturbance modes versus the divergence rate of vortex structures) of the two analyses are compared to identify the impact that above-rated wave-induced structural motions have on the stability of the floating offshore wind turbine wake.

  8. Interpretation of Time Series from Nonlinear Systems. Volume 58. Proceedings of the IUTAM Symposium and NATO Advanced Research Workshop on the Interpretation of Time Series from Nonlinear Mechanical Systems Held in England on 26 - 30 August 1991,

    DTIC Science & Technology

    1992-01-01

    VM and the correlation entropy K,(M) versus the embedding dimension M for both the linear and non-linear signals. Crosses refer to the linear signal...mensions, leading to a correlation dimension v=2.7. A similar structure was observed bv Voges et al. [461 in the analysis of the X-ray variability of...0 + 7 1j, and its recurrence plots often indicates whether a where A 0 = 10 and 71, is uniformly random dis- meaningful correlation integral analysis

  9. Volterra Series Approach for Nonlinear Aeroelastic Response of 2-D Lifting Surfaces

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Marzocca, Piergiovanni; Librescu, Liviu

    2001-01-01

    The problem of the determination of the subcritical aeroelastic response and flutter instability of nonlinear two-dimensional lifting surfaces in an incompressible flow-field via Volterra series approach is addressed. The related aeroelastic governing equations are based upon the inclusion of structural nonlinearities, of the linear unsteady aerodynamics and consideration of an arbitrary time-dependent external pressure pulse. Unsteady aeroelastic nonlinear kernels are determined, and based on these, frequency and time histories of the subcritical aeroelastic response are obtained, and in this context the influence of geometric nonlinearities is emphasized. Conclusions and results displaying the implications of the considered effects are supplied.

  10. Decahydrobenzoquinolin-5-one sigma receptor ligands: Divergent development of both sigma 1 and sigma 2 receptor selective examples.

    PubMed

    McLeod, Michael C; Aubé, Jeffrey; Frankowski, Kevin J

    2016-12-01

    Analogues of the decahydrobenzoquinolin-5-one class of sigma (σ) receptor ligands were used to probe the structure-activity relationship trends for this recently discovered series of σ ligands. In all, 29 representatives were tested for σ and opioid receptor affinity, leading to the identification of compounds possessing improved σ 1 selectivity and, for the first time in this series, examples possessing preferential σ 2 affinity. Several structural features associated with these selectivity trends have been identified. Two analogues of improved selectivity were evaluated in a binding panel of 43 CNS-relevant targets to confirm their sigma receptor preference. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Nonstationary frequency analysis for the trivariate flood series of the Weihe River

    NASA Astrophysics Data System (ADS)

    Jiang, Cong; Xiong, Lihua

    2016-04-01

    Some intensive human activities such as water-soil conservation can significantly alter the natural hydrological processes of rivers. In this study, the effect of the water-soil conservation on the trivariate flood series from the Weihe River located in the Northwest China is investigated. The annual maxima daily discharge, annual maxima 3-day flood volume and annual maxima 5-day flood volume are chosen as the study data and used to compose the trivariate flood series. The nonstationarities in both the individual univariate flood series and the corresponding antecedent precipitation series generating the flood events are examined by the Mann-Kendall trend test. It is found that all individual univariate flood series present significant decreasing trend, while the antecedent precipitation series can be treated as stationary. It indicates that the increase of the water-soil conservation land area has altered the rainfall-runoff relationship of the Weihe basin, and induced the nonstationarities in the three individual univariate flood series. The time-varying moments model based on the Pearson type III distribution is applied to capture the nonstationarities in the flood frequency distribution with the water-soil conservation land area introduced as the explanatory variable of the flood distribution parameters. Based on the analysis for each individual univariate flood series, the dependence structure among the three univariate flood series are investigated by the time-varying copula model also with the water-soil conservation land area as the explanatory variable of copula parameters. The results indicate that the dependence among the trivariate flood series is enhanced by the increase of water-soil conservation land area.

  12. Analysis of brain patterns using temporal measures

    DOEpatents

    Georgopoulos, Apostolos

    2015-08-11

    A set of brain data representing a time series of neurophysiologic activity acquired by spatially distributed sensors arranged to detect neural signaling of a brain (such as by the use of magnetoencephalography) is obtained. The set of brain data is processed to obtain a dynamic brain model based on a set of statistically-independent temporal measures, such as partial cross correlations, among groupings of different time series within the set of brain data. The dynamic brain model represents interactions between neural populations of the brain occurring close in time, such as with zero lag, for example. The dynamic brain model can be analyzed to obtain the neurophysiologic assessment of the brain. Data processing techniques may be used to assess structural or neurochemical brain pathologies.

  13. Structure of public transit costs in the presence of multiple serial correlation

    DOT National Transportation Integrated Search

    1999-12-01

    Most studies indicate that public transit systems operate under increasing returns to capital stock utilization and are significantly overcapitalized. Existing flexible form time series analyses, however, fail to correct for serial correlation. In th...

  14. Parabolic quantitative structure-activity relationships and photodynamic therapy: application of a three-compartment model with clearance to the in vivo quantitative structure-activity relationships of a congeneric series of pyropheophorbide derivatives used as photosensitizers for photodynamic therapy.

    PubMed

    Potter, W R; Henderson, B W; Bellnier, D A; Pandey, R K; Vaughan, L A; Weishaupt, K R; Dougherty, T J

    1999-11-01

    An open three-compartment pharmacokinetic model was applied to the in vivo quantitative structure-activity relationship (QSAR) data of a homologous series of pyropheophorbide photosensitizers for photodynamic therapy (PDT). The physical model was a lipid compartment sandwiched between two identical aqueous compartments. The first compartment was assumed to clear irreversibly at a rate K0. The measured octanol-water partition coefficients, P(i) (where i is the number of carbons in the alkyl chain) and the clearance rate K0 determined the clearance kinetics of the drugs. Solving the coupled differential equations of the three-compartment model produced clearance kinetics for each of the sensitizers in each of the compartments. The third compartment was found to contain the target of PDT. This series of compounds is quite lipophilic. Therefore these drugs are found mainly in the second compartment. The drug level in the third compartment represents a small fraction of the tissue level and is thus not accessible to direct measurement by extraction. The second compartment of the model accurately predicted the clearance from the serum of mice of the hexyl ether of pyropheophorbide a, one member of this series of compounds. The diffusion and clearance rate constants were those found by fitting the pharmacokinetics of the third compartment to the QSAR data. This result validated the magnitude and mechanistic significance of the rate constants used to model the QSAR data. The PDT response to dose theory was applied to the kinetic behavior of the target compartment drug concentration. This produced a pharmacokinetic-based function connecting PDT response to dose as a function of time postinjection. This mechanistic dose-response function was fitted to published, single time point QSAR data for the pheophorbides. As a result, the PDT target threshold dose together with the predicted QSAR as a function of time postinjection was found.

  15. Applications of Fault Detection in Vibrating Structures

    NASA Technical Reports Server (NTRS)

    Eure, Kenneth W.; Hogge, Edward; Quach, Cuong C.; Vazquez, Sixto L.; Russell, Andrew; Hill, Boyd L.

    2012-01-01

    Structural fault detection and identification remains an area of active research. Solutions to fault detection and identification may be based on subtle changes in the time series history of vibration signals originating from various sensor locations throughout the structure. The purpose of this paper is to document the application of vibration based fault detection methods applied to several structures. Overall, this paper demonstrates the utility of vibration based methods for fault detection in a controlled laboratory setting and limitations of applying the same methods to a similar structure during flight on an experimental subscale aircraft.

  16. Molecular Modeling and Experimental Study of Nonlinear Optical Compounds: Mono-Substituted Derivatives of Dicyanovinylbenzene

    NASA Technical Reports Server (NTRS)

    Timofeeva, Tatyana V.; Nesterov, Vladimir N.; Antipin, Mikhael Y.; Clark, R. D.; Sanghadasa, M.; Cardelino, B. H.; Moore, C. E.; Frazier, Donald O.

    2000-01-01

    A search for potential nonlinear optical (NLO) compounds has been performed using the Cambridge Structural Database and molecular modeling. We have studied a series of mono-substituted derivatives of dicyanovinylbenzene as the NLO properties of one of its derivatives (o-methoxy-dicyanovinylbenzene, DIVA) were described earlier. The molecular geometry in the series of the compounds studied was investigated with an X- ray analysis and discussed along with results of molecular mechanics and ab initio quantum chemical calculations. The influence of crystal packing on the molecular planarity has been revealed. Two new compounds from the series studied were found to be active for second harmonic generation (SHG) in the powder. The measurements of SHG efficiency have shown that the o-F- and p-Cl-derivatives of dicyanovinylbenzene are about 10 and 20- times more active than urea, respectively. The peculiarities of crystal structure formation in the framework of balance between the van der Waals and electrostatic interactions have been discussed. The crystal morphology of DIVA and two new SHG-active compounds have been calculated on the basis of their known crystal structures.

  17. Rheological and micro-Raman time-series characterization of enzyme sol–gel solution toward morphological control of electrospun fibers

    PubMed Central

    Oriero, Dennis A; Weakley, Andrew T; Aston, D Eric

    2012-01-01

    Rheological and micro-Raman time-series characterizations were used to investigate the chemical evolutionary changes of silica sol–gel mixtures for electrospinning fibers to immobilize an enzyme (tyrosinase). Results of dynamic rheological measurements agreed with the expected structural transitions associated with reacting sol–gel systems. The electrospinning sols exhibited shear-thinning behavior typical of a power law model. Ultrafine (200–300 nm diameter) fibers were produced at early and late times within the reaction window of approximately one hour from initial mixing of sol solutions with and without enzyme; diameter distributions of these fibers showed much smaller deviations than expected. The enzyme markedly increased magnitudes of both elastic and viscous moduli but had no significant impact on final fiber diameters, suggesting that the shear-thinning behavior of both sol–gel mixtures is dominant in the fiber elongation process. The time course and scale for the electrospinning batch fabrication show strong correlations between the magnitudes in rheological property changes over time and the chemical functional group evolution obtained from micro-Raman time-series analysis of the reacting sol–gel systems. PMID:27877486

  18. The genomic response of skeletal muscle to methylprednisolone using microarrays: tailoring data mining to the structure of the pharmacogenomic time series

    PubMed Central

    DuBois, Debra C; Piel, William H; Jusko, William J

    2008-01-01

    High-throughput data collection using gene microarrays has great potential as a method for addressing the pharmacogenomics of complex biological systems. Similarly, mechanism-based pharmacokinetic/pharmacodynamic modeling provides a tool for formulating quantitative testable hypotheses concerning the responses of complex biological systems. As the response of such systems to drugs generally entails cascades of molecular events in time, a time series design provides the best approach to capturing the full scope of drug effects. A major problem in using microarrays for high-throughput data collection is sorting through the massive amount of data in order to identify probe sets and genes of interest. Due to its inherent redundancy, a rich time series containing many time points and multiple samples per time point allows for the use of less stringent criteria of expression, expression change and data quality for initial filtering of unwanted probe sets. The remaining probe sets can then become the focus of more intense scrutiny by other methods, including temporal clustering, functional clustering and pharmacokinetic/pharmacodynamic modeling, which provide additional ways of identifying the probes and genes of pharmacological interest. PMID:15212590

  19. Latent Variable Regression 4-Level Hierarchical Model Using Multisite Multiple-Cohorts Longitudinal Data. CRESST Report 801

    ERIC Educational Resources Information Center

    Choi, Kilchan

    2011-01-01

    This report explores a new latent variable regression 4-level hierarchical model for monitoring school performance over time using multisite multiple-cohorts longitudinal data. This kind of data set has a 4-level hierarchical structure: time-series observation nested within students who are nested within different cohorts of students. These…

  20. Temporal turnover and the maintenance of diversity in ecological assemblages

    PubMed Central

    Magurran, Anne E.; Henderson, Peter A.

    2010-01-01

    Temporal variation in species abundances occurs in all ecological communities. Here, we explore the role that this temporal turnover plays in maintaining assemblage diversity. We investigate a three-decade time series of estuarine fishes and show that the abundances of the individual species fluctuate asynchronously around their mean levels. We then use a time-series modelling approach to examine the consequences of different patterns of turnover, by asking how the correlation between the abundance of a species in a given year and its abundance in the previous year influences the structure of the overall assemblage. Classical diversity measures that ignore species identities reveal that the observed assemblage structure will persist under all but the most extreme conditions. However, metrics that track species identities indicate a narrower set of turnover scenarios under which the predicted assemblage resembles the natural one. Our study suggests that species diversity metrics are insensitive to change and that measures that track species ranks may provide better early warning that an assemblage is being perturbed. It also highlights the need to incorporate temporal turnover in investigations of assemblage structure and function. PMID:20980310

  1. Visibility graphlet approach to chaotic time series

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mutua, Stephen; Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega; Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems.more » Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.« less

  2. Wavelet-based time series bootstrap model for multidecadal streamflow simulation using climate indicators

    NASA Astrophysics Data System (ADS)

    Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth

    2016-05-01

    A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.

  3. An Efficient Interval Type-2 Fuzzy CMAC for Chaos Time-Series Prediction and Synchronization.

    PubMed

    Lee, Ching-Hung; Chang, Feng-Yu; Lin, Chih-Min

    2014-03-01

    This paper aims to propose a more efficient control algorithm for chaos time-series prediction and synchronization. A novel type-2 fuzzy cerebellar model articulation controller (T2FCMAC) is proposed. In some special cases, this T2FCMAC can be reduced to an interval type-2 fuzzy neural network, a fuzzy neural network, and a fuzzy cerebellar model articulation controller (CMAC). So, this T2FCMAC is a more generalized network with better learning ability, thus, it is used for the chaos time-series prediction and synchronization. Moreover, this T2FCMAC realizes the un-normalized interval type-2 fuzzy logic system based on the structure of the CMAC. It can provide better capabilities for handling uncertainty and more design degree of freedom than traditional type-1 fuzzy CMAC. Unlike most of the interval type-2 fuzzy system, the type-reduction of T2FCMAC is bypassed due to the property of un-normalized interval type-2 fuzzy logic system. This causes T2FCMAC to have lower computational complexity and is more practical. For chaos time-series prediction and synchronization applications, the training architectures with corresponding convergence analyses and optimal learning rates based on Lyapunov stability approach are introduced. Finally, two illustrated examples are presented to demonstrate the performance of the proposed T2FCMAC.

  4. The Fourier decomposition method for nonlinear and non-stationary time series analysis

    PubMed Central

    Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik

    2017-01-01

    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of ‘Fourier intrinsic band functions’ (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time–frequency–energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms. PMID:28413352

  5. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in multivariate patterns of voxel activity.

  6. Tidal and residual currents measured by an acoustic doppler current profiler at the west end of Carquinez Strait, San Francisco Bay, California, March to November 1988

    USGS Publications Warehouse

    Burau, J.R.; Simpson, M.R.; Cheng, R.T.

    1993-01-01

    Water-velocity profiles were collected at the west end of Carquinez Strait, San Francisco Bay, California, from March to November 1988, using an acoustic Doppler current profiler (ADCP). These data are a series of 10-minute-averaged water velocities collected at 1-meter vertical intervals (bins) in the 16.8-meter water column, beginning 2.1 meters above the estuary bed. To examine the vertical structure of the horizontal water velocities, the data are separated into individual time-series by bin and then used for time-series plots, harmonic analysis, and for input to digital filters. Three-dimensional graphic renditions of the filtered data are also used in the analysis. Harmonic analysis of the time-series data from each bin indicates that the dominant (12.42 hour or M2) partial tidal currents reverse direction near the bottom, on average, 20 minutes sooner than M2 partial tidal currents near the surface. Residual (nontidal) currents derived from the filtered data indicate that currents near the bottom are pre- dominantly up-estuary during the neap tides and down-estuary during the more energetic spring tides.

  7. Spatial independent component analysis of functional MRI time-series: to what extent do results depend on the algorithm used?

    PubMed

    Esposito, Fabrizio; Formisano, Elia; Seifritz, Erich; Goebel, Rainer; Morrone, Renato; Tedeschi, Gioacchino; Di Salle, Francesco

    2002-07-01

    Independent component analysis (ICA) has been successfully employed to decompose functional MRI (fMRI) time-series into sets of activation maps and associated time-courses. Several ICA algorithms have been proposed in the neural network literature. Applied to fMRI, these algorithms might lead to different spatial or temporal readouts of brain activation. We compared the two ICA algorithms that have been used so far for spatial ICA (sICA) of fMRI time-series: the Infomax (Bell and Sejnowski [1995]: Neural Comput 7:1004-1034) and the Fixed-Point (Hyvärinen [1999]: Adv Neural Inf Proc Syst 10:273-279) algorithms. We evaluated the Infomax- and Fixed Point-based sICA decompositions of simulated motor, and real motor and visual activation fMRI time-series using an ensemble of measures. Log-likelihood (McKeown et al. [1998]: Hum Brain Mapp 6:160-188) was used as a measure of how significantly the estimated independent sources fit the statistical structure of the data; receiver operating characteristics (ROC) and linear correlation analyses were used to evaluate the algorithms' accuracy of estimating the spatial layout and the temporal dynamics of simulated and real activations; cluster sizing calculations and an estimation of a residual gaussian noise term within the components were used to examine the anatomic structure of ICA components and for the assessment of noise reduction capabilities. Whereas both algorithms produced highly accurate results, the Fixed-Point outperformed the Infomax in terms of spatial and temporal accuracy as long as inferential statistics were employed as benchmarks. Conversely, the Infomax sICA was superior in terms of global estimation of the ICA model and noise reduction capabilities. Because of its adaptive nature, the Infomax approach appears to be better suited to investigate activation phenomena that are not predictable or adequately modelled by inferential techniques. Copyright 2002 Wiley-Liss, Inc.

  8. Assessment of the Seattle Smart Traveler evaluation

    DOT National Transportation Integrated Search

    1999-09-01

    The system was designed using a World Wide Web or Internet interface. Two of the unique features of the design were accommodating the desired travel times and identifying origins and destinations. A search structure was developed using a series of pu...

  9. Laboratory and Modeling Studies of Insect Swarms

    DTIC Science & Technology

    2016-03-10

    and measuring the response. These novel methods allowed us for the first time to characterize precisely properties of the swarm at the group level... Time series for a randomly chosen pair as well as its continuous wavelet transform (CWT; bottom panel). Nearly all of the power in the signal for... based time -frequency analysis to identify such transient interactions, as long as they modified the frequency structure of the insect flight

  10. Non-Markovian properties and multiscale hidden Markovian network buried in single molecule time series

    NASA Astrophysics Data System (ADS)

    Sultana, Tahmina; Takagi, Hiroaki; Morimatsu, Miki; Teramoto, Hiroshi; Li, Chun-Biu; Sako, Yasushi; Komatsuzaki, Tamiki

    2013-12-01

    We present a novel scheme to extract a multiscale state space network (SSN) from single-molecule time series. The multiscale SSN is a type of hidden Markov model that takes into account both multiple states buried in the measurement and memory effects in the process of the observable whenever they exist. Most biological systems function in a nonstationary manner across multiple timescales. Combined with a recently established nonlinear time series analysis based on information theory, a simple scheme is proposed to deal with the properties of multiscale and nonstationarity for a discrete time series. We derived an explicit analytical expression of the autocorrelation function in terms of the SSN. To demonstrate the potential of our scheme, we investigated single-molecule time series of dissociation and association kinetics between epidermal growth factor receptor (EGFR) on the plasma membrane and its adaptor protein Ash/Grb2 (Grb2) in an in vitro reconstituted system. We found that our formula successfully reproduces their autocorrelation function for a wide range of timescales (up to 3 s), and the underlying SSNs change their topographical structure as a function of the timescale; while the corresponding SSN is simple at the short timescale (0.033-0.1 s), the SSN at the longer timescales (0.1 s to ˜3 s) becomes rather complex in order to capture multiscale nonstationary kinetics emerging at longer timescales. It is also found that visiting the unbound form of the EGFR-Grb2 system approximately resets all information of history or memory of the process.

  11. Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions

    NASA Astrophysics Data System (ADS)

    Latos, Dorota; Kolanowski, Bogdan; Pachelski, Wojciech; Sołoducha, Ryszard

    2017-12-01

    Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object's behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar).

  12. The Effects of Computer-Assisted Instruction Based on Top-Level Structure Method in English Reading and Writing Abilities of Thai EFL Students

    ERIC Educational Resources Information Center

    Jinajai, Nattapong; Rattanavich, Saowalak

    2015-01-01

    This research aims to study the development of ninth grade students' reading and writing abilities and interests in learning English taught through computer-assisted instruction (CAI) based on the top-level structure (TLS) method. An experimental group time series design was used, and the data was analyzed by multivariate analysis of variance…

  13. Influence of lidar, Landsat imagery, disturbance history, plot location accuracy, and plot size on accuracy of imputation maps of forest composition and structure

    Treesearch

    Harold S.J. Zald; Janet L. Ohmann; Heather M. Roberts; Matthew J. Gregory; Emilie B. Henderson; Robert J. McGaughey; Justin Braaten

    2014-01-01

    This study investigated how lidar-derived vegetation indices, disturbance history from Landsat time series (LTS) imagery, plot location accuracy, and plot size influenced accuracy of statistical spatial models (nearest-neighbor imputation maps) of forest vegetation composition and structure. Nearest-neighbor (NN) imputation maps were developed for 539,000 ha in the...

  14. Detecting mortality induced structural and functional changes in a pinon-juniper woodland using Landsat and RapidEye time series

    Treesearch

    Dan J. Krofcheck; Jan U. H. Eitel; Lee A. Vierling; Urs Schulthess; Timothy M. Hilton; Eva Dettweiler-Robinson; Rosemary Pendleton; Marcy E. Litvak

    2014-01-01

    Pinon-juniper (PJ) woodlands have recently undergone dramatic drought-induced mortality, triggering broad scale structural changes in this extensive Southwestern US biome. Given that climate projections for the region suggest widespread conifer mortality is likely to continue into the next century, it is critical to better understand how this climate-induced change in...

  15. Reconstructing Genetic Regulatory Networks Using Two-Step Algorithms with the Differential Equation Models of Neural Networks.

    PubMed

    Chen, Chi-Kan

    2017-07-26

    The identification of genetic regulatory networks (GRNs) provides insights into complex cellular processes. A class of recurrent neural networks (RNNs) captures the dynamics of GRN. Algorithms combining the RNN and machine learning schemes were proposed to reconstruct small-scale GRNs using gene expression time series. We present new GRN reconstruction methods with neural networks. The RNN is extended to a class of recurrent multilayer perceptrons (RMLPs) with latent nodes. Our methods contain two steps: the edge rank assignment step and the network construction step. The former assigns ranks to all possible edges by a recursive procedure based on the estimated weights of wires of RNN/RMLP (RE RNN /RE RMLP ), and the latter constructs a network consisting of top-ranked edges under which the optimized RNN simulates the gene expression time series. The particle swarm optimization (PSO) is applied to optimize the parameters of RNNs and RMLPs in a two-step algorithm. The proposed RE RNN -RNN and RE RMLP -RNN algorithms are tested on synthetic and experimental gene expression time series of small GRNs of about 10 genes. The experimental time series are from the studies of yeast cell cycle regulated genes and E. coli DNA repair genes. The unstable estimation of RNN using experimental time series having limited data points can lead to fairly arbitrary predicted GRNs. Our methods incorporate RNN and RMLP into a two-step structure learning procedure. Results show that the RE RMLP using the RMLP with a suitable number of latent nodes to reduce the parameter dimension often result in more accurate edge ranks than the RE RNN using the regularized RNN on short simulated time series. Combining by a weighted majority voting rule the networks derived by the RE RMLP -RNN using different numbers of latent nodes in step one to infer the GRN, the method performs consistently and outperforms published algorithms for GRN reconstruction on most benchmark time series. The framework of two-step algorithms can potentially incorporate with different nonlinear differential equation models to reconstruct the GRN.

  16. Kumaraswamy autoregressive moving average models for double bounded environmental data

    NASA Astrophysics Data System (ADS)

    Bayer, Fábio Mariano; Bayer, Débora Missio; Pumi, Guilherme

    2017-12-01

    In this paper we introduce the Kumaraswamy autoregressive moving average models (KARMA), which is a dynamic class of models for time series taking values in the double bounded interval (a,b) following the Kumaraswamy distribution. The Kumaraswamy family of distribution is widely applied in many areas, especially hydrology and related fields. Classical examples are time series representing rates and proportions observed over time. In the proposed KARMA model, the median is modeled by a dynamic structure containing autoregressive and moving average terms, time-varying regressors, unknown parameters and a link function. We introduce the new class of models and discuss conditional maximum likelihood estimation, hypothesis testing inference, diagnostic analysis and forecasting. In particular, we provide closed-form expressions for the conditional score vector and conditional Fisher information matrix. An application to environmental real data is presented and discussed.

  17. Inference for local autocorrelations in locally stationary models.

    PubMed

    Zhao, Zhibiao

    2015-04-01

    For non-stationary processes, the time-varying correlation structure provides useful insights into the underlying model dynamics. We study estimation and inferences for local autocorrelation process in locally stationary time series. Our constructed simultaneous confidence band can be used to address important hypothesis testing problems, such as whether the local autocorrelation process is indeed time-varying and whether the local autocorrelation is zero. In particular, our result provides an important generalization of the R function acf() to locally stationary Gaussian processes. Simulation studies and two empirical applications are developed. For the global temperature series, we find that the local autocorrelations are time-varying and have a "V" shape during 1910-1960. For the S&P 500 index, we conclude that the returns satisfy the efficient-market hypothesis whereas the magnitudes of returns show significant local autocorrelations.

  18. Predicting the process of extinction in experimental microcosms and accounting for interspecific interactions in single-species time series.

    PubMed

    Ferguson, Jake M; Ponciano, José M

    2014-02-01

    Predicting population extinction risk is a fundamental application of ecological theory to the practice of conservation biology. Here, we compared the prediction performance of a wide array of stochastic, population dynamics models against direct observations of the extinction process from an extensive experimental data set. By varying a series of biological and statistical assumptions in the proposed models, we were able to identify the assumptions that affected predictions about population extinction. We also show how certain autocorrelation structures can emerge due to interspecific interactions, and that accounting for the stochastic effect of these interactions can improve predictions of the extinction process. We conclude that it is possible to account for the stochastic effects of community interactions on extinction when using single-species time series. © 2013 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.

  19. CU Prime Diversity Workshops: Creating Spaces for Growth Amongst Organizers

    NASA Astrophysics Data System (ADS)

    Hyater-Adams, Simone

    2016-03-01

    CU Prime is a graduate student run organization that was created as a way to promote community and inclusion amongst students in CU Physics Department. With a mission to improve the experiences of students, especially those underrepresented in the department and field, the core organizers developed three programs: a seminar series, a class, and a mentorship program. However, because this is strictly volunteer time for most organizers, there is little time for development and growth as a group. In response, we developed a series of diversity workshops for the group, in order to provide space and time for organizers to reflect on and grapple with difficult issues around diversity and inclusion that are important to think about when running these programs. With a structure based on readings, informal videos, and reflection, there have been 5 workshops around topics ranging from gender in physics to how to be an ally. We overview the structure and framing of these workshops, along with the challenges and successes throughout the process of developing them, along with plans for future development.

  20. Approximation methods for combined thermal/structural design

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Shore, C. P.

    1979-01-01

    Two approximation concepts for combined thermal/structural design are evaluated. The first concept is an approximate thermal analysis based on the first derivatives of structural temperatures with respect to design variables. Two commonly used first-order Taylor series expansions are examined. The direct and reciprocal expansions are special members of a general family of approximations, and for some conditions other members of that family of approximations are more accurate. Several examples are used to compare the accuracy of the different expansions. The second approximation concept is the use of critical time points for combined thermal and stress analyses of structures with transient loading conditions. Significant time savings are realized by identifying critical time points and performing the stress analysis for those points only. The design of an insulated panel which is exposed to transient heating conditions is discussed.

  1. Comparison of seasonal variability of Aquarius sea surface salinity time series with in situ observations in the Karimata Strait, Indonesia

    NASA Astrophysics Data System (ADS)

    Susanto, R. D.; Setiawan, A.; Zheng, Q.; Sulistyo, B.; Adi, T. R.; Agustiadi, T.; Trenggono, M.; Triyono, T.; Kuswardani, A.

    2016-12-01

    The seasonal variability of a full lifetime of Aquarius sea surface salinity time series from August 25, 2011 to June 7, 2015 is compared to salinity time series obtained from in situ observations in the Karimata Strait. The Karimata Strait plays dual roles in water exchange between the Pacific and the Indian Ocean. The salinity in the Karimata Strait is strongly affected by seasonal monsoon winds. During the boreal winter monsoon, northwesterly winds draws low salinity water from the South China Sea into the Java Sea and at the same time, the Java Sea receives an influx of the Indian Ocean water via the Sunda Strait. The Java Sea water will reduce the main Indonesian throughflow in the Makassar Strait. Conditions are reversed during the summer monsoon. Low salinity water from the South China Sea also controls the vertical structure of water properties in the upper layer of the Makassar Strait and the Lombok Strait. As a part of the South China Sea and Indonesian Seas Transport/Exchange (SITE) program, trawl resistance bottom mounted CTD was deployed in the Karimata Strait in mid-2010 to mid-2016 at water depth of 40 m. CTD casts during the mooring recoveries and deployments are used to compare the bottom salinity data. This in situ salinity time series is compared with various Aquarius NASA salinity products (the level 2, level 3 ascending and descending tracks and the seven-days rolling averaged) to check the consistency, correlation and statistical analysis. The preliminary results show that the seasonal variability of Aquarius salinity time series has larger amplitude variability compared to that of in situ data.

  2. Statistical Frequency-Dependent Analysis of Trial-to-Trial Variability in Single Time Series by Recurrence Plots.

    PubMed

    Tošić, Tamara; Sellers, Kristin K; Fröhlich, Flavio; Fedotenkova, Mariia; Beim Graben, Peter; Hutt, Axel

    2015-01-01

    For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain.

  3. Statistical Frequency-Dependent Analysis of Trial-to-Trial Variability in Single Time Series by Recurrence Plots

    PubMed Central

    Tošić, Tamara; Sellers, Kristin K.; Fröhlich, Flavio; Fedotenkova, Mariia; beim Graben, Peter; Hutt, Axel

    2016-01-01

    For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain. PMID:26834580

  4. Dimension reduction of frequency-based direct Granger causality measures on short time series.

    PubMed

    Siggiridou, Elsa; Kimiskidis, Vasilios K; Kugiumtzis, Dimitris

    2017-09-01

    The mainstream in the estimation of effective brain connectivity relies on Granger causality measures in the frequency domain. If the measure is meant to capture direct causal effects accounting for the presence of other observed variables, as in multi-channel electroencephalograms (EEG), typically the fit of a vector autoregressive (VAR) model on the multivariate time series is required. For short time series of many variables, the estimation of VAR may not be stable requiring dimension reduction resulting in restricted or sparse VAR models. The restricted VAR obtained by the modified backward-in-time selection method (mBTS) is adapted to the generalized partial directed coherence (GPDC), termed restricted GPDC (RGPDC). Dimension reduction on other frequency based measures, such the direct directed transfer function (dDTF), is straightforward. First, a simulation study using linear stochastic multivariate systems is conducted and RGPDC is favorably compared to GPDC on short time series in terms of sensitivity and specificity. Then the two measures are tested for their ability to detect changes in brain connectivity during an epileptiform discharge (ED) from multi-channel scalp EEG. It is shown that RGPDC identifies better than GPDC the connectivity structure of the simulated systems, as well as changes in the brain connectivity, and is less dependent on the free parameter of VAR order. The proposed dimension reduction in frequency measures based on VAR constitutes an appropriate strategy to estimate reliably brain networks within short-time windows. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Boolean network identification from perturbation time series data combining dynamics abstraction and logic programming.

    PubMed

    Ostrowski, M; Paulevé, L; Schaub, T; Siegel, A; Guziolowski, C

    2016-11-01

    Boolean networks (and more general logic models) are useful frameworks to study signal transduction across multiple pathways. Logic models can be learned from a prior knowledge network structure and multiplex phosphoproteomics data. However, most efficient and scalable training methods focus on the comparison of two time-points and assume that the system has reached an early steady state. In this paper, we generalize such a learning procedure to take into account the time series traces of phosphoproteomics data in order to discriminate Boolean networks according to their transient dynamics. To that end, we identify a necessary condition that must be satisfied by the dynamics of a Boolean network to be consistent with a discretized time series trace. Based on this condition, we use Answer Set Programming to compute an over-approximation of the set of Boolean networks which fit best with experimental data and provide the corresponding encodings. Combined with model-checking approaches, we end up with a global learning algorithm. Our approach is able to learn logic models with a true positive rate higher than 78% in two case studies of mammalian signaling networks; for a larger case study, our method provides optimal answers after 7min of computation. We quantified the gain in our method predictions precision compared to learning approaches based on static data. Finally, as an application, our method proposes erroneous time-points in the time series data with respect to the optimal learned logic models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Non-parametric directionality analysis - Extension for removal of a single common predictor and application to time series.

    PubMed

    Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob

    2016-08-01

    The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Carbon-dioxide emissions trading and hierarchical structure in worldwide finance and commodities markets.

    PubMed

    Zheng, Zeyu; Yamasaki, Kazuko; Tenenbaum, Joel N; Stanley, H Eugene

    2013-01-01

    In a highly interdependent economic world, the nature of relationships between financial entities is becoming an increasingly important area of study. Recently, many studies have shown the usefulness of minimal spanning trees (MST) in extracting interactions between financial entities. Here, we propose a modified MST network whose metric distance is defined in terms of cross-correlation coefficient absolute values, enabling the connections between anticorrelated entities to manifest properly. We investigate 69 daily time series, comprising three types of financial assets: 28 stock market indicators, 21 currency futures, and 20 commodity futures. We show that though the resulting MST network evolves over time, the financial assets of similar type tend to have connections which are stable over time. In addition, we find a characteristic time lag between the volatility time series of the stock market indicators and those of the EU CO(2) emission allowance (EUA) and crude oil futures (WTI). This time lag is given by the peak of the cross-correlation function of the volatility time series EUA (or WTI) with that of the stock market indicators, and is markedly different (>20 days) from 0, showing that the volatility of stock market indicators today can predict the volatility of EU emissions allowances and of crude oil in the near future.

  8. Carbon-dioxide emissions trading and hierarchical structure in worldwide finance and commodities markets

    NASA Astrophysics Data System (ADS)

    Zheng, Zeyu; Yamasaki, Kazuko; Tenenbaum, Joel N.; Stanley, H. Eugene

    2013-01-01

    In a highly interdependent economic world, the nature of relationships between financial entities is becoming an increasingly important area of study. Recently, many studies have shown the usefulness of minimal spanning trees (MST) in extracting interactions between financial entities. Here, we propose a modified MST network whose metric distance is defined in terms of cross-correlation coefficient absolute values, enabling the connections between anticorrelated entities to manifest properly. We investigate 69 daily time series, comprising three types of financial assets: 28 stock market indicators, 21 currency futures, and 20 commodity futures. We show that though the resulting MST network evolves over time, the financial assets of similar type tend to have connections which are stable over time. In addition, we find a characteristic time lag between the volatility time series of the stock market indicators and those of the EU CO2 emission allowance (EUA) and crude oil futures (WTI). This time lag is given by the peak of the cross-correlation function of the volatility time series EUA (or WTI) with that of the stock market indicators, and is markedly different (>20 days) from 0, showing that the volatility of stock market indicators today can predict the volatility of EU emissions allowances and of crude oil in the near future.

  9. A Point Rainfall Generator With Internal Storm Structure

    NASA Astrophysics Data System (ADS)

    Marien, J. L.; Vandewiele, G. L.

    1986-04-01

    A point rainfall generator is a probabilistic model for the time series of rainfall as observed in one geographical point. The main purpose of such a model is to generate long synthetic sequences of rainfall for simulation studies. The present generator is a continuous time model based on 13.5 years of 10-min point rainfalls observed in Belgium and digitized with a resolution of 0.1 mm. The present generator attempts to model all features of the rainfall time series which are important for flood studies as accurately as possible. The original aspects of the model are on the one hand the way in which storms are defined and on the other hand the theoretical model for the internal storm characteristics. The storm definition has the advantage that the important characteristics of successive storms are fully independent and very precisely modelled, even on time bases as small as 10 min. The model of the internal storm characteristics has a strong theoretical structure. This fact justifies better the extrapolation of this model to severe storms for which the data are very sparse. This can be important when using the model to simulate severe flood events.

  10. Field evidences for a Mesozoic palaeo-relief through the northern Tianshan

    NASA Astrophysics Data System (ADS)

    Gumiaux, Charles; Chen, Ke; Augier, Romain; Chen, Yan; Wang, Qingchen

    2010-05-01

    The modern Tianshan mountain belt, located in Central Asia, offers a natural laboratory to study orogenic processes linked with convergent geodynamical settings. Most of the previous studies either focused on the Paleozoic evolution of the range - subductions, arc accretions and continental collision - or on its Cenozoic intra-continental evolution linked with the India-Asia collision. At first order, the finite structure of this range obviously displays a remarkable uprising of Paleozoic "basement" rocks - as a crustal-scale ‘pop-up' - surrounded by two Cenozoic foreland basins. The present-day topography of the Tianshan is traditionally related to the latest intra-continental reactivation of the range. In contrast, the present field study of the northern Tianshan brings new and clear evidences for the existence of a significant relief, in this area, during Mesozoic times. The investigation zone is about 250 km long, from Wusu to Urumqi, along the northern flank of the Tianshan where the rivers deeply incised the topography. In such valleys, lithologies and structural relationships between Paleozoic basement rocks, Mesozoic and Cenozoic sedimentary series are particularly well exposed along several sections. Jurassic series are mostly characterized by coal-bearing, coarse-grained continental deposits. Within intra-mountain basins, sedimentary breccias, with clasts of Carboniferous basement rocks, have been locally found at the base of the series. This argues for the presence of a rather proximal palaeo-relief of basement rocks along the range front and the occurrence of proximal intra-mountain basins, during the Jurassic. Moreover, while a major thrust is mostly evoked between Jurassic deposits and the Paleozoic units, some of the studied sections show that the Triassic to Jurassic sedimentary series can be followed from the basin to the range. In these cases, the unconformity of the Mesozoic series on top of the Carboniferous basement has been locally clearly identified quite high in the mountain range or even, surprisingly, directly along the northern Tianshan "front" itself. Combining available information from geological maps, field investigations and numerous drilling wells, regional-scale cross-sections have been built. Some of them show "onlap" type deposit of the Triassic to Jurassic clastic sediments on top of the Paleozoic basement that was thus significantly sloping down to the North at that time. Our study clearly evidences, at different scales, the existence of a major palaeo-relief along the northern Tianshan range during Mesozoic, and particularly during Jurassic times. Such results are compatible with previous fission tracks and sedimentology studies. From this, the Tianshan's uplift and the movements associated with along its northern front structures, which are traditionally assigned to its Cenozoic reactivation, must be reduced. These new results question on the mode and timing of reactivation of structures and on the link between topography and intra-continental collisional settings.

  11. Biological Indicators in Studies of Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Sidorin, A. Ya.; Deshcherevskii, A. V.

    2012-04-01

    Time series of data on variations in the electric activity (EA) of four species of weakly electric fish Gnathonemus leopoldianus and moving activity (MA) of two cat-fishes Hoplosternum thoracatum and two groups of Columbian cockroaches Blaberus craniifer were analyzed. The observations were carried out in the Garm region of Tajikistan within the frameworks of the experiments aimed at searching for earthquake precursors. An automatic recording system continuously recorded EA and DA over a period of several years. Hourly means EA and MA values were processed. Approximately 100 different parameters were calculated on the basis of six initial EA and MA time series, which characterize different variations in the EA and DA structure: amplitude of the signal and fluctuations of activity, parameters of diurnal rhythms, correlated changes in the activity of various biological indicators, and others. A detailed analysis of the statistical structure of the total array of parametric time series obtained in the experiment showed that the behavior of all animals shows a strong temporal variability. All calculated parameters are unstable and subject to frequent changes. A comparison of the data obtained with seismicity allow us to make the following conclusions: (1) The structure of variations in the studied parameters is represented by flicker noise or even a more complex process with permanent changes in its characteristics. Significant statistics are required to prove the cause-and-effect relationship of the specific features of such time series with seismicity. (2) The calculation of the reconstruction statistics in the EA and MA series structure demonstrated an increase in their frequency in the last hours or a few days before the earthquake if the hypocenter distance is comparable to the source size. Sufficiently dramatic anomalies in the behavior of catfishes and cockroaches (changes in the amplitude of activity variation, distortions of diurnal rhythms, increase in the mismatch of coordination between the activity dynamics of one type of biological indicators) were observed in one case before the November 12, 1987, event at a hypocenter distance of 8 km from the observation point (i.e., the animals were located within the source zone). (3) Changes observed before the earthquakes do not have any specific features and correspond quite well to the variations permanently observed without any relation to the earthquakes. (4) The activity of individual specimens has specific features. This hampers the implication of the biological monitoring. (5) The conclusions made here should not be considered absolute or extrapolated over all cases of observation of the behavior of animals, because the animals were kept under experimental (laboratory) conditions and could be screened from the influence of the stimuli of some modalities.

  12. Output-only modal parameter estimator of linear time-varying structural systems based on vector TAR model and least squares support vector machine

    NASA Astrophysics Data System (ADS)

    Zhou, Si-Da; Ma, Yuan-Chen; Liu, Li; Kang, Jie; Ma, Zhi-Sai; Yu, Lei

    2018-01-01

    Identification of time-varying modal parameters contributes to the structural health monitoring, fault detection, vibration control, etc. of the operational time-varying structural systems. However, it is a challenging task because there is not more information for the identification of the time-varying systems than that of the time-invariant systems. This paper presents a vector time-dependent autoregressive model and least squares support vector machine based modal parameter estimator for linear time-varying structural systems in case of output-only measurements. To reduce the computational cost, a Wendland's compactly supported radial basis function is used to achieve the sparsity of the Gram matrix. A Gamma-test-based non-parametric approach of selecting the regularization factor is adapted for the proposed estimator to replace the time-consuming n-fold cross validation. A series of numerical examples have illustrated the advantages of the proposed modal parameter estimator on the suppression of the overestimate and the short data. A laboratory experiment has further validated the proposed estimator.

  13. β Lup, δ Lup, and τ^1 Lup observed by BRITE-Constellation

    NASA Astrophysics Data System (ADS)

    Cugier, H.; Pigulski, A.

    2017-09-01

    Time-series analysis of BRITE-Constellation photometry of β Lup, δ Lup and τ^1 Lup revealed 16, 22 and four pulsation modes, respectively. An attempt to constrain internal structure of these stars via seismic modelling was also made.

  14. System Complexity Reduction via Feature Selection

    ERIC Educational Resources Information Center

    Deng, Houtao

    2011-01-01

    This dissertation transforms a set of system complexity reduction problems to feature selection problems. Three systems are considered: classification based on association rules, network structure learning, and time series classification. Furthermore, two variable importance measures are proposed to reduce the feature selection bias in tree…

  15. Extension of classical hydrological risk analysis to non-stationary conditions due to climate change - application to the Fulda catchment, Germany

    NASA Astrophysics Data System (ADS)

    Fink, G.; Koch, M.

    2010-12-01

    An important aspect in water resources and hydrological engineering is the assessment of hydrological risk, due to the occurrence of extreme events, e.g. droughts or floods. When dealing with the latter - as is the focus here - the classical methods of flood frequency analysis (FFA) are usually being used for the proper dimensioning of a hydraulic structure, for the purpose of bringing down the flood risk to an acceptable level. FFA is based on extreme value statistics theory. Despite the progress of methods in this scientific branch, the development, decision, and fitting of an appropriate distribution function stills remains a challenge, particularly, when certain underlying assumptions of the theory are not met in real applications. This is, for example, the case when the stationarity-condition for a random flood time series is not satisfied anymore, as could be the situation when long-term hydrological impacts of future climate change are to be considered. The objective here is to verify the applicability of classical (stationary) FFA to predicted flood time series in the Fulda catchment in central Germany, as they may occur in the wake of climate change during the 21st century. These discharge time series at the outlet of the Fulda basin have been simulated with a distributed hydrological model (SWAT) that is forced by predicted climate variables of a regional climate model for Germany (REMO). From the simulated future daily time series, annual maximum (extremes) values are computed and analyzed for the purpose of risk evaluation. Although the 21st century estimated extreme flood series of the Fulda river turn out to be only mildly non-stationary, alleviating the need for further action and concern at the first sight, the more detailed analysis of the risk, as quantified, for example, by the return period, shows non-negligent differences in the calculated risk levels. This could be verified by employing a new method, the so-called flood series maximum analysis (FSMA) method, which consists in the stochastic simulation of numerous trajectories of a stochastic process with a given GEV-distribution over a certain length of time (> larger than a desired return period). Then the maximum value for each trajectory is computed, all of which are then used to determine the empirical distribution of this maximum series. Through graphical inversion of this distribution function the size of the design flood for a given risk (quantile) and given life duration can be inferred. The results of numerous simulations show that for stationary flood series, the new FSMA method results, expectedly, in nearly identical risk values as the classical FFA approach. However, once the flood time series becomes slightly non-stationary - for reasons as discussed - and regardless of whether the trend is increasing or decreasing, large differences in the computed risk values for a given design flood occur. Or in other word, for the same risk, the new FSMA method would lead to different values in the design flood for a hydraulic structure than the classical FFA method. This, in turn, could lead to some cost savings in the realization of a hydraulic project.

  16. Change Point Detection in Correlation Networks

    NASA Astrophysics Data System (ADS)

    Barnett, Ian; Onnela, Jukka-Pekka

    2016-01-01

    Many systems of interacting elements can be conceptualized as networks, where network nodes represent the elements and network ties represent interactions between the elements. In systems where the underlying network evolves, it is useful to determine the points in time where the network structure changes significantly as these may correspond to functional change points. We propose a method for detecting change points in correlation networks that, unlike previous change point detection methods designed for time series data, requires minimal distributional assumptions. We investigate the difficulty of change point detection near the boundaries of the time series in correlation networks and study the power of our method and competing methods through simulation. We also show the generalizable nature of the method by applying it to stock price data as well as fMRI data.

  17. Review of current GPS methodologies for producing accurate time series and their error sources

    NASA Astrophysics Data System (ADS)

    He, Xiaoxing; Montillet, Jean-Philippe; Fernandes, Rui; Bos, Machiel; Yu, Kegen; Hua, Xianghong; Jiang, Weiping

    2017-05-01

    The Global Positioning System (GPS) is an important tool to observe and model geodynamic processes such as plate tectonics and post-glacial rebound. In the last three decades, GPS has seen tremendous advances in the precision of the measurements, which allow researchers to study geophysical signals through a careful analysis of daily time series of GPS receiver coordinates. However, the GPS observations contain errors and the time series can be described as the sum of a real signal and noise. The signal itself can again be divided into station displacements due to geophysical causes and to disturbing factors. Examples of the latter are errors in the realization and stability of the reference frame and corrections due to ionospheric and tropospheric delays and GPS satellite orbit errors. There is an increasing demand on detecting millimeter to sub-millimeter level ground displacement signals in order to further understand regional scale geodetic phenomena hence requiring further improvements in the sensitivity of the GPS solutions. This paper provides a review spanning over 25 years of advances in processing strategies, error mitigation methods and noise modeling for the processing and analysis of GPS daily position time series. The processing of the observations is described step-by-step and mainly with three different strategies in order to explain the weaknesses and strengths of the existing methodologies. In particular, we focus on the choice of the stochastic model in the GPS time series, which directly affects the estimation of the functional model including, for example, tectonic rates, seasonal signals and co-seismic offsets. Moreover, the geodetic community continues to develop computational methods to fully automatize all phases from analysis of GPS time series. This idea is greatly motivated by the large number of GPS receivers installed around the world for diverse applications ranging from surveying small deformations of civil engineering structures (e.g., subsidence of the highway bridge) to the detection of particular geophysical signals.

  18. On pads and filters: Processing strong-motion data

    USGS Publications Warehouse

    Boore, D.M.

    2005-01-01

    Processing of strong-motion data in many cases can be as straightforward as filtering the acceleration time series and integrating to obtain velocity and displacement. To avoid the introduction of spurious low-frequency noise in quantities derived from the filtered accelerations, however, care must be taken to append zero pads of adequate length to the beginning and end of the segment of recorded data. These padded sections of the filtered acceleration need to be retained when deriving velocities, displacements, Fourier spectra, and response spectra. In addition, these padded and filtered sections should also be included in the time series used in the dynamic analysis of structures and soils to ensure compatibility with the filtered accelerations.

  19. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  20. Aeroelastic Response of Nonlinear Wing Section By Functional Series Technique

    NASA Technical Reports Server (NTRS)

    Marzocca, Piergiovanni; Librescu, Liviu; Silva, Walter A.

    2000-01-01

    This paper addresses the problem of the determination of the subcritical aeroelastic response and flutter instability of nonlinear two-dimensional lifting surfaces in an incompressible flow-field via indicial functions and Volterra series approach. The related aeroelastic governing equations are based upon the inclusion of structural and damping nonlinearities in plunging and pitching, of the linear unsteady aerodynamics and consideration of an arbitrary time-dependent external pressure pulse. Unsteady aeroelastic nonlinear kernels are determined, and based on these, frequency and time histories of the subcritical aeroelastic response are obtained, and in this context the influence of the considered nonlinearities is emphasized. Conclusions and results displaying the implications of the considered effects are supplied.

  1. Aeroelastic Response of Nonlinear Wing Section by Functional Series Technique

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Marzocca, Piergiovanni

    2001-01-01

    This paper addresses the problem of the determination of the subcritical aeroelastic response and flutter instability of nonlinear two-dimensional lifting surfaces in an incompressible flow-field via indicial functions and Volterra series approach. The related aeroelastic governing equations are based upon the inclusion of structural and damping nonlinearities in plunging and pitching, of the linear unsteady aerodynamics and consideration of an arbitrary time-dependent external pressure pulse. Unsteady aeroelastic nonlinear kernels are determined, and based on these, frequency and time histories of the subcritical aeroelastic response are obtained, and in this context the influence of the considered nonlinearities is emphasized. Conclusions and results displaying the implications of the considered effects are supplied.

  2. Modified DFA and DCCA approach for quantifying the multiscale correlation structure of financial markets

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2013-12-01

    We use multiscale detrended fluctuation analysis (MSDFA) and multiscale detrended cross-correlation analysis (MSDCCA) to investigate auto-correlation (AC) and cross-correlation (CC) in the US and Chinese stock markets during 1997-2012. The results show that US and Chinese stock indices differ in terms of their multiscale AC structures. Stock indices in the same region also differ with regard to their multiscale AC structures. We analyze AC and CC behaviors among indices for the same region to determine similarity among six stock indices and divide them into four groups accordingly. We choose S&P500, NQCI, HSI, and the Shanghai Composite Index as representative samples for simplicity. MSDFA and MSDCCA results and average MSDFA spectra for local scaling exponents (LSEs) for individual series are presented. We find that the MSDCCA spectrum for LSE CC between two time series generally tends to be greater than the average MSDFA LSE spectrum for individual series. We obtain detailed multiscale structures and relations for CC between the four representatives. MSDFA and MSDCCA with secant rolling windows of different sizes are then applied to reanalyze the AC and CC. Vertical and horizontal comparisons of different window sizes are made. The MSDFA and MSDCCA results for the original window size are confirmed and some new interesting characteristics and conclusions regarding multiscale correlation structures are obtained.

  3. Nonlinear Behavior of the Geomagnetic Fluctuations Recorded in Different Geomagnetic Latitudes

    NASA Astrophysics Data System (ADS)

    Kovacs, P.; Heilig, B.; Koppan, A.; Vadasz, G.; Echim, M.

    2014-12-01

    The paper concerns with the nonlinear properties of geomagnetic variations recorded in different geomagnetic latitudes, in the years of solar maximum and minimum. For the study, we use the geomagnetic time-series recorded by some of the stations of the EMMA quasi-meridional magnetometer network, established for pulsation study, in September 2001. The stations are located approx. along the magnetic meridian of 100 degree, and the sampling frequency of the series is 1 Hz. It is argued that the geomagnetic field exhibits nonlinear intermittent fluctuations in certain temporal scale range. For quantitatively investigating the scaling ranges and the variation of intermittent properties with latitude and time, we analyse the higher order moments of the time records (probability density function or structure function analyses). The multifractal or self-similar scaling of the fluctuations is investigated via the fitting of the P model to structure function scaling exponents. We also study the power-law behaviour of the power-spectral density functions of the series in order to evaluate the possible inertial frequency (and temporal) range of the geomagnetic field and compare them with the scaling ranges of structure functions. The range where intermittent geomagnetic variation is found falls typically between 100 and 20.000 s, i.e. covers the temporal range of the main phases of geomagnetic storms. It is shown that the intensity of intermittent fluctuations increases from solar minimum to solar maximum. The expected increase in the level of intermittency with the geomagnetic latitude can be evidenced only in the years of solar minimum. The research leading to these results has received funding from the European Community's Seventh Framework Programme ([FP7/2007-2013]) under grant agreement n° 313038/STORM.

  4. Taurus II Stage Test Simulations: Using Large-Scale CFD Simulations to Provide Critical Insight into Plume Induced Environments During Design

    NASA Technical Reports Server (NTRS)

    Struzenberg, L. L.; West, J. S.

    2011-01-01

    This paper describes the use of targeted Loci/CHEM CFD simulations to evaluate the effects of a dual-engine first-stage hot-fire test on an evolving integrated launch pad/test article design. This effort was undertaken as a part of the NESC Independent Assessment of the Taurus II Stage Test Series. The underlying conceptual model included development of a series of computational models and simulations to analyze the plume induced environments on the pad, facility structures and test article. A pathfinder simulation was first developed, capable of providing quick-turn around evaluation of plume impingement pressures on the flame deflector. Results from this simulation were available in time to provide data for an ongoing structural assessment of the deflector. The resulting recommendation was available in a timely manner and was incorporated into construction schedule for the new launch stand under construction at Wallops Flight Facility. A series of Reynolds-Averaged Navier-Stokes (RANS) quasi-steady simulations representative of various key elements of the test profile was performed to identify potential concerns with the test configuration and test profile. As required, unsteady Hybrid-RANS/LES simulations were performed, to provide additional insight into critical aspects of the test sequence. Modifications to the test-specific hardware and facility structures thermal protection as well as modifications to the planned hot-fire test profile were implemented based on these simulation results.

  5. Studies on the wintertime current structure and T-S fine-structure in the Taiwan Strait

    NASA Astrophysics Data System (ADS)

    Hu, Jianyu; Fu, Zilang; Wu, Lianxing

    1990-12-01

    A cruise through the western sea area of the Taiwan Strait was carried out by the R/V Dong Fang Hong in December, 1987. Eight anchored and 10 not anchored stations were set up. Over 25 time-series current observations were made at each station and CTD (Conductivity-temperature-depth) measurements were made at 5 anchored and 10 not anchored stations. Based on the measured data. fine-structures and step-like vertical structures of temperature and salinity were analysed and a tentative wintertime current structure in the Taiwan Strait was described.

  6. On the characterization of vegetation recovery after fire disturbance using Fisher-Shannon analysis and SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series

    NASA Astrophysics Data System (ADS)

    Lasaponara, Rosa; Lanorte, Antonio; Lovallo, Michele; Telesca, Luciano

    2015-04-01

    Time series can fruitfully support fire monitoring and management from statistical analysis of fire occurrence (Tuia et al. 2008) to danger estimation (lasaponara 2005), damage evaluation (Lanorte et al 2014) and post fire recovery (Lanorte et al. 2014). In this paper, the time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers. Reference Lanorte A, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbanceInternational Journal of Applied Earth Observation and Geoinformation 26 441-446 Lanorte A, M Danese, R Lasaponara, B Murgante 2014 Multiscale mapping of burn area and severity using multisensor satellite data and spatial autocorrelation analysis International Journal of Applied Earth Observation and Geoinformation 20, 42-51 Tuia D, F Ratle, R Lasaponara, L Telesca, M Kanevski 2008 Scan statistics analysis of forest fire clusters Communications in Nonlinear Science and Numerical Simulation 13 (8), 1689-1694 Telesca L, R Lasaponara 2006 Pre and post fire behavioral trends revealed in satellite NDVI time series Geophysical Research Letters 33 (14) Lasaponara R 2005 Intercomparison of AVHRR based fire susceptibility indicators for the Mediterranean ecosystems of southern Italy International Journal of Remote Sensing 26 (5), 853-870

  7. Simple Deterministically Constructed Recurrent Neural Networks

    NASA Astrophysics Data System (ADS)

    Rodan, Ali; Tiňo, Peter

    A large number of models for time series processing, forecasting or modeling follows a state-space formulation. Models in the specific class of state-space approaches, referred to as Reservoir Computing, fix their state-transition function. The state space with the associated state transition structure forms a reservoir, which is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be potentially exploited by the reservoir-to-output readout mapping. The largely "black box" character of reservoirs prevents us from performing a deeper theoretical investigation of the dynamical properties of successful reservoirs. Reservoir construction is largely driven by a series of (more-or-less) ad-hoc randomized model building stages, with both the researchers and practitioners having to rely on a series of trials and errors. We show that a very simple deterministically constructed reservoir with simple cycle topology gives performances comparable to those of the Echo State Network (ESN) on a number of time series benchmarks. Moreover, we argue that the memory capacity of such a model can be made arbitrarily close to the proved theoretical limit.

  8. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance Structure Models to Block-Toeplitz Matrices Representing Single-Subject Multivariate Time-Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    1998-01-01

    Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…

  9. Dealing with Multiple Solutions in Structural Vector Autoregressive Models.

    PubMed

    Beltz, Adriene M; Molenaar, Peter C M

    2016-01-01

    Structural vector autoregressive models (VARs) hold great potential for psychological science, particularly for time series data analysis. They capture the magnitude, direction of influence, and temporal (lagged and contemporaneous) nature of relations among variables. Unified structural equation modeling (uSEM) is an optimal structural VAR instantiation, according to large-scale simulation studies, and it is implemented within an SEM framework. However, little is known about the uniqueness of uSEM results. Thus, the goal of this study was to investigate whether multiple solutions result from uSEM analysis and, if so, to demonstrate ways to select an optimal solution. This was accomplished with two simulated data sets, an empirical data set concerning children's dyadic play, and modifications to the group iterative multiple model estimation (GIMME) program, which implements uSEMs with group- and individual-level relations in a data-driven manner. Results revealed multiple solutions when there were large contemporaneous relations among variables. Results also verified several ways to select the correct solution when the complete solution set was generated, such as the use of cross-validation, maximum standardized residuals, and information criteria. This work has immediate and direct implications for the analysis of time series data and for the inferences drawn from those data concerning human behavior.

  10. The crystal structure of lueshite at 298 K resolved by high-resolution time-of-flight neutron powder diffraction

    NASA Astrophysics Data System (ADS)

    Mitchell, Roger H.; Kennedy, Brendan J.; Knight, Kevin S.

    2018-01-01

    Refinement of time-of-flight high-resolution neutron powder diffraction data for lueshite (Na, Ca)(Nb, Ta, Ti)O3, the natural analogue of synthetic NaNbO3, demonstrates that lueshite at room temperature (298 K) adopts an orthorhombic structure with a 2 a p × 2 a p × 4 a p superlattice described by space group Pmmn [#59: a = 7.8032(4) Å; b = 7.8193(4) Å; c = 15.6156(9) Å]. This structure is analogous to that of phase S of synthetic NaNbO3 observed at 753-783 K (480-510 °C). In common with synthetic NaNbO3, lueshite exhibits a series of phase transitions with decreasing temperature from a cubic (Pm\\bar{3}m) aristotype through tetragonal ( P4/ mbm) and orthorhombic ( Cmcm) structures. However, the further sequence of phase transitions differs in that for lueshite the series terminates with the room temperature S ( Pmmn) phase, and the R ( Pmmn or Pnma) and P ( Pbcm) phases of NaNbO3 are not observed. The appearance of the S phase in lueshite at a lower temperature, relative to that of NaNbO3, is attributable to the effects of solid solution of Ti, Ta and Ca in lueshite.

  11. Zernike Phase Contrast Electron Cryo-Tomography Applied to Marine Cyanobacteria Infected with Cyanophages

    PubMed Central

    Dai, Wei; Fu, Caroline; Khant, Htet A.; Ludtke, Steven J.; Schmid, Michael F.; Chiu, Wah

    2015-01-01

    Advances in electron cryo-tomography have provided a new opportunity to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase contrast optics produces images with dramatically increased contrast compared to images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods to obtain 3D structures of cyanophage assembly intermediates in the host, by subtomogram alignment, classification and averaging. Acquiring three to four tomographic tilt series takes approximately 12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. Time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume. PMID:25321408

  12. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  13. Diurnal Transcriptome and Gene Network Represented through Sparse Modeling in Brachypodium distachyon.

    PubMed

    Koda, Satoru; Onda, Yoshihiko; Matsui, Hidetoshi; Takahagi, Kotaro; Yamaguchi-Uehara, Yukiko; Shimizu, Minami; Inoue, Komaki; Yoshida, Takuhiro; Sakurai, Tetsuya; Honda, Hiroshi; Eguchi, Shinto; Nishii, Ryuei; Mochida, Keiichi

    2017-01-01

    We report the comprehensive identification of periodic genes and their network inference, based on a gene co-expression analysis and an Auto-Regressive eXogenous (ARX) model with a group smoothly clipped absolute deviation (SCAD) method using a time-series transcriptome dataset in a model grass, Brachypodium distachyon . To reveal the diurnal changes in the transcriptome in B. distachyon , we performed RNA-seq analysis of its leaves sampled through a diurnal cycle of over 48 h at 4 h intervals using three biological replications, and identified 3,621 periodic genes through our wavelet analysis. The expression data are feasible to infer network sparsity based on ARX models. We found that genes involved in biological processes such as transcriptional regulation, protein degradation, and post-transcriptional modification and photosynthesis are significantly enriched in the periodic genes, suggesting that these processes might be regulated by circadian rhythm in B. distachyon . On the basis of the time-series expression patterns of the periodic genes, we constructed a chronological gene co-expression network and identified putative transcription factors encoding genes that might be involved in the time-specific regulatory transcriptional network. Moreover, we inferred a transcriptional network composed of the periodic genes in B. distachyon , aiming to identify genes associated with other genes through variable selection by grouping time points for each gene. Based on the ARX model with the group SCAD regularization using our time-series expression datasets of the periodic genes, we constructed gene networks and found that the networks represent typical scale-free structure. Our findings demonstrate that the diurnal changes in the transcriptome in B. distachyon leaves have a sparse network structure, demonstrating the spatiotemporal gene regulatory network over the cyclic phase transitions in B. distachyon diurnal growth.

  14. Alcohol and liver cirrhosis mortality in the United States: comparison of methods for the analyses of time-series panel data models.

    PubMed

    Ye, Yu; Kerr, William C

    2011-01-01

    To explore various model specifications in estimating relationships between liver cirrhosis mortality rates and per capita alcohol consumption in aggregate-level cross-section time-series data. Using a series of liver cirrhosis mortality rates from 1950 to 2002 for 47 U.S. states, the effects of alcohol consumption were estimated from pooled autoregressive integrated moving average (ARIMA) models and 4 types of panel data models: generalized estimating equation, generalized least square, fixed effect, and multilevel models. Various specifications of error term structure under each type of model were also examined. Different approaches controlling for time trends and for using concurrent or accumulated consumption as predictors were also evaluated. When cirrhosis mortality was predicted by total alcohol, highly consistent estimates were found between ARIMA and panel data analyses, with an average overall effect of 0.07 to 0.09. Less consistent estimates were derived using spirits, beer, and wine consumption as predictors. When multiple geographic time series are combined as panel data, none of existent models could accommodate all sources of heterogeneity such that any type of panel model must employ some form of generalization. Different types of panel data models should thus be estimated to examine the robustness of findings. We also suggest cautious interpretation when beverage-specific volumes are used as predictors. Copyright © 2010 by the Research Society on Alcoholism.

  15. Shapes of Magnetically Controlled Electron Density Structures in the Dayside Martian Ionosphere

    NASA Astrophysics Data System (ADS)

    Diéval, C.; Kopf, A. J.; Wild, J. A.

    2018-05-01

    Nonhorizontal localized electron density structures associated with regions of near-radial crustal magnetic fields are routinely detected via radar oblique echoes on the dayside of Mars with the ionospheric sounding mode of the Mars Advanced Radar for Subsurface and Ionospheric Sounding (MARSIS) radar onboard Mars Express. Previous studies mostly investigated these structures at a fixed plasma frequency and assumed that the larger apparent altitude of the structures compared to the normal surrounding ionosphere implied that they are bulges. However, the signal is subjected to dispersion when it propagates through the plasma, so interpretations based on the apparent altitude should be treated with caution. We go further by investigating the frequency dependence (i.e., the altitude dependence) of the shape of 48 density structure events, using time series of MARSIS electron density profiles corrected for signal dispersion. Four possible simplest shapes are detected in these time series, which can give oblique echoes: bulges, dips, downhill slopes, and uphill slopes. The altitude differences between the density structures and their edges are, in absolute value, larger at low frequency (high altitude) than at high frequency (low altitude), going from a few tens of kilometers to a few kilometers as frequency increases. Bulges dominate in numbers in most of the frequency range. Finally, the geographical extension of the density structures covers a wide range of crustal magnetic fields orientations, with near-vertical fields toward their center and near-horizontal fields toward their edges, as expected. Transport processes are suggested to be a key driver for these density structures.

  16. Beyond multi-fractals: surrogate time series and fields

    NASA Astrophysics Data System (ADS)

    Venema, V.; Simmer, C.

    2007-12-01

    Most natural complex are characterised by variability on a large range of temporal and spatial scales. The two main methodologies to generate such structures are Fourier/FARIMA based algorithms and multifractal methods. The former is restricted to Gaussian data, whereas the latter requires the structure to be self-similar. This work will present so-called surrogate data as an alternative that works with any (empirical) distribution and power spectrum. The best-known surrogate algorithm is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm. We have studied six different geophysical time series (two clouds, runoff of a small and a large river, temperature and rain) and their surrogates. The power spectra and consequently the 2nd order structure functions were replicated accurately. Even the fourth order structure function was more accurately reproduced by the surrogates as would be possible by a fractal method, because the measured structure deviated too strong from fractal scaling. Only in case of the daily rain sums a fractal method could have been more accurate. Just as Fourier and multifractal methods, the current surrogates are not able to model the asymmetric increment distributions observed for runoff, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found differences for the structure functions on small scales. Surrogate methods are especially valuable for empirical studies, because the time series and fields that are generated are able to mimic measured variables accurately. Our main application is radiative transfer through structured clouds. Like many geophysical fields, clouds can only be sampled sparsely, e.g. with in-situ airborne instruments. However, for radiative transfer calculations we need full 3-dimensional cloud fields. A first study relating the measured properties of the cloud droplets and the radiative properties of the cloud field by generating surrogate cloud fields yielded good results within the measurement error. A further test of the suitability of the surrogate clouds for radiative transfer is evaluated by comparing the radiative properties of model cloud fields of sparse cumulus and stratocumulus with their surrogate fields. The bias and root mean square error in various radiative properties is small and the deviations in the radiances and irradiances are not statistically significant, i.e. these deviations can be attributed to the Monte Carlo noise of the radiative transfer calculations. We compared these results with optical properties of synthetic clouds that have either the correct distribution (but no spatial correlations) or the correct power spectrum (but a Gaussian distribution). These clouds did show statistical significant deviations. For more information see: http://www.meteo.uni-bonn.de/venema/themes/surrogates/

  17. JTSA: an open source framework for time series abstractions.

    PubMed

    Sacchi, Lucia; Capozzi, Davide; Bellazzi, Riccardo; Larizza, Cristiana

    2015-10-01

    The evaluation of the clinical status of a patient is frequently based on the temporal evolution of some parameters, making the detection of temporal patterns a priority in data analysis. Temporal abstraction (TA) is a methodology widely used in medical reasoning for summarizing and abstracting longitudinal data. This paper describes JTSA (Java Time Series Abstractor), a framework including a library of algorithms for time series preprocessing and abstraction and an engine to execute a workflow for temporal data processing. The JTSA framework is grounded on a comprehensive ontology that models temporal data processing both from the data storage and the abstraction computation perspective. The JTSA framework is designed to allow users to build their own analysis workflows by combining different algorithms. Thanks to the modular structure of a workflow, simple to highly complex patterns can be detected. The JTSA framework has been developed in Java 1.7 and is distributed under GPL as a jar file. JTSA provides: a collection of algorithms to perform temporal abstraction and preprocessing of time series, a framework for defining and executing data analysis workflows based on these algorithms, and a GUI for workflow prototyping and testing. The whole JTSA project relies on a formal model of the data types and of the algorithms included in the library. This model is the basis for the design and implementation of the software application. Taking into account this formalized structure, the user can easily extend the JTSA framework by adding new algorithms. Results are shown in the context of the EU project MOSAIC to extract relevant patterns from data coming related to the long term monitoring of diabetic patients. The proof that JTSA is a versatile tool to be adapted to different needs is given by its possible uses, both as a standalone tool for data summarization and as a module to be embedded into other architectures to select specific phenotypes based on TAs in a large dataset. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Spectral entropy as a mean to quantify water stress history for natural vegetation and irrigated agriculture in a water-stressed tropical environment

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Johnson, M. S.

    2017-12-01

    Spectral entropy (Hs) is an index which can be used to measure the structural complexity of time series data. When a time series is made up of one periodic function, the Hs value becomes smaller, while Hs becomes larger when a time series is composed of several periodic functions. We hypothesized that this characteristic of the Hs could be used to quantify the water stress history of vegetation. For the ideal condition for which sufficient water is supplied to an agricultural crop or natural vegetation, there should be a single distinct phenological cycle represented in a vegetation index time series (e.g., NDVI and EVI). However, time series data for a vegetation area that repeatedly experiences water stress may include several fluctuations that can be observed in addition to the predominant phenological cycle. This is because the process of experiencing water stress and recovering from it generates small fluctuations in phenological characteristics. Consequently, the value of Hs increases when vegetation experiences several water shortages. Therefore, the Hs could be used as an indicator for water stress history. To test this hypothesis, we analyzed Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) data for a natural area in comparison to a nearby sugarcane area in seasonally-dry western Costa Rica. In this presentation we will illustrate the use of spectral entropy to evaluate the vegetative responses of natural vegetation (dry tropical forest) and sugarcane under three different irrigation techniques (center pivot irrigation, drip irrigation and flood irrigation). Through this comparative analysis, the utility of Hs as an indicator will be tested. Furthermore, crop response to the different irrigation methods will be discussed in terms of Hs, NDVI and yield.

  19. Monthly water quality forecasting and uncertainty assessment via bootstrapped wavelet neural networks under missing data for Harbin, China.

    PubMed

    Wang, Yi; Zheng, Tong; Zhao, Ying; Jiang, Jiping; Wang, Yuanyuan; Guo, Liang; Wang, Peng

    2013-12-01

    In this paper, bootstrapped wavelet neural network (BWNN) was developed for predicting monthly ammonia nitrogen (NH(4+)-N) and dissolved oxygen (DO) in Harbin region, northeast of China. The Morlet wavelet basis function (WBF) was employed as a nonlinear activation function of traditional three-layer artificial neural network (ANN) structure. Prediction intervals (PI) were constructed according to the calculated uncertainties from the model structure and data noise. Performance of BWNN model was also compared with four different models: traditional ANN, WNN, bootstrapped ANN, and autoregressive integrated moving average model. The results showed that BWNN could handle the severely fluctuating and non-seasonal time series data of water quality, and it produced better performance than the other four models. The uncertainty from data noise was smaller than that from the model structure for NH(4+)-N; conversely, the uncertainty from data noise was larger for DO series. Besides, total uncertainties in the low-flow period were the biggest due to complicated processes during the freeze-up period of the Songhua River. Further, a data missing-refilling scheme was designed, and better performances of BWNNs for structural data missing (SD) were observed than incidental data missing (ID). For both ID and SD, temporal method was satisfactory for filling NH(4+)-N series, whereas spatial imputation was fit for DO series. This filling BWNN forecasting method was applied to other areas suffering "real" data missing, and the results demonstrated its efficiency. Thus, the methods introduced here will help managers to obtain informed decisions.

  20. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  1. Deformation and Quaternary Faulting in Southeast Missouri across the Commerce Geophysical Lineament

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.K.; Williams, R.A.; Pratt, T.L.; Harrison, R.W.; Hoffman, D.

    1999-01-01

    High-resolution seismic-reflection data acquired at three sites along the surface projection of the Commerce geophysical lineament in southeast Missouri reveal a complex history of post-Cretaceous faulting that has continued into the Quaternary. Near Qulin, Missouri, approximately 20 m of apparent vertical fault displacement has occurred in the Quaternary. Reflection data collected at Idalia Hill, about 45 km to the northeast, reveal a series of reverse and possibly right-lateral strike-slip faults with Quaternary displacement. In the Benton Hills, 45 km northeast of Idalia Hill, seismic data image a complicated series of anticlinal and synclinal fault-bounded blocks immediately north of the Commerce fault. We infer that most of the deformation imaged in the upper 400 m of these three data sets occurred since post-Cretaceous time, and a significant portion of it occurred during Quaternary time. Collectively, these seismic data along with geomorphic and surface-geologic evidence suggest (1) the existence of at least one potential seismogenic structure in southeastern Missouri outside the main zones of New Madrid seismicity, and (2) these structures have been active during the Quaternary. The geographic location of the imaged deformation suggests it is related to structures along with the Commerce geophysical lineament.

  2. Statistical inference of seabed sound-speed structure in the Gulf of Oman Basin.

    PubMed

    Sagers, Jason D; Knobles, David P

    2014-06-01

    Addressed is the statistical inference of the sound-speed depth profile of a thick soft seabed from broadband sound propagation data recorded in the Gulf of Oman Basin in 1977. The acoustic data are in the form of time series signals recorded on a sparse vertical line array and generated by explosive sources deployed along a 280 km track. The acoustic data offer a unique opportunity to study a deep-water bottom-limited thickly sedimented environment because of the large number of time series measurements, very low seabed attenuation, and auxiliary measurements. A maximum entropy method is employed to obtain a conditional posterior probability distribution (PPD) for the sound-speed ratio and the near-surface sound-speed gradient. The multiple data samples allow for a determination of the average error constraint value required to uniquely specify the PPD for each data sample. Two complicating features of the statistical inference study are addressed: (1) the need to develop an error function that can both utilize the measured multipath arrival structure and mitigate the effects of data errors and (2) the effect of small bathymetric slopes on the structure of the bottom interacting arrivals.

  3. Machine Learning and Network Analysis of Molecular Dynamics Trajectories Reveal Two Chains of Red/Ox-specific Residue Interactions in Human Protein Disulfide Isomerase.

    PubMed

    Karamzadeh, Razieh; Karimi-Jafari, Mohammad Hossein; Sharifi-Zarchi, Ali; Chitsaz, Hamidreza; Salekdeh, Ghasem Hosseini; Moosavi-Movahedi, Ali Akbar

    2017-06-16

    The human protein disulfide isomerase (hPDI), is an essential four-domain multifunctional enzyme. As a result of disulfide shuffling in its terminal domains, hPDI exists in two oxidation states with different conformational preferences which are important for substrate binding and functional activities. Here, we address the redox-dependent conformational dynamics of hPDI through molecular dynamics (MD) simulations. Collective domain motions are identified by the principal component analysis of MD trajectories and redox-dependent opening-closing structure variations are highlighted on projected free energy landscapes. Then, important structural features that exhibit considerable differences in dynamics of redox states are extracted by statistical machine learning methods. Mapping the structural variations to time series of residue interaction networks also provides a holistic representation of the dynamical redox differences. With emphasizing on persistent long-lasting interactions, an approach is proposed that compiled these time series networks to a single dynamic residue interaction network (DRIN). Differential comparison of DRIN in oxidized and reduced states reveals chains of residue interactions that represent potential allosteric paths between catalytic and ligand binding sites of hPDI.

  4. Towards a novel look on low-frequency climate reconstructions

    NASA Astrophysics Data System (ADS)

    Kamenik, Christian; Goslar, Tomasz; Hicks, Sheila; Barnekow, Lena; Huusko, Antti

    2010-05-01

    Information on low-frequency (millennial to sub-centennial) climate change is often derived from sedimentary archives, such as peat profiles or lake sediments. Usually, these archives have non-annual and varying time resolution. Their dating is mainly based on radionuclides, which provide probabilistic age-depth relationships with complex error structures. Dating uncertainties impede the interpretation of sediment-based climate reconstructions. They complicate the calculation of time-dependent rates. In most cases, they make any calibration in time impossible. Sediment-based climate proxies are therefore often presented as a single, best-guess time series without proper calibration and error estimation. Errors along time and dating errors that propagate into the calculation of time-dependent rates are neglected. Our objective is to overcome the aforementioned limitations by using a 'swarm' or 'ensemble' of reconstructions instead of a single best-guess. The novelty of our approach is to take into account age-depth uncertainties by permuting through a large number of potential age-depth relationships of the archive of interest. For each individual permutation we can then calculate rates, calibrate proxies in time, and reconstruct the climate-state variable of interest. From the resulting swarm of reconstructions, we can derive realistic estimates of even complex error structures. The likelihood of reconstructions is visualized by a grid of two-dimensional kernels that take into account probabilities along time and the climate-state variable of interest simultaneously. For comparison and regional synthesis, likelihoods can be scored against other independent climate time series.

  5. Forecasting Electricity Prices in an Optimization Hydrothermal Problem

    NASA Astrophysics Data System (ADS)

    Matías, J. M.; Bayón, L.; Suárez, P.; Argüelles, A.; Taboada, J.

    2007-12-01

    This paper presents an economic dispatch algorithm in a hydrothermal system within the framework of a competitive and deregulated electricity market. The optimization problem of one firm is described, whose objective function can be defined as its profit maximization. Since next-day price forecasting is an aspect crucial, this paper proposes an efficient yet highly accurate next-day price new forecasting method using a functional time series approach trying to exploit the daily seasonal structure of the series of prices. For the optimization problem, an optimal control technique is applied and Pontryagin's theorem is employed.

  6. Data and methodological problems in establishing state gasoline-conservation targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, D.L.; Walton, G.H.

    The Emergency Energy Conservation Act of 1979 gives the President the authority to set gasoline-conservation targets for states in the event of a supply shortage. This paper examines data and methodological problems associated with setting state gasoline-conservation targets. The target-setting method currently used is examined and found to have some flaws. Ways of correcting these deficiencies through the use of Box-Jenkins time-series analysis are investigated. A successful estimation of Box-Jenkins models for all states included the estimation of the magnitude of the supply shortages of 1979 in each state and a preliminary estimation of state short-run price elasticities, which weremore » found to vary about a median value of -0.16. The time-series models identified were very simple in structure and lent support to the simple consumption growth model assumed by the current target method. The authors conclude that the flaws in the current method can be remedied either by replacing the current procedures with time-series models or by using the models in conjunction with minor modifications of the current method.« less

  7. Fitting Flux Ropes to a Global MHD Solution: A Comparison of Techniques. Appendix 1

    NASA Technical Reports Server (NTRS)

    Riley, Pete; Linker, J. A.; Lionello, R.; Mikic, Z.; Odstrcil, D.; Hidalgo, M. A.; Cid, C.; Hu, Q.; Lepping, R. P.; Lynch, B. J.

    2004-01-01

    Flux rope fitting (FRF) techniques are an invaluable tool for extracting information about the properties of a subclass of CMEs in the solar wind. However, it has proven difficult to assess their accuracy since the underlying global structure of the CME cannot be independently determined from the data. In contrast, large-scale MHD simulations of CME evolution can provide both a global view as well as localized time series at specific points in space. In this study we apply 5 different fitting techniques to 2 hypothetical time series derived from MHD simulation results. Independent teams performed the analysis of the events in "blind tests", for which no information, other than the time series, was provided. F rom the results, we infer the following: (1) Accuracy decreases markedly with increasingly glancing encounters; (2) Correct identification of the boundaries of the flux rope can be a significant limiter; and (3) Results from techniques that infer global morphology must be viewed with caution. In spite of these limitations, FRF techniques remain a useful tool for describing in situ observations of flux rope CMEs.

  8. Blind tests of methods for InSight Mars mission: Open scientific challenge

    NASA Astrophysics Data System (ADS)

    Clinton, John; Ceylan, Savas; Giardini, Domenico; Khan, Amir; van Driel, Martin; Böse, Maren; Euchner, Fabian; Garcia, Raphael F.; Drilleau, Mélanie; Lognonné, Philippe; Panning, Mark; Banerdt, Bruce

    2017-04-01

    The Marsquake Service (MQS) will be the ground segment service within the InSight mission to Mars, which will deploy a single seismic station on Elysium Planitia in November 2018. The main tasks of the MQS are the identification and characterisation of seismicity, and managing the Martian seismic event catalogue. In advance of the mission, we have developed a series of single station event location methods that rely on a priori 1D and 3D structural models. In coordination with the Mars Structural Service, we expect to use iterative inversion techniques to revise these structural models and event locations. In order to seek methodological advancements and test our current approaches, we have designed a blind test case using Martian synthetics combined with realistic noise models for the Martian surface. We invite all scientific parties that are interested in single station approaches and in exploring the Martian time-series to participate and contribute to our blind test. We anticipate the test will can improve currently developed location and structural inversion techniques, and also allow us explore new single station techniques for moment tensor and magnitude determination. The waveforms for our test case are computed employing AxiSEM and Instaseis for a randomly selected 1D background model and event catalogue that is statistically consistent with our current expectation of Martian seismicity. Realistic seismic surface noise is superimposed to generate a continuous time-series spanning 6 months. The event catalog includes impacts as well as Martian quakes. The temporal distribution of the seismicity in the timeseries, as well as the true structural model, are not be known to any participating parties including MQS till the end of competition. We provide our internal tools such as event location codes, suite of background models, seismic phase travel times, in order to support researchers who are willing to use/improve our current methods. Following the deadline of our blind test in late 2017, we plan to combine all outcomes in an article with all participants as co-authors.

  9. Detecting dryland degradation through the use of Time Series Segmentation and Residual Trend analysis (TSS-RESTREND)

    NASA Astrophysics Data System (ADS)

    Burrell, A. L.; Evans, J. P.; Liu, Y.

    2017-12-01

    Dryland degradation is an issue of international significance as dryland regions play a substantial role in global food production. Remotely sensed data provide the only long term, large scale record of changes within dryland ecosystems. The Residual Trend, or RESTREND, method is applied to satellite observations to detect dryland degradation. Whilst effective in most cases, it has been shown that the RESTREND method can fail to identify degraded pixels if the relationship between vegetation and precipitation has broken-down as a result of severe or rapid degradation. This study presents an extended version of the RESTREND methodology that incorporates the Breaks For Additive Seasonal and Trend method to identify step changes in the time series that are related to significant structural changes in the ecosystem, e.g. land use changes. When applied to Australia, this new methodology, termed Time Series Segmentation and Residual Trend analysis (TSS-RESTREND), was able to detect degradation in 5.25% of pixels compared to only 2.0% for RESTREND alone. This modified methodology was then assessed in two regions with known histories of degradation where it was found to accurately capture both the timing and directionality of ecosystem change.

  10. The relevance of time series in molecular ecology and conservation biology.

    PubMed

    Habel, Jan C; Husemann, Martin; Finger, Aline; Danley, Patrick D; Zachos, Frank E

    2014-05-01

    The genetic structure of a species is shaped by the interaction of contemporary and historical factors. Analyses of individuals from the same population sampled at different points in time can help to disentangle the effects of current and historical forces and facilitate the understanding of the forces driving the differentiation of populations. The use of such time series allows for the exploration of changes at the population and intraspecific levels over time. Material from museum collections plays a key role in understanding and evaluating observed population structures, especially if large numbers of individuals have been sampled from the same locations at multiple time points. In these cases, changes in population structure can be assessed empirically. The development of new molecular markers relying on short DNA fragments (such as microsatellites or single nucleotide polymorphisms) allows for the analysis of long-preserved and partially degraded samples. Recently developed techniques to construct genome libraries with a reduced complexity and next generation sequencing and their associated analysis pipelines have the potential to facilitate marker development and genotyping in non-model species. In this review, we discuss the problems with sampling and available marker systems for historical specimens and demonstrate that temporal comparative studies are crucial for the estimation of important population genetic parameters and to measure empirically the effects of recent habitat alteration. While many of these analyses can be performed with samples taken at a single point in time, the measurements are more robust if multiple points in time are studied. Furthermore, examining the effects of habitat alteration, population declines, and population bottlenecks is only possible if samples before and after the respective events are included. © 2013 The Authors. Biological Reviews © 2013 Cambridge Philosophical Society.

  11. Molecular Simulations in Astrobiology

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Wilson, Michael A.; Schweighofer, Karl; Chipot, Christophe; New, Michael H.

    2000-01-01

    One of the main goals of astrobiology is to understand the origin of cellular life. The most direct approach to this problem is to construct laboratory models of protocells. Such efforts, currently underway in the NASA Astrobiology Program, are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures that are capable of performing protocellular functions. Many of these functions, such as importing nutrients, capturing energy and responding to changes in the environment, are carried out by proteins bound to membranes. We use computer simulations to address the following questions about these proteins: (1) How do small proteins self-organize into ordered structures at water-membrane interfaces and insert into membranes? (2) How do peptides form membrane-spanning structures (e.g. channels)? (3) By what mechanisms do such structures perform their functions? The simulations are performed using the molecular dynamics method. In this method, Newton's equations of motion for each atom in the system are solved iteratively. At each time step, the forces exerted on each atom by the remaining atoms are evaluated by dividing them into two parts. Short-range forces are calculated in real space while long-range forces are evaluated in reciprocal space, using a particle-mesh algorithm which is of order O(NInN). With a time step of 2 femtoseconds, problems occurring on multi-nanosecond time scales (10(exp 6)-10(exp 8) time steps) are accessible. To address a broader range of problems, simulations need to be extended by three orders of magnitude, which requires algorithmic improvements and codes scalable to a large number of processors. Work in this direction is in progress. Two series of simulations are discussed. In one series, it is shown that nonpolar peptides, disordered in water, translocate to the nonpolar interior of the membrane and fold into helical structures (see Figure). Once in the membrane, the peptides exhibit orientational flexibility with changing conditions, which may have provided a mechanism of transmitting signals between the protocell and its environment. In another series of simulations, the mechanism by which a simple protein channel efficiently mediates proton transport across membranes was investigated. This process is a key step in cellular bioenergetics. In the channel under study, proton transport is gated by four histidines that occlude the channel pore. The simulations identify the mechanisms by which protons move through the gate.

  12. Development of indicators of vegetation recovery based on time series analysis of SPOT Vegetation data

    NASA Astrophysics Data System (ADS)

    Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol

    2005-10-01

    Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.

  13. GPS Position Time Series @ JPL

    NASA Technical Reports Server (NTRS)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  14. Dihydropyrimidine based hydrazine dihydrochloride derivatives as potent urease inhibitors.

    PubMed

    Khan, Ajmal; Hashim, Jamshed; Arshad, Nuzhat; Khan, Ijaz; Siddiqui, Naureen; Wadood, Abdul; Ali, Muzaffar; Arshad, Fiza; Khan, Khalid Mohammed; Choudhary, M Iqbal

    2016-02-01

    Four series of heterocyclic compounds 4-dihydropyrimidine-2-thiones 7-12 (series A), N,S-dimethyl-dihydropyrimidines 13-18 (series B), hydrazine derivatives of dihydropyrimidine 19-24 (series C), and tetrazolo dihydropyrimidine derivatives 25-30 (series D), were synthesized and evaluated for in vitro urease inhibitory activity. The series B-D were first time examined for urease inhibition. Series A and C were found to be significantly active with IC50 values between 34.7-42.9 and 15.0-26.0 μM, respectively. The structure-activity relationship showed that the free S atom and hydrazine moiety are the key pharmacophores against urease enzyme. The kinetic studies of the active series A (7-12) and C (19-24) were carried out to determine their modes of inhibition and dissociation constants Ki. Compounds of series A (7-12) and series C (19-24) showed a mixed-type of inhibition with Ki values ranging between 15.76-25.66 and 14.63-29.42 μM, respectively. The molecular docking results showed that all the active compounds of both series have significant binding interactions with the active sites specially Ni-ion of the urease enzyme. Cytotoxicity of all series A-D was also evaluated against mammalian mouse fibroblast 3T3 cell lines, and no toxicity was observed in cellular model. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Staining paraffin embedded sections of scald of barley before paraffin removal.

    PubMed

    Xi, K; Burnett, P A

    1997-07-01

    Staining of paraffin embedded sections with periodic acid-Schiff reagent and fast green before paraffin removal resulted in differentiation of barley seed and leaf tissue from fungal structures of Rhynchosporium secalis. Crystal violet, toluidine blue O and antiline blue also successfully stained fungal structures of R. secalis in barley leaf tissues. Staining of embedded sections before paraffin removal allows simple processing of a series of sections, saves time and reduces solvent consumption.

  16. Temporal structure and gain-loss asymmetry for real and artificial stock indices

    NASA Astrophysics Data System (ADS)

    Siven, Johannes Vitalis; Lins, Jeffrey Todd

    2009-11-01

    Previous research has shown that for stock indices, the most likely time until a return of a particular size has been observed is longer for gains than for losses. We demonstrate that this so-called gain-loss asymmetry vanishes if the temporal dependence structure is destroyed by scrambling the time series. We also show that an artificial index constructed by a simple average of a number of individual stocks display gain-loss asymmetry—this allows us to explicitly analyze the dependence between the index constituents. We consider mutual information and correlation-based measures and show that the stock returns indeed have a higher degree of dependence in times of market downturns than upturns.

  17. Future projects in asteroseismology: the unique role of Antarctica

    NASA Astrophysics Data System (ADS)

    Mosser, B.; Siamois Team

    Asteroseismology requires observables registered in stringent conditions: very high sensitivity, uninterrupted time series, long duration. These specifications then allow to study the details of the stellar interior structure. Space-borne and ground-based asteroseismic projects are presented and compared. With CoRoT as a precursor, then Kepler and maybe Plato, the roadmap in space appears to be precisely designed. In parallel, ground-based projects are necessary to provide different and unique information on bright stars with Doppler measurements. Dome C appears to be the ideal place for ground-based asteroseismic observations. The unequalled weather conditions yield a duty cycle comparable to space. Long time series (up to 3 months) will be possible, thanks to the long duration of the polar night.

  18. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  19. What does the structure of its visibility graph tell us about the nature of the time series?

    NASA Astrophysics Data System (ADS)

    Franke, Jasper G.; Donner, Reik V.

    2017-04-01

    Visibility graphs are a recently introduced method to construct complex network representations based upon univariate time series in order to study their dynamical characteristics [1]. In the last years, this approach has been successfully applied to studying a considerable variety of geoscientific research questions and data sets, including non-trivial temporal patterns in complex earthquake catalogs [2] or time-reversibility in climate time series [3]. It has been shown that several characteristic features of the thus constructed networks differ between stochastic and deterministic (possibly chaotic) processes, which is, however, relatively hard to exploit in the case of real-world applications. In this study, we propose studying two new measures related with the network complexity of visibility graphs constructed from time series, one being a special type of network entropy [4] and the other a recently introduced measure of the heterogeneity of the network's degree distribution [5]. For paradigmatic model systems exhibiting bifurcation sequences between regular and chaotic dynamics, both properties clearly trace the transitions between both types of regimes and exhibit marked quantitative differences for regular and chaotic dynamics. Moreover, for dynamical systems with a small amount of additive noise, the considered properties demonstrate gradual changes prior to the bifurcation point. This finding appears closely related to the subsequent loss of stability of the current state known to lead to a critical slowing down as the transition point is approaches. In this spirit, both considered visibility graph characteristics provide alternative tracers of dynamical early warning signals consistent with classical indicators. Our results demonstrate that measures of visibility graph complexity (i) provide a potentially useful means to tracing changes in the dynamical patterns encoded in a univariate time series that originate from increasing autocorrelation and (ii) allow to systematically distinguish regular from deterministic-chaotic dynamics. We demonstrate the application of our method for different model systems as well as selected paleoclimate time series from the North Atlantic region. Notably, visibility graph based methods are particularly suited for studying the latter type of geoscientific data, since they do not impose intrinsic restrictions or assumptions on the nature of the time series under investigation in terms of noise process, linearity and sampling homogeneity. [1] Lacasa, Lucas, et al. "From time series to complex networks: The visibility graph." Proceedings of the National Academy of Sciences 105.13 (2008): 4972-4975. [2] Telesca, Luciano, and Michele Lovallo. "Analysis of seismic sequences by using the method of visibility graph." EPL (Europhysics Letters) 97.5 (2012): 50002. [3] Donges, Jonathan F., Reik V. Donner, and Jürgen Kurths. "Testing time series irreversibility using complex network methods." EPL (Europhysics Letters) 102.1 (2013): 10004. [4] Small, Michael. "Complex networks from time series: capturing dynamics." 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013), Beijing (2013): 2509-2512. [5] Jacob, Rinku, K.P. Harikrishnan, Ranjeev Misra, and G. Ambika. "Measure for degree heterogeneity in complex networks and its application to recurrence network analysis." arXiv preprint 1605.06607 (2016).

  20. Boosting Maintenance in Working Memory with Temporal Regularities

    ERIC Educational Resources Information Center

    Plancher, Gaën; Lévêque, Yohana; Fanuel, Lison; Piquandet, Gaëlle; Tillmann, Barbara

    2018-01-01

    Music cognition research has provided evidence for the benefit of temporally regular structures guiding attention over time. The present study investigated whether maintenance in working memory can benefit from an isochronous rhythm. Participants were asked to remember series of 6 letters for serial recall. In the rhythm condition of Experiment…

  1. Technique and interpretation in tree seed radiography

    Treesearch

    Howard B. Kriebel

    1966-01-01

    The study of internal seed structure by radiography requires techniques which will give good definition. To establish the best procedures, we conducted a series of experiments in which we manipulated the principal controllable variables affecting the quality of X-radiographs: namely, focus-to-film distance, film speed (grain), exposure time, kilovoltage, and...

  2. Some Notes on "Hunting for Mushrooms in the Slough of Despond."

    ERIC Educational Resources Information Center

    Hogan, Robert F.

    In this series of aphoristic remarks about the situation of English teachers in the junior college, the following topics are discussed: administrative structure and accountability; teacher load; part time instructors and professionalism; Proposition 13; literacy and junior college students; evaluation of student papers; careerism in teachers and…

  3. Effects of Noun Phrase Type on Sentence Complexity

    ERIC Educational Resources Information Center

    Gordon, Peter C.; Hendrick, Randall; Johnson, Marcus

    2004-01-01

    A series of self-paced reading time experiments was performed to assess how characteristics of noun phrases (NPs) contribute to the difference in processing difficulty between object- and subject-extracted relative clauses. Structural semantic characteristics of the NP in the embedded clause (definite vs. indefinite and definite vs. generic) did…

  4. Interpretation of Late Cretaceous Volcanic Mounds and Surrounding Gulfian Series Formations Using 3D Seismic Data in Zavala County, Texas

    NASA Astrophysics Data System (ADS)

    Bennett, Laura Claire

    The Late Cretaceous Gulfian series is a prominent and important series across the State of Texas that has been extensively studied since the nineteenth century. It is composed of series of southeast-dipping shelf carbonates and clastics deposited on the northwest margin of the Gulf of Mexico Basin. In south Texas, the Gulfian series was deposited in the Rio Grande Embayment and Maverick Basin and is comprised of the Eagle Ford Group, Austin Group, Anacacho Limestone, San Miguel Formation, Olmos Formation, and Escondido Formation that crop out and continue basinward in the subsurface. Late Cretaceous volcanism formed volcanic mounds composed of altered palagonite tuff that are clustered into two fields, including the Uvalde Field centered in Zavala County. Using the Pedernales 3D seismic survey, located in east-central Zavala County, several volcanic mounds were identified and mapped without the use of well log data by identifying structures and characteristics associated with the volcanic mounds. Isolating these mounds through mapping enabled the mapping of the tops surrounding Gulfian formations, Lower Eagle Ford, Upper Eagle Ford, Austin, Anacacho, and San Miguel, for which time-structure, amplitude, similarity/coherency attribute, and isochron maps were generated. By using 3D seismic data, the volcanic mounds and their relation to surrounding rocks can be better interpreted.

  5. Dielectric and structural characterisation of chalcogenide glasses via terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Ravagli, A.; Naftaly, M.; Craig, C.; Weatherby, E.; Hewak, D. W.

    2017-07-01

    Terahertz time-domain spectroscopy (THz TDS) was used to investigate a series of chalcogenide glasses. In particular, the dielectric properties at terahertz frequencies were determined and correlated with the glass composition. The experimental results showed a strong relationship between the dielectric properties and the polarizability of the glasses studied. A new explanation based on the coordination number of the metallic cations was proposed to understand these observations.

  6. Detection of a sudden change of the field time series based on the Lorenz system.

    PubMed

    Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.

  7. Network Analyses for Space-Time High Frequency Wind Data

    NASA Astrophysics Data System (ADS)

    Laib, Mohamed; Kanevski, Mikhail

    2017-04-01

    Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.

  8. 4D electron tomography.

    PubMed

    Kwon, Oh-Hoon; Zewail, Ahmed H

    2010-06-25

    Electron tomography provides three-dimensional (3D) imaging of noncrystalline and crystalline equilibrium structures, as well as elemental volume composition, of materials and biological specimens, including those of viruses and cells. We report the development of 4D electron tomography by integrating the fourth dimension (time resolution) with the 3D spatial resolution obtained from a complete tilt series of 2D projections of an object. The different time frames of tomograms constitute a movie of the object in motion, thus enabling studies of nonequilibrium structures and transient processes. The method was demonstrated using carbon nanotubes of a bracelet-like ring structure for which 4D tomograms display different modes of motion, such as breathing and wiggling, with resonance frequencies up to 30 megahertz. Applications can now make use of the full space-time range with the nanometer-femtosecond resolution of ultrafast electron tomography.

  9. 4D Electron Tomography

    NASA Astrophysics Data System (ADS)

    Kwon, Oh-Hoon; Zewail, Ahmed H.

    2010-06-01

    Electron tomography provides three-dimensional (3D) imaging of noncrystalline and crystalline equilibrium structures, as well as elemental volume composition, of materials and biological specimens, including those of viruses and cells. We report the development of 4D electron tomography by integrating the fourth dimension (time resolution) with the 3D spatial resolution obtained from a complete tilt series of 2D projections of an object. The different time frames of tomograms constitute a movie of the object in motion, thus enabling studies of nonequilibrium structures and transient processes. The method was demonstrated using carbon nanotubes of a bracelet-like ring structure for which 4D tomograms display different modes of motion, such as breathing and wiggling, with resonance frequencies up to 30 megahertz. Applications can now make use of the full space-time range with the nanometer-femtosecond resolution of ultrafast electron tomography.

  10. Online Time Series Analysis of Land Products over Asia Monsoon Region via Giovanni

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina

    2011-01-01

    Time series analysis is critical to the study of land cover/land use changes and climate. Time series studies at local-to-regional scales require higher spatial resolution, such as 1km or less, data. MODIS land products of 250m to 1km resolution enable such studies. However, such MODIS land data files are distributed in 10ox10o tiles, due to large data volumes. Conducting a time series study requires downloading all tiles that include the study area for the time period of interest, and mosaicking the tiles spatially. This can be an extremely time-consuming process. In support of the Monsoon Asia Integrated Regional Study (MAIRS) program, NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) has processed MODIS land products at 1 km resolution over the Asia monsoon region (0o-60oN, 60o-150oE) with a common data structure and format. The processed data have been integrated into the Giovanni system (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) that enables users to explore, analyze, and download data over an area and time period of interest easily. Currently, the following regional MODIS land products are available in Giovanni: 8-day 1km land surface temperature and active fire, monthly 1km vegetation index, and yearly 0.05o, 500m land cover types. More data will be added in the near future. By combining atmospheric and oceanic data products in the Giovanni system, it is possible to do further analyses of environmental and climate changes associated with the land, ocean, and atmosphere. This presentation demonstrates exploring land products in the Giovanni system with sample case scenarios.

  11. Volatility of linear and nonlinear time series

    NASA Astrophysics Data System (ADS)

    Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo

    2005-07-01

    Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.

  12. Duality between Time Series and Networks

    PubMed Central

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  13. On-Line Monitoring and Diagnostics of the Integrity of Nuclear Plant Steam Generators and Heat Exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belle R. Upadhyaya; J. Wesley Hines

    2004-09-27

    Integrity monitoring and flaw diagnostics of flat beams and tubular structures was investigated in this research task using guided acoustic signals. A piezo-sensor suite was deployed to activate and collect Lamb wave signals that propagate along metallic specimens. The dispersion curves of Lamb waves along plate and tubular structures are generated through numerical analysis. Several advanced techniques were explored to extract representative features from acoustic time series. Among them, the Hilbert-Huang transform (HHT) is a recently developed technique for the analysis of non-linear and transient signals. A moving window method was introduced to generate the local peak characters from acousticmore » time series, and a zooming window technique was developed to localize the structural flaws. The time-frequency analysis and pattern recognition techniques were combined for classifying structural defects in brass tubes. Several types of flaws in brass tubes were tested, both in the air and in water. The techniques also proved to be effective under background/process noise. A detailed theoretical analysis of Lamb wave propagation was performed and simulations were carried out using the finite element software system ABAQUS. This analytical study confirmed the behavior of the acoustic signals acquired from the experimental studies. The report presents the background the analysis of acoustic signals acquired from piezo-electric transducers for structural defect monitoring. A comparison of the use of time-frequency techniques, including the Hilbert-Huang transform, is presented. The report presents the theoretical study of Lamb wave propagation in flat beams and tubular structures, and the need for mode separation in order to effectively perform defect diagnosis. The results of an extensive experimental study of detection, location, and isolation of structural defects in flat aluminum beams and brass tubes are presented. The results of this research show the feasibility of on-line monitoring of small structural flaws by the use of transient and nonlinear acoustic signal analysis, and its implementation by the proper design of a piezo-electric transducer suite.« less

  14. Recent Structural Change in Remote Sensing Data Time Series Linked to Farm Management in Horn of Africa (1999-2009)

    NASA Astrophysics Data System (ADS)

    Crisci, A.; Vignaroli, P.; Genesio, L.; Grasso, V.; Bacci, M.; Tarchiani, V.; Capecchi, V.

    2011-01-01

    Food security in East Africa region essentially depends on the stability of rain-fed crops farming, which renders its society vulnerable to climatic fluctuations. These ones in Africa are most widely and directly related to rainfall. In this study, the relation between recent spatial rainfall variability and vegetation dynamics has been investigated for East Africa territories. Satellite raster products SPOT-4 Vegetation 1 km resolution (Saint, 1995) and RFE (rainfall estimates) from Famine Early Warning Systems Network (FEWS NET) are used. The survey is carried out at administrative level scale using 10-day summaries extracted from raster data for each spatial area unit thanks to specific polygonal layers. Time series covers two different periods: 1996-2009 for rainfall estimates and 1999-2009 for NDVI. The first step of the analysis has been to build for each administrative unit a coherent set of data, along the time series, suitable to be processed with state-of-art statistical tools. The analysis is based on the assumption that every structural break in vegetation dynamics could be caused by two alternative/complementary causes, namely: (i) modifications in crop farming systems (adaptation strategy) related to eventual break-shift in rainfall regime and/or (ii) other socio-economic factors. BFAST (Verbesselt et al, 2010) R package are employed to lead a comprehensive breakpoint analysis on 10-day RFE (spatial mean and standard deviation) and 10-day NDVI ones (spatial mean, mode and standard deviation). The cross-viewing of the years where significant breaks have occurred, throughout opportune GIS layering, provides an explorative interpretation of spatial climate/vegetation dynamics in the whole area. Moreover, the spatial and temporal pattern of ecosystem dynamics in response to climatic variability has been investigated using wavelet coherency by SOWAS R package (Maraun, 2007). The wavelet coherency (WCOH) is a normalized time and scale resolved measure for the relationship between two time series (Maraun and Kurths, 2004). This kind of multi-scale temporal investigation provides an explanation of break detected in time series, confirming or not their climatic linkage; results of the analysis are shown. Finally, in order to support the dissemination and sharing of information, interactive vegetation maps have been implemented with Google Earth mash-up. The maturity of Web-based GIS enables the generation of thematic maps dynamically and efficiently, with a thin/thick client or hybrid architectures. This could be a great support for the understanding environmental phenomena.

  15. Investigating Access Performance of Long Time Series with Restructured Big Model Data

    NASA Astrophysics Data System (ADS)

    Shen, S.; Ostrenga, D.; Vollmer, B.; Meyer, D. J.

    2017-12-01

    Data sets generated by models are substantially increasing in volume, due to increases in spatial and temporal resolution, and the number of output variables. Many users wish to download subsetted data in preferred data formats and structures, as it is getting increasingly difficult to handle the original full-size data files. For example, application research users, such as those involved with wind or solar energy, or extreme weather events, are likely only interested in daily or hourly model data at a single point or for a small area for a long time period, and prefer to have the data downloaded in a single file. With native model file structures, such as hourly data from NASA Modern-Era Retrospective analysis for Research and Applications Version-2 (MERRA-2), it may take over 10 hours for the extraction of interested parameters at a single point for 30 years. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is exploring methods to address this particular user need. One approach is to create value-added data by reconstructing the data files. Taking MERRA-2 data as an example, we have tested converting hourly data from one-day-per-file into different data cubes, such as one-month, one-year, or whole-mission. Performance are compared for reading local data files and accessing data through interoperable service, such as OPeNDAP. Results show that, compared to the original file structure, the new data cubes offer much better performance for accessing long time series. We have noticed that performance is associated with the cube size and structure, the compression method, and how the data are accessed. An optimized data cube structure will not only improve data access, but also may enable better online analytic services.

  16. Investigating Access Performance of Long Time Series with Restructured Big Model Data

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Ostrenga, Dana M.; Vollmer, Bruce E.; Meyer, Dave

    2017-01-01

    Data sets generated by models are substantially increasing in volume, due to increases in spatial and temporal resolution, and the number of output variables. Many users wish to download subsetted data in preferred data formats and structures, as it is getting increasingly difficult to handle the original full-size data files. For example, application research users such as those involved with wind or solar energy, or extreme weather events are likely only interested in daily or hourly model data at a single point (or for a small area) for a long time period, and prefer to have the data downloaded in a single file. With native model file structures, such as hourly data from NASA Modern-Era Retrospective analysis for Research and Applications Version-2 (MERRA-2), it may take over 10 hours for the extraction of parameters-of-interest at a single point for 30 years. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is exploring methods to address this particular user need. One approach is to create value-added data by reconstructing the data files. Taking MERRA-2 data as an example, we have tested converting hourly data from one-day-per-file into different data cubes, such as one-month, or one-year. Performance is compared for reading local data files and accessing data through interoperable services, such as OPeNDAP. Results show that, compared to the original file structure, the new data cubes offer much better performance for accessing long time series. We have noticed that performance is associated with the cube size and structure, the compression method, and how the data are accessed. An optimized data cube structure will not only improve data access, but also may enable better online analysis services

  17. Describing temporal variability of the mean Estonian precipitation series in climate time scale

    NASA Astrophysics Data System (ADS)

    Post, P.; Kärner, O.

    2009-04-01

    Applicability of the random walk type models to represent the temporal variability of various atmospheric temperature series has been successfully demonstrated recently (e.g. Kärner, 2002). Main problem in the temperature modeling is connected to the scale break in the generally self similar air temperature anomaly series (Kärner, 2005). The break separates short-range strong non-stationarity from nearly stationary longer range variability region. This is an indication of the fact that several geophysical time series show a short-range non-stationary behaviour and a stationary behaviour in longer range (Davis et al., 1996). In order to model series like that the choice of time step appears to be crucial. To characterize the long-range variability we can neglect the short-range non-stationary fluctuations, provided that we are able to model properly the long-range tendencies. The structure function (Monin and Yaglom, 1975) was used to determine an approximate segregation line between the short and the long scale in terms of modeling. The longer scale can be called climate one, because such models are applicable in scales over some decades. In order to get rid of the short-range fluctuations in daily series the variability can be examined using sufficiently long time step. In the present paper, we show that the same philosophy is useful to find a model to represent a climate-scale temporal variability of the Estonian daily mean precipitation amount series over 45 years (1961-2005). Temporal variability of the obtained daily time series is examined by means of an autoregressive and integrated moving average (ARIMA) family model of the type (0,1,1). This model is applicable for daily precipitation simulating if to select an appropriate time step that enables us to neglet the short-range non-stationary fluctuations. A considerably longer time step than one day (30 days) is used in the current paper to model the precipitation time series variability. Each ARIMA (0,1,1) model can be interpreted to be consisting of random walk in a noisy environment (Box and Jenkins, 1976). The fitted model appears to be weakly non-stationary, that gives us the possibility to use stationary approximation if only the noise component from that sum of white noise and random walk is exploited. We get a convenient routine to generate a stationary precipitation climatology with a reasonable accuracy, since the noise component variance is much larger than the dispersion of the random walk generator. This interpretation emphasizes dominating role of a random component in the precipitation series. The result is understandable due to a small territory of Estonia that is situated in the mid-latitude cyclone track. References Box, J.E.P. and G. Jenkins 1976: Time Series Analysis, Forecasting and Control (revised edn.), Holden Day San Francisco, CA, 575 pp. Davis, A., Marshak, A., Wiscombe, W. and R. Cahalan 1996: Multifractal characterizations of intermittency in nonstationary geophysical signals and fields.in G. Trevino et al. (eds) Current Topics in Nonsstationarity Analysis. World-Scientific, Singapore, 97-158. Kärner, O. 2002: On nonstationarity and antipersistency in global temperature series. J. Geophys. Res. D107; doi:10.1029/2001JD002024. Kärner, O. 2005: Some examples on negative feedback in the Earth climate system. Centr. European J. Phys. 3; 190-208. Monin, A.S. and A.M. Yaglom 1975: Statistical Fluid Mechanics, Vol 2. Mechanics of Turbulence , MIT Press Boston Mass, 886 pp.

  18. Precursory signatures of protein folding/unfolding: From time series correlation analysis to atomistic mechanisms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, P. J.; Lai, S. K., E-mail: sklai@coll.phy.ncu.edu.tw; Molecular Science and Technology Program, Taiwan International Graduate Program, Academia Sinica, Taipei 115, Taiwan

    Folded conformations of proteins in thermodynamically stable states have long lifetimes. Before it folds into a stable conformation, or after unfolding from a stable conformation, the protein will generally stray from one random conformation to another leading thus to rapid fluctuations. Brief structural changes therefore occur before folding and unfolding events. These short-lived movements are easily overlooked in studies of folding/unfolding for they represent momentary excursions of the protein to explore conformations in the neighborhood of the stable conformation. The present study looks for precursory signatures of protein folding/unfolding within these rapid fluctuations through a combination of three techniques: (1)more » ultrafast shape recognition, (2) time series segmentation, and (3) time series correlation analysis. The first procedure measures the differences between statistical distance distributions of atoms in different conformations by calculating shape similarity indices from molecular dynamics simulation trajectories. The second procedure is used to discover the times at which the protein makes transitions from one conformation to another. Finally, we employ the third technique to exploit spatial fingerprints of the stable conformations; this procedure is to map out the sequences of changes preceding the actual folding and unfolding events, since strongly correlated atoms in different conformations are different due to bond and steric constraints. The aforementioned high-frequency fluctuations are therefore characterized by distinct correlational and structural changes that are associated with rate-limiting precursors that translate into brief segments. Guided by these technical procedures, we choose a model system, a fragment of the protein transthyretin, for identifying in this system not only the precursory signatures of transitions associated with α helix and β hairpin, but also the important role played by weaker correlations in such protein folding dynamics.« less

  19. The GOLM-database standard- a framework for time-series data management based on free software

    NASA Astrophysics Data System (ADS)

    Eichler, M.; Francke, T.; Kneis, D.; Reusser, D.

    2009-04-01

    Monitoring and modelling projects usually involve time series data originating from different sources. Often, file formats, temporal resolution and meta-data documentation rarely adhere to a common standard. As a result, much effort is spent on converting, harmonizing, merging, checking, resampling and reformatting these data. Moreover, in work groups or during the course of time, these tasks tend to be carried out redundantly and repeatedly, especially when new data becomes available. The resulting duplication of data in various formats strains additional ressources. We propose a database structure and complementary scripts for facilitating these tasks. The GOLM- (General Observation and Location Management) framework allows for import and storage of time series data of different type while assisting in meta-data documentation, plausibility checking and harmonization. The imported data can be visually inspected and its coverage among locations and variables may be visualized. Supplementing scripts provide options for data export for selected stations and variables and resampling of the data to the desired temporal resolution. These tools can, for example, be used for generating model input files or reports. Since GOLM fully supports network access, the system can be used efficiently by distributed working groups accessing the same data over the internet. GOLM's database structure and the complementary scripts can easily be customized to specific needs. Any involved software such as MySQL, R, PHP, OpenOffice as well as the scripts for building and using the data base, including documentation, are free for download. GOLM was developed out of the practical requirements of the OPAQUE-project. It has been tested and further refined in the ERANET-CRUE and SESAM projects, all of which used GOLM to manage meteorological, hydrological and/or water quality data.

  20. Precursory signatures of protein folding/unfolding: From time series correlation analysis to atomistic mechanisms

    NASA Astrophysics Data System (ADS)

    Hsu, P. J.; Cheong, S. A.; Lai, S. K.

    2014-05-01

    Folded conformations of proteins in thermodynamically stable states have long lifetimes. Before it folds into a stable conformation, or after unfolding from a stable conformation, the protein will generally stray from one random conformation to another leading thus to rapid fluctuations. Brief structural changes therefore occur before folding and unfolding events. These short-lived movements are easily overlooked in studies of folding/unfolding for they represent momentary excursions of the protein to explore conformations in the neighborhood of the stable conformation. The present study looks for precursory signatures of protein folding/unfolding within these rapid fluctuations through a combination of three techniques: (1) ultrafast shape recognition, (2) time series segmentation, and (3) time series correlation analysis. The first procedure measures the differences between statistical distance distributions of atoms in different conformations by calculating shape similarity indices from molecular dynamics simulation trajectories. The second procedure is used to discover the times at which the protein makes transitions from one conformation to another. Finally, we employ the third technique to exploit spatial fingerprints of the stable conformations; this procedure is to map out the sequences of changes preceding the actual folding and unfolding events, since strongly correlated atoms in different conformations are different due to bond and steric constraints. The aforementioned high-frequency fluctuations are therefore characterized by distinct correlational and structural changes that are associated with rate-limiting precursors that translate into brief segments. Guided by these technical procedures, we choose a model system, a fragment of the protein transthyretin, for identifying in this system not only the precursory signatures of transitions associated with α helix and β hairpin, but also the important role played by weaker correlations in such protein folding dynamics.

  1. Spatial and Temporal Variation in DeSoto Canyon Macrofaunal Community Structure

    NASA Astrophysics Data System (ADS)

    Baco-Taylor, A.; Shantharam, A. K.

    2016-02-01

    Sediment-dwelling macrofauna (polychaetes, bivalves, and assorted crustaceans ≥ 300 µm) have long served as biological indicators of ecosystem stress. As part of evaluating the 2010 impact from the Deepwater Horizon blowout, we sampled 12 sites along and transverse to the DeSoto Canyon axis, Gulf of Mexico, as well as 2 control sites outside the Canyon. Sites ranged in depth from 479-2310 m. Three of the sites (PCB06, S36, and XC4) were sampled annually from 2012-2014. We provide an overview of the macrofauna community structure of canyon and non-canyon sites, as well as trends in community structure and diversity at the time-series sites. Compositionally, polychaetes dominated the communities, followed by tanaid crustaceans and bivalves. The total number of individuals was not significantly correlated with depth while the total number of taxa and species richness were. Rarefaction shows the deepest station, XC4 (2310 m) had the lowest diversity while NT800 (a non-canyon control at 800m) had the highest. Multivariate analysis shows the canyon assemblages fall into eight clusters with the non-canyon stations forming a separate ninth cluster, indicating a detectable difference in canyon and non-canyon communities. Time series stations show an increase in diversity from 2012-2014 with a strong overlap in community structure in 2013 and 2014 samples. Environmental analysis, via BEST, using data from 10 canyon sites and the controls, indicated depth in combination with latitude explain the most variation in macrofaunal community structure.

  2. Detection of a sudden change of the field time series based on the Lorenz system

    PubMed Central

    Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832

  3. Application of time series analysis on molecular dynamics simulations of proteins: a study of different conformational spaces by principal component analysis.

    PubMed

    Alakent, Burak; Doruker, Pemra; Camurdan, Mehmet C

    2004-09-08

    Time series analysis is applied on the collective coordinates obtained from principal component analysis of independent molecular dynamics simulations of alpha-amylase inhibitor tendamistat and immunity protein of colicin E7 based on the Calpha coordinates history. Even though the principal component directions obtained for each run are considerably different, the dynamics information obtained from these runs are surprisingly similar in terms of time series models and parameters. There are two main differences in the dynamics of the two proteins: the higher density of low frequencies and the larger step sizes for the interminima motions of colicin E7 than those of alpha-amylase inhibitor, which may be attributed to the higher number of residues of colicin E7 and/or the structural differences of the two proteins. The cumulative density function of the low frequencies in each run conforms to the expectations from the normal mode analysis. When different runs of alpha-amylase inhibitor are projected on the same set of eigenvectors, it is found that principal components obtained from a certain conformational region of a protein has a moderate explanation power in other conformational regions and the local minima are similar to a certain extent, while the height of the energy barriers in between the minima significantly change. As a final remark, time series analysis tools are further exploited in this study with the motive of explaining the equilibrium fluctuations of proteins. Copyright 2004 American Institute of Physics

  4. Application of time series analysis on molecular dynamics simulations of proteins: A study of different conformational spaces by principal component analysis

    NASA Astrophysics Data System (ADS)

    Alakent, Burak; Doruker, Pemra; Camurdan, Mehmet C.

    2004-09-01

    Time series analysis is applied on the collective coordinates obtained from principal component analysis of independent molecular dynamics simulations of α-amylase inhibitor tendamistat and immunity protein of colicin E7 based on the Cα coordinates history. Even though the principal component directions obtained for each run are considerably different, the dynamics information obtained from these runs are surprisingly similar in terms of time series models and parameters. There are two main differences in the dynamics of the two proteins: the higher density of low frequencies and the larger step sizes for the interminima motions of colicin E7 than those of α-amylase inhibitor, which may be attributed to the higher number of residues of colicin E7 and/or the structural differences of the two proteins. The cumulative density function of the low frequencies in each run conforms to the expectations from the normal mode analysis. When different runs of α-amylase inhibitor are projected on the same set of eigenvectors, it is found that principal components obtained from a certain conformational region of a protein has a moderate explanation power in other conformational regions and the local minima are similar to a certain extent, while the height of the energy barriers in between the minima significantly change. As a final remark, time series analysis tools are further exploited in this study with the motive of explaining the equilibrium fluctuations of proteins.

  5. The Application of Finite Element Solution Techniques in Structural Analysis on a Microcomputer.

    DTIC Science & Technology

    1981-12-01

    my wife for her support of this research project and the amount of time she spent helping me in preparation. Thanks go to the personnel at Computer...questions which had to be answered concerning the microcomputer in relation to a sequentially programmed finite element program. The first was how big...central site, then usefullness of the microcomputer is limited. The first series of problems consisted of a simple truss structure, which was expanded

  6. Spatio-Temporal Story Mapping Animation Based On Structured Causal Relationships Of Historical Events

    NASA Astrophysics Data System (ADS)

    Inoue, Y.; Tsuruoka, K.; Arikawa, M.

    2014-04-01

    In this paper, we proposed a user interface that displays visual animations on geographic maps and timelines for depicting historical stories by representing causal relationships among events for time series. We have been developing an experimental software system for the spatial-temporal visualization of historical stories for tablet computers. Our proposed system makes people effectively learn historical stories using visual animations based on hierarchical structures of different scale timelines and maps.

  7. A Multi-Scale Structural Health Monitoring Approach for Damage Detection, Diagnosis and Prognosis in Aerospace Structures

    DTIC Science & Technology

    2012-01-20

    ultrasonic Lamb waves to plastic strain and fatigue life. Theory was developed and validated to predict second harmonic generation for specific mode... Fatigue and damage generation and progression are processes consisting of a series of interrelated events that span large scales of space and time...strain and fatigue life A set of experiments were completed that worked to relate the acoustic nonlinearity measured with Lamb waves to both the

  8. Decadal variability of the Tropical Atlantic Ocean Surface Temperature in shipboard measurements and in a Global Ocean-Atmosphere model

    NASA Technical Reports Server (NTRS)

    Mehta, Vikram M.; Delworth, Thomas

    1995-01-01

    Sea surface temperature (SST) variability was investigated in a 200-yr integration of a global model of the coupled oceanic and atmospheric general circulations developed at the Geophysical Fluid Dynamics Laboratory (GFDL). The second 100 yr of SST in the coupled model's tropical Atlantic region were analyzed with a variety of techniques. Analyses of SST time series, averaged over approximately the same subregions as the Global Ocean Surface Temperature Atlas (GOSTA) time series, showed that the GFDL SST anomalies also undergo pronounced quasi-oscillatory decadal and multidecadal variability but at somewhat shorter timescales than the GOSTA SST anomalies. Further analyses of the horizontal structures of the decadal timescale variability in the GFDL coupled model showed the existence of two types of variability in general agreement with results of the GOSTA SST time series analyses. One type, characterized by timescales between 8 and 11 yr, has high spatial coherence within each hemisphere but not between the two hemispheres of the tropical Atlantic. A second type, characterized by timescales between 12 and 20 yr, has high spatial coherence between the two hemispheres. The second type of variability is considerably weaker than the first. As in the GOSTA time series, the multidecadal variability in the GFDL SST time series has approximately opposite phases between the tropical North and South Atlantic Oceans. Empirical orthogonal function analyses of the tropical Atlantic SST anomalies revealed a north-south bipolar pattern as the dominant pattern of decadal variability. It is suggested that the bipolar pattern can be interpreted as decadal variability of the interhemispheric gradient of SST anomalies. The decadal and multidecadal timescale variability of the tropical Atlantic SST, both in the actual and in the GFDL model, stands out significantly above the background 'red noise' and is coherent within each of the time series, suggesting that specific sets of processes may be responsible for the choice of the decadal and multidecadal timescales. Finally, it must be emphasized that the GFDL coupled ocean-atmosphere model generates the decadal and multidecadal timescale variability without any externally applied force, solar or lunar, at those timescales.

  9. Coupling Poisson rectangular pulse and multiplicative microcanonical random cascade models to generate sub-daily precipitation timeseries

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph

    2018-07-01

    To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.

  10. Using time-delayed mutual information to discover and interpret temporal correlation structure in complex populations

    NASA Astrophysics Data System (ADS)

    Albers, D. J.; Hripcsak, George

    2012-03-01

    This paper addresses how to calculate and interpret the time-delayed mutual information (TDMI) for a complex, diversely and sparsely measured, possibly non-stationary population of time-series of unknown composition and origin. The primary vehicle used for this analysis is a comparison between the time-delayed mutual information averaged over the population and the time-delayed mutual information of an aggregated population (here, aggregation implies the population is conjoined before any statistical estimates are implemented). Through the use of information theoretic tools, a sequence of practically implementable calculations are detailed that allow for the average and aggregate time-delayed mutual information to be interpreted. Moreover, these calculations can also be used to understand the degree of homo or heterogeneity present in the population. To demonstrate that the proposed methods can be used in nearly any situation, the methods are applied and demonstrated on the time series of glucose measurements from two different subpopulations of individuals from the Columbia University Medical Center electronic health record repository, revealing a picture of the composition of the population as well as physiological features.

  11. Complementary effects of surface water and groundwater on soil moisture dynamics in a degraded coastal floodplain forest

    NASA Astrophysics Data System (ADS)

    Kaplan, D.; Muñoz-Carpena, R.

    2011-02-01

    SummaryRestoration of degraded floodplain forests requires a robust understanding of surface water, groundwater, and vadose zone hydrology. Soil moisture is of particular importance for seed germination and seedling survival, but is difficult to monitor and often overlooked in wetland restoration studies. This research hypothesizes that the complex effects of surface water and shallow groundwater on the soil moisture dynamics of floodplain wetlands are spatially complementary. To test this hypothesis, 31 long-term (4-year) hydrological time series were collected in the floodplain of the Loxahatchee River (Florida, USA), where watershed modifications have led to reduced freshwater flow, altered hydroperiod and salinity, and a degraded ecosystem. Dynamic factor analysis (DFA), a time series dimension reduction technique, was applied to model temporal and spatial variation in 12 soil moisture time series as linear combinations of common trends (representing shared, but unexplained, variability) and explanatory variables (selected from 19 additional candidate hydrological time series). The resulting dynamic factor models yielded good predictions of observed soil moisture series (overall coefficient of efficiency = 0.90) by identifying surface water elevation, groundwater elevation, and net recharge (cumulative rainfall-cumulative evapotranspiration) as important explanatory variables. Strong and complementary linear relationships were found between floodplain elevation and surface water effects (slope = 0.72, R2 = 0.86, p < 0.001), and between elevation and groundwater effects (slope = -0.71, R2 = 0.71, p = 0.001), while the effect of net recharge was homogenous across the experimental transect (slope = 0.03, R2 = 0.05, p = 0.242). This study provides a quantitative insight into the spatial structure of groundwater and surface water effects on soil moisture that will be useful for refining monitoring plans and developing ecosystem restoration and management scenarios in degraded coastal floodplains.

  12. Inter-annual cascade effect on marine food web: A benthic pathway lagging nutrient supply to pelagic fish stock

    PubMed Central

    Fernandes, Lohengrin Dias de Almeida; Fagundes Netto, Eduardo Barros; Coutinho, Ricardo

    2017-01-01

    Currently, spatial and temporal changes in nutrients availability, marine planktonic, and fish communities are best described on a shorter than inter-annual (seasonal) scale, primarily because the simultaneous year-to-year variations in physical, chemical, and biological parameters are very complex. The limited availability of time series datasets furnishing simultaneous evaluations of temperature, nutrients, plankton, and fish have limited our ability to describe and to predict variability related to short-term process, as species-specific phenology and environmental seasonality. In the present study, we combine a computational time series analysis on a 15-year (1995–2009) weekly-sampled time series (high-resolution long-term time series, 780 weeks) with an Autoregressive Distributed Lag Model to track non-seasonal changes in 10 potentially related parameters: sea surface temperature, nutrient concentrations (NO2, NO3, NH4 and PO4), phytoplankton biomass (as in situ chlorophyll a biomass), meroplankton (barnacle and mussel larvae), and fish abundance (Mugil liza and Caranx latus). Our data demonstrate for the first time that highly intense and frequent upwelling years initiate a huge energy flux that is not fully transmitted through classical size-structured food web by bottom-up stimulus but through additional ontogenetic steps. A delayed inter-annual sequential effect from phytoplankton up to top predators as carnivorous fishes is expected if most of energy is trapped into benthic filter feeding organisms and their larval forms. These sequential events can explain major changes in ecosystem food web that were not predicted in previous short-term models. PMID:28886162

  13. Rodent CNS neuron development: Timing of cell birth and death

    NASA Technical Reports Server (NTRS)

    Keefe, J. R.

    1984-01-01

    Data obtained from a staged series of single paired injections of tritiated thymidine to pregnant Wistar rats or C57B16/j mice on selected embryonic days and several postnatal times are reported. All injected specimens were allowed to come to term, each litter culled to six pups and specimens were sacrificed on PN28, with fixation and embedding for paraffin and plastic embedding. The results are derived from serial paraffin sections of PN28 animals exposed to autoradiographic processing and plotted with respect to heavily labelled cell nuclei present in the selected brain stem nuclei and sensory ganglia. Counts from each time sample/structure are totalled and the percentage of cells in the total labelled population/structure represented by each injection time interval plotted.

  14. Discovery of Jurassic ammonite-bearing series in Jebel Bou Hedma (South-Central Tunisian Atlas): Implications for stratigraphic correlations and paleogeographic reconstruction

    NASA Astrophysics Data System (ADS)

    Bahrouni, Néjib; Houla, Yassine; Soussi, Mohamed; Boughdiri, Mabrouk; Ali, Walid Ben; Nasri, Ahmed; Bouaziz, Samir

    2016-01-01

    Recent geological mapping undertaken in the Southern-Central Atlas of Tunisia led to the discovery of Jurassic ammonite-bearing series in the Jebel Bou Hedma E-W anticline structure. These series represent the Southernmost Jurassic rocks ever documented in the outcrops of the Tunisian Atlas. These series which outcrop in a transitional zone between the Southern Tunisian Atlas and the Chott basin offer a valuable benchmark for new stratigraphic correlation with the well-known Jurassic series of the North-South Axis of Central Tunisia and also with the Jurassic subsurface successions transected by petroleum wells in the study area. The preliminary investigations allowed the identification, within the most complete section outcropping in the center of the structure, of numerous useful biochronological and sedimentological markers helping in the establishment of an updated Jurassic stratigraphic framework chart of South-Western Tunisia. Additionally, the Late Jurassic succession documents syn-sedimentary features such as slumping, erosion and reworking of sediments and ammonite faunas that can be considered as strong witnesses of an important geodynamic event around the Jurassic-Cretaceous boundary. These stratigraphic and geodynamic new data make of the Jurassic of Jebel Bou Hedma a key succession for stratigraphic correlation attempt between Atlas Tunisian series and those currently buried in the Chott basin or outcropping in the Saharan platform. Furthermore, the several rich-ammonite identified horizons within the Middle and Upper Jurassic series constitute reliable time lines that can be useful for both paleogeographic and geodynamic reconstructions of this part of the North African Tethyan margin but also in the refinement of the potential migration routes for ammonite populations from the Maghrebian Southern Tethys to Arabia.

  15. Microscopic Spin Model for the STOCK Market with Attractor Bubbling on Regular and Small-World Lattices

    NASA Astrophysics Data System (ADS)

    Krawiecki, A.

    A multi-agent spin model for changes of prices in the stock market based on the Ising-like cellular automaton with interactions between traders randomly varying in time is investigated by means of Monte Carlo simulations. The structure of interactions has topology of a small-world network obtained from regular two-dimensional square lattices with various coordination numbers by randomly cutting and rewiring edges. Simulations of the model on regular lattices do not yield time series of logarithmic price returns with statistical properties comparable with the empirical ones. In contrast, in the case of networks with a certain degree of randomness for a wide range of parameters the time series of the logarithmic price returns exhibit intermittent bursting typical of volatility clustering. Also the tails of distributions of returns obey a power scaling law with exponents comparable to those obtained from the empirical data.

  16. Fractal structure of the interplanetary magnetic field

    NASA Technical Reports Server (NTRS)

    Burlaga, L. F.; Klein, L. W.

    1985-01-01

    Under some conditions, time series of the interplanetary magnetic field strength and components have the properties of fractal curves. Magnetic field measurements made near 8.5 AU by Voyager 2 from June 5 to August 24, 1981 were self-similar over time scales from approximately 20 sec to approximately 3 x 100,000 sec, and the fractal dimension of the time series of the strength and components of the magnetic field was D = 5/3, corresponding to a power spectrum P(f) approximately f sup -5/3. Since the Kolmogorov spectrum for homogeneous, isotropic, stationary turbulence is also f sup -5/3, the Voyager 2 measurements are consistent with the observation of an inertial range of turbulence extending over approximately four decades in frequency. Interaction regions probably contributed most of the power in this interval. As an example, one interaction region is discussed in which the magnetic field had a fractal dimension D = 5/3.

  17. [The calming process of anger experience: time series changes of affects, cognitions, and behaviors].

    PubMed

    Hibino, Kei; Yukawa, Shintaro

    2004-02-01

    This study investigated time series changes and relationships of affects, cognitions, and behaviors immediately, a few days, and a week after anger episodes. Two hundred undergraduates (96 men, and 104 women) completed a questionnaire. The results were as follows. Anger intensely aroused immediately after anger episodes, and was rapidly calmed as time passed. Anger and depression correlated in each period, so depression was accompanied with anger experiences. The results of covariance structure analysis showed that aggressive behavior was evoked only by affects (especially anger) immediately, and was evoked only by cognitions (especially inflating) a few days after the episode. One week after the episode, aggressive behavior decreased, and was not influenced by affects and cognitions. Anger elicited all anger-expressive behaviors such as aggressive behavior, social sharing, and object-displacement, while depression accompanied with anger episodes elicited only object-displacement.

  18. Spatio-Temporal Video Segmentation with Shape Growth or Shrinkage Constraint

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Charpiat, Guillaume; Brucker, Ludovic; Menze, Bjoern H.

    2014-01-01

    We propose a new method for joint segmentation of monotonously growing or shrinking shapes in a time sequence of noisy images. The task of segmenting the image time series is expressed as an optimization problem using the spatio-temporal graph of pixels, in which we are able to impose the constraint of shape growth or of shrinkage by introducing monodirectional infinite links connecting pixels at the same spatial locations in successive image frames. The globally optimal solution is computed with a graph cut. The performance of the proposed method is validated on three applications: segmentation of melting sea ice floes and of growing burned areas from time series of 2D satellite images, and segmentation of a growing brain tumor from sequences of 3D medical scans. In the latter application, we impose an additional intersequences inclusion constraint by adding directed infinite links between pixels of dependent image structures.

  19. Macrozooplankton biomass in a warm-core Gulf Stream ring: Time series changes in size structure, taxonomic composition, and vertical distribution

    NASA Astrophysics Data System (ADS)

    Davis, Cabell S.; Wiebe, Peter H.

    1985-01-01

    Macrozooplankton size structure and taxonomic composition in warm-core ring 82B was examined from a time series (March, April, June) of ring center MOCNESS (1 m) samples. Size distributions of 15 major taxonomic groups were determined from length measurements digitized from silhouette photographs of the samples. Silhouette digitization allows rapid quantification of Zooplankton size structure and taxonomic composition. Length/weight regressions, determined for each taxon, were used to partition the biomass (displacement volumes) of each sample among the major taxonomic groups. Zooplankton taxonomic composition and size structure varied with depth and appeared to coincide with the hydrographic structure of the ring. In March and April, within the thermostad region of the ring, smaller herbivorous/omnivorous Zooplankton, including copepods, crustacean larvae, and euphausiids, were dominant, whereas below this region, larger carnivores, such as medusae, ctenophores, fish, and decapods, dominated. Copepods were generally dominant in most samples above 500 m. Total macrozooplankton abundance and biomass increased between March and April, primarily because of increases in herbivorous taxa, including copepods, crustacean larvae, and larvaceans. A marked increase in total macrozooplankton abundance and biomass between April and June was characterized by an equally dramatic shift from smaller herbivores (1.0-3.0 mm) in April to large herbivores (5.0-6.0 mm) and carnivores (>15 mm) in June. Species identifications made directly from the samples suggest that changes in trophic structure resulted from seeding type immigration and subsequent in situ population growth of Slope Water zooplankton species.

  20. A discrete scattering series representation for lattice embedded models of chain cyclization

    NASA Astrophysics Data System (ADS)

    Fraser, Simon J.; Winnik, Mitchell A.

    1980-01-01

    In this paper we develop a lattice based model of chain cyclization in the presence of a set of occupied sites V in the lattice. We show that within the approximation of a Markovian chain propagator the effect of V on the partition function for the system can be written as a time-ordered exponential series in which V behaves like a scattering potential and chainlength is the timelike parameter. The discrete and finite nature of this model allows us to obtain rigorous upper and lower bounds to the series limit. We adapt these formulas to calculation of the partition functions and cyclization probabilities of terminally and globally cyclizing chains. Two classes of cyclization are considered: in the first model the target set H may be visited repeatedly (the Markovian model); in the second case vertices in H may be visited at most once(the non-Markovian or taboo model). This formulation depends on two fundamental combinatorial structures, namely the inclusion-exclusion principle and the set of subsets of a set. We have tried to interpret these abstract structures with physical analogies throughout the paper.

  1. Electrochemical route to the synthesis of ZnO microstructures: its nestlike structure and holding of Ag particles

    NASA Astrophysics Data System (ADS)

    Ding, Ling; Zhang, Ruixue; Fan, Louzhen

    2013-02-01

    A simple and facile electrochemical route was developed for the shape-selective synthesis of large-scaled series of ZnO microstructures, including petal, flower, sphere, nest and clew aggregates of ZnO laminas at room temperature. This route is based on sodium citrate-directed crystallization. In the system, sodium citrate can greatly promote ZnO to nucleate and directly grow by selectively capping the specific ZnO facets because of its excellent adsorption ability. The morphology of ZnO is tuned by readily adjusting the concentration of sodium citrate and the electrodeposition time. Among the series structures, the remarkable ZnO nestlike structure can be used as a container to hold not only the interlaced ZnO laminas but also Ag nanoparticles in the center. The special heterostructures of nestlike ZnO holding Ag nanoparticles were found to display the superior properties on the surface-enhanced Raman scattering. This work has signified an important methodology to produce a wide assortment of desired microstructures of ZnO.

  2. A convergent series expansion for hyperbolic systems of conservation laws

    NASA Technical Reports Server (NTRS)

    Harabetian, E.

    1985-01-01

    The discontinuities piecewise analytic initial value problem for a wide class of conservation laws is considered which includes the full three-dimensional Euler equations. The initial interaction at an arbitrary curved surface is resolved in time by a convergent series. Among other features the solution exhibits shock, contact, and expansion waves as well as sound waves propagating on characteristic surfaces. The expansion waves correspond to he one-dimensional rarefactions but have a more complicated structure. The sound waves are generated in place of zero strength shocks, and they are caused by mismatches in derivatives.

  3. Second and third order nonlinear optical properties of conjugated molecules and polymers

    NASA Technical Reports Server (NTRS)

    Perry, Joseph W.; Stiegman, Albert E.; Marder, Seth R.; Coulter, Daniel R.; Beratan, David N.; Brinza, David E.

    1988-01-01

    Second- and third-order nonlinear optical properties of some newly synthesized organic molecules and polymers are reported. Powder second-harmonic-generation efficiencies of up to 200 times urea have been realized for asymmetric donor-acceptor acetylenes. Third harmonic generation chi(3)s have been determined for a series of small conjugated molecules in solution. THG chi(3)s have also been determined for a series of soluble conjugated copolymers prepared using ring-opening metathesis polymerization. The results are discussed in terms of relevant molecular and/or macroscopic structural features of these conjugated organic materials.

  4. Efficient Algorithms for Segmentation of Item-Set Time Series

    NASA Astrophysics Data System (ADS)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  5. Mapping the structure of the world economy.

    PubMed

    Lenzen, Manfred; Kanemoto, Keiichiro; Moran, Daniel; Geschke, Arne

    2012-08-07

    We have developed a new series of environmentally extended multi-region input-output (MRIO) tables with applications in carbon, water, and ecological footprinting, and Life-Cycle Assessment, as well as trend and key driver analyses. Such applications have recently been at the forefront of global policy debates, such as about assigning responsibility for emissions embodied in internationally traded products. The new time series was constructed using advanced parallelized supercomputing resources, and significantly advances the previous state of art because of four innovations. First, it is available as a continuous 20-year time series of MRIO tables. Second, it distinguishes 187 individual countries comprising more than 15,000 industry sectors, and hence offers unsurpassed detail. Third, it provides information just 1-3 years delayed therefore significantly improving timeliness. Fourth, it presents MRIO elements with accompanying standard deviations in order to allow users to understand the reliability of data. These advances will lead to material improvements in the capability of applications that rely on input-output tables. The timeliness of information means that analyses are more relevant to current policy questions. The continuity of the time series enables the robust identification of key trends and drivers of global environmental change. The high country and sector detail drastically improves the resolution of Life-Cycle Assessments. Finally, the availability of information on uncertainty allows policy-makers to quantitatively judge the level of confidence that can be placed in the results of analyses.

  6. A novel weight determination method for time series data aggregation

    NASA Astrophysics Data System (ADS)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  7. Physical habitat simulation system reference manual: version II

    USGS Publications Warehouse

    Milhous, Robert T.; Updike, Marlys A.; Schneider, Diane M.

    1989-01-01

    There are four major components of a stream system that determine the productivity of the fishery (Karr and Dudley 1978). These are: (1) flow regime, (2) physical habitat structure (channel form, substrate distribution, and riparian vegetation), (3) water quality (including temperature), and (4) energy inputs from the watershed (sediments, nutrients, and organic matter). The complex interaction of these components determines the primary production, secondary production, and fish population of the stream reach. The basic components and interactions needed to simulate fish populations as a function of management alternatives are illustrated in Figure I.1. The assessment process utilizes a hierarchical and modular approach combined with computer simulation techniques. The modular components represent the "building blocks" for the simulation. The quality of the physical habitat is a function of flow and, therefore, varies in quality and quantity over the range of the flow regime. The conceptual framework of the Incremental Methodology and guidelines for its application are described in "A Guide to Stream Habitat Analysis Using the Instream Flow Incremental Methodology" (Bovee 1982). Simulation of physical habitat is accomplished using the physical structure of the stream and streamflow. The modification of physical habitat by temperature and water quality is analyzed separately from physical habitat simulation. Temperature in a stream varies with the seasons, local meteorological conditions, stream network configuration, and the flow regime; thus, the temperature influences on habitat must be analysed on a stream system basis. Water quality under natural conditions is strongly influenced by climate and the geological materials, with the result that there is considerable natural variation in water quality. When we add the activities of man, the possible range of water quality possibilities becomes rather large. Consequently, water quality must also be analysed on a stream system basis. Such analysis is outside the scope of this manual, which concentrates on simulation of physical habitat based on depth, velocity, and a channel index. The results form PHABSIM can be used alone or by using a series of habitat time series programs that have been developed to generate monthly or daily habitat time series from the Weighted Usable Area versus streamflow table resulting from the habitat simulation programs and streamflow time series data. Monthly and daily streamflow time series may be obtained from USGS gages near the study site or as the output of river system management models.

  8. Common data buffer

    NASA Technical Reports Server (NTRS)

    Byrne, F.

    1981-01-01

    Time-shared interface speeds data processing in distributed computer network. Two-level high-speed scanning approach routes information to buffer, portion of which is reserved for series of "first-in, first-out" memory stacks. Buffer address structure and memory are protected from noise or failed components by error correcting code. System is applicable to any computer or processing language.

  9. PHYTOPLANKTON DYNAMICS IN A GULF OF MEXICO ESTUARY: TIME SERIES OF SIZE STRUCTURE, NUTRIENTS, VARIABLE FLUORESCENCE AND ALGAL PHOSPHATASE ACTIVITY

    EPA Science Inventory

    Relationships between phytoplankton dynamics and physiology, and environmental conditions were studied in Santa Rosa Sound, Florida, USA, at near-weekly intervals during 2001. Santa Rosa Sound is a component of the Pensacola Bay estuary in the northern Gulf of Mexico. Parameters ...

  10. Reflections of Single Turkish International Graduate Students: Studies on Life at a Midwestern University

    ERIC Educational Resources Information Center

    Burkholder, Jessica Reno

    2010-01-01

    The research was guided by the research question: How do full-time single Turkish international graduate students conceptualize their experiences as international students? Participants in the study included three doctoral students and three master's students who participated in a series of semi-structured interviews. The data was transcribed and…

  11. Toward A Theory of HRD Learning Participation

    ERIC Educational Resources Information Center

    Wang, Greg G.; Wang, Jia

    2005-01-01

    This article fills a gap by identifying an under-studied area for learning participation (LP) in HRD theory building. A literature review is presented to identify gaps in adult education and HRD literature. An HRD LP framework is then proposed, from cross-sectional/time-series perspectives, to describe the pattern, factors, structure, and the…

  12. Models and Forecasts of Federal Spending for Elementary and Secondary Education.

    ERIC Educational Resources Information Center

    Rossi, Robert J.; Gilmartin, Kevin J.

    Structural equation models of annual federal expenditures for elementary and secondary education and for higher education were estimated using time-series data extending from 1947 to the later 1970s. The pattern of expenditures for elementary and secondary education proved to follow closely that for higher education. Factors affecting federal…

  13. Schools and Work: Developments in Vocational Education. Cassell Education Series.

    ERIC Educational Resources Information Center

    Coffey, David

    This book assesses the developing vocational functions of schools in Britain, identifies vocational values and policies, and discovers gaps in provision. Chapter 1 gives a summary analysis of school structural and curricular developments between medieval times and the reign of Victoria that were inspired by vocational or economic influences or had…

  14. 14 CFR Appendix A to Part 150 - Noise Exposure Maps

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...

  15. 14 CFR Appendix A to Part 150 - Noise Exposure Maps

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...

  16. 14 CFR Appendix A to Part 150 - Noise Exposure Maps

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...

  17. 14 CFR Appendix A to Part 150 - Noise Exposure Maps

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...

  18. 14 CFR Appendix A to Part 150 - Noise Exposure Maps

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...

  19. The Political Economy of Schooling. ESA845, The Economy of Schooling.

    ERIC Educational Resources Information Center

    Freeland, John

    This volume, part of a series of mongraphs that explore the relationship between the economy and schooling, analyzes the economic influences contributing to current pressures for changes in secondary schooling in Australian society with particular attention to the long-term structural collapse of the full-time teenage labor market. After a brief…

  20. Short-term prediction of chaotic time series by using RBF network with regression weights.

    PubMed

    Rojas, I; Gonzalez, J; Cañas, A; Diaz, A F; Rojas, F J; Rodriguez, M

    2000-10-01

    We propose a framework for constructing and training a radial basis function (RBF) neural network. The structure of the gaussian functions is modified using a pseudo-gaussian function (PG) in which two scaling parameters sigma are introduced, which eliminates the symmetry restriction and provides the neurons in the hidden layer with greater flexibility with respect to function approximation. We propose a modified PG-BF (pseudo-gaussian basis function) network in which the regression weights are used to replace the constant weights in the output layer. For this purpose, a sequential learning algorithm is presented to adapt the structure of the network, in which it is possible to create a new hidden unit and also to detect and remove inactive units. A salient feature of the network systems is that the method used for calculating the overall output is the weighted average of the output associated with each receptive field. The superior performance of the proposed PG-BF system over the standard RBF are illustrated using the problem of short-term prediction of chaotic time series.

Top