Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-07
... Series, adjusted option series and any options series until the time to expiration for such series is... time to expiration for such series is less than nine months be treated differently. Specifically, under... series until the time to expiration for such series is less than nine months. Accordingly, the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... Options Series, adjusted option series and any options series until the time to expiration for such series... time to expiration for such series is less than nine months be treated differently. Specifically, under... until the time to expiration for such series is less than nine months. Accordingly, the requirement to...
NASA Astrophysics Data System (ADS)
Diao, Chunyuan
In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.
Faithfulness of Recurrence Plots: A Mathematical Proof
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; Komuro, Motomasa; Horai, Shunsuke; Aihara, Kazuyuki
It is practically known that a recurrence plot, a two-dimensional visualization of time series data, can contain almost all information related to the underlying dynamics except for its spatial scale because we can recover a rough shape for the original time series from the recurrence plot even if the original time series is multivariate. We here provide a mathematical proof that the metric defined by a recurrence plot [Hirata et al., 2008] is equivalent to the Euclidean metric under mild conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xiaoran, E-mail: sxr0806@gmail.com; School of Mathematics and Statistics, The University of Western Australia, Crawley WA 6009; Small, Michael, E-mail: michael.small@uwa.edu.au
In this work, we propose a novel method to transform a time series into a weighted and directed network. For a given time series, we first generate a set of segments via a sliding window, and then use a doubly symbolic scheme to characterize every windowed segment by combining absolute amplitude information with an ordinal pattern characterization. Based on this construction, a network can be directly constructed from the given time series: segments corresponding to different symbol-pairs are mapped to network nodes and the temporal succession between nodes is represented by directed links. With this conversion, dynamics underlying the timemore » series has been encoded into the network structure. We illustrate the potential of our networks with a well-studied dynamical model as a benchmark example. Results show that network measures for characterizing global properties can detect the dynamical transitions in the underlying system. Moreover, we employ a random walk algorithm to sample loops in our networks, and find that time series with different dynamics exhibits distinct cycle structure. That is, the relative prevalence of loops with different lengths can be used to identify the underlying dynamics.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
... behavior is included in the econometric models underlying STANS, time series of proportional changes in... included in the econometric models underlying STANS, time series of proportional changes in implied... calculate daily margin requirements. OCC has proposed at this time to clear only OTC Options on the S&P 500...
A Review of Subsequence Time Series Clustering
Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332
A review of subsequence time series clustering.
Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.
NASA Astrophysics Data System (ADS)
Zhou, Ya-Tong; Fan, Yu; Chen, Zi-Yi; Sun, Jian-Cheng
2017-05-01
The contribution of this work is twofold: (1) a multimodality prediction method of chaotic time series with the Gaussian process mixture (GPM) model is proposed, which employs a divide and conquer strategy. It automatically divides the chaotic time series into multiple modalities with different extrinsic patterns and intrinsic characteristics, and thus can more precisely fit the chaotic time series. (2) An effective sparse hard-cut expectation maximization (SHC-EM) learning algorithm for the GPM model is proposed to improve the prediction performance. SHC-EM replaces a large learning sample set with fewer pseudo inputs, accelerating model learning based on these pseudo inputs. Experiments on Lorenz and Chua time series demonstrate that the proposed method yields not only accurate multimodality prediction, but also the prediction confidence interval. SHC-EM outperforms the traditional variational learning in terms of both prediction accuracy and speed. In addition, SHC-EM is more robust and insusceptible to noise than variational learning. Supported by the National Natural Science Foundation of China under Grant No 60972106, the China Postdoctoral Science Foundation under Grant No 2014M561053, the Humanity and Social Science Foundation of Ministry of Education of China under Grant No 15YJA630108, and the Hebei Province Natural Science Foundation under Grant No E2016202341.
The Consequences of Model Misidentification in the Interrupted Time-Series Experiment.
ERIC Educational Resources Information Center
Padia, William L.
Campbell (l969) argued for the interrupted time-series experiment as a useful methodology for testing intervention effects in the social sciences. The validity of the statistical hypothesis testing of time-series, is, however, dependent upon the proper identification of the underlying stochastic nature of the data. Several types of model…
Empirical method to measure stochasticity and multifractality in nonlinear time series
NASA Astrophysics Data System (ADS)
Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping
2013-12-01
An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.
Robust extrema features for time-series data analysis.
Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N
2013-06-01
The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.
Inference of scale-free networks from gene expression time series.
Daisuke, Tominaga; Horton, Paul
2006-04-01
Quantitative time-series observation of gene expression is becoming possible, for example by cell array technology. However, there are no practical methods with which to infer network structures using only observed time-series data. As most computational models of biological networks for continuous time-series data have a high degree of freedom, it is almost impossible to infer the correct structures. On the other hand, it has been reported that some kinds of biological networks, such as gene networks and metabolic pathways, may have scale-free properties. We hypothesize that the architecture of inferred biological network models can be restricted to scale-free networks. We developed an inference algorithm for biological networks using only time-series data by introducing such a restriction. We adopt the S-system as the network model, and a distributed genetic algorithm to optimize models to fit its simulated results to observed time series data. We have tested our algorithm on a case study (simulated data). We compared optimization under no restriction, which allows for a fully connected network, and under the restriction that the total number of links must equal that expected from a scale free network. The restriction reduced both false positive and false negative estimation of the links and also the differences between model simulation and the given time-series data.
75 FR 47320 - Millington Securities, Inc., et al.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-05
... investment. At such time, the Series also will transmit to the Unaffiliated Underlying Fund a list of the... Unit Investment Trusts (the ``Trust''), on behalf of itself and any future series, and any future... by or under common control with the Depositor) and their respective series (the future UITs, together...
Characterization of chaotic attractors under noise: A recurrence network perspective
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2016-12-01
We undertake a detailed numerical investigation to understand how the addition of white and colored noise to a chaotic time series changes the topology and the structure of the underlying attractor reconstructed from the time series. We use the methods and measures of recurrence plot and recurrence network generated from the time series for this analysis. We explicitly show that the addition of noise obscures the property of recurrence of trajectory points in the phase space which is the hallmark of every dynamical system. However, the structure of the attractor is found to be robust even upto high noise levels of 50%. An advantage of recurrence network measures over the conventional nonlinear measures is that they can be applied on short and non stationary time series data. By using the results obtained from the above analysis, we go on to analyse the light curves from a dominant black hole system and show that the recurrence network measures are capable of identifying the nature of noise contamination in a time series.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-25
... minimum CMM quoting requirements based on a percentage of series or as a percentage of time achieves the... shares of underlying stock or exchange-traded fund shares. Long- term options are series with a time to... 60% of the non-adjusted options series that have a time to expiration of less than nine months); NYSE...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-08
.... (``Bloomberg''), FactSet Research Systems, Inc. (``FactSet'') and Thomson Reuters (``Reuters''). Real time data... reasonably related to the current value of the underlying index at the time such series are first opened for... to which such series relates at or about the time such series of options is first opened for trading...
Trend time-series modeling and forecasting with neural networks.
Qi, Min; Zhang, G Peter
2008-05-01
Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.
21 CFR 177.1330 - Ionomeric resins.
Code of Federal Regulations, 2012 CFR
2012-04-01
...% ethanol 72, 96, 120 The results from a series of extraction times demonstrate equilibrium when the net... the above time series, extraction times must be extended until three successive unchanging values for... characterizing the type of food and under the conditions of time and temperature characterizing the conditions of...
21 CFR 177.1330 - Ionomeric resins.
Code of Federal Regulations, 2011 CFR
2011-04-01
...% ethanol 72, 96, 120 The results from a series of extraction times demonstrate equilibrium when the net... the above time series, extraction times must be extended until three successive unchanging values for... characterizing the type of food and under the conditions of time and temperature characterizing the conditions of...
21 CFR 177.1330 - Ionomeric resins.
Code of Federal Regulations, 2013 CFR
2013-04-01
...% ethanol 72, 96, 120 The results from a series of extraction times demonstrate equilibrium when the net... the above time series, extraction times must be extended until three successive unchanging values for... characterizing the type of food and under the conditions of time and temperature characterizing the conditions of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-02
... Order Feed, which provides real-time updates every time a new limit order that is not immediately... each instrument series, including the symbols (series and underlying security), put or call indicator, the expiration and the strike price of the series. \\4\\ The ISE Order Feed does not include market...
The examination of headache activity using time-series research designs.
Houle, Timothy T; Remble, Thomas A; Houle, Thomas A
2005-05-01
The majority of research conducted on headache has utilized cross-sectional designs which preclude the examination of dynamic factors and principally rely on group-level effects. The present article describes the application of an individual-oriented process model using time-series analytical techniques. The blending of a time-series approach with an interactive process model allows consideration of the relationships of intra-individual dynamic processes, while not precluding the researcher to examine inter-individual differences. The authors explore the nature of time-series data and present two necessary assumptions underlying the time-series approach. The concept of shock and its contribution to headache activity is also presented. The time-series approach is not without its problems and two such problems are specifically reported: autocorrelation and the distribution of daily observations. The article concludes with the presentation of several analytical techniques suited to examine the time-series interactive process model.
Tewatia, D K; Tolakanahalli, R P; Paliwal, B R; Tomé, W A
2011-04-07
The underlying requirements for successful implementation of any efficient tumour motion management strategy are regularity and reproducibility of a patient's breathing pattern. The physiological act of breathing is controlled by multiple nonlinear feedback and feed-forward couplings. It would therefore be appropriate to analyse the breathing pattern of lung cancer patients in the light of nonlinear dynamical system theory. The purpose of this paper is to analyse the one-dimensional respiratory time series of lung cancer patients based on nonlinear dynamics and delay coordinate state space embedding. It is very important to select a suitable pair of embedding dimension 'm' and time delay 'τ' when performing a state space reconstruction. Appropriate time delay and embedding dimension were obtained using well-established methods, namely mutual information and the false nearest neighbour method, respectively. Establishing stationarity and determinism in a given scalar time series is a prerequisite to demonstrating that the nonlinear dynamical system that gave rise to the scalar time series exhibits a sensitive dependence on initial conditions, i.e. is chaotic. Hence, once an appropriate state space embedding of the dynamical system has been reconstructed, we show that the time series of the nonlinear dynamical systems under study are both stationary and deterministic in nature. Once both criteria are established, we proceed to calculate the largest Lyapunov exponent (LLE), which is an invariant quantity under time delay embedding. The LLE for all 16 patients is positive, which along with stationarity and determinism establishes the fact that the time series of a lung cancer patient's breathing pattern is not random or irregular, but rather it is deterministic in nature albeit chaotic. These results indicate that chaotic characteristics exist in the respiratory waveform and techniques based on state space dynamics should be employed for tumour motion management.
Estimation of Parameters from Discrete Random Nonstationary Time Series
NASA Astrophysics Data System (ADS)
Takayasu, H.; Nakamura, T.
For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.
Time averaging, ageing and delay analysis of financial time series
NASA Astrophysics Data System (ADS)
Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf
2017-06-01
We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.
Emerging spectra of singular correlation matrices under small power-map deformations
NASA Astrophysics Data System (ADS)
Vinayak; Schäfer, Rudi; Seligman, Thomas H.
2013-09-01
Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.
Emerging spectra of singular correlation matrices under small power-map deformations.
Vinayak; Schäfer, Rudi; Seligman, Thomas H
2013-09-01
Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.
Wu, Zi Yi; Xie, Ping; Sang, Yan Fang; Gu, Hai Ting
2018-04-01
The phenomenon of jump is one of the importantly external forms of hydrological variabi-lity under environmental changes, representing the adaption of hydrological nonlinear systems to the influence of external disturbances. Presently, the related studies mainly focus on the methods for identifying the jump positions and jump times in hydrological time series. In contrast, few studies have focused on the quantitative description and classification of jump degree in hydrological time series, which make it difficult to understand the environmental changes and evaluate its potential impacts. Here, we proposed a theatrically reliable and easy-to-apply method for the classification of jump degree in hydrological time series, using the correlation coefficient as a basic index. The statistical tests verified the accuracy, reasonability, and applicability of this method. The relationship between the correlation coefficient and the jump degree of series were described using mathematical equation by derivation. After that, several thresholds of correlation coefficients under different statistical significance levels were chosen, based on which the jump degree could be classified into five levels: no, weak, moderate, strong and very strong. Finally, our method was applied to five diffe-rent observed hydrological time series, with diverse geographic and hydrological conditions in China. The results of the classification of jump degrees in those series were closely accorded with their physically hydrological mechanisms, indicating the practicability of our method.
Sea change: Charting the course for biogeochemical ocean time-series research in a new millennium
NASA Astrophysics Data System (ADS)
Church, Matthew J.; Lomas, Michael W.; Muller-Karger, Frank
2013-09-01
Ocean time-series provide vital information needed for assessing ecosystem change. This paper summarizes the historical context, major program objectives, and future research priorities for three contemporary ocean time-series programs: The Hawaii Ocean Time-series (HOT), the Bermuda Atlantic Time-series Study (BATS), and the CARIACO Ocean Time-Series. These three programs operate in physically and biogeochemically distinct regions of the world's oceans, with HOT and BATS located in the open-ocean waters of the subtropical North Pacific and North Atlantic, respectively, and CARIACO situated in the anoxic Cariaco Basin of the tropical Atlantic. All three programs sustain near-monthly shipboard occupations of their field sampling sites, with HOT and BATS beginning in 1988, and CARIACO initiated in 1996. The resulting data provide some of the only multi-disciplinary, decadal-scale determinations of time-varying ecosystem change in the global ocean. Facilitated by a scoping workshop (September 2010) sponsored by the Ocean Carbon Biogeochemistry (OCB) program, leaders of these time-series programs sought community input on existing program strengths and for future research directions. Themes that emerged from these discussions included: 1. Shipboard time-series programs are key to informing our understanding of the connectivity between changes in ocean-climate and biogeochemistry 2. The scientific and logistical support provided by shipboard time-series programs forms the backbone for numerous research and education programs. Future studies should be encouraged that seek mechanistic understanding of ecological interactions underlying the biogeochemical dynamics at these sites. 3. Detecting time-varying trends in ocean properties and processes requires consistent, high-quality measurements. Time-series must carefully document analytical procedures and, where possible, trace the accuracy of analyses to certified standards and internal reference materials. 4. Leveraged implementation, testing, and validation of autonomous and remote observing technologies at time-series sites provide new insights into spatiotemporal variability underlying ecosystem changes. 5. The value of existing time-series data for formulating and validating ecosystem models should be promoted. In summary, the scientific underpinnings of ocean time-series programs remain as strong and important today as when these programs were initiated. The emerging data inform our knowledge of the ocean's biogeochemistry and ecology, and improve our predictive capacity about planetary change.
NASA Astrophysics Data System (ADS)
Bock, Y.; Fang, P.; Moore, A. W.; Kedar, S.; Liu, Z.; Owen, S. E.; Glasscoe, M. T.
2016-12-01
Detection of time-dependent crustal deformation relies on the availability of accurate surface displacements, proper time series analysis to correct for secular motion, coseismic and non-tectonic instrument offsets, periodic signatures at different frequencies, and a realistic estimate of uncertainties for the parameters of interest. As part of the NASA Solid Earth Science ESDR System (SESES) project, daily displacement time series are estimated for about 2500 stations, focused on tectonic plate boundaries and having a global distribution for accessing the terrestrial reference frame. The "combined" time series are optimally estimated from independent JPL GIPSY and SIO GAMIT solutions, using a consistent set of input epoch-date coordinates and metadata. The longest time series began in 1992; more than 30% of the stations have experienced one or more of 35 major earthquakes with significant postseismic deformation. Here we present three examples of time-dependent deformation that have been detected in the SESES displacement time series. (1) Postseismic deformation is a fundamental time-dependent signal that indicates a viscoelastic response of the crust/mantle lithosphere, afterslip, or poroelastic effects at different spatial and temporal scales. It is critical to identify and estimate the extent of postseismic deformation in both space and time not only for insight into the crustal deformation and earthquake cycles and their underlying physical processes, but also to reveal other time-dependent signals. We report on our database of characterized postseismic motions using a principal component analysis to isolate different postseismic processes. (2) Starting with the SESES combined time series and applying a time-dependent Kalman filter, we examine episodic tremor and slow slip (ETS) in the Cascadia subduction zone. We report on subtle slip details, allowing investigation of the spatiotemporal relationship between slow slip transients and tremor and their underlying physical mechanisms. (3) We present evolving strain dilatation and shear rates based on the SESES velocities for regional subnetworks as a metric for assigning earthquake probabilities and detection of possible time-dependent deformation related to underlying physical processes.
NASA Astrophysics Data System (ADS)
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Multivariate Time Series Decomposition into Oscillation Components.
Matsuda, Takeru; Komaki, Fumiyasu
2017-08-01
Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.
76 FR 81473 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-28
... records. Core questions will be used to explore relationships among the concepts, develop a time series... of events occur or look for underlying causes when we see a change in the time series. Up to 20 times... fielded for a pre-specified amount of time. These experimental questions will be submitted to OMB at a...
Using Time Series Analysis to Predict Cardiac Arrest in a PICU.
Kennedy, Curtis E; Aoki, Noriaki; Mariscalco, Michele; Turley, James P
2015-11-01
To build and test cardiac arrest prediction models in a PICU, using time series analysis as input, and to measure changes in prediction accuracy attributable to different classes of time series data. Retrospective cohort study. Thirty-one bed academic PICU that provides care for medical and general surgical (not congenital heart surgery) patients. Patients experiencing a cardiac arrest in the PICU and requiring external cardiac massage for at least 2 minutes. None. One hundred three cases of cardiac arrest and 109 control cases were used to prepare a baseline dataset that consisted of 1,025 variables in four data classes: multivariate, raw time series, clinical calculations, and time series trend analysis. We trained 20 arrest prediction models using a matrix of five feature sets (combinations of data classes) with four modeling algorithms: linear regression, decision tree, neural network, and support vector machine. The reference model (multivariate data with regression algorithm) had an accuracy of 78% and 87% area under the receiver operating characteristic curve. The best model (multivariate + trend analysis data with support vector machine algorithm) had an accuracy of 94% and 98% area under the receiver operating characteristic curve. Cardiac arrest predictions based on a traditional model built with multivariate data and a regression algorithm misclassified cases 3.7 times more frequently than predictions that included time series trend analysis and built with a support vector machine algorithm. Although the final model lacks the specificity necessary for clinical application, we have demonstrated how information from time series data can be used to increase the accuracy of clinical prediction models.
Error-based Extraction of States and Energy Landscapes from Experimental Single-Molecule Time-Series
NASA Astrophysics Data System (ADS)
Taylor, J. Nicholas; Li, Chun-Biu; Cooper, David R.; Landes, Christy F.; Komatsuzaki, Tamiki
2015-03-01
Characterization of states, the essential components of the underlying energy landscapes, is one of the most intriguing subjects in single-molecule (SM) experiments due to the existence of noise inherent to the measurements. Here we present a method to extract the underlying state sequences from experimental SM time-series. Taking into account empirical error and the finite sampling of the time-series, the method extracts a steady-state network which provides an approximation of the underlying effective free energy landscape. The core of the method is the application of rate-distortion theory from information theory, allowing the individual data points to be assigned to multiple states simultaneously. We demonstrate the method's proficiency in its application to simulated trajectories as well as to experimental SM fluorescence resonance energy transfer (FRET) trajectories obtained from isolated agonist binding domains of the AMPA receptor, an ionotropic glutamate receptor that is prevalent in the central nervous system.
Parametric, nonparametric and parametric modelling of a chaotic circuit time series
NASA Astrophysics Data System (ADS)
Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.
2000-09-01
The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.
Code of Federal Regulations, 2010 CFR
2010-10-01
... meeting or the portion or portions of a series of meetings setting forth the time and place of the... any series of meetings under the provisions of §§ 503.74 and 503.75, the General Counsel of the agency... series of meetings is proper under the provisions of this subpart and the terms of the Government in the...
Code of Federal Regulations, 2014 CFR
2014-10-01
... meeting or the portion or portions of a series of meetings setting forth the time and place of the... any series of meetings under the provisions of §§ 503.74 and 503.75, the General Counsel of the agency... series of meetings is proper under the provisions of this subpart and the terms of the Government in the...
Code of Federal Regulations, 2011 CFR
2011-10-01
... meeting or the portion or portions of a series of meetings setting forth the time and place of the... any series of meetings under the provisions of §§ 503.74 and 503.75, the General Counsel of the agency... series of meetings is proper under the provisions of this subpart and the terms of the Government in the...
Code of Federal Regulations, 2012 CFR
2012-10-01
... meeting or the portion or portions of a series of meetings setting forth the time and place of the... any series of meetings under the provisions of §§ 503.74 and 503.75, the General Counsel of the agency... series of meetings is proper under the provisions of this subpart and the terms of the Government in the...
Self-organising mixture autoregressive model for non-stationary time series modelling.
Ni, He; Yin, Hujun
2008-12-01
Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.
Superstatistical fluctuations in time series: Applications to share-price dynamics and turbulence
NASA Astrophysics Data System (ADS)
van der Straeten, Erik; Beck, Christian
2009-09-01
We report a general technique to study a given experimental time series with superstatistics. Crucial for the applicability of the superstatistics concept is the existence of a parameter β that fluctuates on a large time scale as compared to the other time scales of the complex system under consideration. The proposed method extracts the main superstatistical parameters out of a given data set and examines the validity of the superstatistical model assumptions. We test the method thoroughly with surrogate data sets. Then the applicability of the superstatistical approach is illustrated using real experimental data. We study two examples, velocity time series measured in turbulent Taylor-Couette flows and time series of log returns of the closing prices of some stock market indices.
Coarse-graining time series data: Recurrence plot of recurrence plots and its application for music
NASA Astrophysics Data System (ADS)
Fukino, Miwa; Hirata, Yoshito; Aihara, Kazuyuki
2016-02-01
We propose a nonlinear time series method for characterizing two layers of regularity simultaneously. The key of the method is using the recurrence plots hierarchically, which allows us to preserve the underlying regularities behind the original time series. We demonstrate the proposed method with musical data. The proposed method enables us to visualize both the local and the global musical regularities or two different features at the same time. Furthermore, the determinism scores imply that the proposed method may be useful for analyzing emotional response to the music.
Coarse-graining time series data: Recurrence plot of recurrence plots and its application for music.
Fukino, Miwa; Hirata, Yoshito; Aihara, Kazuyuki
2016-02-01
We propose a nonlinear time series method for characterizing two layers of regularity simultaneously. The key of the method is using the recurrence plots hierarchically, which allows us to preserve the underlying regularities behind the original time series. We demonstrate the proposed method with musical data. The proposed method enables us to visualize both the local and the global musical regularities or two different features at the same time. Furthermore, the determinism scores imply that the proposed method may be useful for analyzing emotional response to the music.
46 CFR 108.550 - Survival craft launching and recovery arrangements: General.
Code of Federal Regulations, 2011 CFR
2011-10-01
... must be designed, based on the ultimate strength of the construction material, to be at least 4.5 times...-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT Lifesaving Equipment § 108.550 Survival craft... approved under approval series 160.132, with a winch approved under approval series 160.115. Each launching...
46 CFR 108.550 - Survival craft launching and recovery arrangements: General.
Code of Federal Regulations, 2010 CFR
2010-10-01
... must be designed, based on the ultimate strength of the construction material, to be at least 4.5 times...-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT Lifesaving Equipment § 108.550 Survival craft... approved under approval series 160.132, with a winch approved under approval series 160.115. Each launching...
Low Streamflow Forcasting using Minimum Relative Entropy
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2013-12-01
Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.
Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.
Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin
2016-01-01
This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.
Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations
Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin
2016-01-01
This paper introduces a new approach–the Principal Component Gradient Analysis (PCGA)–to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA. PMID:27467508
Time-dependent limited penetrable visibility graph analysis of nonstationary time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong
2017-06-01
Recent years have witnessed the development of visibility graph theory, which allows us to analyze a time series from the perspective of complex network. We in this paper develop a novel time-dependent limited penetrable visibility graph (TDLPVG). Two examples using nonstationary time series from RR intervals and gas-liquid flows are provided to demonstrate the effectiveness of our approach. The results of the first example suggest that our TDLPVG method allows characterizing the time-varying behaviors and classifying heart states of healthy, congestive heart failure and atrial fibrillation from RR interval time series. For the second example, we infer TDLPVGs from gas-liquid flow signals and interestingly find that the deviation of node degree of TDLPVGs enables to effectively uncover the time-varying dynamical flow behaviors of gas-liquid slug and bubble flow patterns. All these results render our TDLPVG method particularly powerful for characterizing the time-varying features underlying realistic complex systems from time series.
Complexity multiscale asynchrony measure and behavior for interacting financial dynamics
NASA Astrophysics Data System (ADS)
Yang, Ge; Wang, Jun; Niu, Hongli
2016-08-01
A stochastic financial price process is proposed and investigated by the finite-range multitype contact dynamical system, in an attempt to study the nonlinear behaviors of real asset markets. The viruses spreading process in a finite-range multitype system is used to imitate the interacting behaviors of diverse investment attitudes in a financial market, and the empirical research on descriptive statistics and autocorrelation behaviors of return time series is performed for different values of propagation rates. Then the multiscale entropy analysis is adopted to study several different shuffled return series, including the original return series, the corresponding reversal series, the random shuffled series, the volatility shuffled series and the Zipf-type shuffled series. Furthermore, we propose and compare the multiscale cross-sample entropy and its modification algorithm called composite multiscale cross-sample entropy. We apply them to study the asynchrony of pairs of time series under different time scales.
NASA Astrophysics Data System (ADS)
Ramli, Nazirah; Mutalib, Siti Musleha Ab; Mohamad, Daud
2017-08-01
Fuzzy time series forecasting model has been proposed since 1993 to cater for data in linguistic values. Many improvement and modification have been made to the model such as enhancement on the length of interval and types of fuzzy logical relation. However, most of the improvement models represent the linguistic term in the form of discrete fuzzy sets. In this paper, fuzzy time series model with data in the form of trapezoidal fuzzy numbers and natural partitioning length approach is introduced for predicting the unemployment rate. Two types of fuzzy relations are used in this study which are first order and second order fuzzy relation. This proposed model can produce the forecasted values under different degree of confidence.
Zhao, Yan-Hong; Zhang, Xue-Fang; Zhao, Yan-Qiu; Bai, Fan; Qin, Fan; Sun, Jing; Dong, Ying
2017-08-01
Chronic myeloid leukemia (CML) is characterized by the accumulation of active BCR-ABL protein. Imatinib is the first-line treatment of CML; however, many patients are resistant to this drug. In this study, we aimed to compare the differences in expression patterns and functions of time-series genes in imatinib-resistant CML cells under different drug treatments. GSE24946 was downloaded from the GEO database, which included 17 samples of K562-r cells with (n=12) or without drug administration (n=5). Three drug treatment groups were considered for this study: arsenic trioxide (ATO), AMN107, and ATO+AMN107. Each group had one sample at each time point (3, 12, 24, and 48 h). Time-series genes with a ratio of standard deviation/average (coefficient of variation) >0.15 were screened, and their expression patterns were revealed based on Short Time-series Expression Miner (STEM). Then, the functional enrichment analysis of time-series genes in each group was performed using DAVID, and the genes enriched in the top ten functional categories were extracted to detect their expression patterns. Different time-series genes were identified in the three groups, and most of them were enriched in the ribosome and oxidative phosphorylation pathways. Time-series genes in the three treatment groups had different expression patterns and functions. Time-series genes in the ATO group (e.g. CCNA2 and DAB2) were significantly associated with cell adhesion, those in the AMN107 group were related to cellular carbohydrate metabolic process, while those in the ATO+AMN107 group (e.g. AP2M1) were significantly related to cell proliferation and antigen processing. In imatinib-resistant CML cells, ATO could influence genes related to cell adhesion, AMN107 might affect genes involved in cellular carbohydrate metabolism, and the combination therapy might regulate genes involved in cell proliferation.
76 FR 62481 - Incapital LLC, et al.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
... investment. At such time, the Series also will transmit to the Unaffiliated Underlying Fund a list of the... Application: Applicants request an order that would permit certain series of a registered unit investment... series thereof (the ``Funds'') both within and outside the same group of investment companies. Applicants...
A Nonlinear Dynamical Systems based Model for Stochastic Simulation of Streamflow
NASA Astrophysics Data System (ADS)
Erkyihun, S. T.; Rajagopalan, B.; Zagona, E. A.
2014-12-01
Traditional time series methods model the evolution of the underlying process as a linear or nonlinear function of the autocorrelation. These methods capture the distributional statistics but are incapable of providing insights into the dynamics of the process, the potential regimes, and predictability. This work develops a nonlinear dynamical model for stochastic simulation of streamflows. In this, first a wavelet spectral analysis is employed on the flow series to isolate dominant orthogonal quasi periodic timeseries components. The periodic bands are added denoting the 'signal' component of the time series and the residual being the 'noise' component. Next, the underlying nonlinear dynamics of this combined band time series is recovered. For this the univariate time series is embedded in a d-dimensional space with an appropriate lag T to recover the state space in which the dynamics unfolds. Predictability is assessed by quantifying the divergence of trajectories in the state space with time, as Lyapunov exponents. The nonlinear dynamics in conjunction with a K-nearest neighbor time resampling is used to simulate the combined band, to which the noise component is added to simulate the timeseries. We demonstrate this method by applying it to the data at Lees Ferry that comprises of both the paleo reconstructed and naturalized historic annual flow spanning 1490-2010. We identify interesting dynamics of the signal in the flow series and epochal behavior of predictability. These will be of immense use for water resources planning and management.
Burby, Joshua W.; Lacker, Daniel
2016-01-01
Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or the number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems. PMID:27689714
2017-01-01
are the shear relaxation moduli and relaxation times , which make up the classical Prony series . A Prony- series expansion is a relaxation function...approximation for modeling time -dependent damping. The scalar parameters 1 and 2 control the nonlinearity of the Prony series . Under the...Velodyne that best fit the experimental stress-strain data. To do so, the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA
22 CFR 1203.735-401 - Employees required to submit statements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... consultants serving on a full-time or intermittent basis, except when waived under § 1203.735-402(c). (b... overseas whose positions fall within the following series or position titles (occupational code given in parenthesis): Economist Series (0110); International Cooperation Series (0136); Auditor General (0301.21...
22 CFR 1203.735-401 - Employees required to submit statements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... consultants serving on a full-time or intermittent basis, except when waived under § 1203.735-402(c). (b... overseas whose positions fall within the following series or position titles (occupational code given in parenthesis): Economist Series (0110); International Cooperation Series (0136); Auditor General (0301.21...
22 CFR 1203.735-401 - Employees required to submit statements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... consultants serving on a full-time or intermittent basis, except when waived under § 1203.735-402(c). (b... overseas whose positions fall within the following series or position titles (occupational code given in parenthesis): Economist Series (0110); International Cooperation Series (0136); Auditor General (0301.21...
22 CFR 1203.735-401 - Employees required to submit statements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... consultants serving on a full-time or intermittent basis, except when waived under § 1203.735-402(c). (b... overseas whose positions fall within the following series or position titles (occupational code given in parenthesis): Economist Series (0110); International Cooperation Series (0136); Auditor General (0301.21...
22 CFR 1203.735-401 - Employees required to submit statements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... consultants serving on a full-time or intermittent basis, except when waived under § 1203.735-402(c). (b... overseas whose positions fall within the following series or position titles (occupational code given in parenthesis): Economist Series (0110); International Cooperation Series (0136); Auditor General (0301.21...
NASA Astrophysics Data System (ADS)
Cannas, Barbara; Fanni, Alessandra; Murari, Andrea; Pisano, Fabio; Contributors, JET
2018-02-01
In this paper, the dynamic characteristics of type-I ELM time-series from the JET tokamak, the world’s largest magnetic confinement plasma physics experiment, have been investigated. The dynamic analysis has been focused on the detection of nonlinear structure in D α radiation time series. Firstly, the method of surrogate data has been applied to evaluate the statistical significance of the null hypothesis of static nonlinear distortion of an underlying Gaussian linear process. Several nonlinear statistics have been evaluated, such us the time delayed mutual information, the correlation dimension and the maximal Lyapunov exponent. The obtained results allow us to reject the null hypothesis, giving evidence of underlying nonlinear dynamics. Moreover, no evidence of low-dimensional chaos has been found; indeed, the analysed time series are better characterized by the power law sensitivity to initial conditions which can suggest a motion at the ‘edge of chaos’, at the border between chaotic and regular non-chaotic dynamics. This uncertainty makes it necessary to further investigate about the nature of the nonlinear dynamics. For this purpose, a second surrogate test to distinguish chaotic orbits from pseudo-periodic orbits has been applied. In this case, we cannot reject the null hypothesis which means that the ELM time series is possibly pseudo-periodic. In order to reproduce pseudo-periodic dynamical properties, a periodic state-of-the-art model, proposed to reproduce the ELM cycle, has been corrupted by a dynamical noise, obtaining time series qualitatively in agreement with experimental time series.
2011-01-01
Background Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Results Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. Conclusions The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html. PMID:21851598
Yuan, Yuan; Chen, Yi-Ping Phoebe; Ni, Shengyu; Xu, Augix Guohua; Tang, Lin; Vingron, Martin; Somel, Mehmet; Khaitovich, Philipp
2011-08-18
Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html.
The Design of Time-Series Comparisons under Resource Constraints.
ERIC Educational Resources Information Center
Willemain, Thomas R.; Hartunian, Nelson S.
1982-01-01
Two methods for dividing an interrupted time-series study between baseline and experimental phases when study resources are limited are compared. In fixed designs, the baseline duration is predetermined. In flexible designs the baseline duration is contingent on remaining resources and the match of results to prior expectations of the evaluator.…
Mathematical Sciences Division 1992 Programs
1992-10-01
statistical theory that underlies modern signal analysis . There is a strong emphasis on stochastic processes and time series , particularly those which...include optimal resource planning and real- time scheduling of stochastic shop-floor processes. Scheduling systems will be developed that can adapt to...make forecasts for the length-of-service time series . Protocol analysis of these sessions will be used to idenify relevant contextual features and to
An improvement of the measurement of time series irreversibility with visibility graph approach
NASA Astrophysics Data System (ADS)
Wu, Zhenyu; Shang, Pengjian; Xiong, Hui
2018-07-01
We propose a method to improve the measure of real-valued time series irreversibility which contains two tools: the directed horizontal visibility graph and the Kullback-Leibler divergence. The degree of time irreversibility is estimated by the Kullback-Leibler divergence between the in and out degree distributions presented in the associated visibility graph. In our work, we reframe the in and out degree distributions by encoding them with different embedded dimensions used in calculating permutation entropy(PE). With this improved method, we can not only estimate time series irreversibility efficiently, but also detect time series irreversibility from multiple dimensions. We verify the validity of our method and then estimate the amount of time irreversibility of series generated by chaotic maps as well as global stock markets over the period 2005-2015. The result shows that the amount of time irreversibility reaches the peak with embedded dimension d = 3 under circumstances of experiment and financial markets.
Prediction of flow dynamics using point processes
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; Stemler, Thomas; Eroglu, Deniz; Marwan, Norbert
2018-01-01
Describing a time series parsimoniously is the first step to study the underlying dynamics. For a time-discrete system, a generating partition provides a compact description such that a time series and a symbolic sequence are one-to-one. But, for a time-continuous system, such a compact description does not have a solid basis. Here, we propose to describe a time-continuous time series using a local cross section and the times when the orbit crosses the local cross section. We show that if such a series of crossing times and some past observations are given, we can predict the system's dynamics with fine accuracy. This reconstructability neither depends strongly on the size nor the placement of the local cross section if we have a sufficiently long database. We demonstrate the proposed method using the Lorenz model as well as the actual measurement of wind speed.
77 FR 33790 - Hennion & Walsh, Inc. and Smart Trust; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... investment. At such time, the Series also will transmit to the Unaffiliated Underlying Fund a list of the... of the Application: Applicants request an order that would permit certain series of a unit investment... and unit investment trusts or series thereof (the ``Funds'') both within and outside the same group of...
77 FR 4588 - Incapital LLC and Incapital Unit Trust; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-30
... sales charge. If such a market is not maintained at any time for any Series, holders of the Units... future unit investment trusts (collectively, with the Incapital Trust, the ``Trusts'') and series of the Trusts (``Series'') that are sponsored by Incapital or any entity controlling, controlled by or under...
31 CFR 360.35 - Payment (redemption).
Code of Federal Regulations, 2013 CFR
2013-07-01
... earlier, will be paid at any time after six months from issue date. A Series I bond issued on February 1... STATES SAVINGS BONDS, SERIES I General Provisions for Payment § 360.35 Payment (redemption). (a) General. Payment of a Series I savings bond will be made to the person or persons entitled under the provisions of...
31 CFR 360.35 - Payment (redemption).
Code of Federal Regulations, 2010 CFR
2010-07-01
... earlier, will be paid at any time after six months from issue date. A Series I bond issued on February 1... STATES SAVINGS BONDS, SERIES I General Provisions for Payment § 360.35 Payment (redemption). (a) General. Payment of a Series I savings bond will be made to the person or persons entitled under the provisions of...
31 CFR 360.35 - Payment (redemption).
Code of Federal Regulations, 2011 CFR
2011-07-01
... earlier, will be paid at any time after six months from issue date. A Series I bond issued on February 1... STATES SAVINGS BONDS, SERIES I General Provisions for Payment § 360.35 Payment (redemption). (a) General. Payment of a Series I savings bond will be made to the person or persons entitled under the provisions of...
31 CFR 360.35 - Payment (redemption).
Code of Federal Regulations, 2014 CFR
2014-07-01
... earlier, will be paid at any time after six months from issue date. A Series I bond issued on February 1... STATES SAVINGS BONDS, SERIES I General Provisions for Payment § 360.35 Payment (redemption). (a) General. Payment of a Series I savings bond will be made to the person or persons entitled under the provisions of...
31 CFR 360.35 - Payment (redemption).
Code of Federal Regulations, 2012 CFR
2012-07-01
... earlier, will be paid at any time after six months from issue date. A Series I bond issued on February 1... STATES SAVINGS BONDS, SERIES I General Provisions for Payment § 360.35 Payment (redemption). (a) General. Payment of a Series I savings bond will be made to the person or persons entitled under the provisions of...
A Bayesian nonparametric approach to dynamical noise reduction
NASA Astrophysics Data System (ADS)
Kaloudis, Konstantinos; Hatjispyros, Spyridon J.
2018-06-01
We propose a Bayesian nonparametric approach for the noise reduction of a given chaotic time series contaminated by dynamical noise, based on Markov Chain Monte Carlo methods. The underlying unknown noise process (possibly) exhibits heavy tailed behavior. We introduce the Dynamic Noise Reduction Replicator model with which we reconstruct the unknown dynamic equations and in parallel we replicate the dynamics under reduced noise level dynamical perturbations. The dynamic noise reduction procedure is demonstrated specifically in the case of polynomial maps. Simulations based on synthetic time series are presented.
Asquith, W.H.; Mosier, J. G.; Bush, P.W.
1997-01-01
The watershed simulation model Hydrologic Simulation Program—Fortran (HSPF) was used to generate simulated flow (runoff) from the 13 watersheds to the six bay systems because adequate gaged streamflow data from which to estimate freshwater inflows are not available; only about 23 percent of the adjacent contributing watershed area is gaged. The model was calibrated for the gaged parts of three watersheds—that is, selected input parameters (meteorologic and hydrologic properties and conditions) that control runoff were adjusted in a series of simulations until an adequate match between model-generated flows and a set (time series) of gaged flows was achieved. The primary model input is rainfall and evaporation data and the model output is a time series of runoff volumes. After calibration, simulations driven by daily rainfall for a 26-year period (1968–93) were done for the 13 watersheds to obtain runoff under current (1983–93), predevelopment (pre-1940 streamflow and pre-urbanization), and future (2010) land-use conditions for estimating freshwater inflows and for comparing runoff under the three land-use conditions; and to obtain time series of runoff from which to estimate time series of freshwater inflows for trend analysis.
Transition Icons for Time-Series Visualization and Exploratory Analysis.
Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa
2018-03-01
The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.
Application of dynamic topic models to toxicogenomics data.
Lee, Mikyung; Liu, Zhichao; Huang, Ruili; Tong, Weida
2016-10-06
All biological processes are inherently dynamic. Biological systems evolve transiently or sustainably according to sequential time points after perturbation by environment insults, drugs and chemicals. Investigating the temporal behavior of molecular events has been an important subject to understand the underlying mechanisms governing the biological system in response to, such as, drug treatment. The intrinsic complexity of time series data requires appropriate computational algorithms for data interpretation. In this study, we propose, for the first time, the application of dynamic topic models (DTM) for analyzing time-series gene expression data. A large time-series toxicogenomics dataset was studied. It contains over 3144 microarrays of gene expression data corresponding to rat livers treated with 131 compounds (most are drugs) at two doses (control and high dose) in a repeated schedule containing four separate time points (4-, 8-, 15- and 29-day). We analyzed, with DTM, the topics (consisting of a set of genes) and their biological interpretations over these four time points. We identified hidden patterns embedded in this time-series gene expression profiles. From the topic distribution for compound-time condition, a number of drugs were successfully clustered by their shared mode-of-action such as PPARɑ agonists and COX inhibitors. The biological meaning underlying each topic was interpreted using diverse sources of information such as functional analysis of the pathways and therapeutic uses of the drugs. Additionally, we found that sample clusters produced by DTM are much more coherent in terms of functional categories when compared to traditional clustering algorithms. We demonstrated that DTM, a text mining technique, can be a powerful computational approach for clustering time-series gene expression profiles with the probabilistic representation of their dynamic features along sequential time frames. The method offers an alternative way for uncovering hidden patterns embedded in time series gene expression profiles to gain enhanced understanding of dynamic behavior of gene regulation in the biological system.
Monitoring of seismic time-series with advanced parallel computational tools and complex networks
NASA Astrophysics Data System (ADS)
Kechaidou, M.; Sirakoulis, G. Ch.; Scordilis, E. M.
2012-04-01
Earthquakes have been in the focus of human and research interest for several centuries due to their catastrophic effect to the everyday life as they occur almost all over the world demonstrating a hard to be modelled unpredictable behaviour. On the other hand, their monitoring with more or less technological updated instruments has been almost continuous and thanks to this fact several mathematical models have been presented and proposed so far to describe possible connections and patterns found in the resulting seismological time-series. Especially, in Greece, one of the most seismically active territories on earth, detailed instrumental seismological data are available from the beginning of the past century providing the researchers with valuable and differential knowledge about the seismicity levels all over the country. Considering available powerful parallel computational tools, such as Cellular Automata, these data can be further successfully analysed and, most important, modelled to provide possible connections between different parameters of the under study seismic time-series. More specifically, Cellular Automata have been proven very effective to compose and model nonlinear complex systems resulting in the advancement of several corresponding models as possible analogues of earthquake fault dynamics. In this work preliminary results of modelling of the seismic time-series with the help of Cellular Automata so as to compose and develop the corresponding complex networks are presented. The proposed methodology will be able to reveal under condition hidden relations as found in the examined time-series and to distinguish the intrinsic time-series characteristics in an effort to transform the examined time-series to complex networks and graphically represent their evolvement in the time-space. Consequently, based on the presented results, the proposed model will eventually serve as a possible efficient flexible computational tool to provide a generic understanding of the possible triggering mechanisms as arrived from the adequately monitoring and modelling of the regional earthquake phenomena.
Time irreversibility and intrinsics revealing of series with complex network approach
NASA Astrophysics Data System (ADS)
Xiong, Hui; Shang, Pengjian; Xia, Jianan; Wang, Jing
2018-06-01
In this work, we analyze time series on the basis of the visibility graph algorithm that maps the original series into a graph. By taking into account the all-round information carried by the signals, the time irreversibility and fractal behavior of series are evaluated from a complex network perspective, and considered signals are further classified from different aspects. The reliability of the proposed analysis is supported by numerical simulations on synthesized uncorrelated random noise, short-term correlated chaotic systems and long-term correlated fractal processes, and by the empirical analysis on daily closing prices of eleven worldwide stock indices. Obtained results suggest that finite size has a significant effect on the evaluation, and that there might be no direct relation between the time irreversibility and long-range correlation of series. Similarity and dissimilarity between stock indices are also indicated from respective regional and global perspectives, showing the existence of multiple features of underlying systems.
Constructing networks from a dynamical system perspective for multivariate nonlinear time series.
Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael
2016-03-01
We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.
Time-Series Analysis of Intermittent Velocity Fluctuations in Turbulent Boundary Layers
NASA Astrophysics Data System (ADS)
Zayernouri, Mohsen; Samiee, Mehdi; Meerschaert, Mark M.; Klewicki, Joseph
2017-11-01
Classical turbulence theory is modified under the inhomogeneities produced by the presence of a wall. In this regard, we propose a new time series model for the streamwise velocity fluctuations in the inertial sub-layer of turbulent boundary layers. The new model employs tempered fractional calculus and seamlessly extends the classical 5/3 spectral model of Kolmogorov in the inertial subrange to the whole spectrum from large to small scales. Moreover, the proposed time-series model allows the quantification of data uncertainties in the underlying stochastic cascade of turbulent kinetic energy. The model is tested using well-resolved streamwise velocity measurements up to friction Reynolds numbers of about 20,000. The physics of the energy cascade are briefly described within the context of the determined model parameters. This work was supported by the AFOSR Young Investigator Program (YIP) award (FA9550-17-1-0150) and partially by MURI/ARO (W911NF-15-1-0562).
Nonparametric autocovariance estimation from censored time series by Gaussian imputation.
Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K
2009-02-01
One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.
Statistical inference methods for sparse biological time series data.
Ndukum, Juliet; Fonseca, Luís L; Santos, Helena; Voit, Eberhard O; Datta, Susmita
2011-04-25
Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR) spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values <0.0001). We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures, based on ANOVA likelihood ratio tests, for testing the significance of differences between short time course data under different biological perturbations.
26 CFR 1.454-1 - Obligations issued at discount.
Code of Federal Regulations, 2010 CFR
2010-04-01
... series E bond, at which time the stated redemption value was $674.60. A never elected under section 454(a... obligation, in which he retains his investment in a matured series E U.S. savings bond, or (iii) A nontransferable obligation (whether or not a current income obligation) of the United States for which a series E...
Takayasu, Hideki; Takayasu, Misako
2017-01-01
We extend the concept of statistical symmetry as the invariance of a probability distribution under transformation to analyze binary sign time series data of price difference from the foreign exchange market. We model segments of the sign time series as Markov sequences and apply a local hypothesis test to evaluate the symmetries of independence and time reversion in different periods of the market. For the test, we derive the probability of a binary Markov process to generate a given set of number of symbol pairs. Using such analysis, we could not only segment the time series according the different behaviors but also characterize the segments in terms of statistical symmetries. As a particular result, we find that the foreign exchange market is essentially time reversible but this symmetry is broken when there is a strong external influence. PMID:28542208
Detection of crossover time scales in multifractal detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Ge, Erjia; Leung, Yee
2013-04-01
Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.
Quantifying evolutionary dynamics from variant-frequency time series
NASA Astrophysics Data System (ADS)
Khatri, Bhavin S.
2016-09-01
From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.
Quantifying evolutionary dynamics from variant-frequency time series.
Khatri, Bhavin S
2016-09-12
From Kimura's neutral theory of protein evolution to Hubbell's neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher's angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.
ERIC Educational Resources Information Center
St. Clair, Travis; Hallberg, Kelly; Cook, Thomas D.
2016-01-01
We explore the conditions under which short, comparative interrupted time-series (CITS) designs represent valid alternatives to randomized experiments in educational evaluations. To do so, we conduct three within-study comparisons, each of which uses a unique data set to test the validity of the CITS design by comparing its causal estimates to…
46 CFR 503.83 - Public announcement of changes in meeting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... a portion or portions of a series of meetings, the time and place of such meeting, and the... in §§ 503.80 and 503.81, under the provisions of paragraphs (b) and (c) of this section, the time or place of a meeting or series of meetings may be changed by the agency following accomplishment of the...
46 CFR 503.83 - Public announcement of changes in meeting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... a portion or portions of a series of meetings, the time and place of such meeting, and the... in §§ 503.80 and 503.81, under the provisions of paragraphs (b) and (c) of this section, the time or place of a meeting or series of meetings may be changed by the agency following accomplishment of the...
46 CFR 503.83 - Public announcement of changes in meeting.
Code of Federal Regulations, 2013 CFR
2013-10-01
... a portion or portions of a series of meetings, the time and place of such meeting, and the... in §§ 503.80 and 503.81, under the provisions of paragraphs (b) and (c) of this section, the time or place of a meeting or series of meetings may be changed by the agency following accomplishment of the...
46 CFR 503.83 - Public announcement of changes in meeting.
Code of Federal Regulations, 2012 CFR
2012-10-01
... a portion or portions of a series of meetings, the time and place of such meeting, and the... in §§ 503.80 and 503.81, under the provisions of paragraphs (b) and (c) of this section, the time or place of a meeting or series of meetings may be changed by the agency following accomplishment of the...
46 CFR 503.83 - Public announcement of changes in meeting.
Code of Federal Regulations, 2010 CFR
2010-10-01
... a portion or portions of a series of meetings, the time and place of such meeting, and the... in §§ 503.80 and 503.81, under the provisions of paragraphs (b) and (c) of this section, the time or place of a meeting or series of meetings may be changed by the agency following accomplishment of the...
Memory and betweenness preference in temporal networks induced from time series
NASA Astrophysics Data System (ADS)
Weng, Tongfeng; Zhang, Jie; Small, Michael; Zheng, Rui; Hui, Pan
2017-02-01
We construct temporal networks from time series via unfolding the temporal information into an additional topological dimension of the networks. Thus, we are able to introduce memory entropy analysis to unravel the memory effect within the considered signal. We find distinct patterns in the entropy growth rate of the aggregate network at different memory scales for time series with different dynamics ranging from white noise, 1/f noise, autoregressive process, periodic to chaotic dynamics. Interestingly, for a chaotic time series, an exponential scaling emerges in the memory entropy analysis. We demonstrate that the memory exponent can successfully characterize bifurcation phenomenon, and differentiate the human cardiac system in healthy and pathological states. Moreover, we show that the betweenness preference analysis of these temporal networks can further characterize dynamical systems and separate distinct electrocardiogram recordings. Our work explores the memory effect and betweenness preference in temporal networks constructed from time series data, providing a new perspective to understand the underlying dynamical systems.
Multiscale multifractal detrended cross-correlation analysis of financial time series
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian; Wang, Jing; Lin, Aijing
2014-06-01
In this paper, we introduce a method called multiscale multifractal detrended cross-correlation analysis (MM-DCCA). The method allows us to extend the description of the cross-correlation properties between two time series. MM-DCCA may provide new ways of measuring the nonlinearity of two signals, and it helps to present much richer information than multifractal detrended cross-correlation analysis (MF-DCCA) by sweeping all the range of scale at which the multifractal structures of complex system are discussed. Moreover, to illustrate the advantages of this approach we make use of the MM-DCCA to analyze the cross-correlation properties between financial time series. We show that this new method can be adapted to investigate stock markets under investigation. It can provide a more faithful and more interpretable description of the dynamic mechanism between financial time series than traditional MF-DCCA. We also propose to reduce the scale ranges to analyze short time series, and some inherent properties which remain hidden when a wide range is used may exhibit perfectly in this way.
Washburne, Alex D.; Burby, Joshua W.; Lacker, Daniel; ...
2016-09-30
Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or themore » number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Here, future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washburne, Alex D.; Burby, Joshua W.; Lacker, Daniel
Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or themore » number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Here, future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems.« less
Unraveling multiple changes in complex climate time series using Bayesian inference
NASA Astrophysics Data System (ADS)
Berner, Nadine; Trauth, Martin H.; Holschneider, Matthias
2016-04-01
Change points in time series are perceived as heterogeneities in the statistical or dynamical characteristics of observations. Unraveling such transitions yields essential information for the understanding of the observed system. The precise detection and basic characterization of underlying changes is therefore of particular importance in environmental sciences. We present a kernel-based Bayesian inference approach to investigate direct as well as indirect climate observations for multiple generic transition events. In order to develop a diagnostic approach designed to capture a variety of natural processes, the basic statistical features of central tendency and dispersion are used to locally approximate a complex time series by a generic transition model. A Bayesian inversion approach is developed to robustly infer on the location and the generic patterns of such a transition. To systematically investigate time series for multiple changes occurring at different temporal scales, the Bayesian inversion is extended to a kernel-based inference approach. By introducing basic kernel measures, the kernel inference results are composed into a proxy probability to a posterior distribution of multiple transitions. Thus, based on a generic transition model a probability expression is derived that is capable to indicate multiple changes within a complex time series. We discuss the method's performance by investigating direct and indirect climate observations. The approach is applied to environmental time series (about 100 a), from the weather station in Tuscaloosa, Alabama, and confirms documented instrumentation changes. Moreover, the approach is used to investigate a set of complex terrigenous dust records from the ODP sites 659, 721/722 and 967 interpreted as climate indicators of the African region of the Plio-Pleistocene period (about 5 Ma). The detailed inference unravels multiple transitions underlying the indirect climate observations coinciding with established global climate events.
MEM spectral analysis for predicting influenza epidemics in Japan.
Sumi, Ayako; Kamo, Ken-ichi
2012-03-01
The prediction of influenza epidemics has long been the focus of attention in epidemiology and mathematical biology. In this study, we tested whether time series analysis was useful for predicting the incidence of influenza in Japan. The method of time series analysis we used consists of spectral analysis based on the maximum entropy method (MEM) in the frequency domain and the nonlinear least squares method in the time domain. Using this time series analysis, we analyzed the incidence data of influenza in Japan from January 1948 to December 1998; these data are unique in that they covered the periods of pandemics in Japan in 1957, 1968, and 1977. On the basis of the MEM spectral analysis, we identified the periodic modes explaining the underlying variations of the incidence data. The optimum least squares fitting (LSF) curve calculated with the periodic modes reproduced the underlying variation of the incidence data. An extension of the LSF curve could be used to predict the incidence of influenza quantitatively. Our study suggested that MEM spectral analysis would allow us to model temporal variations of influenza epidemics with multiple periodic modes much more effectively than by using the method of conventional time series analysis, which has been used previously to investigate the behavior of temporal variations in influenza data.
26 CFR 1.454-1 - Obligations issued at discount.
Code of Federal Regulations, 2013 CFR
2013-04-01
... series E bond, at which time the stated redemption value was $674.60. A never elected under section 454(a... current income obligation, in which he retains his investment in a matured series E U.S. savings bond, or... for which a series E U.S. savings bond was exchanged (whether or not at final maturity) in an exchange...
26 CFR 1.454-1 - Obligations issued at discount.
Code of Federal Regulations, 2011 CFR
2011-04-01
... series E bond, at which time the stated redemption value was $674.60. A never elected under section 454(a... current income obligation, in which he retains his investment in a matured series E U.S. savings bond, or... for which a series E U.S. savings bond was exchanged (whether or not at final maturity) in an exchange...
26 CFR 1.454-1 - Obligations issued at discount.
Code of Federal Regulations, 2012 CFR
2012-04-01
... series E bond, at which time the stated redemption value was $674.60. A never elected under section 454(a... current income obligation, in which he retains his investment in a matured series E U.S. savings bond, or... for which a series E U.S. savings bond was exchanged (whether or not at final maturity) in an exchange...
NASA Astrophysics Data System (ADS)
Feng, Lian-Li; Tian, Shou-Fu; Wang, Xiu-Bin; Zhang, Tian-Tian
2016-09-01
In this paper, the time fractional Fordy-Gibbons equation is investigated with Riemann-Liouville derivative. The equation can be reduced to the Caudrey-Dodd-Gibbon equation, Savada-Kotera equation and the Kaup-Kupershmidt equation, etc. By means of the Lie group analysis method, the invariance properties and symmetry reductions of the equation are derived. Furthermore, by means of the power series theory, its exact power series solutions of the equation are also constructed. Finally, two kinds of conservation laws of the equation are well obtained with aid of the self-adjoint method. Supported by the Fundamental Research Funds for Key Discipline Construction under Grant No. XZD201602, the Fundamental Research Funds for the Central Universities under Grant Nos. 2015QNA53 and 2015XKQY14, the Fundamental Research Funds for Postdoctoral at the Key Laboratory of Gas and Fire Control for Coal Mines, the General Financial Grant from the China Postdoctoral Science Foundation under Grant No. 2015M570498, and Natural Sciences Foundation of China under Grant No. 11301527
Functional linear models to test for differences in prairie wetland hydraulic gradients
Greenwood, Mark C.; Sojda, Richard S.; Preston, Todd M.; Swayne, David A.; Yang, Wanhong; Voinov, A.A.; Rizzoli, A.; Filatova, T.
2010-01-01
Functional data analysis provides a framework for analyzing multiple time series measured frequently in time, treating each series as a continuous function of time. Functional linear models are used to test for effects on hydraulic gradient functional responses collected from three types of land use in Northeastern Montana at fourteen locations. Penalized regression-splines are used to estimate the underlying continuous functions based on the discretely recorded (over time) gradient measurements. Permutation methods are used to assess the statistical significance of effects. A method for accommodating missing observations in each time series is described. Hydraulic gradients may be an initial and fundamental ecosystem process that responds to climate change. We suggest other potential uses of these methods for detecting evidence of climate change.
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
Testing for nonlinearity in time series: The method of surrogate data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, J.; Galdrikian, B.; Longtin, A.
1991-01-01
We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less
Large-scale Granger causality analysis on resting-state functional MRI
NASA Astrophysics Data System (ADS)
D'Souza, Adora M.; Abidin, Anas Zainul; Leistritz, Lutz; Wismüller, Axel
2016-03-01
We demonstrate an approach to measure the information flow between each pair of time series in resting-state functional MRI (fMRI) data of the human brain and subsequently recover its underlying network structure. By integrating dimensionality reduction into predictive time series modeling, large-scale Granger Causality (lsGC) analysis method can reveal directed information flow suggestive of causal influence at an individual voxel level, unlike other multivariate approaches. This method quantifies the influence each voxel time series has on every other voxel time series in a multivariate sense and hence contains information about the underlying dynamics of the whole system, which can be used to reveal functionally connected networks within the brain. To identify such networks, we perform non-metric network clustering, such as accomplished by the Louvain method. We demonstrate the effectiveness of our approach to recover the motor and visual cortex from resting state human brain fMRI data and compare it with the network recovered from a visuomotor stimulation experiment, where the similarity is measured by the Dice Coefficient (DC). The best DC obtained was 0.59 implying a strong agreement between the two networks. In addition, we thoroughly study the effect of dimensionality reduction in lsGC analysis on network recovery. We conclude that our approach is capable of detecting causal influence between time series in a multivariate sense, which can be used to segment functionally connected networks in the resting-state fMRI.
Identification of human operator performance models utilizing time series analysis
NASA Technical Reports Server (NTRS)
Holden, F. M.; Shinners, S. M.
1973-01-01
The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.
NASA Astrophysics Data System (ADS)
Zhang, Yun-Wei; Gu, Zhao-Lin; Cheng, Yan; Lee, Shun-Cheng
2011-07-01
Air flow and pollutant dispersion characteristics in an urban street canyon are studied under the real-time boundary conditions. A new scheme for realizing real-time boundary conditions in simulations is proposed, to keep the upper boundary wind conditions consistent with the measured time series of wind data. The air flow structure and its evolution under real-time boundary wind conditions are simulated by using this new scheme. The induced effect of time series of ambient wind conditions on the flow structures inside and above the street canyon is investigated. The flow shows an obvious intermittent feature in the street canyon and the flapping of the shear layer forms near the roof layer under real-time wind conditions, resulting in the expansion or compression of the air mass in the canyon. The simulations of pollutant dispersion show that the pollutants inside and above the street canyon are transported by different dispersion mechanisms, relying on the time series of air flow structures. Large scale air movements in the processes of the air mass expansion or compression in the canyon exhibit obvious effects on pollutant dispersion. The simulations of pollutant dispersion also show that the transport of pollutants from the canyon to the upper air flow is dominated by the shear layer turbulence near the roof level and the expansion or compression of the air mass in street canyon under real-time boundary wind conditions. Especially, the expansion of the air mass, which features the large scale air movement of the air mass, makes more contribution to the pollutant dispersion in this study. Comparisons of simulated results under different boundary wind conditions indicate that real-time boundary wind conditions produces better condition for pollutant dispersion than the artificially-designed steady boundary wind conditions.
Reconstructing multi-mode networks from multivariate time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Yang, Yu-Xuan; Dang, Wei-Dong; Cai, Qing; Wang, Zhen; Marwan, Norbert; Boccaletti, Stefano; Kurths, Jürgen
2017-09-01
Unveiling the dynamics hidden in multivariate time series is a task of the utmost importance in a broad variety of areas in physics. We here propose a method that leads to the construction of a novel functional network, a multi-mode weighted graph combined with an empirical mode decomposition, and to the realization of multi-information fusion of multivariate time series. The method is illustrated in a couple of successful applications (a multi-phase flow and an epileptic electro-encephalogram), which demonstrate its powerfulness in revealing the dynamical behaviors underlying the transitions of different flow patterns, and enabling to differentiate brain states of seizure and non-seizure.
Modelling short time series in metabolomics: a functional data analysis approach.
Montana, Giovanni; Berk, Maurice; Ebbels, Tim
2011-01-01
Metabolomics is the study of the complement of small molecule metabolites in cells, biofluids and tissues. Many metabolomic experiments are designed to compare changes observed over time under two or more experimental conditions (e.g. a control and drug-treated group), thus producing time course data. Models from traditional time series analysis are often unsuitable because, by design, only very few time points are available and there are a high number of missing values. We propose a functional data analysis approach for modelling short time series arising in metabolomic studies which overcomes these obstacles. Our model assumes that each observed time series is a smooth random curve, and we propose a statistical approach for inferring this curve from repeated measurements taken on the experimental units. A test statistic for detecting differences between temporal profiles associated with two experimental conditions is then presented. The methodology has been applied to NMR spectroscopy data collected in a pre-clinical toxicology study.
Generalized Riemann hypothesis and stochastic time series
NASA Astrophysics Data System (ADS)
Mussardo, Giuseppe; LeClair, André
2018-06-01
Using the Dirichlet theorem on the equidistribution of residue classes modulo q and the Lemke Oliver–Soundararajan conjecture on the distribution of pairs of residues on consecutive primes, we show that the domain of convergence of the infinite product of Dirichlet L-functions of non-principal characters can be extended from down to , without encountering any zeros before reaching this critical line. The possibility of doing so can be traced back to a universal diffusive random walk behavior of a series C N over the primes which underlies the convergence of the infinite product of the Dirichlet functions. The series C N presents several aspects in common with stochastic time series and its control requires to address a problem similar to the single Brownian trajectory problem in statistical mechanics. In the case of the Dirichlet functions of non principal characters, we show that this problem can be solved in terms of a self-averaging procedure based on an ensemble of block variables computed on extended intervals of primes. Those intervals, called inertial intervals, ensure the ergodicity and stationarity of the time series underlying the quantity C N . The infinity of primes also ensures the absence of rare events which would have been responsible for a different scaling behavior than the universal law of the random walks.
NASA Astrophysics Data System (ADS)
D'Souza, Adora M.; Abidin, Anas Zainul; Nagarajan, Mahesh B.; Wismüller, Axel
2016-03-01
We investigate the applicability of a computational framework, called mutual connectivity analysis (MCA), for directed functional connectivity analysis in both synthetic and resting-state functional MRI data. This framework comprises of first evaluating non-linear cross-predictability between every pair of time series prior to recovering the underlying network structure using community detection algorithms. We obtain the non-linear cross-prediction score between time series using Generalized Radial Basis Functions (GRBF) neural networks. These cross-prediction scores characterize the underlying functionally connected networks within the resting brain, which can be extracted using non-metric clustering approaches, such as the Louvain method. We first test our approach on synthetic models with known directional influence and network structure. Our method is able to capture the directional relationships between time series (with an area under the ROC curve = 0.92 +/- 0.037) as well as the underlying network structure (Rand index = 0.87 +/- 0.063) with high accuracy. Furthermore, we test this method for network recovery on resting-state fMRI data, where results are compared to the motor cortex network recovered from a motor stimulation sequence, resulting in a strong agreement between the two (Dice coefficient = 0.45). We conclude that our MCA approach is effective in analyzing non-linear directed functional connectivity and in revealing underlying functional network structure in complex systems.
DSouza, Adora M; Abidin, Anas Zainul; Nagarajan, Mahesh B; Wismüller, Axel
2016-03-29
We investigate the applicability of a computational framework, called mutual connectivity analysis (MCA), for directed functional connectivity analysis in both synthetic and resting-state functional MRI data. This framework comprises of first evaluating non-linear cross-predictability between every pair of time series prior to recovering the underlying network structure using community detection algorithms. We obtain the non-linear cross-prediction score between time series using Generalized Radial Basis Functions (GRBF) neural networks. These cross-prediction scores characterize the underlying functionally connected networks within the resting brain, which can be extracted using non-metric clustering approaches, such as the Louvain method. We first test our approach on synthetic models with known directional influence and network structure. Our method is able to capture the directional relationships between time series (with an area under the ROC curve = 0.92 ± 0.037) as well as the underlying network structure (Rand index = 0.87 ± 0.063) with high accuracy. Furthermore, we test this method for network recovery on resting-state fMRI data, where results are compared to the motor cortex network recovered from a motor stimulation sequence, resulting in a strong agreement between the two (Dice coefficient = 0.45). We conclude that our MCA approach is effective in analyzing non-linear directed functional connectivity and in revealing underlying functional network structure in complex systems.
Rényi’s information transfer between financial time series
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad
2012-05-01
In this paper, we quantify the statistical coherence between financial time series by means of the Rényi entropy. With the help of Campbell’s coding theorem, we show that the Rényi entropy selectively emphasizes only certain sectors of the underlying empirical distribution while strongly suppressing others. This accentuation is controlled with Rényi’s parameter q. To tackle the issue of the information flow between time series, we formulate the concept of Rényi’s transfer entropy as a measure of information that is transferred only between certain parts of underlying distributions. This is particularly pertinent in financial time series, where the knowledge of marginal events such as spikes or sudden jumps is of a crucial importance. We apply the Rényian information flow to stock market time series from 11 world stock indices as sampled at a daily rate in the time period 02.01.1990-31.12.2009. Corresponding heat maps and net information flows are represented graphically. A detailed discussion of the transfer entropy between the DAX and S&P500 indices based on minute tick data gathered in the period 02.04.2008-11.09.2009 is also provided. Our analysis shows that the bivariate information flow between world markets is strongly asymmetric with a distinct information surplus flowing from the Asia-Pacific region to both European and US markets. An important yet less dramatic excess of information also flows from Europe to the US. This is particularly clearly seen from a careful analysis of Rényi information flow between the DAX and S&P500 indices.
Identifying the scale-dependent motifs in atmospheric surface layer by ordinal pattern analysis
NASA Astrophysics Data System (ADS)
Li, Qinglei; Fu, Zuntao
2018-07-01
Ramp-like structures in various atmospheric surface layer time series have been long studied, but the presence of motifs with the finer scale embedded within larger scale ramp-like structures has largely been overlooked in the reported literature. Here a novel, objective and well-adapted methodology, the ordinal pattern analysis, is adopted to study the finer-scaled motifs in atmospheric boundary-layer (ABL) time series. The studies show that the motifs represented by different ordinal patterns take clustering properties and 6 dominated motifs out of the whole 24 motifs account for about 45% of the time series under particular scales, which indicates the higher contribution of motifs with the finer scale to the series. Further studies indicate that motif statistics are similar for both stable conditions and unstable conditions at larger scales, but large discrepancies are found at smaller scales, and the frequencies of motifs "1234" and/or "4321" are a bit higher under stable conditions than unstable conditions. Under stable conditions, there are great changes for the occurrence frequencies of motifs "1234" and "4321", where the occurrence frequencies of motif "1234" decrease from nearly 24% to 4.5% with the scale factor increasing, and the occurrence frequencies of motif "4321" change nonlinearly with the scale increasing. These great differences of dominated motifs change with scale can be taken as an indicator to quantify the flow structure changes under different stability conditions, and motif entropy can be defined just by only 6 dominated motifs to quantify this time-scale independent property of the motifs. All these results suggest that the defined scale of motifs with the finer scale should be carefully taken into consideration in the interpretation of turbulence coherent structures.
Forbidden patterns in financial time series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano
2008-03-01
The existence of forbidden patterns, i.e., certain missing sequences in a given time series, is a recently proposed instrument of potential application in the study of time series. Forbidden patterns are related to the permutation entropy, which has the basic properties of classic chaos indicators, such as Lyapunov exponent or Kolmogorov entropy, thus allowing to separate deterministic (usually chaotic) from random series; however, it requires fewer values of the series to be calculated, and it is suitable for using with small datasets. In this paper, the appearance of forbidden patterns is studied in different economical indicators such as stock indices (Dow Jones Industrial Average and Nasdaq Composite), NYSE stocks (IBM and Boeing), and others (ten year Bond interest rate), to find evidence of deterministic behavior in their evolutions. Moreover, the rate of appearance of the forbidden patterns is calculated, and some considerations about the underlying dynamics are suggested.
Non-parametric characterization of long-term rainfall time series
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Pandey, Brij Kishor
2018-03-01
The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
2014-06-15
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
Faes, Luca; Nollo, Giandomenico; Porta, Alberto
2012-03-01
The complexity of the short-term cardiovascular control prompts for the introduction of multivariate (MV) nonlinear time series analysis methods to assess directional interactions reflecting the underlying regulatory mechanisms. This study introduces a new approach for the detection of nonlinear Granger causality in MV time series, based on embedding the series by a sequential, non-uniform procedure, and on estimating the information flow from one series to another by means of the corrected conditional entropy. The approach is validated on short realizations of linear stochastic and nonlinear deterministic processes, and then evaluated on heart period, systolic arterial pressure and respiration variability series measured from healthy humans in the resting supine position and in the upright position after head-up tilt. Copyright © 2011 Elsevier Ltd. All rights reserved.
Time reversibility from visibility graphs of nonstationary processes
NASA Astrophysics Data System (ADS)
Lacasa, Lucas; Flanagan, Ryan
2015-08-01
Visibility algorithms are a family of methods to map time series into networks, with the aim of describing the structure of time series and their underlying dynamical properties in graph-theoretical terms. Here we explore some properties of both natural and horizontal visibility graphs associated to several nonstationary processes, and we pay particular attention to their capacity to assess time irreversibility. Nonstationary signals are (infinitely) irreversible by definition (independently of whether the process is Markovian or producing entropy at a positive rate), and thus the link between entropy production and time series irreversibility has only been explored in nonequilibrium stationary states. Here we show that the visibility formalism naturally induces a new working definition of time irreversibility, which allows us to quantify several degrees of irreversibility for stationary and nonstationary series, yielding finite values that can be used to efficiently assess the presence of memory and off-equilibrium dynamics in nonstationary processes without the need to differentiate or detrend them. We provide rigorous results complemented by extensive numerical simulations on several classes of stochastic processes.
NASA Astrophysics Data System (ADS)
Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.
2012-04-01
We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.
An Illustration of Generalised Arma (garma) Time Series Modeling of Forest Area in Malaysia
NASA Astrophysics Data System (ADS)
Pillai, Thulasyammal Ramiah; Shitan, Mahendran
Forestry is the art and science of managing forests, tree plantations, and related natural resources. The main goal of forestry is to create and implement systems that allow forests to continue a sustainable provision of environmental supplies and services. Forest area is land under natural or planted stands of trees, whether productive or not. Forest area of Malaysia has been observed over the years and it can be modeled using time series models. A new class of GARMA models have been introduced in the time series literature to reveal some hidden features in time series data. For these models to be used widely in practice, we illustrate the fitting of GARMA (1, 1; 1, δ) model to the Annual Forest Area data of Malaysia which has been observed from 1987 to 2008. The estimation of the model was done using Hannan-Rissanen Algorithm, Whittle's Estimation and Maximum Likelihood Estimation.
Approximating high-dimensional dynamics by barycentric coordinates with linear programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirata, Yoshito, E-mail: yoshito@sat.t.u-tokyo.ac.jp; Aihara, Kazuyuki; Suzuki, Hideyuki
The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics ofmore » the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.« less
Approximating high-dimensional dynamics by barycentric coordinates with linear programming.
Hirata, Yoshito; Shiro, Masanori; Takahashi, Nozomu; Aihara, Kazuyuki; Suzuki, Hideyuki; Mas, Paloma
2015-01-01
The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics of the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.
NASA Astrophysics Data System (ADS)
Yuan, Y.; Meng, Y.; Chen, Y. X.; Jiang, C.; Yue, A. Z.
2018-04-01
In this study, we proposed a method to map urban encroachment onto farmland using satellite image time series (SITS) based on the hierarchical hidden Markov model (HHMM). In this method, the farmland change process is decomposed into three hierarchical levels, i.e., the land cover level, the vegetation phenology level, and the SITS level. Then a three-level HHMM is constructed to model the multi-level semantic structure of farmland change process. Once the HHMM is established, a change from farmland to built-up could be detected by inferring the underlying state sequence that is most likely to generate the input time series. The performance of the method is evaluated on MODIS time series in Beijing. Results on both simulated and real datasets demonstrate that our method improves the change detection accuracy compared with the HMM-based method.
Empirical intrinsic geometry for nonlinear modeling and time series filtering.
Talmon, Ronen; Coifman, Ronald R
2013-07-30
In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.
Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin
NASA Astrophysics Data System (ADS)
zhang, L.
2011-12-01
Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be studied through the copula theory. As to the parameter estimation, the maximum likelihood estimation (MLE) will be applied. To illustrate the method, the univariate time series model and the dependence structure will be determined and tested using the monthly discharge time series of Cuyahoga River Basin.
Yu, Hwa-Lung; Lin, Yuan-Chien; Kuo, Yi-Ming
2015-09-01
Understanding the temporal dynamics and interactions of particulate matter (PM) concentration and composition is important for air quality control. This paper applied a dynamic factor analysis method (DFA) to reveal the underlying mechanisms of nonstationary variations in twelve ambient concentrations of aerosols and gaseous pollutants, and the associations with meteorological factors. This approach can consider the uncertainties and temporal dependences of time series data. The common trends of the yearlong and three selected diurnal variations were obtained to characterize the dominant processes occurring in general and specific scenarios in Taipei during 2009 (i.e., during Asian dust storm (ADS) events, rainfall, and under normal conditions). The results revealed the two distinct yearlong NOx transformation processes, and demonstrated that traffic emissions and photochemical reactions both critically influence diurnal variation, depending upon meteorological conditions. During an ADS event, transboundary transport and distinct weather conditions both influenced the temporal pattern of identified common trends. This study shows the DFA method can effectively extract meaningful latent processes of time series data and provide insights of the dominant associations and interactions in the complex air pollution processes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hatch, Christine E; Fisher, Andrew T.; Revenaugh, Justin S.; Constantz, Jim; Ruehl, Chris
2006-01-01
We present a method for determining streambed seepage rates using time series thermal data. The new method is based on quantifying changes in phase and amplitude of temperature variations between pairs of subsurface sensors. For a reasonable range of streambed thermal properties and sensor spacings the time series method should allow reliable estimation of seepage rates for a range of at least ±10 m d−1 (±1.2 × 10−2 m s−1), with amplitude variations being most sensitive at low flow rates and phase variations retaining sensitivity out to much higher rates. Compared to forward modeling, the new method requires less observational data and less setup and data handling and is faster, particularly when interpreting many long data sets. The time series method is insensitive to streambed scour and sedimentation, which allows for application under a wide range of flow conditions and allows time series estimation of variable streambed hydraulic conductivity. This new approach should facilitate wider use of thermal methods and improve understanding of the complex spatial and temporal dynamics of surface water–groundwater interactions.
Does preprocessing change nonlinear measures of heart rate variability?
Gomes, Murilo E D; Guimarães, Homero N; Ribeiro, Antônio L P; Aguirre, Luis A
2002-11-01
This work investigated if methods used to produce a uniformly sampled heart rate variability (HRV) time series significantly change the deterministic signature underlying the dynamics of such signals and some nonlinear measures of HRV. Two methods of preprocessing were used: the convolution of inverse interval function values with a rectangular window and the cubic polynomial interpolation. The HRV time series were obtained from 33 Wistar rats submitted to autonomic blockade protocols and from 17 healthy adults. The analysis of determinism was carried out by the method of surrogate data sets and nonlinear autoregressive moving average modelling and prediction. The scaling exponents alpha, alpha(1) and alpha(2) derived from the detrended fluctuation analysis were calculated from raw HRV time series and respective preprocessed signals. It was shown that the technique of cubic interpolation of HRV time series did not significantly change any nonlinear characteristic studied in this work, while the method of convolution only affected the alpha(1) index. The results suggested that preprocessed time series may be used to study HRV in the field of nonlinear dynamics.
An approach to checking case-crossover analyses based on equivalence with time-series methods.
Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L
2008-03-01
The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.
Flood return level analysis of Peaks over Threshold series under changing climate
NASA Astrophysics Data System (ADS)
Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.
2016-12-01
Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.
Liu, Shuyuan; Liu, Xiangnan; Liu, Meiling; Wu, Ling; Ding, Chao; Huang, Zhi
2017-05-30
An effective method to monitor heavy metal stress in crops is of critical importance to assure agricultural production and food security. Phenology, as a sensitive indicator of environmental change, can respond to heavy metal stress in crops and remote sensing is an effective method to detect plant phenological changes. This study focused on identifying the rice phenological differences under varied heavy metal stress using EVI (enhanced vegetation index) time-series, which was obtained from HJ-1A/B CCD images and fitted with asymmetric Gaussian model functions. We extracted three phenological periods using first derivative analysis: the tillering period, heading period, and maturation period; and constructed two kinds of metrics with phenological characteristics: date-intervals and time-integrated EVI, to explore the rice phenological differences under mild and severe stress levels. Results indicated that under severe stress the values of the metrics for presenting rice phenological differences in the experimental areas of heavy metal stress were smaller than the ones under mild stress. This finding represents a new method for monitoring heavy metal contamination through rice phenology.
NASA Astrophysics Data System (ADS)
Harte, Philip T.; Smith, Thor E.; Williams, John H.; Degnan, James R.
2012-05-01
In situ chemical oxidation (ISCO) treatment with sodium permanganate, an electrically conductive oxidant, provides a strong electrical signal for tracking of injectate transport using time series geophysical surveys including direct current (DC) resistivity and electromagnetic (EM) methods. Effective remediation is dependent upon placing the oxidant in close contact with the contaminated aquifer. Therefore, monitoring tools that provide enhanced tracking capability of the injectate offer considerable benefit to guide subsequent ISCO injections. Time-series geophysical surveys were performed at a superfund site in New Hampshire, USA over a one-year period to identify temporal changes in the bulk electrical conductivity of a tetrachloroethylene (PCE; also called tetrachloroethene) contaminated, glacially deposited aquifer due to the injection of sodium permanganate. The ISCO treatment involved a series of pulse injections of sodium permanganate from multiple injection wells within a contained area of the aquifer. After the initial injection, the permanganate was allowed to disperse under ambient groundwater velocities. Time series geophysical surveys identified the downward sinking and pooling of the sodium permanganate atop of the underlying till or bedrock surface caused by density-driven flow, and the limited horizontal spread of the sodium permanganate in the shallow parts of the aquifer during this injection period. When coupled with conventional monitoring, the surveys allowed for an assessment of ISCO treatment effectiveness in targeting the PCE plume and helped target areas for subsequent treatment.
Harte, Philip T.; Smith, Thor E.; Williams, John H.; Degnan, James R.
2012-01-01
In situ chemical oxidation (ISCO) treatment with sodium permanganate, an electrically conductive oxidant, provides a strong electrical signal for tracking of injectate transport using time series geophysical surveys including direct current (DC) resistivity and electromagnetic (EM) methods. Effective remediation is dependent upon placing the oxidant in close contact with the contaminated aquifer. Therefore, monitoring tools that provide enhanced tracking capability of the injectate offer considerable benefit to guide subsequent ISCO injections. Time-series geophysical surveys were performed at a superfund site in New Hampshire, USA over a one-year period to identify temporal changes in the bulk electrical conductivity of a tetrachloroethylene (PCE; also called tetrachloroethene) contaminated, glacially deposited aquifer due to the injection of sodium permanganate. The ISCO treatment involved a series of pulse injections of sodium permanganate from multiple injection wells within a contained area of the aquifer. After the initial injection, the permanganate was allowed to disperse under ambient groundwater velocities. Time series geophysical surveys identified the downward sinking and pooling of the sodium permanganate atop of the underlying till or bedrock surface caused by density-driven flow, and the limited horizontal spread of the sodium permanganate in the shallow parts of the aquifer during this injection period. When coupled with conventional monitoring, the surveys allowed for an assessment of ISCO treatment effectiveness in targeting the PCE plume and helped target areas for subsequent treatment.
Harte, Philip T; Smith, Thor E; Williams, John H; Degnan, James R
2012-05-01
In situ chemical oxidation (ISCO) treatment with sodium permanganate, an electrically conductive oxidant, provides a strong electrical signal for tracking of injectate transport using time series geophysical surveys including direct current (DC) resistivity and electromagnetic (EM) methods. Effective remediation is dependent upon placing the oxidant in close contact with the contaminated aquifer. Therefore, monitoring tools that provide enhanced tracking capability of the injectate offer considerable benefit to guide subsequent ISCO injections. Time-series geophysical surveys were performed at a superfund site in New Hampshire, USA over a one-year period to identify temporal changes in the bulk electrical conductivity of a tetrachloroethylene (PCE; also called tetrachloroethene) contaminated, glacially deposited aquifer due to the injection of sodium permanganate. The ISCO treatment involved a series of pulse injections of sodium permanganate from multiple injection wells within a contained area of the aquifer. After the initial injection, the permanganate was allowed to disperse under ambient groundwater velocities. Time series geophysical surveys identified the downward sinking and pooling of the sodium permanganate atop of the underlying till or bedrock surface caused by density-driven flow, and the limited horizontal spread of the sodium permanganate in the shallow parts of the aquifer during this injection period. When coupled with conventional monitoring, the surveys allowed for an assessment of ISCO treatment effectiveness in targeting the PCE plume and helped target areas for subsequent treatment. Published by Elsevier B.V.
On statistical inference in time series analysis of the evolution of road safety.
Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora
2013-11-01
Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar
2016-02-01
The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time series or relations among phase shifted time series.
Dynamical Analysis and Visualization of Tornadoes Time Series
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
ERIC Educational Resources Information Center
Huitema, Bradley E.; McKean, Joseph W.
2007-01-01
Regression models used in the analysis of interrupted time-series designs assume statistically independent errors. Four methods of evaluating this assumption are the Durbin-Watson (D-W), Huitema-McKean (H-M), Box-Pierce (B-P), and Ljung-Box (L-B) tests. These tests were compared with respect to Type I error and power under a wide variety of error…
Symbolic Time-Series Analysis for Anomaly Detection in Mechanical Systems
2006-08-01
Amol Khatkhate, Asok Ray , Fellow, IEEE, Eric Keller, Shalabh Gupta, and Shin C. Chin Abstract—This paper examines the efficacy of a novel method for...recognition. KHATKHATE et al.: SYMBOLIC TIME-SERIES ANALYSIS FOR ANOMALY DETECTION 447 Asok Ray (F’02) received graduate degrees in electri- cal...anomaly detection has been pro- posed by Ray [6], where the underlying information on the dynamical behavior of complex systems is derived based on
Quantifying Selection with Pool-Seq Time Series Data.
Taus, Thomas; Futschik, Andreas; Schlötterer, Christian
2017-11-01
Allele frequency time series data constitute a powerful resource for unraveling mechanisms of adaptation, because the temporal dimension captures important information about evolutionary forces. In particular, Evolve and Resequence (E&R), the whole-genome sequencing of replicated experimentally evolving populations, is becoming increasingly popular. Based on computer simulations several studies proposed experimental parameters to optimize the identification of the selection targets. No such recommendations are available for the underlying parameters selection strength and dominance. Here, we introduce a highly accurate method to estimate selection parameters from replicated time series data, which is fast enough to be applied on a genome scale. Using this new method, we evaluate how experimental parameters can be optimized to obtain the most reliable estimates for selection parameters. We show that the effective population size (Ne) and the number of replicates have the largest impact. Because the number of time points and sequencing coverage had only a minor effect, we suggest that time series analysis is feasible without major increase in sequencing costs. We anticipate that time series analysis will become routine in E&R studies. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Empirical forecast of quiet time ionospheric Total Electron Content maps over Europe
NASA Astrophysics Data System (ADS)
Badeke, Ronny; Borries, Claudia; Hoque, Mainul M.; Minkwitz, David
2018-06-01
An accurate forecast of the atmospheric Total Electron Content (TEC) is helpful to investigate space weather influences on the ionosphere and technical applications like satellite-receiver radio links. The purpose of this work is to compare four empirical methods for a 24-h forecast of vertical TEC maps over Europe under geomagnetically quiet conditions. TEC map data are obtained from the Space Weather Application Center Ionosphere (SWACI) and the Universitat Politècnica de Catalunya (UPC). The time-series methods Standard Persistence Model (SPM), a 27 day median model (MediMod) and a Fourier Series Expansion are compared to maps for the entire year of 2015. As a representative of the climatological coefficient models the forecast performance of the Global Neustrelitz TEC model (NTCM-GL) is also investigated. Time periods of magnetic storms, which are identified with the Dst index, are excluded from the validation. By calculating the TEC values with the most recent maps, the time-series methods perform slightly better than the coefficient model NTCM-GL. The benefit of NTCM-GL is its independence on observational TEC data. Amongst the time-series methods mentioned, MediMod delivers the best overall performance regarding accuracy and data gap handling. Quiet-time SWACI maps can be forecasted accurately and in real-time by the MediMod time-series approach.
Xiong, Lihua; Jiang, Cong; Du, Tao
2014-01-01
Time-varying moments models based on Pearson Type III and normal distributions respectively are built under the generalized additive model in location, scale and shape (GAMLSS) framework to analyze the nonstationarity of the annual runoff series of the Weihe River, the largest tributary of the Yellow River. The detection of nonstationarities in hydrological time series (annual runoff, precipitation and temperature) from 1960 to 2009 is carried out using a GAMLSS model, and then the covariate analysis for the annual runoff series is implemented with GAMLSS. Finally, the attribution of each covariate to the nonstationarity of annual runoff is analyzed quantitatively. The results demonstrate that (1) obvious change-points exist in all three hydrological series, (2) precipitation, temperature and irrigated area are all significant covariates of the annual runoff series, and (3) temperature increase plays the main role in leading to the reduction of the annual runoff series in the study basin, followed by the decrease of precipitation and the increase of irrigated area.
Early-Time Solution of the Horizontal Unconfined Aquifer in the Buildup Phase
NASA Astrophysics Data System (ADS)
Gravanis, Elias; Akylas, Evangelos
2017-10-01
We derive the early-time solution of the Boussinesq equation for the horizontal unconfined aquifer in the buildup phase under constant recharge and zero inflow. The solution is expressed as a power series of a suitable similarity variable, which is constructed so that to satisfy the boundary conditions at both ends of the aquifer, that is, it is a polynomial approximation of the exact solution. The series turns out to be asymptotic and it is regularized by resummation techniques that are used to define divergent series. The outflow rate in this regime is linear in time, and the (dimensionless) coefficient is calculated to eight significant figures. The local error of the series is quantified by its deviation from satisfying the self-similar Boussinesq equation at every point. The local error turns out to be everywhere positive, hence, so is the integrated error, which in turn quantifies the degree of convergence of the series to the exact solution.
Characterizing and estimating noise in InSAR and InSAR time series with MODIS
Barnhart, William D.; Lohman, Rowena B.
2013-01-01
InSAR time series analysis is increasingly used to image subcentimeter displacement rates of the ground surface. The precision of InSAR observations is often affected by several noise sources, including spatially correlated noise from the turbulent atmosphere. Under ideal scenarios, InSAR time series techniques can substantially mitigate these effects; however, in practice the temporal distribution of InSAR acquisitions over much of the world exhibit seasonal biases, long temporal gaps, and insufficient acquisitions to confidently obtain the precisions desired for tectonic research. Here, we introduce a technique for constraining the magnitude of errors expected from atmospheric phase delays on the ground displacement rates inferred from an InSAR time series using independent observations of precipitable water vapor from MODIS. We implement a Monte Carlo error estimation technique based on multiple (100+) MODIS-based time series that sample date ranges close to the acquisitions times of the available SAR imagery. This stochastic approach allows evaluation of the significance of signals present in the final time series product, in particular their correlation with topography and seasonality. We find that topographically correlated noise in individual interferograms is not spatially stationary, even over short-spatial scales (<10 km). Overall, MODIS-inferred displacements and velocities exhibit errors of similar magnitude to the variability within an InSAR time series. We examine the MODIS-based confidence bounds in regions with a range of inferred displacement rates, and find we are capable of resolving velocities as low as 1.5 mm/yr with uncertainties increasing to ∼6 mm/yr in regions with higher topographic relief.
Singh, Amritpal; Saini, Barjinder Singh; Singh, Dilbag
2016-06-01
Multiscale approximate entropy (MAE) is used to quantify the complexity of a time series as a function of time scale τ. Approximate entropy (ApEn) tolerance threshold selection 'r' is based on either: (1) arbitrary selection in the recommended range (0.1-0.25) times standard deviation of time series (2) or finding maximum ApEn (ApEnmax) i.e., the point where self-matches start to prevail over other matches and choosing the corresponding 'r' (rmax) as threshold (3) or computing rchon by empirically finding the relation between rmax, SD1/SD2 ratio and N using curve fitting, where, SD1 and SD2 are short-term and long-term variability of a time series respectively. None of these methods is gold standard for selection of 'r'. In our previous study [1], an adaptive procedure for selection of 'r' is proposed for approximate entropy (ApEn). In this paper, this is extended to multiple time scales using MAEbin and multiscale cross-MAEbin (XMAEbin). We applied this to simulations i.e. 50 realizations (n = 50) of random number series, fractional Brownian motion (fBm) and MIX (P) [1] series of data length of N = 300 and short term recordings of HRV and SBPV performed under postural stress from supine to standing. MAEbin and XMAEbin analysis was performed on laboratory recorded data of 50 healthy young subjects experiencing postural stress from supine to upright. The study showed that (i) ApEnbin of HRV is more than SBPV in supine position but is lower than SBPV in upright position (ii) ApEnbin of HRV decreases from supine i.e. 1.7324 ± 0.112 (mean ± SD) to upright 1.4916 ± 0.108 due to vagal inhibition (iii) ApEnbin of SBPV increases from supine i.e. 1.5535 ± 0.098 to upright i.e. 1.6241 ± 0.101 due sympathetic activation (iv) individual and cross complexities of RRi and systolic blood pressure (SBP) series depend on time scale under consideration (v) XMAEbin calculated using ApEnmax is correlated with cross-MAE calculated using ApEn (0.1-0.26) in steps of 0.02 at each time scale in supine and upright position and is concluded that ApEn0.26 has highest correlation at most scales (vi) choice of 'r' is critical in interpreting interactions between RRi and SBP and in ascertaining true complexity of the individual RRi and SBP series.
Future mission studies: Forecasting solar flux directly from its chaotic time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.
1991-01-01
The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.
Multiscale entropy-based methods for heart rate variability complexity analysis
NASA Astrophysics Data System (ADS)
Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio
2015-03-01
Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.
78 FR 66267 - Safety Zone; HITS Triathlon Series; Colorado River; Lake Havasu, AZ
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-05
... an NPRM would be impracticable. Logistical details did not present the Coast Guard enough time to... potential costs and benefits under section 6(a)(3) of Executive Order 12866 or under section 1 of Executive...
Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability.
Krafty, Robert T
2016-07-01
Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modeled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possess favorable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability.
NASA Astrophysics Data System (ADS)
Zhao, Yongguang; Li, Chuanrong; Ma, Lingling; Tang, Lingli; Wang, Ning; Zhou, Chuncheng; Qian, Yonggang
2017-10-01
Time series of satellite reflectance data have been widely used to characterize environmental phenomena, describe trends in vegetation dynamics and study climate change. However, several sensors with wide spatial coverage and high observation frequency are usually designed to have large field of view (FOV), which cause variations in the sun-targetsensor geometry in time-series reflectance data. In this study, on the basis of semiempirical kernel-driven BRDF model, a new semi-empirical model was proposed to normalize the sun-target-sensor geometry of remote sensing image. To evaluate the proposed model, bidirectional reflectance under different canopy growth conditions simulated by Discrete Anisotropic Radiative Transfer (DART) model were used. The semi-empirical model was first fitted by using all simulated bidirectional reflectance. Experimental result showed a good fit between the bidirectional reflectance estimated by the proposed model and the simulated value. Then, MODIS time-series reflectance data was normalized to a common sun-target-sensor geometry by the proposed model. The experimental results showed the proposed model yielded good fits between the observed and estimated values. The noise-like fluctuations in time-series reflectance data was also reduced after the sun-target-sensor normalization process.
Koopman Operator Framework for Time Series Modeling and Analysis
NASA Astrophysics Data System (ADS)
Surana, Amit
2018-01-01
We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.
Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.
Chiba, Tomoaki; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru
2017-01-01
In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.
Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis
Chiba, Tomoaki; Akaho, Shotaro; Murata, Noboru
2017-01-01
In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group’s sales beat GM’s sales, which is a reasonable scenario. PMID:28076383
The study of Thai stock market across the 2008 financial crisis
NASA Astrophysics Data System (ADS)
Kanjamapornkul, K.; Pinčák, Richard; Bartoš, Erik
2016-11-01
The cohomology theory for financial market can allow us to deform Kolmogorov space of time series data over time period with the explicit definition of eight market states in grand unified theory. The anti-de Sitter space induced from a coupling behavior field among traders in case of a financial market crash acts like gravitational field in financial market spacetime. Under this hybrid mathematical superstructure, we redefine a behavior matrix by using Pauli matrix and modified Wilson loop for time series data. We use it to detect the 2008 financial market crash by using a degree of cohomology group of sphere over tensor field in correlation matrix over all possible dominated stocks underlying Thai SET50 Index Futures. The empirical analysis of financial tensor network was performed with the help of empirical mode decomposition and intrinsic time scale decomposition of correlation matrix and the calculation of closeness centrality of planar graph.
Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun
2017-12-01
Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/. sunkim.bioinfo@snu.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
PSO-MISMO modeling strategy for multistep-ahead time series prediction.
Bao, Yukun; Xiong, Tao; Hu, Zhongyi
2014-05-01
Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.
RankExplorer: Visualization of Ranking Changes in Large Time Series Data.
Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin
2012-12-01
For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.
Brüniche-Olsen, Anna; Austin, Jeremy J.; Jones, Menna E.; Holland, Barbara R.; Burridge, Christopher P.
2016-01-01
Detecting loci under selection is an important task in evolutionary biology. In conservation genetics detecting selection is key to investigating adaptation to the spread of infectious disease. Loci under selection can be detected on a spatial scale, accounting for differences in demographic history among populations, or on a temporal scale, tracing changes in allele frequencies over time. Here we use these two approaches to investigate selective responses to the spread of an infectious cancer—devil facial tumor disease (DFTD)—that since 1996 has ravaged the Tasmanian devil (Sarcophilus harrisii). Using time-series ‘restriction site associated DNA’ (RAD) markers from populations pre- and post DFTD arrival, and DFTD free populations, we infer loci under selection due to DFTD and investigate signatures of selection that are incongruent among methods, populations, and times. The lack of congruence among populations influenced by DFTD with respect to inferred loci under selection, and the direction of that selection, fail to implicate a consistent selective role for DFTD. Instead genetic drift is more likely driving the observed allele frequency changes over time. Our study illustrates the importance of applying methods with different performance optima e.g. accounting for population structure and background selection, and assessing congruence of the results. PMID:26930198
Code of Federal Regulations, 2014 CFR
2014-10-01
...; public announcement by the agency under this subpart of the time, place, and subject matter of any... close a portion or portions of a meeting or series of meetings as provided in §§ 503.74 and 503.75; (3... a portion or portions of a meeting or series of meetings as provided in § 503.80; or (4...
Code of Federal Regulations, 2011 CFR
2011-10-01
...; public announcement by the agency under this subpart of the time, place, and subject matter of any... close a portion or portions of a meeting or series of meetings as provided in §§ 503.74 and 503.75; (3... a portion or portions of a meeting or series of meetings as provided in § 503.80; or (4...
29 CFR 102.75 - Suspension of proceedings on the charge where timely petition is filed.
Code of Federal Regulations, 2010 CFR
2010-07-01
... RULES AND REGULATIONS, SERIES 8 Procedure for Unfair Labor Practice and Representation Cases Under... within a reasonable time not to exceed 30 days from the commencement of picketing, the regional director...
NASA Technical Reports Server (NTRS)
1981-01-01
The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1992-01-01
Numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series are presented. This approach makes it possible to extract dynamical invariants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), given a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1993-01-01
This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Automated Bayesian model development for frequency detection in biological time series.
Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J
2011-06-24
A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.
Automated Bayesian model development for frequency detection in biological time series
2011-01-01
Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure. PMID:21702910
Ghalyan, Najah F; Miller, David J; Ray, Asok
2018-06-12
Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.
NASA Astrophysics Data System (ADS)
Sigro, J.; Brunet, M.; Aguilar, E.; Stoll, H.; Jimenez, M.
2009-04-01
The Spanish-funded research project Rapid Climate Changes in the Iberian Peninsula (IP) Based on Proxy Calibration, Long Term Instrumental Series and High Resolution Analyses of Terrestrial and Marine Records (CALIBRE: ref. CGL2006-13327-C04/CLI) has as main objective to analyse climate dynamics during periods of rapid climate change by means of developing high-resolution paleoclimate proxy records from marine and terrestrial (lakes and caves) deposits over the IP and calibrating them with long-term and high-quality instrumental climate time series. Under CALIBRE, the coordinated project Developing and Enhancing a Climate Instrumental Dataset for Calibrating Climate Proxy Data and Analysing Low-Frequency Climate Variability over the Iberian Peninsula (CLICAL: CGL2006-13327-C04-03/CLI) is devoted to the development of homogenised climate records and sub-regional time series which can be confidently used in the calibration of the lacustrine, marine and speleothem time series generated under CALIBRE. Here we present the procedures followed in order to homogenise a dataset of maximum and minimum temperature and precipitation data on a monthly basis over the Spanish northern coast. The dataset is composed of thirty (twenty) precipitation (temperature) long monthly records. The data are quality controlled following the procedures recommended by Aguilar et al. (2003) and tested for homogeneity and adjusted by following the approach adopted by Brunet et al. (2008). Sub-regional time series of precipitation, maximum and minimum temperatures for the period 1853-2007 have been generated by averaging monthly anomalies and then adding back the base-period mean, according to the method of Jones and Hulme (1996). Also, a method to adjust the variance bias present in regional time series associated over time with varying sample size has been applied (Osborn et al., 1997). The results of this homogenisation exercise and the development of the associated sub-regional time series will be widely discussed. Initial comparisons with rapidly growing speleothems in two different caves indicate that speleothem trace element ratios like Ba/Ca are recording the decrease in littoral precipitation in the last several decades. References Aguilar, E., Auer, I., Brunet, M., Peterson, T. C. and Weringa, J. 2003. Guidelines on Climate Metadata and Homogenization, World Meteorological Organization (WMO)-TD no. 1186 / World Climate Data and Monitoring Program (WCDMP) no. 53, Geneva: 51 pp. Brunet M, Saladié O, Jones P, Sigró J, Aguilar E, Moberg A, Lister D, Walther A, Almarza C. 2008. A case-study/guidance on the development of long-term daily adjusted temperature datasets, WMO-TD-1425/WCDMP-66, Geneva: 43 pp. Jones, P D, and Hulme M, 1996, Calculating regional climatic time series for temperature and precipitation: Methods and illustrations, Int. J. Climatol., 16, 361- 377. Osborn, T. J., Briffa K. R., and Jones P. D., 1997, Adjusting variance for sample-size in tree-ring chronologies and other regional mean time series, Dendrochronologia, 15, 89- 99.
NASA Astrophysics Data System (ADS)
Serov, Vladislav V.; Kheifets, A. S.
2014-12-01
We analyze a transfer ionization (TI) reaction in the fast proton-helium collision H++He →H0+He2 ++ e- by solving a time-dependent Schrödinger equation (TDSE) under the classical projectile motion approximation in one-dimensional kinematics. In addition, we construct various time-independent analogs of our model using lowest-order perturbation theory in the form of the Born series. By comparing various aspects of the TDSE and the Born series calculations, we conclude that the recent discrepancies of experimental and theoretical data may be attributed to deficiency of the Born models used by other authors. We demonstrate that the correct Born series for TI should include the momentum-space overlap between the double-ionization amplitude and the wave function of the transferred electron.
tsiR: An R package for time-series Susceptible-Infected-Recovered models of epidemics.
Becker, Alexander D; Grenfell, Bryan T
2017-01-01
tsiR is an open source software package implemented in the R programming language designed to analyze infectious disease time-series data. The software extends a well-studied and widely-applied algorithm, the time-series Susceptible-Infected-Recovered (TSIR) model, to infer parameters from incidence data, such as contact seasonality, and to forward simulate the underlying mechanistic model. The tsiR package aggregates a number of different fitting features previously described in the literature in a user-friendly way, providing support for their broader adoption in infectious disease research. Also included in tsiR are a number of diagnostic tools to assess the fit of the TSIR model. This package should be useful for researchers analyzing incidence data for fully-immunizing infectious diseases.
Pandit, Jaideep J; Tavare, Aniket
2011-07-01
It is important that a surgical list is planned to utilise as much of the scheduled time as possible while not over-running, because this can lead to cancellation of operations. We wished to assess whether, theoretically, the known duration of individual operations could be used quantitatively to predict the likely duration of the operating list. In a university hospital setting, we first assessed the extent to which the current ad-hoc method of operating list planning was able to match the scheduled operating list times for 153 consecutive historical lists. Using receiver operating curve analysis, we assessed the ability of an alternative method to predict operating list duration for the same operating lists. This method uses a simple formula: the sum of individual operation times and a pooled standard deviation of these times. We used the operating list duration estimated from this formula to generate a probability that the operating list would finish within its scheduled time. Finally, we applied the simple formula prospectively to 150 operating lists, 'shadowing' the current ad-hoc method, to confirm the predictive ability of the formula. The ad-hoc method was very poor at planning: 50% of historical operating lists were under-booked and 37% over-booked. In contrast, the simple formula predicted the correct outcome (under-run or over-run) for 76% of these operating lists. The calculated probability that a planned series of operations will over-run or under-run was found useful in developing an algorithm to adjust the planned cases optimally. In the prospective series, 65% of operating lists were over-booked and 10% were under-booked. The formula predicted the correct outcome for 84% of operating lists. A simple quantitative method of estimating operating list duration for a series of operations leads to an algorithm (readily created on an Excel spreadsheet, http://links.lww.com/EJA/A19) that can potentially improve operating list planning.
The Effect of the Underlying Distribution in Hurst Exponent Estimation
Sánchez, Miguel Ángel; Trinidad, Juan E.; García, José; Fernández, Manuel
2015-01-01
In this paper, a heavy-tailed distribution approach is considered in order to explore the behavior of actual financial time series. We show that this kind of distribution allows to properly fit the empirical distribution of the stocks from S&P500 index. In addition to that, we explain in detail why the underlying distribution of the random process under study should be taken into account before using its self-similarity exponent as a reliable tool to state whether that financial series displays long-range dependence or not. Finally, we show that, under this model, no stocks from S&P500 index show persistent memory, whereas some of them do present anti-persistent memory and most of them present no memory at all. PMID:26020942
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... models underlying STANS, time series of proportional changes in implied volatilities for a range of... computer systems used by OCC to calculate daily margin requirements. OCC has proposed at this time to clear...
Multifractal detrended fluctuation analysis of sheep livestock prices in origin
NASA Astrophysics Data System (ADS)
Pavón-Domínguez, P.; Serrano, S.; Jiménez-Hornero, F. J.; Jiménez-Hornero, J. E.; Gutiérrez de Ravé, E.; Ariza-Villaverde, A. B.
2013-10-01
The multifractal detrended fluctuation analysis (MF-DFA) is used to verify whether or not the returns of time series of prices paid to farmers in original markets can be described by the multifractal approach. By way of example, 5 weekly time series of prices of different breeds, slaughter weight and market differentiation from 2000 to 2012 are analyzed. Results obtained from the multifractal parameters and multifractal spectra show that the price series of livestock products are of a multifractal nature. The Hurst exponent shows that these time series are stationary signals, some of which exhibit long memory (Merino milk-fed in Seville and Segureña paschal in Jaen), short memory (Merino paschal in Cordoba and Segureña milk-fed in Jaen) or even are close to an uncorrelated signals (Merino paschal in Seville). MF-DFA is able to discern the different underlying dynamics that play an important role in different types of sheep livestock markets, such as degree and source of multifractality. In addition, the main source of multifractality of these time series is due to the broadness of the probability function, instead of the long-range correlation properties between small and large fluctuations, which play a clearly secondary role.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-17
....00 (for options whose underlying stock's previous trading day's last sale price was less than or equal to $100) and between $0.10 and $5.00 (for options whose underlying stock's previous trading day's... the time for series trading between $0.03 and $5.00 (for options whose underlying stock's previous...
Estimating Perturbation and Meta-Stability in the Daily Attendance Rates of Six Small High Schools
NASA Astrophysics Data System (ADS)
Koopmans, Matthijs
This paper discusses the daily attendance rates in six small high schools over a ten-year period and evaluates how stable those rates are. “Stability” is approached from two vantage points: pulse models are fitted to estimate the impact of sudden perturbations and their reverberation through the series, and Autoregressive Fractionally Integrated Moving Average (ARFIMA) techniques are used to detect dependencies over the long range of the series. The analyses are meant to (1) exemplify the utility of time series approaches in educational research, which lacks a time series tradition, (2) discuss some time series features that seem to be particular to daily attendance rate trajectories such as the distinct downward pull coming from extreme observations, and (3) present an analytical approach to handle the important yet distinct patterns of variability that can be found in these data. The analysis also illustrates why the assumption of stability that underlies the habitual reporting of weekly, monthly and yearly averages in the educational literature is questionable, as it reveals dynamical processes (perturbation, meta-stability) that remain hidden in such summaries.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-02
... retirement life of 3,600 hours time-in-service (TIS) for certain part-numbered main rotor yokes installed on the Bell Model 204, 205 series, and 212 series helicopters. Those ADs were prompted by reports of... the applicability to include yokes produced under a Parts Manufacturing Approval (PMA) whose design...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-23
... Series (Order Audit Trail System); NASD Rule 2320 (Best Execution and Interpositioning); NASD Rule 2400 Series (Commissions, Mark-Ups and Charges); NASD IM-2110-2 (Trading Ahead of Customer Limit Order); and... prepared to purchase or sell at that price and under the conditions stated at the time of the offer to buy...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-24
... time that a document is referenced. Revision 3 of Regulatory Guide 1.129 is available in ADAMS under...-251-7455; email: [email protected] . Both of the Office of Nuclear Regulatory Research, U.S... NRC is issuing a revision to an existing guide in the NRC's ``Regulatory Guide'' series. This series...
46 CFR 503.80 - Procedures for withholding information pertaining to meeting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... pertaining to a portion or portions of a meeting or to a portion or portions of a series of meetings be... series of meetings. (b) Upon receipt of any request made under paragraph (a) of this section, the Secretary shall schedule a time at which the members of the agency shall vote upon the request, which vote...
46 CFR 503.80 - Procedures for withholding information pertaining to meeting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... pertaining to a portion or portions of a meeting or to a portion or portions of a series of meetings be... series of meetings. (b) Upon receipt of any request made under paragraph (a) of this section, the Secretary shall schedule a time at which the members of the agency shall vote upon the request, which vote...
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.
Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.
... under the skin) Infection (a slight risk any time the skin is broken) Alternative Names GH test Images Growth hormone stimulation test - series References Ali O. Hyperpituitarism, tall stature, and overgrowth ...
NASA Astrophysics Data System (ADS)
Corbineau, A.; Rouyer, T.; Fromentin, J.-M.; Cazelles, B.; Fonteneau, A.; Ménard, F.
2010-07-01
Catch data of large pelagic fish such as tuna, swordfish and billfish are highly variable ranging from short to long term. Based on fisheries data, these time series are noisy and reflect mixed information on exploitation (targeting, strategy, fishing power), population dynamics (recruitment, growth, mortality, migration, etc.), and environmental forcing (local conditions or dominant climate patterns). In this work, we investigated patterns of variation of large pelagic fish (i.e. yellowfin tuna, bigeye tuna, swordfish and blue marlin) in Japanese longliners catch data from 1960 to 2004. We performed wavelet analyses on the yearly time series of each fish species in each biogeographic province of the tropical Indian and Atlantic Oceans. In addition, we carried out cross-wavelet analyses between these biological time series and a large-scale climatic index, i.e. the Southern Oscillation Index (SOI). Results showed that the biogeographic province was the most important factor structuring the patterns of variability of Japanese catch time series. Relationships between the SOI and the fish catches in the Indian and Atlantic Oceans also pointed out the role of climatic variability for structuring patterns of variation of catch time series. This work finally confirmed that Japanese longline CPUE data poorly reflect the underlying population dynamics of tunas.
Correlations of stock price fluctuations under multi-scale and multi-threshold scenarios
NASA Astrophysics Data System (ADS)
Sui, Guo; Li, Huajiao; Feng, Sida; Liu, Xueyong; Jiang, Meihui
2018-01-01
The multi-scale method is widely used in analyzing time series of financial markets and it can provide market information for different economic entities who focus on different periods. Through constructing multi-scale networks of price fluctuation correlation in the stock market, we can detect the topological relationship between each time series. Previous research has not addressed the problem that the original fluctuation correlation networks are fully connected networks and more information exists within these networks that is currently being utilized. Here we use listed coal companies as a case study. First, we decompose the original stock price fluctuation series into different time scales. Second, we construct the stock price fluctuation correlation networks at different time scales. Third, we delete the edges of the network based on thresholds and analyze the network indicators. Through combining the multi-scale method with the multi-threshold method, we bring to light the implicit information of fully connected networks.
Time series models of environmental exposures: Good predictions or good understanding.
Barnett, Adrian G; Stephen, Dimity; Huang, Cunrui; Wolkewitz, Martin
2017-04-01
Time series data are popular in environmental epidemiology as they make use of the natural experiment of how changes in exposure over time might impact on disease. Many published time series papers have used parameter-heavy models that fully explained the second order patterns in disease to give residuals that have no short-term autocorrelation or seasonality. This is often achieved by including predictors of past disease counts (autoregression) or seasonal splines with many degrees of freedom. These approaches give great residuals, but add little to our understanding of cause and effect. We argue that modelling approaches should rely more on good epidemiology and less on statistical tests. This includes thinking about causal pathways, making potential confounders explicit, fitting a limited number of models, and not over-fitting at the cost of under-estimating the true association between exposure and disease. Copyright © 2017 Elsevier Inc. All rights reserved.
Production and Uses of Multi-Decade Geodetic Earth Science Data Records
NASA Astrophysics Data System (ADS)
Bock, Y.; Kedar, S.; Moore, A. W.; Fang, P.; Liu, Z.; Sullivan, A.; Argus, D. F.; Jiang, S.; Marshall, S. T.
2017-12-01
The Solid Earth Science ESDR System (SESES) project funded under the NASA MEaSUREs program produces and disseminates mature, long-term, calibrated and validated, GNSS based Earth Science Data Records (ESDRs) that encompass multiple diverse areas of interest in Earth Science, such as tectonic motion, transient slip and earthquake dynamics, as well as meteorology, climate, and hydrology. The ESDRs now span twenty-five years for the earliest stations and today are available for thousands of global and regional stations. Using a unified metadata database and a combination of GNSS solutions generated by two independent analysis centers, the project currently produces four long-term ESDR's: Geodetic Displacement Time Series: Daily, combined, cleaned and filtered, GIPSY and GAMIT long-term time series of continuous GPS station positions (global and regional) in the latest version of ITRF, automatically updated weekly. Geodetic Velocities: Weekly updated velocity field + velocity field histories in various reference frames; compendium of all model parameters including earthquake catalog, coseismic offsets, and postseismic model parameters (exponential or logarithmic). Troposphere Delay Time Series: Long-term time series of troposphere delay (30-min resolution) at geodetic stations, necessarily estimated during position time series production and automatically updated weekly. Seismogeodetic records for historic earthquakes: High-rate broadband displacement and seismic velocity time series combining 1 Hz GPS displacements and 100 Hz accelerometer data for select large earthquakes and collocated cGPS and seismic instruments from regional networks. We present several recent notable examples of the ESDR's usage: A transient slip study that uses the combined position time series to unravel "tremor-less" slow tectonic transient events. Fault geometry determination from geodetic slip rates. Changes in water resources across California's physiographic provinces at a spatial resolution of 75 km. Retrospective study of a southern California summer monsoon event.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-14
... models underlying STANS, time series of proportional changes in implied volatilities for a range of.... OCC has proposed at this time to clear only OTC Options on the S&P 500 index and only such options...
Development and Testing of Data Mining Algorithms for Earth Observation
NASA Technical Reports Server (NTRS)
Glymour, Clark
2005-01-01
The new algorithms developed under this project included a principled procedure for classification of objects, events or circumstances according to a target variable when a very large number of potential predictor variables is available but the number of cases that can be used for training a classifier is relatively small. These "high dimensional" problems require finding a minimal set of variables -called the Markov Blanket-- sufficient for predicting the value of the target variable. An algorithm, the Markov Blanket Fan Search, was developed, implemented and tested on both simulated and real data in conjunction with a graphical model classifier, which was also implemented. Another algorithm developed and implemented in TETRAD IV for time series elaborated on work by C. Granger and N. Swanson, which in turn exploited some of our earlier work. The algorithms in question learn a linear time series model from data. Given such a time series, the simultaneous residual covariances, after factoring out time dependencies, may provide information about causal processes that occur more rapidly than the time series representation allow, so called simultaneous or contemporaneous causal processes. Working with A. Monetta, a graduate student from Italy, we produced the correct statistics for estimating the contemporaneous causal structure from time series data using the TETRAD IV suite of algorithms. Two economists, David Bessler and Kevin Hoover, have independently published applications using TETRAD style algorithms to the same purpose. These implementations and algorithmic developments were separately used in two kinds of studies of climate data: Short time series of geographically proximate climate variables predicting agricultural effects in California, and longer duration climate measurements of temperature teleconnections.
Swartz, R. Andrew
2013-01-01
This paper investigates the time series representation methods and similarity measures for sensor data feature extraction and structural damage pattern recognition. Both model-based time series representation and dimensionality reduction methods are studied to compare the effectiveness of feature extraction for damage pattern recognition. The evaluation of feature extraction methods is performed by examining the separation of feature vectors among different damage patterns and the pattern recognition success rate. In addition, the impact of similarity measures on the pattern recognition success rate and the metrics for damage localization are also investigated. The test data used in this study are from the System Identification to Monitor Civil Engineering Structures (SIMCES) Z24 Bridge damage detection tests, a rigorous instrumentation campaign that recorded the dynamic performance of a concrete box-girder bridge under progressively increasing damage scenarios. A number of progressive damage test case datasets and damage test data with different damage modalities are used. The simulation results show that both time series representation methods and similarity measures have significant impact on the pattern recognition success rate. PMID:24191136
van Bömmel, Alena; Song, Song; Majer, Piotr; Mohr, Peter N C; Heekeren, Hauke R; Härdle, Wolfgang K
2014-07-01
Decision making usually involves uncertainty and risk. Understanding which parts of the human brain are activated during decisions under risk and which neural processes underly (risky) investment decisions are important goals in neuroeconomics. Here, we analyze functional magnetic resonance imaging (fMRI) data on 17 subjects who were exposed to an investment decision task from Mohr, Biele, Krugel, Li, and Heekeren (in NeuroImage 49, 2556-2563, 2010b). We obtain a time series of three-dimensional images of the blood-oxygen-level dependent (BOLD) fMRI signals. We apply a panel version of the dynamic semiparametric factor model (DSFM) presented in Park, Mammen, Wolfgang, and Borak (in Journal of the American Statistical Association 104(485), 284-298, 2009) and identify task-related activations in space and dynamics in time. With the panel DSFM (PDSFM) we can capture the dynamic behavior of the specific brain regions common for all subjects and represent the high-dimensional time-series data in easily interpretable low-dimensional dynamic factors without large loss of variability. Further, we classify the risk attitudes of all subjects based on the estimated low-dimensional time series. Our classification analysis successfully confirms the estimated risk attitudes derived directly from subjects' decision behavior.
Beyond Fractals and 1/f Noise: Multifractal Analysis of Complex Physiological Time Series
NASA Astrophysics Data System (ADS)
Ivanov, Plamen Ch.; Amaral, Luis A. N.; Ashkenazy, Yosef; Stanley, H. Eugene; Goldberger, Ary L.; Hausdorff, Jeffrey M.; Yoneyama, Mitsuru; Arai, Kuniharu
2001-03-01
We investigate time series with 1/f-like spectra generated by two physiologic control systems --- the human heartbeat and human gait. We show that physiological fluctuations exhibit unexpected ``hidden'' structures often described by scaling laws. In particular, our studies indicate that when analyzed on different time scales the heartbeat fluctuations exhibit cascades of branching patterns with self-similar (fractal) properties, characterized by long-range power-law anticorrelations. We find that these scaling features change during sleep and wake phases, and with pathological perturbations. Further, by means of a new wavelet-based technique, we find evidence of multifractality in the healthy human heartbeat even under resting conditions, and show that the multifractal character and nonlinear properties of the healthy heart are encoded in the Fourier phases. We uncover a loss of multifractality for a life-threatening condition, congestive heart failure. In contrast to the heartbeat, we find that the interstride interval time series of healthy human gait, a voluntary process under neural regulation, is described by a single fractal dimension (such as classical 1/f noise) indicating monofractal behavior. Thus our approach can help distinguish physiological and physical signals with comparable frequency spectra and two-point correlations, and guide modeling of their control mechanisms.
Rostami, Mehran; Jalilian, Abdollah; Hamzeh, Behrooz; Laghaei, Zahra
2015-01-01
The target of the Fourth Millennium Development Goal (MDG-4) is to reduce the rate of under-five mortality by two-thirds between 1990 and 2015. Despite substantial progress towards achieving the target of the MDG-4 in Iran at the national level, differences at the sub-national levels should be taken into consideration. The under-five mortality data available from the Deputy of Public Health, Kermanshah University of Medical Sciences, was used in order to perform a time series analysis of the monthly under-five mortality rate (U5MR) from 2005 to 2012 in Kermanshah province in the west of Iran. After primary analysis, a seasonal auto-regressive integrated moving average model was chosen as the best fitting model based on model selection criteria. The model was assessed and proved to be adequate in describing variations in the data. However, the unexpected presence of a stochastic increasing trend and a seasonal component with a periodicity of six months in the fitted model are very likely to be consequences of poor quality of data collection and reporting systems. The present work is the first attempt at time series modeling of the U5MR in Iran, and reveals that improvement of under-five mortality data collection in health facilities and their corresponding systems is a major challenge to fully achieving the MGD-4 in Iran. Studies similar to the present work can enhance the understanding of the invisible patterns in U5MR, monitor progress towards the MGD-4, and predict the impact of future variations on the U5MR.
Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui
2016-08-31
Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.
NASA Astrophysics Data System (ADS)
Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui
2016-08-01
Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.
Analyzing Single-Molecule Time Series via Nonparametric Bayesian Inference
Hines, Keegan E.; Bankston, John R.; Aldrich, Richard W.
2015-01-01
The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. PMID:25650922
Estimating monotonic rates from biological data using local linear regression.
Olito, Colin; White, Craig R; Marshall, Dustin J; Barneche, Diego R
2017-03-01
Accessing many fundamental questions in biology begins with empirical estimation of simple monotonic rates of underlying biological processes. Across a variety of disciplines, ranging from physiology to biogeochemistry, these rates are routinely estimated from non-linear and noisy time series data using linear regression and ad hoc manual truncation of non-linearities. Here, we introduce the R package LoLinR, a flexible toolkit to implement local linear regression techniques to objectively and reproducibly estimate monotonic biological rates from non-linear time series data, and demonstrate possible applications using metabolic rate data. LoLinR provides methods to easily and reliably estimate monotonic rates from time series data in a way that is statistically robust, facilitates reproducible research and is applicable to a wide variety of research disciplines in the biological sciences. © 2017. Published by The Company of Biologists Ltd.
On system behaviour using complex networks of a compression algorithm
NASA Astrophysics Data System (ADS)
Walker, David M.; Correa, Debora C.; Small, Michael
2018-01-01
We construct complex networks of scalar time series using a data compression algorithm. The structure and statistics of the resulting networks can be used to help characterize complex systems, and one property, in particular, appears to be a useful discriminating statistic in surrogate data hypothesis tests. We demonstrate these ideas on systems with known dynamical behaviour and also show that our approach is capable of identifying behavioural transitions within electroencephalogram recordings as well as changes due to a bifurcation parameter of a chaotic system. The technique we propose is dependent on a coarse grained quantization of the original time series and therefore provides potential for a spatial scale-dependent characterization of the data. Finally the method is as computationally efficient as the underlying compression algorithm and provides a compression of the salient features of long time series.
Phase locking route behind complex periodic windows in a forced oscillator
NASA Astrophysics Data System (ADS)
Jan, Hengtai; Tsai, Kuo-Ting; Kuo, Li-wei
2013-09-01
Chaotic systems have complex reactions against an external driving force; even in cases with low-dimension oscillators, the routes to synchronization are diverse. We proposed a stroboscope-based method for analyzing driven chaotic systems in their phase space. According to two statistic quantities generated from time series, we could realize the system state and the driving behavior simultaneously. We demonstrated our method in a driven bi-stable system, which showed complex period windows under a proper driving force. With increasing periodic driving force, a route from interior periodic oscillation to phase synchronization through the chaos state could be found. Periodic windows could also be identified and the circumstances under which they occurred distinguished. Statistical results were supported by conditional Lyapunov exponent analysis to show the power in analyzing the unknown time series.
Quantification of Toxic Effects for Water Concentration-based Aquatic Life Criteria -Part B
Erickson et al. (1991) conducted a series of experiments on the toxicity of pentachloroethane (PCE) to juvenile fathead minnows. These experiments included evaluations of bioaccumulation kinetics, the time-course of mortality under both constant and time-variable exposures, the r...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-24
... Amendment No. 1 Thereto, To List and Trade Fourteen Series of the iShares Trust Under NYSE Arca Equities... rule change to list and trade shares (``Shares'') of fourteen series of the iShares Trust (``Trust... roll-over in New Zealand, which would ordinarily be 7:00 a.m., New Zealand time (which would be 1:00 p...
Behaviour of a series of reservoirs separated by drowned gates
NASA Astrophysics Data System (ADS)
Kolechkina, Alla; van Nooijen, Ronald
2017-04-01
Modern control systems tend to be based on computers and therefore to operate by sending commands to structures at given intervals (discrete time control system). Moreover, for almost all water management control systems there are practical lower limits on the time interval between structure adjustments and even between measurements. The water resource systems that are being controlled are physical systems whose state changes continuously. If we combine a continuously changing system and a discrete time controller we get a hybrid system. We use material from recent control theory literature to examine the behaviour of a series of reservoirs separated by drowned gates where the gates are under computer control.
NASA Astrophysics Data System (ADS)
Prada, D. A.; Sanabria, M. P.; Torres, A. F.; Álvarez, M. A.; Gómez, J.
2018-04-01
The study of persistence in time series in seismic events in two of the most important nets such as Hindu Kush in Afghanistan and Los Santos Santander in Colombia generate great interest due to its high presence of telluric activity. The data were taken from the global seismological network. Using the Jarque-Bera test the presence of gaussian distribution was analyzed, and because the distribution in the series was asymmetric, without presence of mesocurtisity, the Hurst coefficient was calculated using the rescaled range method, with which it was found the fractal dimension associated to these time series and under what is possible to determine the persistence, antipersistence and volatility in these phenomena.
Analysis and Forecasting of Shoreline Position
NASA Astrophysics Data System (ADS)
Barton, C. C.; Tebbens, S. F.
2007-12-01
Analysis of historical shoreline positions on sandy coasts, in the geologic record, and study of sea-level rise curves reveals that the dynamics of the underlying processes produce temporal/spatial signals that exhibit power scaling and are therefore self-affine fractals. Self-affine time series signals can be quantified over many orders of magnitude in time and space in terms of persistence, a measure of the degree of correlation between adjacent values in the stochastic portion of a time series. Fractal statistics developed for self-affine time series are used to forecast a probability envelope bounding future shoreline positions. The envelope provides the standard deviation as a function of three variables: persistence, a constant equal to the value of the power spectral density when 1/period equals 1, and the number of time increments. The persistence of a twenty-year time series of the mean-high-water (MHW) shoreline positions was measured for four profiles surveyed at Duck, NC at the Field Research Facility (FRF) by the U.S. Army Corps of Engineers. The four MHW shoreline time series signals are self-affine with persistence ranging between 0.8 and 0.9, which indicates that the shoreline position time series is weakly persistent (where zero is uncorrelated), and has highly varying trends for all time intervals sampled. Forecasts of a probability envelope for future MHW positions are made for the 20 years of record and beyond to 50 years from the start of the data records. The forecasts describe the twenty-year data sets well and indicate that within a 96% confidence envelope, future decadal MHW shoreline excursions should be within 14.6 m of the position at the start of data collection. This is a stable-oscillatory shoreline. The forecasting method introduced here includes the stochastic portion of the time series while the traditional method of predicting shoreline change reduces the time series to a linear trend line fit to historic shoreline positions and extrapolated linearly to forecast future positions with a linearly increasing mean that breaks the confidence envelope eight years into the future and continues to increase. The traditional method is a poor representation of the observed shoreline position time series and is a poor basis for extrapolating future shoreline positions.
NASA Astrophysics Data System (ADS)
Genty, Dominique; Massault, Marc
1999-05-01
Twenty-two AMS 14C measurements have been made on a modern stalagmite from SW France in order to reconstruct the 14C activity history of the calcite deposit. Annual growth laminae provides a chronology up to 1919 A.D. Results show that the stalagmite 14C activity time series is sensitive to modern atmosphere 14C activity changes such as those produced by the nuclear weapon tests. The comparison between the two 14C time series shows that the stalagmite time series is damped: its amplitude variation between pre-bomb and post-bomb values is 75% less and the time delay between the two time series peaks is 16 years ±3. A model is developed using atmosphere 14C and 13C data, fractionation processes and three soil organic matter components whose mean turnover rates are different. The linear correlation coefficient between modeled and measured activities is 0.99. These results, combined with two other stalagmite 14C time series already published and compared with local vegetation and climate, demonstrate that most of the carbon transfer dynamics are controlled in the soil by soil organic matter degradation rates. Where vegetation produces debris whose degradation is slow, the fraction of old carbon injected in the system increases, the observed 14C time series is much more damped and lag time longer than that observed under grassland sites. The same mixing model applied on the 13C shows a good agreement ( R2 = 0.78) between modeled and measured stalagmite δ 13C and demonstrates that the Suess Effect due to fossil fuel combustion in the atmosphere is recorded in the stalagmite but with a damped effect due to SOM degradation rate. The different sources of dead carbon in the seepage water are calculated and discussed.
TTSD has completed a series of technology transfer and risk communication handbooks, case studies, and summary reports for community-based environmental monitoring projects under EPA's Real-Time Environmental Monitoring, Data Delivery, and Public Outreach Program. The Program tak...
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series
Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C.; Golino, Caroline A.; Kemper, Peter; Saha, Margaret S.
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features. PMID:27977764
Visibility graph network analysis of natural gas price: The case of North American market
NASA Astrophysics Data System (ADS)
Sun, Mei; Wang, Yaqi; Gao, Cuixia
2016-11-01
Fluctuations in prices of natural gas significantly affect global economy. Therefore, the research on the characteristics of natural gas price fluctuations, turning points and its influencing cycle on the subsequent price series is of great significance. Global natural gas trade concentrates on three regional markets: the North American market, the European market and the Asia-Pacific market, with North America having the most developed natural gas financial market. In addition, perfect legal supervision and coordinated regulations make the North American market more open and more competitive. This paper focuses on the North American natural gas market specifically. The Henry Hub natural gas spot price time series is converted to a visibility graph network which provides a new direction for macro analysis of time series, and several indicators are investigated: degree and degree distribution, the average shortest path length and community structure. The internal mechanisms underlying price fluctuations are explored through the indicators. The results show that the natural gas prices visibility graph network (NGP-VGN) is of small-world and scale-free properties simultaneously. After random rearrangement of original price time series, the degree distribution of network becomes exponential distribution, different from the original ones. This means that, the original price time series is of long-range negative correlation fractal characteristic. In addition, nodes with large degree correspond to significant geopolitical or economic events. Communities correspond to time cycles in visibility graph network. The cycles of time series and the impact scope of hubs can be found by community structure partition.
Bayesian inference of selection in a heterogeneous environment from genetic time-series data.
Gompert, Zachariah
2016-01-01
Evolutionary geneticists have sought to characterize the causes and molecular targets of selection in natural populations for many years. Although this research programme has been somewhat successful, most statistical methods employed were designed to detect consistent, weak to moderate selection. In contrast, phenotypic studies in nature show that selection varies in time and that individual bouts of selection can be strong. Measurements of the genomic consequences of such fluctuating selection could help test and refine hypotheses concerning the causes of ecological specialization and the maintenance of genetic variation in populations. Herein, I proposed a Bayesian nonhomogeneous hidden Markov model to estimate effective population sizes and quantify variable selection in heterogeneous environments from genetic time-series data. The model is described and then evaluated using a series of simulated data, including cases where selection occurs on a trait with a simple or polygenic molecular basis. The proposed method accurately distinguished neutral loci from non-neutral loci under strong selection, but not from those under weak selection. Selection coefficients were accurately estimated when selection was constant or when the fitness values of genotypes varied linearly with the environment, but these estimates were less accurate when fitness was polygenic or the relationship between the environment and the fitness of genotypes was nonlinear. Past studies of temporal evolutionary dynamics in laboratory populations have been remarkably successful. The proposed method makes similar analyses of genetic time-series data from natural populations more feasible and thereby could help answer fundamental questions about the causes and consequences of evolution in the wild. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Mehrvand, Masoud; Baghanam, Aida Hosseini; Razzaghzadeh, Zahra; Nourani, Vahid
2017-04-01
Since statistical downscaling methods are the most largely used models to study hydrologic impact studies under climate change scenarios, nonlinear regression models known as Artificial Intelligence (AI)-based models such as Artificial Neural Network (ANN) and Support Vector Machine (SVM) have been used to spatially downscale the precipitation outputs of Global Climate Models (GCMs). The study has been carried out using GCM and station data over GCM grid points located around the Peace-Tampa Bay watershed weather stations. Before downscaling with AI-based model, correlation coefficient values have been computed between a few selected large-scale predictor variables and local scale predictands to select the most effective predictors. The selected predictors are then assessed considering grid location for the site in question. In order to increase AI-based downscaling model accuracy pre-processing has been developed on precipitation time series. In this way, the precipitation data derived from various GCM data analyzed thoroughly to find the highest value of correlation coefficient between GCM-based historical data and station precipitation data. Both GCM and station precipitation time series have been assessed by comparing mean and variances over specific intervals. Results indicated that there is similar trend between GCM and station precipitation data; however station data has non-stationary time series while GCM data does not. Finally AI-based downscaling model have been applied to several GCMs with selected predictors by targeting local precipitation time series as predictand. The consequences of recent step have been used to produce multiple ensembles of downscaled AI-based models.
A Methodological Framework for Model Selection in Interrupted Time Series Studies.
Lopez Bernal, J; Soumerai, S; Gasparrini, A
2018-06-06
Interrupted time series is a powerful and increasingly popular design for evaluating public health and health service interventions. The design involves analysing trends in the outcome of interest and estimating the change in trend following an intervention relative to the counterfactual (the expected ongoing trend if the intervention had not occurred). There are two key components to modelling this effect: first, defining the counterfactual; second, defining the type of effect that the intervention is expected to have on the outcome, known as the impact model. The counterfactual is defined by extrapolating the underlying trends observed before the intervention to the post-intervention period. In doing this, authors must consider the pre-intervention period that will be included, any time varying confounders, whether trends may vary within different subgroups of the population and whether trends are linear or non-linear. Defining the impact model involves specifying the parameters that model the intervention, including for instance whether to allow for an abrupt level change or a gradual slope change, whether to allow for a lag before any effect on the outcome, whether to allow a transition period during which the intervention is being implemented and whether a ceiling or floor effect might be expected. Inappropriate model specification can bias the results of an interrupted time series analysis and using a model that is not closely tailored to the intervention or testing multiple models increases the risk of false positives being detected. It is important that authors use substantive knowledge to customise their interrupted time series model a priori to the intervention and outcome under study. Where there is uncertainty in model specification, authors should consider using separate data sources to define the intervention, running limited sensitivity analyses or undertaking initial exploratory studies. Copyright © 2018. Published by Elsevier Inc.
Towards seasonal forecasting of malaria in India.
Lauderdale, Jonathan M; Caminade, Cyril; Heath, Andrew E; Jones, Anne E; MacLeod, David A; Gouda, Krushna C; Murty, Upadhyayula Suryanarayana; Goswami, Prashant; Mutheneni, Srinivasa R; Morse, Andrew P
2014-08-10
Malaria presents public health challenge despite extensive intervention campaigns. A 30-year hindcast of the climatic suitability for malaria transmission in India is presented, using meteorological variables from a state of the art seasonal forecast model to drive a process-based, dynamic disease model. The spatial distribution and seasonal cycles of temperature and precipitation from the forecast model are compared to three observationally-based meteorological datasets. These time series are then used to drive the disease model, producing a simulated forecast of malaria and three synthetic malaria time series that are qualitatively compared to contemporary and pre-intervention malaria estimates. The area under the Relative Operator Characteristic (ROC) curve is calculated as a quantitative metric of forecast skill, comparing the forecast to the meteorologically-driven synthetic malaria time series. The forecast shows probabilistic skill in predicting the spatial distribution of Plasmodium falciparum incidence when compared to the simulated meteorologically-driven malaria time series, particularly where modelled incidence shows high seasonal and interannual variability such as in Orissa, West Bengal, and Jharkhand (North-east India), and Gujarat, Rajastan, Madhya Pradesh and Maharashtra (North-west India). Focusing on these two regions, the malaria forecast is able to distinguish between years of "high", "above average" and "low" malaria incidence in the peak malaria transmission seasons, with more than 70% sensitivity and a statistically significant area under the ROC curve. These results are encouraging given that the three month forecast lead time used is well in excess of the target for early warning systems adopted by the World Health Organization. This approach could form the basis of an operational system to identify the probability of regional malaria epidemics, allowing advanced and targeted allocation of resources for combatting malaria in India.
Code of Federal Regulations, 2014 CFR
2014-04-01
... fireplace stoves certified under the HUD Building Products Certification Program shall be designed... and manufacturer series or model number; and (iv) The type of fuel to be used. (2) The certification... Accreditation Program. (2) The administrator shall visit the manufacturer's facility two times a year to assure...
Code of Federal Regulations, 2010 CFR
2010-04-01
... fireplace stoves certified under the HUD Building Products Certification Program shall be designed... and manufacturer series or model number; and (iv) The type of fuel to be used. (2) The certification... Accreditation Program. (2) The administrator shall visit the manufacturer's facility two times a year to assure...
Code of Federal Regulations, 2012 CFR
2012-04-01
... fireplace stoves certified under the HUD Building Products Certification Program shall be designed... and manufacturer series or model number; and (iv) The type of fuel to be used. (2) The certification... Accreditation Program. (2) The administrator shall visit the manufacturer's facility two times a year to assure...
Code of Federal Regulations, 2011 CFR
2011-04-01
... fireplace stoves certified under the HUD Building Products Certification Program shall be designed... and manufacturer series or model number; and (iv) The type of fuel to be used. (2) The certification... Accreditation Program. (2) The administrator shall visit the manufacturer's facility two times a year to assure...
A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress
2018-01-01
The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies. PMID:29765399
A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress.
Cheng, Ching-Hsue; Chan, Chia-Pang; Yang, Jun-He
2018-01-01
The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.
NASA Technical Reports Server (NTRS)
Herman, J.
2004-01-01
The amount of UV irradiance reaching the Earth's surface is estimated from the measured cloud reflectivity, ozone, aerosol amounts, and surface reflectivity time series from 1980 to 1992 and 1997 to 2000 to estimate changes that have occurred over a 21-year period. Recent analysis of the TOMS data shows that there has been an apparent increase in reflectivity (decrease in W) in the Southern Hemisphere that is related to a calibration error in EP-TOMS. Data from the well-calibrated SeaWiFS satellite instrument have been used to correct the EP-TOMS reflectivity and UV time series. After correction, some of the local trend features seen in the N7 time series (1980 to 1992) have been continued in the combined time series, but the overall zonal average and global trends have changed. In addition to correcting the EP-TOMS radiance calibration, the use of SeaWiFS cloud data permits estimation of UV irradiance at higher spatial resolution (1 to 4 km) than is available from TOMS (100 km) under the assumption that ozone is slowly varying over a scale of 100 km. The key results include a continuing decrease in cloud cover over Europe and North America with a corresponding increase in UV and a decrease in UV irradiance near Antarctica.
Road safety forecasts in five European countries using structural time series models.
Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George
2014-01-01
Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.
29 CFR 102.22 - Extension of time for filing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 2 2010-07-01 2010-07-01 false Extension of time for filing. 102.22 Section 102.22 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Procedure Under... of time for filing. Upon his own motion or upon proper cause shown by any other party the regional...
29 CFR 102.22 - Extension of time for filing.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 2 2013-07-01 2013-07-01 false Extension of time for filing. 102.22 Section 102.22 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Procedure Under... of time for filing. Upon his own motion or upon proper cause shown by any other party the regional...
29 CFR 102.22 - Extension of time for filing.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 2 2012-07-01 2012-07-01 false Extension of time for filing. 102.22 Section 102.22 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Procedure Under... of time for filing. Upon his own motion or upon proper cause shown by any other party the regional...
29 CFR 102.22 - Extension of time for filing.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 2 2014-07-01 2014-07-01 false Extension of time for filing. 102.22 Section 102.22 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Procedure Under... of time for filing. Upon his own motion or upon proper cause shown by any other party the regional...
29 CFR 102.22 - Extension of time for filing.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 2 2011-07-01 2011-07-01 false Extension of time for filing. 102.22 Section 102.22 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Procedure Under... of time for filing. Upon his own motion or upon proper cause shown by any other party the regional...
Finding hidden periodic signals in time series - an application to stock prices
NASA Astrophysics Data System (ADS)
O'Shea, Michael
2014-03-01
Data in the form of time series appear in many areas of science. In cases where the periodicity is apparent and the only other contribution to the time series is stochastic in origin, the data can be `folded' to improve signal to noise and this has been done for light curves of variable stars with the folding resulting in a cleaner light curve signal. Stock index prices versus time are classic examples of time series. Repeating patterns have been claimed by many workers and include unusually large returns on small-cap stocks during the month of January, and small returns on the Dow Jones Industrial average (DJIA) in the months June through September compared to the rest of the year. Such observations imply that these prices have a periodic component. We investigate this for the DJIA. If such a component exists it is hidden in a large non-periodic variation and a large stochastic variation. We show how to extract this periodic component and for the first time reveal its yearly (averaged) shape. This periodic component leads directly to the `Sell in May and buy at Halloween' adage. We also drill down and show that this yearly variation emerges from approximately half of the underlying stocks making up the DJIA index.
Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier
2012-01-01
Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785
Dimensionless embedding for nonlinear time series analysis
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; Aihara, Kazuyuki
2017-09-01
Recently, infinite-dimensional delay coordinates (InDDeCs) have been proposed for predicting high-dimensional dynamics instead of conventional delay coordinates. Although InDDeCs can realize faster computation and more accurate short-term prediction, it is still not well-known whether InDDeCs can be used in other applications of nonlinear time series analysis in which reconstruction is needed for the underlying dynamics from a scalar time series generated from a dynamical system. Here, we give theoretical support for justifying the use of InDDeCs and provide numerical examples to show that InDDeCs can be used for various applications for obtaining the recurrence plots, correlation dimensions, and maximal Lyapunov exponents, as well as testing directional couplings and extracting slow-driving forces. We demonstrate performance of the InDDeCs using the weather data. Thus, InDDeCs can eventually realize "dimensionless embedding" while we enjoy faster and more reliable computations.
A low free-parameter stochastic model of daily Forbush decrease indices
NASA Astrophysics Data System (ADS)
Patra, Sankar Narayan; Bhattacharya, Gautam; Panja, Subhash Chandra; Ghosh, Koushik
2014-01-01
Forbush decrease is a rapid decrease in the observed galactic cosmic ray intensity pattern occurring after a coronal mass ejection. In the present paper we have analyzed the daily Forbush decrease indices from January, 1967 to December, 2003 generated in IZMIRAN, Russia. First the entire indices have been smoothened and next we have made an attempt to fit a suitable stochastic model for the present time series by means of a necessary number of process parameters. The study reveals that the present time series is governed by a stationary autoregressive process of order 2 with a trace of white noise. Under the consideration of the present model we have shown that chaos is not expected in the present time series which opens up the possibility of validation of its forecasting (both short-term and long-term) as well as its multi-periodic behavior.
2014-01-01
A brief overview is provided of cosinor-based techniques for the analysis of time series in chronobiology. Conceived as a regression problem, the method is applicable to non-equidistant data, a major advantage. Another dividend is the feasibility of deriving confidence intervals for parameters of rhythmic components of known periods, readily drawn from the least squares procedure, stressing the importance of prior (external) information. Originally developed for the analysis of short and sparse data series, the extended cosinor has been further developed for the analysis of long time series, focusing both on rhythm detection and parameter estimation. Attention is given to the assumptions underlying the use of the cosinor and ways to determine whether they are satisfied. In particular, ways of dealing with non-stationary data are presented. Examples illustrate the use of the different cosinor-based methods, extending their application from the study of circadian rhythms to the mapping of broad time structures (chronomes). PMID:24725531
Variable diffusion in stock market fluctuations
NASA Astrophysics Data System (ADS)
Hua, Jia-Chen; Chen, Lijian; Falcon, Liberty; McCauley, Joseph L.; Gunaratne, Gemunu H.
2015-02-01
We analyze intraday fluctuations in several stock indices to investigate the underlying stochastic processes using techniques appropriate for processes with nonstationary increments. The five most actively traded stocks each contains two time intervals during the day where the variance of increments can be fit by power law scaling in time. The fluctuations in return within these intervals follow asymptotic bi-exponential distributions. The autocorrelation function for increments vanishes rapidly, but decays slowly for absolute and squared increments. Based on these results, we propose an intraday stochastic model with linear variable diffusion coefficient as a lowest order approximation to the real dynamics of financial markets, and to test the effects of time averaging techniques typically used for financial time series analysis. We find that our model replicates major stylized facts associated with empirical financial time series. We also find that ensemble averaging techniques can be used to identify the underlying dynamics correctly, whereas time averages fail in this task. Our work indicates that ensemble average approaches will yield new insight into the study of financial markets' dynamics. Our proposed model also provides new insight into the modeling of financial markets dynamics in microscopic time scales.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Grading Reviews and Appeals § 532.701 General. A prevailing rate employee may at any time appeal the occupational series, grade, or title to which the employee's job is assigned, but may not appeal under this...
Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay
2013-12-01
Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations. © 2013, The International Biometric Society.
Application of computational mechanics to the analysis of natural data: an example in geomagnetism.
Clarke, Richard W; Freeman, Mervyn P; Watkins, Nicholas W
2003-01-01
We discuss how the ideal formalism of computational mechanics can be adapted to apply to a noninfinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the concept of effective soficity is introduced. We believe that computational mechanics cannot be applied to a noisy and finite data series without invoking an argument based upon effective soficity. A related distinction between noise and unresolved structure is also defined: Noise can only be eliminated by increasing the length of the time series, whereas the resolution of previously unresolved structure only requires the finite memory of the analysis to be increased. The benefits of these concepts are demonstrated in a simulated times series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to an analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the Earth-based station under an electrical current pattern that is fixed with respect to the Sun-Earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, some useful terminology for the discussion of model construction in general is introduced.
Temporal dynamics of catchment transit times from stable isotope data
NASA Astrophysics Data System (ADS)
Klaus, Julian; Chun, Kwok P.; McGuire, Kevin J.; McDonnell, Jeffrey J.
2015-06-01
Time variant catchment transit time distributions are fundamental descriptors of catchment function but yet not fully understood, characterized, and modeled. Here we present a new approach for use with standard runoff and tracer data sets that is based on tracking of tracer and age information and time variant catchment mixing. Our new approach is able to deal with nonstationarity of flow paths and catchment mixing, and an irregular shape of the transit time distribution. The approach extracts information on catchment mixing from the stable isotope time series instead of prior assumptions of mixing or the shape of transit time distribution. We first demonstrate proof of concept of the approach with artificial data; the Nash-Sutcliffe efficiencies in tracer and instantaneous transit times were >0.9. The model provides very accurate estimates of time variant transit times when the boundary conditions and fluxes are fully known. We then tested the model with real rainfall-runoff flow and isotope tracer time series from the H.J. Andrews Watershed 10 (WS10) in Oregon. Model efficiencies were 0.37 for the 18O modeling for a 2 year time series; the efficiencies increased to 0.86 for the second year underlying the need of long time tracer time series with a long overlap of tracer input and output. The approach was able to determine time variant transit time of WS10 with field data and showed how it follows the storage dynamics and related changes in flow paths where wet periods with high flows resulted in clearly shorter transit times compared to dry low flow periods.
Kwasniok, Frank
2013-11-01
A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.
A novel Bayesian approach to acoustic emission data analysis.
Agletdinov, E; Pomponi, E; Merson, D; Vinogradov, A
2016-12-01
Acoustic emission (AE) technique is a popular tool for materials characterization and non-destructive testing. Originating from the stochastic motion of defects in solids, AE is a random process by nature. The challenging problem arises whenever an attempt is made to identify specific points corresponding to the changes in the trends in the fluctuating AE time series. A general Bayesian framework is proposed for the analysis of AE time series, aiming at automated finding the breakpoints signaling a crossover in the dynamics of underlying AE sources. Copyright © 2016 Elsevier B.V. All rights reserved.
FROG: Time Series Analysis for the Web Service Era
NASA Astrophysics Data System (ADS)
Allan, A.
2005-12-01
The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).
AQUAdexIM: highly efficient in-memory indexing and querying of astronomy time series images
NASA Astrophysics Data System (ADS)
Hong, Zhi; Yu, Ce; Wang, Jie; Xiao, Jian; Cui, Chenzhou; Sun, Jizhou
2016-12-01
Astronomy has always been, and will continue to be, a data-based science, and astronomers nowadays are faced with increasingly massive datasets, one key problem of which is to efficiently retrieve the desired cup of data from the ocean. AQUAdexIM, an innovative spatial indexing and querying method, performs highly efficient on-the-fly queries under users' request to search for Time Series Images from existing observation data on the server side and only return the desired FITS images to users, so users no longer need to download entire datasets to their local machines, which will only become more and more impractical as the data size keeps increasing. Moreover, AQUAdexIM manages to keep a very low storage space overhead and its specially designed in-memory index structure enables it to search for Time Series Images of a given area of the sky 10 times faster than using Redis, a state-of-the-art in-memory database.
Wan, Huafang; Cui, Yixin; Ding, Yijuan; Mei, Jiaqin; Dong, Hongli; Zhang, Wenxin; Wu, Shiqi; Liang, Ying; Zhang, Chunyu; Li, Jiana; Xiong, Qing; Qian, Wei
2016-01-01
Understanding the regulation of lipid metabolism is vital for genetic engineering of canola ( Brassica napus L.) to increase oil yield or modify oil composition. We conducted time-series analyses of transcriptomes and proteomes to uncover the molecular networks associated with oil accumulation and dynamic changes in these networks in canola. The expression levels of genes and proteins were measured at 2, 4, 6, and 8 weeks after pollination (WAP). Our results show that the biosynthesis of fatty acids is a dominant cellular process from 2 to 6 WAP, while the degradation mainly happens after 6 WAP. We found that genes in almost every node of fatty acid synthesis pathway were significantly up-regulated during oil accumulation. Moreover, significant expression changes of two genes, acetyl-CoA carboxylase and acyl-ACP desaturase, were detected on both transcriptomic and proteomic levels. We confirmed the temporal expression patterns revealed by the transcriptomic analyses using quantitative real-time PCR experiments. The gene set association analysis show that the biosynthesis of fatty acids and unsaturated fatty acids are the most significant biological processes from 2-4 WAP and 4-6 WAP, respectively, which is consistent with the results of time-series analyses. These results not only provide insight into the mechanisms underlying lipid metabolism, but also reveal novel candidate genes that are worth further investigation for their values in the genetic engineering of canola.
NASA Astrophysics Data System (ADS)
Srivastav, R. K.; Chinnapa Reddy, A. R.
2015-12-01
Recent trends in climate, land-use pattern and population has affected almost every portable water resources in the world. Due to depleting surface water and untimely distribution of precipitation, the demand to use groundwater has increased considerably. Further recent studies have shown that the groundwater stress is more in developing countries like India. This study focuses on understanding the impacts of three major factors (i.e., rainfall, land-cover and population growth) effecting the groundwater levels. For this purpose, the correlation between the trends in groundwater time series is compared with trends in rainfall, land-cover and population growth. To detect the trends in time series, two statistical methods namely, least square method and Mann-Kendall method, are adopted. The results were analyzed based on the measurements from 1800 observation wells in the Karnataka state, India. The data is obtained for a total of 9 year time period ranging from 2005 to 2013. A gridded precipitation data of 0.5o× 0.5o over the entire region is used. The change in land-cover and population data was approximately obtained from the local governing bodies. The early results show significant correlation between rainfall and groundwater time series trends. The outcomes will assess the vulnerability of groundwater levels under changing physical and hydroclimatic conditions, especially under climate change.
Reconstruction of network topology using status-time-series data
NASA Astrophysics Data System (ADS)
Pandey, Pradumn Kumar; Badarla, Venkataramana
2018-01-01
Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.
Retrieving hydrological connectivity from empirical causality in karst systems
NASA Astrophysics Data System (ADS)
Delforge, Damien; Vanclooster, Marnik; Van Camp, Michel; Poulain, Amaël; Watlet, Arnaud; Hallet, Vincent; Kaufmann, Olivier; Francis, Olivier
2017-04-01
Because of their complexity, karst systems exhibit nonlinear dynamics. Moreover, if one attempts to model a karst, the hidden behavior complicates the choice of the most suitable model. Therefore, both intense investigation methods and nonlinear data analysis are needed to reveal the underlying hydrological connectivity as a prior for a consistent physically based modelling approach. Convergent Cross Mapping (CCM), a recent method, promises to identify causal relationships between time series belonging to the same dynamical systems. The method is based on phase space reconstruction and is suitable for nonlinear dynamics. As an empirical causation detection method, it could be used to highlight the hidden complexity of a karst system by revealing its inner hydrological and dynamical connectivity. Hence, if one can link causal relationships to physical processes, the method should show great potential to support physically based model structure selection. We present the results of numerical experiments using karst model blocks combined in different structures to generate time series from actual rainfall series. CCM is applied between the time series to investigate if the empirical causation detection is consistent with the hydrological connectivity suggested by the karst model.
A Physiological Time Series Dynamics-Based Approach to Patient Monitoring and Outcome Prediction
Lehman, Li-Wei H.; Adams, Ryan P.; Mayaud, Louis; Moody, George B.; Malhotra, Atul; Mark, Roger G.; Nemati, Shamim
2015-01-01
Cardiovascular variables such as heart rate (HR) and blood pressure (BP) are regulated by an underlying control system, and therefore, the time series of these vital signs exhibit rich dynamical patterns of interaction in response to external perturbations (e.g., drug administration), as well as pathological states (e.g., onset of sepsis and hypotension). A question of interest is whether “similar” dynamical patterns can be identified across a heterogeneous patient cohort, and be used for prognosis of patients’ health and progress. In this paper, we used a switching vector autoregressive framework to systematically learn and identify a collection of vital sign time series dynamics, which are possibly recurrent within the same patient and may be shared across the entire cohort. We show that these dynamical behaviors can be used to characterize the physiological “state” of a patient. We validate our technique using simulated time series of the cardiovascular system, and human recordings of HR and BP time series from an orthostatic stress study with known postural states. Using the HR and BP dynamics of an intensive care unit (ICU) cohort of over 450 patients from the MIMIC II database, we demonstrate that the discovered cardiovascular dynamics are significantly associated with hospital mortality (dynamic modes 3 and 9, p = 0.001, p = 0.006 from logistic regression after adjusting for the APACHE scores). Combining the dynamics of BP time series and SAPS-I or APACHE-III provided a more accurate assessment of patient survival/mortality in the hospital than using SAPS-I and APACHE-III alone (p = 0.005 and p = 0.045). Our results suggest that the discovered dynamics of vital sign time series may contain additional prognostic value beyond that of the baseline acuity measures, and can potentially be used as an independent predictor of outcomes in the ICU. PMID:25014976
Kojima, Motonaga; Obuchi, Shuichi; Mizuno, Kousuke; Henmi, Osamu; Ikeda, Noriaki
2008-06-01
We propose a novel indicator for smoothness of movement, i.e., the power spectrum entropy of the acceleration time-series, and compare it with conventional indices of smoothness. For this purpose, nineteen healthy adults (21.3+/-2.5 years old) performed the task of raising and lowering a beaker between the level of the umbilicus and eye level under the two following conditions: one with the beaker containing water and the other with the beaker containing a weight of the same mass as the water. Moving the beaker up and down when it contained water required extra control to prevent the water from being spilled. This means that movement was not as smooth as when the beaker contained a weight. Under these two conditions, entropy was measured along with a traditional indicator of smoothness of movement, the jerk index. The entropy could distinguish just as well as the jerk index (p<0.01) between when water was used and when the weight was used. The entropy correlated highly with the jerk index, with Spearman's rho at 0.88 (p<0.01). These results showed that the entropy derived from the spectrum of the acceleration time-series during movement is useful as an indicator of the smoothness of that movement.
Causal discovery and inference: concepts and recent methodological advances.
Spirtes, Peter; Zhang, Kun
This paper aims to give a broad coverage of central concepts and principles involved in automated causal inference and emerging approaches to causal discovery from i.i.d data and from time series. After reviewing concepts including manipulations, causal models, sample predictive modeling, causal predictive modeling, and structural equation models, we present the constraint-based approach to causal discovery, which relies on the conditional independence relationships in the data, and discuss the assumptions underlying its validity. We then focus on causal discovery based on structural equations models, in which a key issue is the identifiability of the causal structure implied by appropriately defined structural equation models: in the two-variable case, under what conditions (and why) is the causal direction between the two variables identifiable? We show that the independence between the error term and causes, together with appropriate structural constraints on the structural equation, makes it possible. Next, we report some recent advances in causal discovery from time series. Assuming that the causal relations are linear with nonGaussian noise, we mention two problems which are traditionally difficult to solve, namely causal discovery from subsampled data and that in the presence of confounding time series. Finally, we list a number of open questions in the field of causal discovery and inference.
Park, Chihyun; Yun, So Jeong; Ryu, Sung Jin; Lee, Soyoung; Lee, Young-Sam; Yoon, Youngmi; Park, Sang Chul
2017-03-15
Cellular senescence irreversibly arrests growth of human diploid cells. In addition, recent studies have indicated that senescence is a multi-step evolving process related to important complex biological processes. Most studies analyzed only the genes and their functions representing each senescence phase without considering gene-level interactions and continuously perturbed genes. It is necessary to reveal the genotypic mechanism inferred by affected genes and their interaction underlying the senescence process. We suggested a novel computational approach to identify an integrative network which profiles an underlying genotypic signature from time-series gene expression data. The relatively perturbed genes were selected for each time point based on the proposed scoring measure denominated as perturbation scores. Then, the selected genes were integrated with protein-protein interactions to construct time point specific network. From these constructed networks, the conserved edges across time point were extracted for the common network and statistical test was performed to demonstrate that the network could explain the phenotypic alteration. As a result, it was confirmed that the difference of average perturbation scores of common networks at both two time points could explain the phenotypic alteration. We also performed functional enrichment on the common network and identified high association with phenotypic alteration. Remarkably, we observed that the identified cell cycle specific common network played an important role in replicative senescence as a key regulator. Heretofore, the network analysis from time series gene expression data has been focused on what topological structure was changed over time point. Conversely, we focused on the conserved structure but its context was changed in course of time and showed it was available to explain the phenotypic changes. We expect that the proposed method will help to elucidate the biological mechanism unrevealed by the existing approaches.
Code of Federal Regulations, 2010 CFR
2010-04-01
... to section 404 of the act for such temporary period of time as may be necessary to protect the public... single plant or establishment, a series of plants under a single management, or all plants in an industry...
Code of Federal Regulations, 2013 CFR
2013-04-01
... to section 404 of the act for such temporary period of time as may be necessary to protect the public... single plant or establishment, a series of plants under a single management, or all plants in an industry...
Hypoxia and performance decrement.
DOT National Transportation Integrated Search
1966-05-01
The concept of time of useful consciousness fails to take into account the progressive decay that occurs in performance under hypoxic conditions. This study, using a means of quantitatively assessing such a decrement, presents data obtained in a seri...
Analysis of PV Advanced Inverter Functions and Setpoints under Time Series Simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seuss, John; Reno, Matthew J.; Broderick, Robert Joseph
Utilities are increasingly concerned about the potential negative impacts distributed PV may have on the operational integrity of their distribution feeders. Some have proposed novel methods for controlling a PV system's grid - tie inverter to mitigate poten tial PV - induced problems. This report investigates the effectiveness of several of these PV advanced inverter controls on improving distribution feeder operational metrics. The controls are simulated on a large PV system interconnected at several locations within two realistic distribution feeder models. Due to the time - domain nature of the advanced inverter controls, quasi - static time series simulations aremore » performed under one week of representative variable irradiance and load data for each feeder. A para metric study is performed on each control type to determine how well certain measurable network metrics improve as a function of the control parameters. This methodology is used to determine appropriate advanced inverter settings for each location on the f eeder and overall for any interconnection location on the feeder.« less
Multifractality of cerebral blood flow
NASA Astrophysics Data System (ADS)
West, Bruce J.; Latka, Miroslaw; Glaubic-Latka, Marta; Latka, Dariusz
2003-02-01
Scale invariance, the property relating time series across multiple scales, has provided a new perspective of physiological phenomena and their underlying control systems. The traditional “signal plus noise” paradigm of the engineer was first replaced with a model in which biological time series have a fractal structure in time (Fractal Physiology, Oxford University Press, Oxford, 1994). This new paradigm was subsequently shown to be overly restrictive when certain physiological signals were found to be characterized by more than one scaling parameter and therefore to belong to a class of more complex processes known as multifractals (Fractals, Plenum Press, New York, 1988). Here we demonstrate that in addition to heart rate (Nature 399 (1999) 461) and human gait (Phys. Rev. E, submitted for publication), the nonlinear control system for cerebral blood flow (CBF) (Phys. Rev. Lett., submitted for publication; Phys. Rev. E 59 (1999) 3492) is multifractal. We also find that this multifractality is greatly reduced for subjects with “serious” migraine and we present a simple model for the underlying control process to describe this effect.
Research on PM2.5 time series characteristics based on data mining technology
NASA Astrophysics Data System (ADS)
Zhao, Lifang; Jia, Jin
2018-02-01
With the development of data mining technology and the establishment of environmental air quality database, it is necessary to discover the potential correlations and rules by digging the massive environmental air quality information and analyzing the air pollution process. In this paper, we have presented a sequential pattern mining method based on the air quality data and pattern association technology to analyze the PM2.5 time series characteristics. Utilizing the real-time monitoring data of urban air quality in China, the time series rule and variation properties of PM2.5 under different pollution levels are extracted and analyzed. The analysis results show that the time sequence features of the PM2.5 concentration is directly affected by the alteration of the pollution degree. The longest time that PM2.5 remained stable is about 24 hours. As the pollution degree gets severer, the instability time and step ascending time gradually changes from 12-24 hours to 3 hours. The presented method is helpful for the controlling and forecasting of the air quality while saving the measuring costs, which is of great significance for the government regulation and public prevention of the air pollution.
On Clear-Cut Mapping with Time-Series of Sentinel-1 Data in Boreal Forest
NASA Astrophysics Data System (ADS)
Rauste, Yrjo; Antropov, Oleg; Mutanen, Teemu; Hame, Tuomas
2016-08-01
Clear-cutting is the most drastic and wide-spread change that affects the hydrological and carbon-balance proper- ties of forested land in the Boreal forest zone1.A time-series of 36 Sentinel-1 images was used to study the potential for mapping clear-cut areas. The time series covered one and half year (2014-10-09 ... 2016-03-20) in a 200-km-by-200-km study site in Finland. The Sentinel- 1 images were acquired in Interferometric Wide-swath (IW), dual-polarized mode (VV+VH). All scenes were acquired in the same orbit configuration. Amplitude im- ages (GRDH product) were used. The Sentinel-1 scenes were ortho-rectified with in-house software using a digi- tal elevation model (DEM) produced by the Land Survey of Finland. The Sentinel-1 amplitude data were radio- metrically corrected for topographic effects.The temporal behaviour of C-band backscatter was stud- ied for areas representing 1) areas clear-cut during the ac- quisition of the Sentinel-1 time-series, 2) areas remaining forest during the acquisition of the Sentinel-1 time-series, and 3) areas that had been clear-cut before the acquisition of the Sentinel-1 time-series.The following observations were made:1. The separation between clear-cut areas and forest was generally low;2. Under certain acquisition conditions, clear-cut areas were well separable from forest;3. The good scenes were acquired: 1) in winter during thick snow cover, and 2) in late summer towards the end of a warm and dry period;4. The separation between clear-cut and forest was higher in VH polarized data than in VV-polarized data.5. The separation between clear-cut and forest was higher in the winter/snow scenes than in the dry summer scenes.
Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il
A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less
Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin
2013-01-01
Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509
Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin
2014-01-30
The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.
46 CFR 199.150 - Survival craft launching and recovery arrangements; general.
Code of Federal Regulations, 2011 CFR
2011-10-01
... appliance for a lifeboat must be approved under approval series 160.132 with a winch approved under approval series 160.115. (2) Each launching appliance for a davit-launched liferaft must be approved under approval series 160.163 with an automatic disengaging apparatus approved under approval series 160.170. (b...
46 CFR 199.150 - Survival craft launching and recovery arrangements; general.
Code of Federal Regulations, 2010 CFR
2010-10-01
... appliance for a lifeboat must be approved under approval series 160.132 with a winch approved under approval series 160.115. (2) Each launching appliance for a davit-launched liferaft must be approved under approval series 160.163 with an automatic disengaging apparatus approved under approval series 160.170. (b...
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
Delay differential analysis of time series.
Lainscsek, Claudia; Sejnowski, Terrence J
2015-03-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time compared with frequency-based methods such as the DFT and cross-spectral analysis.
Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents
NASA Astrophysics Data System (ADS)
Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.
2016-12-01
Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.
A DDC Bibliography on Computers in Information Sciences. Volume I. Information Sciences Series.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 249 annotated references grouped under two major headings: Time Shared, On-Line, and Real Time Systems, and Computer Components. The references are arranged in accesion number (AD-number)…
Mediating Relations: Therapeutic Discourse in American Prime Time Series.
ERIC Educational Resources Information Center
White, Mimi
Although "The Equalizer" and "Finder of Lost Loves" are different kinds of prime time fiction--urban thriller on the one hand and fantasy melodrama on the other--they share an underlying dramatic structure and symbolic problematic in their repeated enactments of a therapeutic cure overseen by a mediating, authority figure. The…
Comparing the structure of an emerging market with a mature one under global perturbation
NASA Astrophysics Data System (ADS)
Namaki, A.; Jafari, G. R.; Raei, R.
2011-09-01
In this paper we investigate the Tehran stock exchange (TSE) and Dow Jones Industrial Average (DJIA) in terms of perturbed correlation matrices. To perturb a stock market, there are two methods, namely local and global perturbation. In the local method, we replace a correlation coefficient of the cross-correlation matrix with one calculated from two Gaussian-distributed time series, whereas in the global method, we reconstruct the correlation matrix after replacing the original return series with Gaussian-distributed time series. The local perturbation is just a technical study. We analyze these markets through two statistical approaches, random matrix theory (RMT) and the correlation coefficient distribution. By using RMT, we find that the largest eigenvalue is an influence that is common to all stocks and this eigenvalue has a peak during financial shocks. We find there are a few correlated stocks that make the essential robustness of the stock market but we see that by replacing these return time series with Gaussian-distributed time series, the mean values of correlation coefficients, the largest eigenvalues of the stock markets and the fraction of eigenvalues that deviate from the RMT prediction fall sharply in both markets. By comparing these two markets, we can see that the DJIA is more sensitive to global perturbations. These findings are crucial for risk management and portfolio selection.
NASA Astrophysics Data System (ADS)
Watkins, Nicholas; Clarke, Richard; Freeman, Mervyn
2002-11-01
We discuss how the ideal formalism of Computational Mechanics can be adapted to apply to a non-infinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the new concept of effective soficity is introduced. The benefits of these new concepts are demonstrated on simulated time series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the earth-based station under an electrical current pattern that is fixed with respect to the sun-earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, a hypothesis is advanced about model construction in general (see also Clarke et al; arXiv::cond-mat/0110228).
Benchmarking the Algorithms to Detect Seasonal Signals Under Different Noise Conditions
NASA Astrophysics Data System (ADS)
Klos, A.; Bogusz, J.; Bos, M. S.
2017-12-01
Global Positioning System (GPS) position time series contain seasonal signals. Among the others, annual and semi-annual are the most powerful. Widely, these oscillations are modelled as curves with constant amplitudes, using the Weighted Least-Squares (WLS) algorithm. However, in reality, the seasonal signatures vary over time, as their geophysical causes are not constant. Different algorithms have been already used to cover this time-variability, as Wavelet Decomposition (WD), Singular Spectrum Analysis (SSA), Chebyshev Polynomial (CP) or Kalman Filter (KF). In this research, we employed 376 globally distributed GPS stations which time series contributed to the newest International Terrestrial Reference Frame (ITRF2014). We show that for c.a. 20% of stations the amplitudes of seasonal signal varies over time of more than 1.0 mm. Then, we compare the WD, SSA, CP and KF algorithms for a set of synthetic time series to quantify them under different noise conditions. We show that when variations of seasonal signals are ignored, the power-law character is biased towards flicker noise. The most reliable estimates of the variations were found to be given by SSA and KF. These methods also perform the best for other noise levels while WD, and to a lesser extend also CP, have trouble in separating the seasonal signal from the noise which leads to an underestimation in the spectral index of power-law noise of around 0.1. For real ITRF2014 GPS data we discovered, that SSA and KF are capable to model 49-84% and 77-90% of the variance of the true varying seasonal signals, respectively.
The geometry of chaotic dynamics — a complex network perspective
NASA Astrophysics Data System (ADS)
Donner, R. V.; Heitzig, J.; Donges, J. F.; Zou, Y.; Marwan, N.; Kurths, J.
2011-12-01
Recently, several complex network approaches to time series analysis have been developed and applied to study a wide range of model systems as well as real-world data, e.g., geophysical or financial time series. Among these techniques, recurrence-based concepts and prominently ɛ-recurrence networks, most faithfully represent the geometrical fine structure of the attractors underlying chaotic (and less interestingly non-chaotic) time series. In this paper we demonstrate that the well known graph theoretical properties local clustering coefficient and global (network) transitivity can meaningfully be exploited to define two new local and two new global measures of dimension in phase space: local upper and lower clustering dimension as well as global upper and lower transitivity dimension. Rigorous analytical as well as numerical results for self-similar sets and simple chaotic model systems suggest that these measures are well-behaved in most non-pathological situations and that they can be estimated reasonably well using ɛ-recurrence networks constructed from relatively short time series. Moreover, we study the relationship between clustering and transitivity dimensions on the one hand, and traditional measures like pointwise dimension or local Lyapunov dimension on the other hand. We also provide further evidence that the local clustering coefficients, or equivalently the local clustering dimensions, are useful for identifying unstable periodic orbits and other dynamically invariant objects from time series. Our results demonstrate that ɛ-recurrence networks exhibit an important link between dynamical systems and graph theory.
NASA Astrophysics Data System (ADS)
Xie, Hong-Bo; Dokos, Socrates
2013-06-01
We present a hybrid symplectic geometry and central tendency measure (CTM) method for detection of determinism in noisy time series. CTM is effective for detecting determinism in short time series and has been applied in many areas of nonlinear analysis. However, its performance significantly degrades in the presence of strong noise. In order to circumvent this difficulty, we propose to use symplectic principal component analysis (SPCA), a new chaotic signal de-noising method, as the first step to recover the system dynamics. CTM is then applied to determine whether the time series arises from a stochastic process or has a deterministic component. Results from numerical experiments, ranging from six benchmark deterministic models to 1/f noise, suggest that the hybrid method can significantly improve detection of determinism in noisy time series by about 20 dB when the data are contaminated by Gaussian noise. Furthermore, we apply our algorithm to study the mechanomyographic (MMG) signals arising from contraction of human skeletal muscle. Results obtained from the hybrid symplectic principal component analysis and central tendency measure demonstrate that the skeletal muscle motor unit dynamics can indeed be deterministic, in agreement with previous studies. However, the conventional CTM method was not able to definitely detect the underlying deterministic dynamics. This result on MMG signal analysis is helpful in understanding neuromuscular control mechanisms and developing MMG-based engineering control applications.
Xie, Hong-Bo; Dokos, Socrates
2013-06-01
We present a hybrid symplectic geometry and central tendency measure (CTM) method for detection of determinism in noisy time series. CTM is effective for detecting determinism in short time series and has been applied in many areas of nonlinear analysis. However, its performance significantly degrades in the presence of strong noise. In order to circumvent this difficulty, we propose to use symplectic principal component analysis (SPCA), a new chaotic signal de-noising method, as the first step to recover the system dynamics. CTM is then applied to determine whether the time series arises from a stochastic process or has a deterministic component. Results from numerical experiments, ranging from six benchmark deterministic models to 1/f noise, suggest that the hybrid method can significantly improve detection of determinism in noisy time series by about 20 dB when the data are contaminated by Gaussian noise. Furthermore, we apply our algorithm to study the mechanomyographic (MMG) signals arising from contraction of human skeletal muscle. Results obtained from the hybrid symplectic principal component analysis and central tendency measure demonstrate that the skeletal muscle motor unit dynamics can indeed be deterministic, in agreement with previous studies. However, the conventional CTM method was not able to definitely detect the underlying deterministic dynamics. This result on MMG signal analysis is helpful in understanding neuromuscular control mechanisms and developing MMG-based engineering control applications.
40 CFR 205.55-2 - Compliance with standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... configuration (e.g., L-6, V-8, etc.). (e) Series (i.e., cab design) including but not limited to conventional... of this subpart. (2) [Reserved] (3) At any time following receipt of notice under this section with...
NASA Astrophysics Data System (ADS)
Jiang, Shi-Mei; Cai, Shi-Min; Zhou, Tao; Zhou, Pei-Ling
2008-06-01
The two-phase behaviour in financial markets actually means the bifurcation phenomenon, which represents the change of the conditional probability from an unimodal to a bimodal distribution. We investigate the bifurcation phenomenon in Hang-Seng index. It is observed that the bifurcation phenomenon in financial index is not universal, but specific under certain conditions. For Hang-Seng index and randomly generated time series, the phenomenon just emerges when the power-law exponent of absolute increment distribution is between 1 and 2 with appropriate period. Simulations on a randomly generated time series suggest the bifurcation phenomenon itself is subject to the statistics of absolute increment, thus it may not be able to reflect essential financial behaviours. However, even under the same distribution of absolute increment, the range where bifurcation phenomenon occurs is far different from real market to artificial data, which may reflect certain market information.
Kerlin, Aaron M; Lindsley, Tara A
2008-08-15
Time-lapse imaging of living neurons both in vivo and in vitro has revealed that the growth of axons and dendrites is highly dynamic and characterized by alternating periods of extension and retraction. These growth dynamics are associated with important features of neuronal development and are differentially affected by experimental treatments, but the underlying cellular mechanisms are poorly understood. NeuroRhythmics was developed to semi-automate specific quantitative tasks involved in analysis of two-dimensional time-series images of processes that exhibit saltatory elongation. This software provides detailed information on periods of growth and nongrowth that it identifies by transitions in elongation (i.e. initiation time, average rate, duration) and information regarding the overall pattern of saltatory growth (i.e. time of pattern onset, frequency of transitions, relative time spent in a state of growth vs. nongrowth). Plots and numeric output are readily imported into other applications. The user has the option to specify criteria for identifying transitions in growth behavior, which extends the potential application of the software to neurons of different types or developmental stage and to other time-series phenomena that exhibit saltatory dynamics. NeuroRhythmics will facilitate mechanistic studies of periodic axonal and dendritic growth in neurons.
Indicator saturation: a novel approach to detect multiple breaks in geodetic time series.
NASA Astrophysics Data System (ADS)
Jackson, L. P.; Pretis, F.; Williams, S. D. P.
2016-12-01
Geodetic time series can record long term trends, quasi-periodic signals at a variety of time scales from days to decades, and sudden breaks due to natural or anthropogenic causes. The causes of breaks range from instrument replacement to earthquakes to unknown (i.e. no attributable cause). Furthermore, breaks can be permanent or short-lived and range at least two orders of magnitude in size (mm to 100's mm). To account for this range of possible signal-characteristics requires a flexible time series method that can distinguish between true and false breaks, outliers and time-varying trends. One such method, Indicator Saturation (IS) comes from the field of econometrics where analysing stochastic signals in these terms is a common problem. The IS approach differs from alternative break detection methods by considering every point in the time series as a break until it is demonstrated statistically that it is not. A linear model is constructed with a break function at every point in time, and all but statistically significant breaks are removed through a general-to-specific model selection algorithm for more variables than observations. The IS method is flexible because it allows multiple breaks of different forms (e.g. impulses, shifts in the mean, and changing trends) to be detected, while simultaneously modelling any underlying variation driven by additional covariates. We apply the IS method to identify breaks in a suite of synthetic GPS time series used for the Detection of Offsets in GPS Experiments (DOGEX). We optimise the method to maximise the ratio of true-positive to false-positive detections, which improves estimates of errors in the long term rates of land motion currently required by the GPS community.
Low gravity investigations in suborbital rockets
NASA Technical Reports Server (NTRS)
Wessling, Francis C.; Lundquist, Charles A.
1990-01-01
Two series of suborbital rocket missions are outlined which are intended to support materials and biotechnology investigations under microgravity conditions and enhance commercial rocket activity. The Consort series of missions employs the two-stage Starfire I rocket and recovery systems as well as a payload of three sealed or vented cylindrical sections. The Consort 1 and 2 missions are described which successfully supported six classes of experiments each. The Joust program is the second series of rocket missions, and the Prospector rocket is employed to provide comparable payload masses with twice as much microgravity time as the Consort series. The Joust and Consort missions provide 6-8 and 13-15 mins, respectively, of microgravity flight to support such experiments as polymer processing, scientific apparatus testing, and electrodeposition.
Reduction of coherence of the human brain electric potentials
NASA Astrophysics Data System (ADS)
Novik, Oleg; Smirnov, Fedor
Plenty of technological processes are known to be damaged by magnetic storms. But technology is controlled by men and their functional systems may be damaged as well. We are going to consider the electro-neurophysiological aspect of the general problem: men surrounded by physical fields including ones of cosmic origination. Magnetic storms’ influence had been observed for a group of 13 students (practically healthy girls and boys from 18 to 23 years old, Moscow). To control the main functional systems of the examinees, their electroencephalograms (EEG) were being registered along with electrocardiograms, respiratory rhythms, arterial blood pressure and other characteristics during a year. All of these characteristics, save for the EEG, were within the normal range for all of the examinees during measurements. According to the EEG investigations by implementation of the computer proof-reading test in absence of magnetic storms, the values of the coherence function of time series of the theta-rhythm oscillations (f = 4 - 7.9 Hz, A = 20 μV) of electric potentials of the frontal-polar and occipital areas of the head belong to the interval [0.3, 0.8] for all of the students under investigation. (As the proof-reading test, it was necessary to choose given symbols from a random sequence of ones demonstrated at a monitor and to enter the number of the symbols discovered in a computer. Everyone was known that the time for determination of symbols is unlimited. On the other hand, nobody was known that the EEG and other registrations mentioned are connected with electromagnetic geophysical researches and geomagnetic storms). Let us formulate the main result: by implementation of the same test during a magnetic storm, 5 ≤ K ≤ 6, or no later then 24 hours after its beginning (different types of moderate magnetic storms occurred, the data of IZMIRAN were used), the values of the theta-rhythm frontal - occipital coherence function of all of the students of the group under consideration decreased by a factor of two or more, including the zero coherence function value. The similar result was obtained for another basic low-frequency electro-neurophysiological rhythm delta (f = 0.5 - 3.9 Hz, A = 20 μV). The usual coherence function values from the interval [0.3, 0.8] were being registered, typically, about 48 hours after the magnetic storm end. The result about decreasing of the coherence of the brain low frequency bioelectric oscillations under a magnetic storm influence was obtained by two methods: 1) comparison of the time series of bioelectric oscillations of a given person without a magnetic storm and under its influence; 2) comparison of two sets of time series of oscillations: a) the set A of time series measured without a magnetic storm and b) the set B of time series measured under its influence, regardless to an individual. Surely, the total number of the EEGs available for the investigation by the set’s approach, i.e. without personification, is more than the number of the EEGs available by the individual approach because there were ones investigated without a magnetic storm only as well as ones investigated under its influence only. By the EEG measurements with closed or open eyes, but without a functional load on the brain in the form of the proof-reading test, a distinctive decrease of the coherence function was not observed during a magnetic storm as well as for pairs of points from other parts of the head (see above) or other rhythms.
NASA Astrophysics Data System (ADS)
Jayawardena, Adikaramge Asiri
The goal of this dissertation is to identify electrical and thermal parameters of an LED package that can be used to predict catastrophic failure real-time in an application. Through an experimental study the series electrical resistance and thermal resistance were identified as good indicators of contact failure of LED packages. This study investigated the long-term changes in series electrical resistance and thermal resistance of LED packages at three different current and junction temperature stress conditions. Experiment results showed that the series electrical resistance went through four phases of change; including periods of latency, rapid increase, saturation, and finally a sharp decline just before failure. Formation of voids in the contact metallization was identified as the underlying mechanism for series resistance increase. The rate of series resistance change was linked to void growth using the theory of electromigration. The rate of increase of series resistance is dependent on temperature and current density. The results indicate that void growth occurred in the cap (Au) layer, was constrained by the contact metal (Ni) layer, preventing open circuit failure of contact metal layer. Short circuit failure occurred due to electromigration induced metal diffusion along dislocations in GaN. The increase in ideality factor, and reverse leakage current with time provided further evidence to presence of metal in the semiconductor. An empirical model was derived for estimation of LED package failure time due to metal diffusion. The model is based on the experimental results and theories of electromigration and diffusion. Furthermore, the experimental results showed that the thermal resistance of LED packages increased with aging time. A relationship between thermal resistance change rate, with case temperature and temperature gradient within the LED package was developed. The results showed that dislocation creep is responsible for creep induced plastic deformation in the die-attach solder. The temperatures inside the LED package reached the melting point of die-attach solder due to delamination just before catastrophic open circuit failure. A combined model that could estimate life of LED packages based on catastrophic failure of thermal and electrical contacts is presented for the first time. This model can be used to make a-priori or real-time estimation of LED package life based on catastrophic failure. Finally, to illustrate the usefulness of the findings from this thesis, two different implementations of real-time life prediction using prognostics and health monitoring techniques are discussed.
Enhancing sedimentation by improving flow conditions using parallel retrofit baffles.
He, Cheng; Scott, Eric; Rochfort, Quintin
2015-09-01
In this study, placing parallel-connected baffles in the vicinity of the inlet was proposed to improve hydraulic conditions for enhancing TSS (total suspended solids) removal. The purpose of the retrofit baffle design is to divide the large and fast inflow into smaller and slower flows to increase flow uniformity. This avoids short-circuiting and increases residence time in the sedimentation basin. The newly proposed parallel-connected baffle configuration was assessed in the laboratory by comparing its TSS removal performance and the optimal flow residence time with those from the widely used series-connected baffles. The experimental results showed that the parallel-connected baffles outperformed the series-connected baffles because it could disperse flow faster and in less space by splitting the large inflow into many small branches instead of solely depending on flow internal friction over a longer flow path, as was the case under the series-connected baffles. Being able to dampen faster flow before entering the sedimentation basin is critical to reducing the possibility of disturbing any settled particles, especially under high inflow conditions. Also, for a large sedimentation basin, it may be more economically feasible to deploy the proposed parallel retrofit baffle in the vicinity of the inlet than series-connected baffles throughout the entire settling basin. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Multiple-time scales analysis of physiological time series under neural control
NASA Technical Reports Server (NTRS)
Peng, C. K.; Hausdorff, J. M.; Havlin, S.; Mietus, J. E.; Stanley, H. E.; Goldberger, A. L.
1998-01-01
We discuss multiple-time scale properties of neurophysiological control mechanisms, using heart rate and gait regulation as model systems. We find that scaling exponents can be used as prognostic indicators. Furthermore, detection of more subtle degradation of scaling properties may provide a novel early warning system in subjects with a variety of pathologies including those at high risk of sudden death.
ERIC Educational Resources Information Center
Meaders, O. Donald, Ed.; Ekpo-ufot, Abel, Ed.
The Shared-Time Concept project was one of several projects conducted under a grant for a developmental vocational education research and teacher education program based on a clinical school concept. The objectives were (1) to determine the extent and nature of use of shared-time concept for conducting vocational education programs, and (2) to…
NASA Astrophysics Data System (ADS)
Mercier, R.; Murdin, P.
2002-01-01
From the time of A macronryabhat under dota (ca AD 500) there appeared in India a series of Sanskrit treatises on astronomy. Written always in verse, and normally accompanied by prose commentaries, these served to create an Indian tradition of mathematical astronomy which continued into the 18th century. There are as well texts from earlier centuries, grouped under the name Jyotishaveda macronn d...
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo
2016-01-01
Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A; Zhang, Wenbo; He, Bin
2016-12-01
Combined source-imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a noninvasive fashion. Source-imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source-imaging algorithms to both find the network nodes [regions of interest (ROI)] and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses, and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Source-imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from interictal and ictal signals recorded by EEG and/or Magnetoencephalography (MEG). Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ∼20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Our study indicates that combined source-imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions.
Dual-Pol X-Band Pol-InSAR Time Series of a Greenland Outlet Glacier
NASA Astrophysics Data System (ADS)
Fischer, Georg; Hajnsek, Irena
2015-04-01
This study investigates X-band (TanDEM-X) polarimetric and interferometric SAR (Pol-InSAR) data in order to retrieve information about the temporal and spatial variations of surface and subsurface parameters of the Helheim Glacier in south east Greenland. In particular, it will be indicated that the copolar phase difference between HH and VV could be a suitable proxy for snow accumulation, when Pol-InSAR techniques are used to assess the underlying scattering mechanism. By applying a two-phase mixing formula, this approach shows potential to reveal the temporal and spatial snow accumulation patterns in time series of TanDEM-X data.
Structure of public transit costs in the presence of multiple serial correlation
DOT National Transportation Integrated Search
1999-12-01
Most studies indicate that public transit systems operate under increasing returns to capital stock utilization and are significantly overcapitalized. Existing flexible form time series analyses, however, fail to correct for serial correlation. In th...
Online Detection of Driver Fatigue Using Steering Wheel Angles for Real Driving Conditions
Li, Zuojin; Li, Shengbo Eben; Li, Renjie; Cheng, Bo; Shi, Jinliang
2017-01-01
This paper presents a drowsiness on-line detection system for monitoring driver fatigue level under real driving conditions, based on the data of steering wheel angles (SWA) collected from sensors mounted on the steering lever. The proposed system firstly extracts approximate entropy (ApEn) features from fixed sliding windows on real-time steering wheel angles time series. After that, this system linearizes the ApEn features series through an adaptive piecewise linear fitting using a given deviation. Then, the detection system calculates the warping distance between the linear features series of the sample data. Finally, this system uses the warping distance to determine the drowsiness state of the driver according to a designed binary decision classifier. The experimental data were collected from 14.68 h driving under real road conditions, including two fatigue levels: “wake” and “drowsy”. The results show that the proposed system is capable of working online with an average 78.01% accuracy, 29.35% false detections of the “awake” state, and 15.15% false detections of the “drowsy” state. The results also confirm that the proposed method based on SWA signal is valuable for applications in preventing traffic accidents caused by driver fatigue. PMID:28257094
Lecca, Paola; Mura, Ivan; Re, Angela; Barker, Gary C; Ihekwaba, Adaoha E C
2016-01-01
Chaotic behavior refers to a behavior which, albeit irregular, is generated by an underlying deterministic process. Therefore, a chaotic behavior is potentially controllable. This possibility becomes practically amenable especially when chaos is shown to be low-dimensional, i.e., to be attributable to a small fraction of the total systems components. In this case, indeed, including the major drivers of chaos in a system into the modeling approach allows us to improve predictability of the systems dynamics. Here, we analyzed the numerical simulations of an accurate ordinary differential equation model of the gene network regulating sporulation initiation in Bacillus subtilis to explore whether the non-linearity underlying time series data is due to low-dimensional chaos. Low-dimensional chaos is expectedly common in systems with few degrees of freedom, but rare in systems with many degrees of freedom such as the B. subtilis sporulation network. The estimation of a number of indices, which reflect the chaotic nature of a system, indicates that the dynamics of this network is affected by deterministic chaos. The neat separation between the indices obtained from the time series simulated from the model and those obtained from time series generated by Gaussian white and colored noise confirmed that the B. subtilis sporulation network dynamics is affected by low dimensional chaos rather than by noise. Furthermore, our analysis identifies the principal driver of the networks chaotic dynamics to be sporulation initiation phosphotransferase B (Spo0B). We then analyzed the parameters and the phase space of the system to characterize the instability points of the network dynamics, and, in turn, to identify the ranges of values of Spo0B and of the other drivers of the chaotic dynamics, for which the whole system is highly sensitive to minimal perturbation. In summary, we described an unappreciated source of complexity in the B. subtilis sporulation network by gathering evidence for the chaotic behavior of the system, and by suggesting candidate molecules driving chaos in the system. The results of our chaos analysis can increase our understanding of the intricacies of the regulatory network under analysis, and suggest experimental work to refine our behavior of the mechanisms underlying B. subtilis sporulation initiation control.
ImpulseDE: detection of differentially expressed genes in time series data using impulse models.
Sander, Jil; Schultze, Joachim L; Yosef, Nir
2017-03-01
Perturbations in the environment lead to distinctive gene expression changes within a cell. Observed over time, those variations can be characterized by single impulse-like progression patterns. ImpulseDE is an R package suited to capture these patterns in high throughput time series datasets. By fitting a representative impulse model to each gene, it reports differentially expressed genes across time points from a single or between two time courses from two experiments. To optimize running time, the code uses clustering and multi-threading. By applying ImpulseDE , we demonstrate its power to represent underlying biology of gene expression in microarray and RNA-Seq data. ImpulseDE is available on Bioconductor ( https://bioconductor.org/packages/ImpulseDE/ ). niryosef@berkeley.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
33 CFR 74.20-1 - Buoy and vessel use costs.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 7310 (series) which is available from the District Budget Office of the appropriate Coast Guard District Commander. (b) Buoy and vessel use charges under this part are made for the cost or value of time... obstruction. No charge for time and expense of Coast Guard vessels is made when the marking of the obstruction...
33 CFR 74.20-1 - Buoy and vessel use costs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 7310 (series) which is available from the District Budget Office of the appropriate Coast Guard District Commander. (b) Buoy and vessel use charges under this part are made for the cost or value of time... obstruction. No charge for time and expense of Coast Guard vessels is made when the marking of the obstruction...
Wu, Mike; Ghassemi, Marzyeh; Feng, Mengling; Celi, Leo A; Szolovits, Peter; Doshi-Velez, Finale
2017-05-01
The widespread adoption of electronic health records allows us to ask evidence-based questions about the need for and benefits of specific clinical interventions in critical-care settings across large populations. We investigated the prediction of vasopressor administration and weaning in the intensive care unit. Vasopressors are commonly used to control hypotension, and changes in timing and dosage can have a large impact on patient outcomes. We considered a cohort of 15 695 intensive care unit patients without orders for reduced care who were alive 30 days post-discharge. A switching-state autoregressive model (SSAM) was trained to predict the multidimensional physiological time series of patients before, during, and after vasopressor administration. The latent states from the SSAM were used as predictors of vasopressor administration and weaning. The unsupervised SSAM features were able to predict patient vasopressor administration and successful patient weaning. Features derived from the SSAM achieved areas under the receiver operating curve of 0.92, 0.88, and 0.71 for predicting ungapped vasopressor administration, gapped vasopressor administration, and vasopressor weaning, respectively. We also demonstrated many cases where our model predicted weaning well in advance of a successful wean. Models that used SSAM features increased performance on both predictive tasks. These improvements may reflect an underlying, and ultimately predictive, latent state detectable from the physiological time series. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Ocean rogue waves and their phase space dynamics in the limit of a linear interference model.
Birkholz, Simon; Brée, Carsten; Veselić, Ivan; Demircan, Ayhan; Steinmeyer, Günter
2016-10-12
We reanalyse the probability for formation of extreme waves using the simple model of linear interference of a finite number of elementary waves with fixed amplitude and random phase fluctuations. Under these model assumptions no rogue waves appear when less than 10 elementary waves interfere with each other. Above this threshold rogue wave formation becomes increasingly likely, with appearance frequencies that may even exceed long-term observations by an order of magnitude. For estimation of the effective number of interfering waves, we suggest the Grassberger-Procaccia dimensional analysis of individual time series. For the ocean system, it is further shown that the resulting phase space dimension may vary, such that the threshold for rogue wave formation is not always reached. Time series analysis as well as the appearance of particular focusing wind conditions may enable an effective forecast of such rogue-wave prone situations. In particular, extracting the dimension from ocean time series allows much more specific estimation of the rogue wave probability.
Credit Default Swaps networks and systemic risk
Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano
2014-01-01
Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities. PMID:25366654
Credit Default Swaps networks and systemic risk.
Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano
2014-11-04
Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities.
Credit Default Swaps networks and systemic risk
NASA Astrophysics Data System (ADS)
Puliga, Michelangelo; Caldarelli, Guido; Battiston, Stefano
2014-11-01
Credit Default Swaps (CDS) spreads should reflect default risk of the underlying corporate debt. Actually, it has been recognized that CDS spread time series did not anticipate but only followed the increasing risk of default before the financial crisis. In principle, the network of correlations among CDS spread time series could at least display some form of structural change to be used as an early warning of systemic risk. Here we study a set of 176 CDS time series of financial institutions from 2002 to 2011. Networks are constructed in various ways, some of which display structural change at the onset of the credit crisis of 2008, but never before. By taking these networks as a proxy of interdependencies among financial institutions, we run stress-test based on Group DebtRank. Systemic risk before 2008 increases only when incorporating a macroeconomic indicator reflecting the potential losses of financial assets associated with house prices in the US. This approach indicates a promising way to detect systemic instabilities.
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
Ocean rogue waves and their phase space dynamics in the limit of a linear interference model
Birkholz, Simon; Brée, Carsten; Veselić, Ivan; Demircan, Ayhan; Steinmeyer, Günter
2016-01-01
We reanalyse the probability for formation of extreme waves using the simple model of linear interference of a finite number of elementary waves with fixed amplitude and random phase fluctuations. Under these model assumptions no rogue waves appear when less than 10 elementary waves interfere with each other. Above this threshold rogue wave formation becomes increasingly likely, with appearance frequencies that may even exceed long-term observations by an order of magnitude. For estimation of the effective number of interfering waves, we suggest the Grassberger-Procaccia dimensional analysis of individual time series. For the ocean system, it is further shown that the resulting phase space dimension may vary, such that the threshold for rogue wave formation is not always reached. Time series analysis as well as the appearance of particular focusing wind conditions may enable an effective forecast of such rogue-wave prone situations. In particular, extracting the dimension from ocean time series allows much more specific estimation of the rogue wave probability. PMID:27731411
NASA Astrophysics Data System (ADS)
Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth
2016-05-01
A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.
The excel file contains time series data of flow rates, concentrations of alachlor , atrazine, ammonia, total phosphorus, and total suspended solids observed in two watersheds in Indiana from 2002 to 2007. The aggregate time series data corresponding or representative to all these parameters was obtained using a specialized, data-driven technique. The aggregate data is hypothesized in the published paper to represent the overall health of both watersheds with respect to various potential water quality impairments. The time series data for each of the individual water quality parameters were used to compute corresponding risk measures (Rel, Res, and Vul) that are reported in Table 4 and 5. The aggregation of the risk measures, which is computed from the aggregate time series and water quality standards in Table 1, is also reported in Table 4 and 5 of the published paper. Values under column heading uncertainty reports uncertainties associated with reconstruction of missing records of the water quality parameters. Long-term records of the water quality parameters were reconstructed in order to estimate the (R-R-V) and corresponding aggregate risk measures. This dataset is associated with the following publication:Hoque, Y., S. Tripathi, M. Hantush , and R. Govindaraju. Aggregate Measures of Watershed Health from Reconstructed Water Quality Data with Uncertainty. Ed Gregorich JOURNAL OF ENVIRONMENTAL QUALITY. American Society of Agronomy, MADISON, WI,
Implementation of NASTRAN on the IBM/370 CMS operating system
NASA Technical Reports Server (NTRS)
Britten, S. S.; Schumacker, B.
1980-01-01
The NASA Structural Analysis (NASTRAN) computer program is operational on the IBM 360/370 series computers. While execution of NASTRAN has been described and implemented under the virtual storage operating systems of the IBM 370 models, the IBM 370/168 computer can also operate in a time-sharing mode under the virtual machine operating system using the Conversational Monitor System (CMS) subset. The changes required to make NASTRAN operational under the CMS operating system are described.
Revising time series of the Elbe river discharge for flood frequency determination at gauge Dresden
NASA Astrophysics Data System (ADS)
Bartl, S.; Schümberg, S.; Deutsch, M.
2009-11-01
The German research programme RIsk MAnagment of eXtreme flood events has accomplished the improvement of regional hazard assessment for the large rivers in Germany. Here we focused on the Elbe river at its gauge Dresden, which belongs to the oldest gauges in Europe with officially available daily discharge time series beginning on 1 January 1890. The project on the one hand aimed to extend and to revise the existing time series, and on the other hand to examine the variability of the Elbe river discharge conditions on a greater time scale. Therefore one major task were the historical searches and the examination of the retrieved documents and the contained information. After analysing this information the development of the river course and the discharge conditions were discussed. Using the provided knowledge, in an other subproject, a historical hydraulic model was established. Its results then again were used here. A further purpose was the determining of flood frequency based on all pre-processed data. The obtained knowledge about historical changes was also used to get an idea about possible future variations under climate change conditions. Especially variations in the runoff characteristic of the Elbe river over the course of the year were analysed. It succeeded to obtain a much longer discharge time series which contain fewer errors and uncertainties. Hence an optimized regional hazard assessment was realised.
Faes, Luca; Nollo, Giandomenico
2010-11-01
The Partial Directed Coherence (PDC) and its generalized formulation (gPDC) are popular tools for investigating, in the frequency domain, the concept of Granger causality among multivariate (MV) time series. PDC and gPDC are formalized in terms of the coefficients of an MV autoregressive (MVAR) model which describes only the lagged effects among the time series and forsakes instantaneous effects. However, instantaneous effects are known to affect linear parametric modeling, and are likely to occur in experimental time series. In this study, we investigate the impact on the assessment of frequency domain causality of excluding instantaneous effects from the model underlying PDC evaluation. Moreover, we propose the utilization of an extended MVAR model including both instantaneous and lagged effects. This model is used to assess PDC either in accordance with the definition of Granger causality when considering only lagged effects (iPDC), or with an extended form of causality, when we consider both instantaneous and lagged effects (ePDC). The approach is first evaluated on three theoretical examples of MVAR processes, which show that the presence of instantaneous correlations may produce misleading profiles of PDC and gPDC, while ePDC and iPDC derived from the extended model provide here a correct interpretation of extended and lagged causality. It is then applied to representative examples of cardiorespiratory and EEG MV time series. They suggest that ePDC and iPDC are better interpretable than PDC and gPDC in terms of the known cardiovascular and neural physiologies.
NASA Astrophysics Data System (ADS)
Sultana, Tahmina; Takagi, Hiroaki; Morimatsu, Miki; Teramoto, Hiroshi; Li, Chun-Biu; Sako, Yasushi; Komatsuzaki, Tamiki
2013-12-01
We present a novel scheme to extract a multiscale state space network (SSN) from single-molecule time series. The multiscale SSN is a type of hidden Markov model that takes into account both multiple states buried in the measurement and memory effects in the process of the observable whenever they exist. Most biological systems function in a nonstationary manner across multiple timescales. Combined with a recently established nonlinear time series analysis based on information theory, a simple scheme is proposed to deal with the properties of multiscale and nonstationarity for a discrete time series. We derived an explicit analytical expression of the autocorrelation function in terms of the SSN. To demonstrate the potential of our scheme, we investigated single-molecule time series of dissociation and association kinetics between epidermal growth factor receptor (EGFR) on the plasma membrane and its adaptor protein Ash/Grb2 (Grb2) in an in vitro reconstituted system. We found that our formula successfully reproduces their autocorrelation function for a wide range of timescales (up to 3 s), and the underlying SSNs change their topographical structure as a function of the timescale; while the corresponding SSN is simple at the short timescale (0.033-0.1 s), the SSN at the longer timescales (0.1 s to ˜3 s) becomes rather complex in order to capture multiscale nonstationary kinetics emerging at longer timescales. It is also found that visiting the unbound form of the EGFR-Grb2 system approximately resets all information of history or memory of the process.
NASA Astrophysics Data System (ADS)
Ozawa, Taku; Ueda, Hideki
2011-12-01
InSAR time series analysis is an effective tool for detecting spatially and temporally complicated volcanic deformation. To obtain details of such deformation, we developed an advanced InSAR time series analysis using interferograms of multiple-orbit tracks. Considering only right- (or only left-) looking SAR observations, incidence directions for different orbit tracks are mostly included in a common plane. Therefore, slant-range changes in their interferograms can be expressed by two components in the plane. This approach estimates the time series of their components from interferograms of multiple-orbit tracks by the least squares analysis, and higher accuracy is obtained if many interferograms of different orbit tracks are available. Additionally, this analysis can combine interferograms for different incidence angles. In a case study on Miyake-jima, we obtained a deformation time series corresponding to GPS observations from PALSAR interferograms of six orbit tracks. The obtained accuracy was better than that with the SBAS approach, demonstrating its effectiveness. Furthermore, it is expected that higher accuracy would be obtained if SAR observations were carried out more frequently in all orbit tracks. The deformation obtained in the case study indicates uplift along the west coast and subsidence with contraction around the caldera. The speed of the uplift was almost constant, but the subsidence around the caldera decelerated from 2009. A flat deformation source was estimated near sea level under the caldera, implying that deceleration of subsidence was related to interaction between volcanic thermal activity and the aquifer.
"Time-dependent flow-networks"
NASA Astrophysics Data System (ADS)
Tupikina, Liubov; Molkentin, Nora; Lopez, Cristobal; Hernandez-Garcia, Emilio; Marwan, Norbert; Kurths, Jürgen
2015-04-01
Complex networks have been successfully applied to various systems such as society, technology, and recently climate. Links in a climate network are defined between two geographical locations if the correlation between the time series of some climate variable is higher than a threshold. Therefore, network links are considered to imply information or heat exchange. However, the relationship between the oceanic and atmospheric flows and the climate network's structure is still unclear. Recently, a theoretical approach verifying the correlation between ocean currents and surface air temperature networks has been introduced, where the Pearson correlation networks were constructed from advection-diffusion dynamics on an underlying flow. Since the continuous approach has its limitations, i.e. high computational complexity and fixed variety of the flows in the underlying system, we introduce a new, method of flow-networks for changing in time velocity fields including external forcing in the system, noise and temperature-decay. Method of the flow-network construction can be divided into several steps: first we obtain the linear recursive equation for the temperature time-series. Then we compute the correlation matrix for time-series averaging the tensor product over all realizations of the noise, which we interpret as a weighted adjacency matrix of the flow-network and analyze using network measures. We apply the method to different types of moving flows with geographical relevance such as meandering flow. Analyzing the flow-networks using network measures we find that our approach can highlight zones of high velocity by degree and transition zones by betweenness, while the combination of these network measures can uncover how the flow propagates within time. Flow-networks can be powerful tool to understand the connection between system's dynamics and network's topology analyzed using network measures in order to shed light on different climatic phenomena.
Inference for local autocorrelations in locally stationary models.
Zhao, Zhibiao
2015-04-01
For non-stationary processes, the time-varying correlation structure provides useful insights into the underlying model dynamics. We study estimation and inferences for local autocorrelation process in locally stationary time series. Our constructed simultaneous confidence band can be used to address important hypothesis testing problems, such as whether the local autocorrelation process is indeed time-varying and whether the local autocorrelation is zero. In particular, our result provides an important generalization of the R function acf() to locally stationary Gaussian processes. Simulation studies and two empirical applications are developed. For the global temperature series, we find that the local autocorrelations are time-varying and have a "V" shape during 1910-1960. For the S&P 500 index, we conclude that the returns satisfy the efficient-market hypothesis whereas the magnitudes of returns show significant local autocorrelations.
Chen, Chi-Kan
2017-07-26
The identification of genetic regulatory networks (GRNs) provides insights into complex cellular processes. A class of recurrent neural networks (RNNs) captures the dynamics of GRN. Algorithms combining the RNN and machine learning schemes were proposed to reconstruct small-scale GRNs using gene expression time series. We present new GRN reconstruction methods with neural networks. The RNN is extended to a class of recurrent multilayer perceptrons (RMLPs) with latent nodes. Our methods contain two steps: the edge rank assignment step and the network construction step. The former assigns ranks to all possible edges by a recursive procedure based on the estimated weights of wires of RNN/RMLP (RE RNN /RE RMLP ), and the latter constructs a network consisting of top-ranked edges under which the optimized RNN simulates the gene expression time series. The particle swarm optimization (PSO) is applied to optimize the parameters of RNNs and RMLPs in a two-step algorithm. The proposed RE RNN -RNN and RE RMLP -RNN algorithms are tested on synthetic and experimental gene expression time series of small GRNs of about 10 genes. The experimental time series are from the studies of yeast cell cycle regulated genes and E. coli DNA repair genes. The unstable estimation of RNN using experimental time series having limited data points can lead to fairly arbitrary predicted GRNs. Our methods incorporate RNN and RMLP into a two-step structure learning procedure. Results show that the RE RMLP using the RMLP with a suitable number of latent nodes to reduce the parameter dimension often result in more accurate edge ranks than the RE RNN using the regularized RNN on short simulated time series. Combining by a weighted majority voting rule the networks derived by the RE RMLP -RNN using different numbers of latent nodes in step one to infer the GRN, the method performs consistently and outperforms published algorithms for GRN reconstruction on most benchmark time series. The framework of two-step algorithms can potentially incorporate with different nonlinear differential equation models to reconstruct the GRN.
Multifractal diffusion entropy analysis: Optimal bin width of probability histograms
NASA Astrophysics Data System (ADS)
Jizba, Petr; Korbel, Jan
2014-11-01
In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.
Time-series analysis of delta13C from tree rings. I. Time trends and autocorrelation.
Monserud, R A; Marshall, J D
2001-09-01
Univariate time-series analyses were conducted on stable carbon isotope ratios obtained from tree-ring cellulose. We looked for the presence and structure of autocorrelation. Significant autocorrelation violates the statistical independence assumption and biases hypothesis tests. Its presence would indicate the existence of lagged physiological effects that persist for longer than the current year. We analyzed data from 28 trees (60-85 years old; mean = 73 years) of western white pine (Pinus monticola Dougl.), ponderosa pine (Pinus ponderosa Laws.), and Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. glauca) growing in northern Idaho. Material was obtained by the stem analysis method from rings laid down in the upper portion of the crown throughout each tree's life. The sampling protocol minimized variation caused by changing light regimes within each tree. Autoregressive moving average (ARMA) models were used to describe the autocorrelation structure over time. Three time series were analyzed for each tree: the stable carbon isotope ratio (delta(13)C); discrimination (delta); and the difference between ambient and internal CO(2) concentrations (c(a) - c(i)). The effect of converting from ring cellulose to whole-leaf tissue did not affect the analysis because it was almost completely removed by the detrending that precedes time-series analysis. A simple linear or quadratic model adequately described the time trend. The residuals from the trend had a constant mean and variance, thus ensuring stationarity, a requirement for autocorrelation analysis. The trend over time for c(a) - c(i) was particularly strong (R(2) = 0.29-0.84). Autoregressive moving average analyses of the residuals from these trends indicated that two-thirds of the individual tree series contained significant autocorrelation, whereas the remaining third were random (white noise) over time. We were unable to distinguish between individuals with and without significant autocorrelation beforehand. Significant ARMA models were all of low order, with either first- or second-order (i.e., lagged 1 or 2 years, respectively) models performing well. A simple autoregressive (AR(1)), model was the most common. The most useful generalization was that the same ARMA model holds for each of the three series (delta(13)C, delta, c(a) - c(i)) for an individual tree, if the time trend has been properly removed for each series. The mean series for the two pine species were described by first-order ARMA models (1-year lags), whereas the Douglas-fir mean series were described by second-order models (2-year lags) with negligible first-order effects. Apparently, the process of constructing a mean time series for a species preserves an underlying signal related to delta(13)C while canceling some of the random individual tree variation. Furthermore, the best model for the overall mean series (e.g., for a species) cannot be inferred from a consensus of the individual tree model forms, nor can its parameters be estimated reliably from the mean of the individual tree parameters. Because two-thirds of the individual tree time series contained significant autocorrelation, the normal assumption of a random structure over time is unwarranted, even after accounting for the time trend. The residuals of an appropriate ARMA model satisfy the independence assumption, and can be used to make hypothesis tests.
Detecting switching and intermittent causalities in time series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano; Papo, David
2017-04-01
During the last decade, complex network representations have emerged as a powerful instrument for describing the cross-talk between different brain regions both at rest and as subjects are carrying out cognitive tasks, in healthy brains and neurological pathologies. The transient nature of such cross-talk has nevertheless by and large been neglected, mainly due to the inherent limitations of some metrics, e.g., causality ones, which require a long time series in order to yield statistically significant results. Here, we present a methodology to account for intermittent causal coupling in neural activity, based on the identification of non-overlapping windows within the original time series in which the causality is strongest. The result is a less coarse-grained assessment of the time-varying properties of brain interactions, which can be used to create a high temporal resolution time-varying network. We apply the proposed methodology to the analysis of the brain activity of control subjects and alcoholic patients performing an image recognition task. Our results show that short-lived, intermittent, local-scale causality is better at discriminating both groups than global network metrics. These results highlight the importance of the transient nature of brain activity, at least under some pathological conditions.
Spectral analysis of finite-time correlation matrices near equilibrium phase transitions
NASA Astrophysics Data System (ADS)
Vinayak; Prosen, T.; Buča, B.; Seligman, T. H.
2014-10-01
We study spectral densities for systems on lattices, which, at a phase transition display, power-law spatial correlations. Constructing the spatial correlation matrix we prove that its eigenvalue density shows a power law that can be derived from the spatial correlations. In practice time series are short in the sense that they are either not stationary over long time intervals or not available over long time intervals. Also we usually do not have time series for all variables available. We shall make numerical simulations on a two-dimensional Ising model with the usual Metropolis algorithm as time evolution. Using all spins on a grid with periodic boundary conditions we find a power law, that is, for large grids, compatible with the analytic result. We still find a power law even if we choose a fairly small subset of grid points at random. The exponents of the power laws will be smaller under such circumstances. For very short time series leading to singular correlation matrices we use a recently developed technique to lift the degeneracy at zero in the spectrum and find a significant signature of critical behavior even in this case as compared to high temperature results which tend to those of random matrix models.
LONG-TERM PROJECTIONS OF EASTERN OYSTER POPULATIONS UNDER VARIOUS MANAGEMENT SCENARIOS
Time series of fishery-dependent and fishery-independent data were used to parameterize a model of oyster population dynamics for Maryland's Chesapeake Bay. Model parameters are (1) fishing mortality, estimated from differences between predicted and reported landings scaled to a ...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-23
... the set of risk factors whose behavior is included in the econometric models underlying STANS, time series of proportional changes in implied volatilities for a range of tenors and in-the-money and out-of...
Assessment of time-series MODIS data for cropland mapping in the U.S. central Great Plains
NASA Astrophysics Data System (ADS)
Masialeti, Iwake
This study had three general objectives. First, to explore ways of creating and refining a reference data set when reference data set is unobtainable. Second, extend work previously done in Kansas by Wardlow et al. (2007) to Nebraska, several exploratory approaches were used to further investigate the potential of MODIS NDVI 250-m data in agricultural-related land cover research other parts of the Great Plains. The objective of this part of the research was to evaluate the applicability of time-series MODIS 250-m NDVI data for crop-type discrimination by spectrally characterizing and discriminating major crop types in Nebraska using the reference data set collected and refined under research performed for the first objective. Third, conduct an initial investigation into whether time-series NDVI response curves for crops over a growing season for one year could be used to classify crops for a different year. In this case, time-series NDVI response curves for 2001 and 2005 were investigated to ascertain whether or not the 2001 data set could be used to classify crops for 2005. GIS operations, and reference data refinement using clustering and visual assessment of each crop's NDVI cluster profiles in Nebraska, demonstrated that it is possible to devise an alternative reference data set and refinement plan that redresses the unexpected loss of training and validation data. The analysis enabled the identification and removal of crop pattern outliers and sites atypical of crop phenology under consideration, and after editing, a total of 1,288 field sites remained, which were used as a reference data set for classification of Nebraska crop types. A pixel-level analysis of the time-series MODIS 250-m NDVI for 1,288 field sites representing each of the eight cover types under investigation across Nebraska found that each crop type had a distinctive MODIS 250-m NDVI profile corresponding to the crop calendar. A visual and statistical comparison of the average NDVI profiles showed that the crop types were separable at different times of the growing season based on their phenology-driven spectral-temporal differences. Winter wheat and alfalfa, winter wheat and summer crops, and alfalfa and summer crops were clearly separable. Specific summer crop types were not easily distinguishable from each other due to their similar crop calendars. Their greatest separability however occurred during the initial spring green up and/or senescence plant growth phases. In Kansas, an initial investigation revealed that there was near-complete agreement between the winter wheat crop profiles but that there were some minor differences in the crop profiles for alfalfa and summer crops between 2001 and 2005. However, the profiles of summer crops---corn, grain sorghum, and soybeans---displayed a shift to the right by at least 1 composite date, indicative of possible late crop planting and emergence. Alfalfa and summer crops, seem to suggest that time series NDVI response curves for crops over a growing period for one year of valid ground reference data may not be used to map crops for a different year without taking into account the climatic and/or environmental conditions of each year.
A new correlation coefficient for bivariate time-series data
NASA Astrophysics Data System (ADS)
Erdem, Orhan; Ceyhan, Elvan; Varli, Yusuf
2014-11-01
The correlation in time series has received considerable attention in the literature. Its use has attained an important role in the social sciences and finance. For example, pair trading in finance is concerned with the correlation between stock prices, returns, etc. In general, Pearson’s correlation coefficient is employed in these areas although it has many underlying assumptions which restrict its use. Here, we introduce a new correlation coefficient which takes into account the lag difference of data points. We investigate the properties of this new correlation coefficient. We demonstrate that it is more appropriate for showing the direction of the covariation of the two variables over time. We also compare the performance of the new correlation coefficient with Pearson’s correlation coefficient and Detrended Cross-Correlation Analysis (DCCA) via simulated examples.
Change Point Detection in Correlation Networks
NASA Astrophysics Data System (ADS)
Barnett, Ian; Onnela, Jukka-Pekka
2016-01-01
Many systems of interacting elements can be conceptualized as networks, where network nodes represent the elements and network ties represent interactions between the elements. In systems where the underlying network evolves, it is useful to determine the points in time where the network structure changes significantly as these may correspond to functional change points. We propose a method for detecting change points in correlation networks that, unlike previous change point detection methods designed for time series data, requires minimal distributional assumptions. We investigate the difficulty of change point detection near the boundaries of the time series in correlation networks and study the power of our method and competing methods through simulation. We also show the generalizable nature of the method by applying it to stock price data as well as fMRI data.
Prediction Analysis for Measles Epidemics
NASA Astrophysics Data System (ADS)
Sumi, Ayako; Ohtomo, Norio; Tanaka, Yukio; Sawamura, Sadashi; Olsen, Lars Folke; Kobayashi, Nobumichi
2003-12-01
A newly devised procedure of prediction analysis, which is a linearized version of the nonlinear least squares method combined with the maximum entropy spectral analysis method, was proposed. This method was applied to time series data of measles case notification in several communities in the UK, USA and Denmark. The dominant spectral lines observed in each power spectral density (PSD) can be safely assigned as fundamental periods. The optimum least squares fitting (LSF) curve calculated using these fundamental periods can essentially reproduce the underlying variation of the measles data. An extension of the LSF curve can be used to predict measles case notification quantitatively. Some discussions including a predictability of chaotic time series are presented.
Schiecke, Karin; Pester, Britta; Feucht, Martha; Leistritz, Lutz; Witte, Herbert
2015-01-01
In neuroscience, data are typically generated from neural network activity. Complex interactions between measured time series are involved, and nothing or only little is known about the underlying dynamic system. Convergent Cross Mapping (CCM) provides the possibility to investigate nonlinear causal interactions between time series by using nonlinear state space reconstruction. Aim of this study is to investigate the general applicability, and to show potentials and limitation of CCM. Influence of estimation parameters could be demonstrated by means of simulated data, whereas interval-based application of CCM on real data could be adapted for the investigation of interactions between heart rate and specific EEG components of children with temporal lobe epilepsy.
Empirical mode decomposition and long-range correlation analysis of sunspot time series
NASA Astrophysics Data System (ADS)
Zhou, Yu; Leung, Yee
2010-12-01
Sunspots, which are the best known and most variable features of the solar surface, affect our planet in many ways. The number of sunspots during a period of time is highly variable and arouses strong research interest. When multifractal detrended fluctuation analysis (MF-DFA) is employed to study the fractal properties and long-range correlation of the sunspot series, some spurious crossover points might appear because of the periodic and quasi-periodic trends in the series. However many cycles of solar activities can be reflected by the sunspot time series. The 11-year cycle is perhaps the most famous cycle of the sunspot activity. These cycles pose problems for the investigation of the scaling behavior of sunspot time series. Using different methods to handle the 11-year cycle generally creates totally different results. Using MF-DFA, Movahed and co-workers employed Fourier truncation to deal with the 11-year cycle and found that the series is long-range anti-correlated with a Hurst exponent, H, of about 0.12. However, Hu and co-workers proposed an adaptive detrending method for the MF-DFA and discovered long-range correlation characterized by H≈0.74. In an attempt to get to the bottom of the problem in the present paper, empirical mode decomposition (EMD), a data-driven adaptive method, is applied to first extract the components with different dominant frequencies. MF-DFA is then employed to study the long-range correlation of the sunspot time series under the influence of these components. On removing the effects of these periods, the natural long-range correlation of the sunspot time series can be revealed. With the removal of the 11-year cycle, a crossover point located at around 60 months is discovered to be a reasonable point separating two different time scale ranges, H≈0.72 and H≈1.49. And on removing all cycles longer than 11 years, we have H≈0.69 and H≈0.28. The three cycle-removing methods—Fourier truncation, adaptive detrending and the proposed EMD-based method—are further compared, and possible reasons for the different results are given. Two numerical experiments are designed for quantitatively evaluating the performances of these three methods in removing periodic trends with inexact/exact cycles and in detecting the possible crossover points.
Mbida, André Dieudonné; Sosso, Samuel; Flori, Pierre; Saoudin, Henia; Lawrence, Philip; Monny-Lobé, Marcel; Oyono, Yves; Ndzi, Edward; Cappelli, Giulia; Lucht, Frédéric; Pozzetto, Bruno; Oukem-Boyer, Odile Ouwe Missi; Bourlet, Thomas
2009-09-01
This study aimed to evaluate the use of dried blood spots (DBSs) and dried plasma spots (DPSs) locally collected in 2 rural dispensaries in Cameroon for the quantification of HIV-1 RNA. Forty-one subjects were sampled and spots of whole blood and plasma were deposited onto Whatman 903 cards and dried at ambient temperature under local conditions. Two sets of DBS and DPS cards were done per patient. The rest of the liquid plasma (LP) was frozen until use. LPs were tested at the "Chantal Biya" International Reference Centre (Yaoundé, Cameroon) by the Abbott Real-Time HIV-1 assay (Abbott Molecular Diagnostics, Wiesbaden, Germany). One series of DBS and DPS was transported and tested between 2 and 6 weeks later at the Virology Laboratory of Saint-Etienne (France). The second series was routed by mail and tested after up to 3 months of storage at ambient temperature. From the first series, the correlation rate between viral loads obtained from LP and DBS, and from LP and DPS, was 0.98 and 0.99, respectively; specificity of DBS and DPS results was 100%. The results obtained from the second series indicate a great stability of DBS after long-term storage. This study demonstrates that DBSs collected under local conditions in resource-limited settings are suitable for the differed quantification of HIV-1 RNA.
Quantifying the consequences of changing hydroclimatic extremes on protection levels for the Rhine
NASA Astrophysics Data System (ADS)
Sperna Weiland, Frederiek; Hegnauer, Mark; Buiteveld, Hendrik; Lammersen, Rita; van den Boogaard, Henk; Beersma, Jules
2017-04-01
The Dutch method for quantifying the magnitude and frequency of occurrence of discharge extremes in the Rhine basin and the potential influence of climate change hereon are presented. In the Netherlands flood protection design requires estimates of discharge extremes for return periods of 1000 up to 100,000 years. Observed discharge records are too short to derive such extreme return discharges, therefore extreme value assessment is based on very long synthetic discharge time-series generated with the Generator of Rainfall And Discharge Extremes (GRADE). The GRADE instrument consists of (1) a stochastic weather generator based on time series resampling of historical f rainfall and temperature and (2) a hydrological model optimized following the GLUE methodology and (3) a hydrodynamic model to simulate the propagation of flood waves based on the generated hydrological time-series. To assess the potential influence of climate change, the four KNMI'14 climate scenarios are applied. These four scenarios represent a large part of the uncertainty provided by the GCMs used for the IPCC 5th assessment report (the CMIP5 GCM simulations under different climate forcings) and are for this purpose tailored to the Rhine and Meuse river basins. To derive the probability distributions of extreme discharges under climate change the historical synthetic rainfall and temperature series simulated with the weather generator are transformed to the future following the KNMI'14 scenarios. For this transformation the Advanced Delta Change method, which allows that the changes in the extremes differ from those in the means, is used. Subsequently the hydrological model is forced with the historical and future (i.e. transformed) synthetic time-series after which the propagation of the flood waves is simulated with the hydrodynamic model to obtain the extreme discharge statistics both for current and future climate conditions. The study shows that both for 2050 and 2085 increases in discharge extremes for the river Rhine at Lobith are projected by all four KNMI'14 climate scenarios. This poses increased requirements for flood protection design in order to prepare for changing climate conditions.
Characteristics of Hydrogen Sensors Based on Thin Tin Dioxide Films Modified with Gold
NASA Astrophysics Data System (ADS)
Almaev, A. V.; Gaman, V. I.
2017-11-01
Effect of hydrogen in the concentration range from 10 to 2000 ppm on the characteristics of sensors based on thin films of tin dioxide modified with gold (Au/SnO2:Sb, Au) is studied in the thermo-cyclic mode at temperatures from 623 to 773 K and absolute humidity from 2.5 to 20 g/m3. Experimental data are discussed using expressions obtained within the framework of a model that takes into account the presence of three types of adsorbed particles (O¯, OH, and OH¯) on the surface of SnO2 nanocrystals. The characteristics of the sensors based on thin Pt/Pd/SnO2:Sb films (the first series) are compared with those of Au/SnO2:Sb, Au films (the second series). It is found that the degree of dissociation of molecular hydrogen into atoms during adsorption on the sensor under interaction with Au particles on the SnO2 surface is 4 times greater than that under interaction with Pt/Pd particles. The degree of dissociation of H2O molecules into hydrogen atoms and hydroxyl groups in pure moist air on the surface of the sensors of the second series is 1.6 times greater than that for the sensors of the first series. Thus, gold is a more effective stimulator of the dissociation of H2 and H2O molecules than platinum and palladium. A formula is obtained that describes more accurately the dependence of the response of the sensors of both series to the effect of hydrogen on the concentration of this gas and on the temperature of the measuring devices.
Code of Federal Regulations, 2013 CFR
2013-04-01
... is used for the first time for transport under Customs seal. The TIR Convention, 1975, generally... that country is a contracting party to the Convention, may approve a series of road vehicles or containers presented for design type approval. The procedures for applying for certification are contained in...
Code of Federal Regulations, 2014 CFR
2014-04-01
... is used for the first time for transport under Customs seal. The TIR Convention, 1975, generally... that country is a contracting party to the Convention, may approve a series of road vehicles or containers presented for design type approval. The procedures for applying for certification are contained in...
Code of Federal Regulations, 2010 CFR
2010-04-01
... is used for the first time for transport under Customs seal. The TIR Convention, 1975, generally... that country is a contracting party to the Convention, may approve a series of road vehicles or containers presented for design type approval. The procedures for applying for certification are contained in...
Code of Federal Regulations, 2011 CFR
2011-04-01
... is used for the first time for transport under Customs seal. The TIR Convention, 1975, generally... that country is a contracting party to the Convention, may approve a series of road vehicles or containers presented for design type approval. The procedures for applying for certification are contained in...
Passenger train emergency systems : single-level commuter rail car egress experiments
DOT National Transportation Integrated Search
2015-04-01
Under FRA sponsorship, a series of three experimental egress trials was conducted in 2005 and 2006 to obtain human factors data relating to the amount of time necessary for individuals to exit from a passenger rail car. This final report describes th...
Behavior Analysis: Methodological Foundations.
ERIC Educational Resources Information Center
Owen, James L.
Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline…
Sumi, A; Luo, T; Zhou, D; Yu, B; Kong, D; Kobayashi, N
2013-05-01
Viral hepatitis is recognized as one of the most frequently reported diseases, and especially in China, acute and chronic liver disease due to viral hepatitis has been a major public health problem. The present study aimed to analyse and predict surveillance data of infections of hepatitis A, B, C and E in Wuhan, China, by the method of time-series analysis (MemCalc, Suwa-Trast, Japan). On the basis of spectral analysis, fundamental modes explaining the underlying variation of the data for the years 2004-2008 were assigned. The model was calculated using the fundamental modes and the underlying variation of the data reproduced well. An extension of the model to the year 2009 could predict the data quantitatively. Our study suggests that the present method will allow us to model the temporal pattern of epidemics of viral hepatitis much more effectively than using the artificial neural network, which has been used previously.
Imputation of missing data in time series for air pollutants
NASA Astrophysics Data System (ADS)
Junger, W. L.; Ponce de Leon, A.
2015-02-01
Missing data are major concerns in epidemiological studies of the health effects of environmental air pollutants. This article presents an imputation-based method that is suitable for multivariate time series data, which uses the EM algorithm under the assumption of normal distribution. Different approaches are considered for filtering the temporal component. A simulation study was performed to assess validity and performance of proposed method in comparison with some frequently used methods. Simulations showed that when the amount of missing data was as low as 5%, the complete data analysis yielded satisfactory results regardless of the generating mechanism of the missing data, whereas the validity began to degenerate when the proportion of missing values exceeded 10%. The proposed imputation method exhibited good accuracy and precision in different settings with respect to the patterns of missing observations. Most of the imputations obtained valid results, even under missing not at random. The methods proposed in this study are implemented as a package called mtsdi for the statistical software system R.
Automatic Detection of Driver Fatigue Using Driving Operation Information for Transportation Safety
Li, Zuojin; Chen, Liukui; Peng, Jun; Wu, Ying
2017-01-01
Fatigued driving is a major cause of road accidents. For this reason, the method in this paper is based on the steering wheel angles (SWA) and yaw angles (YA) information under real driving conditions to detect drivers’ fatigue levels. It analyzes the operation features of SWA and YA under different fatigue statuses, then calculates the approximate entropy (ApEn) features of a short sliding window on time series. Using the nonlinear feature construction theory of dynamic time series, with the fatigue features as input, designs a “2-6-6-3” multi-level back propagation (BP) Neural Networks classifier to realize the fatigue detection. An approximately 15-h experiment is carried out on a real road, and the data retrieved are segmented and labeled with three fatigue levels after expert evaluation, namely “awake”, “drowsy” and “very drowsy”. The average accuracy of 88.02% in fatigue identification was achieved in the experiment, endorsing the value of the proposed method for engineering applications. PMID:28587072
Automatic Detection of Driver Fatigue Using Driving Operation Information for Transportation Safety.
Li, Zuojin; Chen, Liukui; Peng, Jun; Wu, Ying
2017-05-25
Fatigued driving is a major cause of road accidents. For this reason, the method in this paper is based on the steering wheel angles (SWA) and yaw angles (YA) information under real driving conditions to detect drivers' fatigue levels. It analyzes the operation features of SWA and YA under different fatigue statuses, then calculates the approximate entropy (ApEn) features of a short sliding window on time series. Using the nonlinear feature construction theory of dynamic time series, with the fatigue features as input, designs a "2-6-6-3" multi-level back propagation (BP) Neural Networks classifier to realize the fatigue detection. An approximately 15-h experiment is carried out on a real road, and the data retrieved are segmented and labeled with three fatigue levels after expert evaluation, namely "awake", "drowsy" and "very drowsy". The average accuracy of 88.02% in fatigue identification was achieved in the experiment, endorsing the value of the proposed method for engineering applications.
NASA Astrophysics Data System (ADS)
Dalezios, Nicolas; Spyropoulos, Nicos V.; Tarquis, Ana M.
2015-04-01
The research work stems from the hypothesis that it is possible to perform an estimation of seasonal water needs of olive tree farms under drought periods by cross correlating high spatial, spectral and temporal resolution (~monthly) of satellite data, acquired at well defined time intervals of the phenological cycle of crops, with ground-truth information simultaneously applied during the image acquisitions. The present research is for the first time, demonstrating the coordinated efforts of space engineers, satellite mission control planners, remote sensing scientists and ground teams to record at specific time intervals of the phenological cycle of trees from ground "zero" and from 770 km above the Earth's surface, the status of plants for subsequent cross correlation and analysis regarding the estimation of the seasonal evapotranspiration in vulnerable agricultural environment. The ETo and ETc derived by Penman-Montieth equation and reference Kc tables, compared with new ETd using the Kc extracted from the time series satellite data. Several vegetation indices were also used especially the RedEdge and the chlorophyll one based on WorldView-2 RedEdge and second NIR bands to relate the tree status with water and nutrition needs. Keywords: Evapotransipration, Very High Spatial Resolution - VHSR, time series, remote sensing, vulnerability, agriculture, vegetation indeces.
Zhang, Fang; Wagner, Anita K; Ross-Degnan, Dennis
2011-11-01
Interrupted time series is a strong quasi-experimental research design to evaluate the impacts of health policy interventions. Using simulation methods, we estimated the power requirements for interrupted time series studies under various scenarios. Simulations were conducted to estimate the power of segmented autoregressive (AR) error models when autocorrelation ranged from -0.9 to 0.9 and effect size was 0.5, 1.0, and 2.0, investigating balanced and unbalanced numbers of time periods before and after an intervention. Simple scenarios of autoregressive conditional heteroskedasticity (ARCH) models were also explored. For AR models, power increased when sample size or effect size increased, and tended to decrease when autocorrelation increased. Compared with a balanced number of study periods before and after an intervention, designs with unbalanced numbers of periods had less power, although that was not the case for ARCH models. The power to detect effect size 1.0 appeared to be reasonable for many practical applications with a moderate or large number of time points in the study equally divided around the intervention. Investigators should be cautious when the expected effect size is small or the number of time points is small. We recommend conducting various simulations before investigation. Copyright © 2011 Elsevier Inc. All rights reserved.
Ye, Yu; Kerr, William C
2011-01-01
To explore various model specifications in estimating relationships between liver cirrhosis mortality rates and per capita alcohol consumption in aggregate-level cross-section time-series data. Using a series of liver cirrhosis mortality rates from 1950 to 2002 for 47 U.S. states, the effects of alcohol consumption were estimated from pooled autoregressive integrated moving average (ARIMA) models and 4 types of panel data models: generalized estimating equation, generalized least square, fixed effect, and multilevel models. Various specifications of error term structure under each type of model were also examined. Different approaches controlling for time trends and for using concurrent or accumulated consumption as predictors were also evaluated. When cirrhosis mortality was predicted by total alcohol, highly consistent estimates were found between ARIMA and panel data analyses, with an average overall effect of 0.07 to 0.09. Less consistent estimates were derived using spirits, beer, and wine consumption as predictors. When multiple geographic time series are combined as panel data, none of existent models could accommodate all sources of heterogeneity such that any type of panel model must employ some form of generalization. Different types of panel data models should thus be estimated to examine the robustness of findings. We also suggest cautious interpretation when beverage-specific volumes are used as predictors. Copyright © 2010 by the Research Society on Alcoholism.
2011-01-01
Background Network inference methods reconstruct mathematical models of molecular or genetic networks directly from experimental data sets. We have previously reported a mathematical method which is exclusively data-driven, does not involve any heuristic decisions within the reconstruction process, and deliveres all possible alternative minimal networks in terms of simple place/transition Petri nets that are consistent with a given discrete time series data set. Results We fundamentally extended the previously published algorithm to consider catalysis and inhibition of the reactions that occur in the underlying network. The results of the reconstruction algorithm are encoded in the form of an extended Petri net involving control arcs. This allows the consideration of processes involving mass flow and/or regulatory interactions. As a non-trivial test case, the phosphate regulatory network of enterobacteria was reconstructed using in silico-generated time-series data sets on wild-type and in silico mutants. Conclusions The new exact algorithm reconstructs extended Petri nets from time series data sets by finding all alternative minimal networks that are consistent with the data. It suggested alternative molecular mechanisms for certain reactions in the network. The algorithm is useful to combine data from wild-type and mutant cells and may potentially integrate physiological, biochemical, pharmacological, and genetic data in the form of a single model. PMID:21762503
Short- and long-term results following standing fracture repair in 34 horses.
Payne, R J; Compston, P C
2012-11-01
Standing fracture repair in the horse is a recently described surgical procedure and currently there are few follow-up data. This case series contains 2 novel aspects in the standing horse: repair of incomplete sagittal fractures of the proximal phalanx and medial condylar repair from a lateral aspect. To describe outcome in a case series of horses that had lower limb fractures repaired under standing sedation at Rossdales Equine Hospital. Case records for all horses that had a fracture surgically repaired, by one surgeon at Rossdales Equine Hospital, under standing sedation and local anaesthesia up until June 2011, were retrieved. Hospital records, owner/trainer telephone questionnaire and the Racing Post website were used to evaluate follow-up. Thirty-four horses satisfied the inclusion criteria. Fracture sites included the proximal phalanx (incomplete sagittal fracture, n = 14); the third metacarpal bone (lateral condyle, n = 12, and medial condyle, n = 7); and the third metatarsal bone (lateral condyle, n = 1). One horse required euthanasia due to caecal rupture 10 days post operatively. Twenty horses (66.7% of those with available follow-up) have returned to racing. Where available, mean time from operation to return to racing was 226 days (range 143-433 days). Standing fracture repair produced similar results to fracture repair under general anaesthesia in terms of both the number of horses that returned to racing and the time between surgery and race. Repair of lower limb fracture in the horse under standing sedation is a procedure that has the potential for tangible benefits, including avoidance of the inherent risks of general anaesthesia. The preliminary findings in this series of horses are encouraging and informative when discussing options available prior to fracture repair. © 2012 EVJ Ltd.
A Langevin equation for the rates of currency exchange based on the Markov analysis
NASA Astrophysics Data System (ADS)
Farahpour, F.; Eskandari, Z.; Bahraminasab, A.; Jafari, G. R.; Ghasemi, F.; Sahimi, Muhammad; Reza Rahimi Tabar, M.
2007-11-01
We propose a method for analyzing the data for the rates of exchange of various currencies versus the U.S. dollar. The method analyzes the return time series of the data as a Markov process, and develops an effective equation which reconstructs it. We find that the Markov time scale, i.e., the time scale over which the data are Markov-correlated, is one day for the majority of the daily exchange rates that we analyze. We derive an effective Langevin equation to describe the fluctuations in the rates. The equation contains two quantities, D and D, representing the drift and diffusion coefficients, respectively. We demonstrate how the two coefficients are estimated directly from the data, without using any assumptions or models for the underlying stochastic time series that represent the daily rates of exchange of various currencies versus the U.S. dollar.
Finite element techniques in computational time series analysis of turbulent flows
NASA Astrophysics Data System (ADS)
Horenko, I.
2009-04-01
In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical assumptions used in the analysis will be pointed up in this example. Finally, applications to analysis of meteorological and climate data will be presented.
2017-10-01
networks of the brain responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased...4 For each subject, the rsFMRI voxel time-series were temporally shifted to account for differences in slice acquisition times...responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased connectivity in
ERIC Educational Resources Information Center
Swygert, Kimberly A.
In this study, data from an operational computerized adaptive test (CAT) were examined in order to gather information concerning item response times in a CAT environment. The CAT under study included multiple-choice items measuring verbal, quantitative, and analytical reasoning. The analyses included the fitting of regression models describing the…
A Report on Army Science Planning and Strategy 2016
2017-06-01
Army Research Laboratory (ARL) hosted a series of meetings in fall 2016 to develop a strategic vision for Army Science. Meeting topics were vetted...reduce maturation time . • Support internal Army research efforts to enhance Army investments in multiscale modeling to accelerate the rate of...requirement are research needs including cross-modal approaches to enabling real- time human comprehension under constraints of bandwidth, information
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-28
... every trading day since that time.\\5\\ \\5\\ CBOE maintains a micro-site for VXSLV: http://www.cboe.com/micro/VIXETF/VXSLV/ . VXSLV is an up-to-the-minute market estimate of the expected volatility of SLV... underlying option series on a real-time basis throughout each trading day, from 8:30 a.m. until 3 p.m. (CT...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-14
... time.\\4\\ \\4\\ CBOE maintains a micro-site for GVZ options at: http://www.cboe.com/gvz . See proposed... underlying option series on a real-time basis throughout each trading day, from 8:30 a.m. until 3 p.m. (CT... cease at 3 p.m. (CT) on the business day immediately preceding the expiration date.\\8\\ Exercise will...
NASA Astrophysics Data System (ADS)
Kim, Y.; Johnson, M. S.
2017-12-01
Spectral entropy (Hs) is an index which can be used to measure the structural complexity of time series data. When a time series is made up of one periodic function, the Hs value becomes smaller, while Hs becomes larger when a time series is composed of several periodic functions. We hypothesized that this characteristic of the Hs could be used to quantify the water stress history of vegetation. For the ideal condition for which sufficient water is supplied to an agricultural crop or natural vegetation, there should be a single distinct phenological cycle represented in a vegetation index time series (e.g., NDVI and EVI). However, time series data for a vegetation area that repeatedly experiences water stress may include several fluctuations that can be observed in addition to the predominant phenological cycle. This is because the process of experiencing water stress and recovering from it generates small fluctuations in phenological characteristics. Consequently, the value of Hs increases when vegetation experiences several water shortages. Therefore, the Hs could be used as an indicator for water stress history. To test this hypothesis, we analyzed Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) data for a natural area in comparison to a nearby sugarcane area in seasonally-dry western Costa Rica. In this presentation we will illustrate the use of spectral entropy to evaluate the vegetative responses of natural vegetation (dry tropical forest) and sugarcane under three different irrigation techniques (center pivot irrigation, drip irrigation and flood irrigation). Through this comparative analysis, the utility of Hs as an indicator will be tested. Furthermore, crop response to the different irrigation methods will be discussed in terms of Hs, NDVI and yield.
NASA Astrophysics Data System (ADS)
He, Jiayi; Shang, Pengjian; Xiong, Hui
2018-06-01
Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.
NASA Astrophysics Data System (ADS)
Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio
2013-10-01
Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.
Development of web tools to disseminate space geodesy data-related products
NASA Astrophysics Data System (ADS)
Soudarin, Laurent; Ferrage, Pascale; Mezerette, Adrien
2015-04-01
In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). A database was created to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).
"Batch" kinetics in flow: online IR analysis and continuous control.
Moore, Jason S; Jensen, Klavs F
2014-01-07
Currently, kinetic data is either collected under steady-state conditions in flow or by generating time-series data in batch. Batch experiments are generally considered to be more suitable for the generation of kinetic data because of the ability to collect data from many time points in a single experiment. Now, a method that rapidly generates time-series reaction data from flow reactors by continuously manipulating the flow rate and reaction temperature has been developed. This approach makes use of inline IR analysis and an automated microreactor system, which allowed for rapid and tight control of the operating conditions. The conversion/residence time profiles at several temperatures were used to fit parameters to a kinetic model. This method requires significantly less time and a smaller amount of starting material compared to one-at-a-time flow experiments, and thus allows for the rapid generation of kinetic data. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Guastello, Stephen J; Reiter, Katherine; Shircel, Anton; Timm, Paul; Malon, Matthew; Fabisch, Megan
2014-07-01
This study examined the relationship between performance variability and actual performance of financial decision makers who were working under experimental conditions of increasing workload and fatigue. The rescaled range statistic, also known as the Hurst exponent (H) was used as an index of variability. Although H is defined as having a range between 0 and 1, 45% of the 172 time series generated by undergraduates were negative. Participants in the study chose the optimum investment out of sets of 3 to 5 options that were presented a series of 350 displays. The sets of options varied in both the complexity of the options and number of options under simultaneous consideration. One experimental condition required participants to make their choices within 15 sec, and the other condition required them to choose within 7.5 sec. Results showed that (a) negative H was possible and not a result of psychometric error; (b) negative H was associated with negative autocorrelations in a time series. (c) H was the best predictor of performance of the variables studied; (d) three other significant predictors were scores on an anagrams test and ratings of physical demands and performance demands; (e) persistence as evidenced by the autocorrelations was associated with ratings of greater time pressure. It was concluded, furthermore, that persistence and overall performance were correlated, that 'healthy' variability only exists within a limited range, and other individual differences related to ability and resistance to stress or fatigue are also involved in the prediction of performance.
Ethical Decisions in Turbulent Times
ERIC Educational Resources Information Center
Shapiro, Joan Poliner; Gross, Steven Jay; Shapiro, Susan H.
2008-01-01
Education leaders make difficult ethical decisions each day. Using the story of a preschool director in lower Manhattan on September 11, 2001, the authors detail a series of paradigms to help educational leaders navigate rationally through challenging and complex circumstances when they are under considerable emotional stress. One approach which…
EVOLUTION OF THE NOCTURNAL INVERSION LAYER AT AN URBAN AND NONURBAN LOCATION
The evolutionary cycle of the nocturnal radiation inversion layer from formation until dissipation under fair weather conditions was investigated by time-series analyses of observations of inversion base and top heights, and inversion strength at an urban and a nonurban site in S...
Predictability of rogue events.
Birkholz, Simon; Brée, Carsten; Demircan, Ayhan; Steinmeyer, Günter
2015-05-29
Using experimental data from three different rogue wave supporting systems, determinism, and predictability of the underlying dynamics are evaluated with methods of nonlinear time series analysis. We included original records from the Draupner platform in the North Sea as well as time series from two optical systems in our analysis. One of the latter was measured in the infrared tail of optical fiber supercontinua, the other in the fluence profiles of multifilaments. All three data sets exhibit extreme-value statistics and exceed the significant wave height in the respective system by a factor larger than 2. Nonlinear time series analysis indicates a different degree of determinism in the systems. The optical fiber scenario is found to be driven by quantum noise whereas rogue waves emerge as a consequence of turbulence in the others. With the large number of rogue events observed in the multifilament system, we can systematically explore the predictability of such events in a turbulent system. We observe that rogue events do not necessarily appear without a warning, but are often preceded by a short phase of relative order. This surprising finding sheds some new light on the fascinating phenomenon of rogue waves.
Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva
2018-01-15
Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.
Fitting Flux Ropes to a Global MHD Solution: A Comparison of Techniques. Appendix 1
NASA Technical Reports Server (NTRS)
Riley, Pete; Linker, J. A.; Lionello, R.; Mikic, Z.; Odstrcil, D.; Hidalgo, M. A.; Cid, C.; Hu, Q.; Lepping, R. P.; Lynch, B. J.
2004-01-01
Flux rope fitting (FRF) techniques are an invaluable tool for extracting information about the properties of a subclass of CMEs in the solar wind. However, it has proven difficult to assess their accuracy since the underlying global structure of the CME cannot be independently determined from the data. In contrast, large-scale MHD simulations of CME evolution can provide both a global view as well as localized time series at specific points in space. In this study we apply 5 different fitting techniques to 2 hypothetical time series derived from MHD simulation results. Independent teams performed the analysis of the events in "blind tests", for which no information, other than the time series, was provided. F rom the results, we infer the following: (1) Accuracy decreases markedly with increasingly glancing encounters; (2) Correct identification of the boundaries of the flux rope can be a significant limiter; and (3) Results from techniques that infer global morphology must be viewed with caution. In spite of these limitations, FRF techniques remain a useful tool for describing in situ observations of flux rope CMEs.
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
46 CFR 133.160 - Rescue boat embarkation, launching and recovery arrangements.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., launching and recovery arrangements. (a) Each davit for a rescue boat must be approved under approval series 160.132 with a winch approved under approval series 160.115. If the launching arrangement uses a... automatic disengaging apparatus approved under approval series 160.170 instead of a lifeboat release...
46 CFR 133.160 - Rescue boat embarkation, launching and recovery arrangements.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., launching and recovery arrangements. (a) Each davit for a rescue boat must be approved under approval series 160.132 with a winch approved under approval series 160.115. If the launching arrangement uses a... automatic disengaging apparatus approved under approval series 160.170 instead of a lifeboat release...
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
Models and signal processing for an implanted ethanol bio-sensor.
Han, Jae-Joon; Doerschuk, Peter C; Gelfand, Saul B; O'Connor, Sean J
2008-02-01
The understanding of drinking patterns leading to alcoholism has been hindered by an inability to unobtrusively measure ethanol consumption over periods of weeks to months in the community environment. An implantable ethanol sensor is under development using microelectromechanical systems technology. For safety and user acceptability issues, the sensor will be implanted subcutaneously and, therefore, measure peripheral-tissue ethanol concentration. Determining ethanol consumption and kinetics in other compartments from the time course of peripheral-tissue ethanol concentration requires sophisticated signal processing based on detailed descriptions of the relevant physiology. A statistical signal processing system based on detailed models of the physiology and using extended Kalman filtering and dynamic programming tools is described which can estimate the time series of ethanol concentration in blood, liver, and peripheral tissue and the time series of ethanol consumption based on peripheral-tissue ethanol concentration measurements.
Data Rescue for precipitation station network in Slovak Republic
NASA Astrophysics Data System (ADS)
Fasko, Pavel; Bochníček, Oliver; Švec, Marek; Paľušová, Zuzana; Markovič, Ladislav
2016-04-01
Transparency of archive catalogues presents very important task for the data saving. It helps to the further activities e.g. digitalization and homogenization. For the time being visualization of time series continuation in precipitation stations (approximately 1250 stations) is under way in Slovak Republic since the beginning of observation (meteorological stations gradually began to operate during the second half of the 19th century in Slovakia). Visualization is joined with the activities like verification and accessibility of the data mentioned in the archive catalogue, station localization according to the historical annual books, conversion of coordinates into x-JTSK, y-JTSK and hydrological catchment assignment. Clustering of precipitation stations at the specific hydrological catchment in the map and visualization of the data duration (line graph) will lead to the effective assignment of corresponding precipitation stations for the prolongation of time series. This process should be followed by the process of turn or trend detection and homogenization. The risks and problems at verification of records from archive catalogues, their digitalization, repairs and the way of visualization will be seen in poster. During the searching process of the historical and often short time series, we realized the importance of mainly those stations, located in the middle and higher altitudes. They might be used as replacement for up to now quoted fictive points used at the construction of precipitation maps. Supplementing and enhancing the time series of individual stations will enable to follow changes in precipitation totals during the certain period as well as area totals for individual catchments in various time periods appreciated mainly by hydrologists and agro-climatologists.
Neutron star dynamics under time dependent external torques
NASA Astrophysics Data System (ADS)
Alpar, M. A.; Gügercinoğlu, E.
2017-12-01
The two component model of neutron star dynamics describing the behaviour of the observed crust coupled to the superfluid interior has so far been applied to radio pulsars for which the external torques are constant on dynamical timescales. We recently solved this problem under arbitrary time dependent external torques. Our solutions pertain to internal torques that are linear in the rotation rates, as well as to the extremely non-linear internal torques of the vortex creep model. Two-component models with linear or nonlinear internal torques can now be applied to magnetars and to neutron stars in binary systems, with strong variability and timing noise. Time dependent external torques can be obtained from the observed spin-down (or spin-up) time series, \\dot Ω ≤ft( t \\right).
NASA Astrophysics Data System (ADS)
Ishbulatov, Yu. M.; Karavaev, A. S.; Kiselev, A. R.; Semyachkina-Glushkovskaya, O. V.; Postnov, D. E.; Bezruchko, B. P.
2018-04-01
A method for the reconstruction of time-delayed feedback system is investigated, which is based on the detection of synchronous response of a slave time-delay system with respect to the driving from the master system under study. The structure of the driven system is similar to the structure of the studied time-delay system, but the feedback circuit is broken in the driven system. The method efficiency is tested using short and noisy data gained from an electronic chaotic oscillator with time-delayed feedback.
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
NASA Astrophysics Data System (ADS)
Scarth, P.; Trevithick, B.; Beutel, T.
2016-12-01
VegMachine Online is a freely available browser application that allows ranchers across Australia to view and interact with satellite derived ground cover state and change maps on their property and extract this information in a graphical format using interactive tools. It supports the delivery and communication of a massive earth observation data set in an accessible, producer friendly way . Around 250,000 Landsat TM, ETM and OLI images were acquired across Australia, converted to terrain corrected surface reflectance and masked for cloud, cloud shadow, terrain shadow and water. More than 2500 field sites across the Australian rangelands were used to derive endmembers used in a constrained unmixing approach to estimate the per-pixel proportion of bare, green and non-green vegetation for all images. A seasonal metoid compositing method was used to produce national fractional cover virtual mosaics for each three month period since 1988. The time series of green fraction is used to estimate the persistent green due to tree and shrub canopies, and this estimate is used to correct the fractional cover to ground cover for our mixed tree-grass rangeland systems. Finally, deciles are produced for key metrics every season to track a pixels relativity to the entire time series. These data are delivered through time series enabled web mapping services and customised web processing services that enable the full time series over any spatial extent to be interrogated in seconds via a RESTful interface. These services interface with a front end browser application that provides product visualization for any date in the time series, tools to draw or import polygon boundaries, plot time series ground cover comparisons, look at the effect of historical rainfall and tools to run the revised universal soil loss equation in web time to assess the effect of proposed changes in cover retention. VegMachine Online is already being used by ranchers monitoring paddock condition, organisations supporting land management initiatives in Great Barrier Reef catchments, by students developing tools to understand land condition and degradation and the underlying data and APIs are supporting several other land condition mapping tools.
Parameterizing time in electronic health record studies.
Hripcsak, George; Albers, David J; Perotte, Adler
2015-07-01
Fields like nonlinear physics offer methods for analyzing time series, but many methods require that the time series be stationary-no change in properties over time.Objective Medicine is far from stationary, but the challenge may be able to be ameliorated by reparameterizing time because clinicians tend to measure patients more frequently when they are ill and are more likely to vary. We compared time parameterizations, measuring variability of rate of change and magnitude of change, and looking for homogeneity of bins of temporal separation between pairs of time points. We studied four common laboratory tests drawn from 25 years of electronic health records on 4 million patients. We found that sequence time-that is, simply counting the number of measurements from some start-produced more stationary time series, better explained the variation in values, and had more homogeneous bins than either traditional clock time or a recently proposed intermediate parameterization. Sequence time produced more accurate predictions in a single Gaussian process model experiment. Of the three parameterizations, sequence time appeared to produce the most stationary series, possibly because clinicians adjust their sampling to the acuity of the patient. Parameterizing by sequence time may be applicable to association and clustering experiments on electronic health record data. A limitation of this study is that laboratory data were derived from only one institution. Sequence time appears to be an important potential parameterization. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial reproduction and distribution of the work, in any medium, provided the original work is not altered or transformed in any way, and that the work properly cited. For commercial re-use, please contact journals.permissions@oup.com.
Mining Gene Regulatory Networks by Neural Modeling of Expression Time-Series.
Rubiolo, Mariano; Milone, Diego H; Stegmayer, Georgina
2015-01-01
Discovering gene regulatory networks from data is one of the most studied topics in recent years. Neural networks can be successfully used to infer an underlying gene network by modeling expression profiles as times series. This work proposes a novel method based on a pool of neural networks for obtaining a gene regulatory network from a gene expression dataset. They are used for modeling each possible interaction between pairs of genes in the dataset, and a set of mining rules is applied to accurately detect the subjacent relations among genes. The results obtained on artificial and real datasets confirm the method effectiveness for discovering regulatory networks from a proper modeling of the temporal dynamics of gene expression profiles.
Historical instrumental climate data for Australia - quality and utility for palaeoclimatic studies
NASA Astrophysics Data System (ADS)
Nicholls, Neville; Collins, Dean; Trewin, Blair; Hope, Pandora
2006-10-01
The quality and availability of climate data suitable for palaeoclimatic calibration and verification for the Australian region are discussed and documented. Details of the various datasets, including problems with the data, are presented. High-quality datasets, where such problems are reduced or even eliminated, are discussed. Many climate datasets are now analysed onto grids, facilitating the preparation of regional-average time series. Work is under way to produce such high-quality, gridded datasets for a variety of hitherto unavailable climate data, including surface humidity, pan evaporation, wind, and cloud. An experiment suggests that only a relatively small number of palaeoclimatic time series could provide a useful estimate of long-term changes in Australian annual average temperature. Copyright
NASA Astrophysics Data System (ADS)
Panteleev, Ivan; Bayandin, Yuriy; Naimark, Oleg
2017-12-01
This work performs a correlation analysis of the statistical properties of continuous acoustic emission recorded in different parts of marble and fiberglass laminate samples under quasi-static deformation. A spectral coherent measure of time series, which is a generalization of the squared coherence spectrum on a multidimensional series, was chosen. The spectral coherent measure was estimated in a sliding time window for two parameters of the acoustic emission multifractal singularity spectrum: the spectrum width and the generalized Hurst exponent realizing the maximum of the singularity spectrum. It is shown that the preparation of the macrofracture focus is accompanied by the synchronization (coherent behavior) of the statistical properties of acoustic emission in allocated frequency intervals.
Ahmed, Ashik; Al-Amin, Rasheduzzaman; Amin, Ruhul
2014-01-01
This paper proposes designing of Static Synchronous Series Compensator (SSSC) based damping controller to enhance the stability of a Single Machine Infinite Bus (SMIB) system by means of Invasive Weed Optimization (IWO) technique. Conventional PI controller is used as the SSSC damping controller which takes rotor speed deviation as the input. The damping controller parameters are tuned based on time integral of absolute error based cost function using IWO. Performance of IWO based controller is compared to that of Particle Swarm Optimization (PSO) based controller. Time domain based simulation results are presented and performance of the controllers under different loading conditions and fault scenarios is studied in order to illustrate the effectiveness of the IWO based design approach.
DNAism: exploring genomic datasets on the web with Horizon Charts.
Rio Deiros, David; Gibbs, Richard A; Rogers, Jeffrey
2016-01-27
Computational biologists daily face the need to explore massive amounts of genomic data. New visualization techniques can help researchers navigate and understand these big data. Horizon Charts are a relatively new visualization method that, under the right circumstances, maximizes data density without losing graphical perception. Horizon Charts have been successfully applied to understand multi-metric time series data. We have adapted an existing JavaScript library (Cubism) that implements Horizon Charts for the time series domain so that it works effectively with genomic datasets. We call this new library DNAism. Horizon Charts can be an effective visual tool to explore complex and large genomic datasets. Researchers can use our library to leverage these techniques to extract additional insights from their own datasets.
From blackbirds to black holes: Investigating capture-recapture methods for time domain astronomy
NASA Astrophysics Data System (ADS)
Laycock, Silas G. T.
2017-07-01
In time domain astronomy, recurrent transients present a special problem: how to infer total populations from limited observations. Monitoring observations may give a biassed view of the underlying population due to limitations on observing time, visibility and instrumental sensitivity. A similar problem exists in the life sciences, where animal populations (such as migratory birds) or disease prevalence, must be estimated from sparse and incomplete data. The class of methods termed Capture-Recapture is used to reconstruct population estimates from time-series records of encounters with the study population. This paper investigates the performance of Capture-Recapture methods in astronomy via a series of numerical simulations. The Blackbirds code simulates monitoring of populations of transients, in this case accreting binary stars (neutron star or black hole accreting from a stellar companion) under a range of observing strategies. We first generate realistic light-curves for populations of binaries with contrasting orbital period distributions. These models are then randomly sampled at observing cadences typical of existing and planned monitoring surveys. The classical capture-recapture methods, Lincoln-Peterson, Schnabel estimators, related techniques, and newer methods implemented in the Rcapture package are compared. A general exponential model based on the radioactive decay law is introduced which is demonstrated to recover (at 95% confidence) the underlying population abundance and duty cycle, in a fraction of the observing visits (10-50%) required to discover all the sources in the simulation. Capture-Recapture is a promising addition to the toolbox of time domain astronomy, and methods implemented in R by the biostats community can be readily called from within python.
A laboratory assessment of the measurement accuracy of weighing type rainfall intensity gauges
NASA Astrophysics Data System (ADS)
Colli, M.; Chan, P. W.; Lanza, L. G.; La Barbera, P.
2012-04-01
In recent years the WMO Commission for Instruments and Methods of Observation (CIMO) fostered noticeable advancements in the accuracy of precipitation measurement issue by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries (Lanza et al., 2005; Vuerich et al., 2009). Extreme events analysis is proven to be highly affected by the on-site RI measurement accuracy (see e.g. Molini et al., 2004) and the time resolution of the available RI series certainly constitutes another key-factor in constructing hyetographs that are representative of real rain events. The OTT Pluvio2 weighing gauge (WG) and the GEONOR T-200 vibrating-wire precipitation gauge demonstrated very good performance under previous constant flow rate calibration efforts (Lanza et al., 2005). Although WGs do provide better performance than more traditional Tipping Bucket Rain gauges (TBR) under continuous and constant reference intensity, dynamic effects seem to affect the accuracy of WG measurements under real world/time varying rainfall conditions (Vuerich et al., 2009). The most relevant is due to the response time of the acquisition system and the derived systematic delay of the instrument in assessing the exact weight of the bin containing cumulated precipitation. This delay assumes a relevant role in case high resolution rain intensity time series are sought from the instrument, as is the case of many hydrologic and meteo-climatic applications. This work reports the laboratory evaluation of Pluvio2 and T-200 rainfall intensity measurements accuracy. Tests are carried out by simulating different artificial precipitation events, namely non-stationary rainfall intensity, using a highly accurate dynamic rainfall generator. Time series measured by an Ogawa drop counter (DC) at a field test site located within the Hong Kong International Airport (HKIA) were aggregated at a 1-minute scale and used as reference for the artificial rain generation (Colli et al., 2012). The preliminary development and validation of the rainfall simulator for the generation of variable time steps reference intensities is also shown. The generator is characterized by a sufficiently short time response with respect to the expected weighing gauges behavior in order to ensure effective comparison of the measured/reference intensity at very high resolution in time.
46 CFR 133.70 - Personal lifesaving appliances.
Code of Federal Regulations, 2010 CFR
2010-10-01
... carry lifebuoys approved under approval series 160.150 or 160.050 as follows: (1) Number. The number of... light approved under approval series 161.010. The self-igniting light must not be attached to the...) Lifejackets. Each OSV must carry lifejackets approved under approval series 160.002, 160.005, 160.055, 160.077...
46 CFR 133.70 - Personal lifesaving appliances.
Code of Federal Regulations, 2012 CFR
2012-10-01
... carry lifebuoys approved under approval series 160.150 or 160.050 as follows: (1) Number. The number of... light approved under approval series 161.010. The self-igniting light must not be attached to the...) Lifejackets. Each OSV must carry lifejackets approved under approval series 160.002, 160.005, 160.055, 160.077...
46 CFR 133.70 - Personal lifesaving appliances.
Code of Federal Regulations, 2014 CFR
2014-10-01
... carry lifebuoys approved under approval series 160.150 or 160.050 as follows: (1) Number. The number of... light approved under approval series 161.010. The self-igniting light must not be attached to the...) Lifejackets. Each OSV must carry lifejackets approved under approval series 160.002, 160.005, 160.055, 160.077...
46 CFR 133.70 - Personal lifesaving appliances.
Code of Federal Regulations, 2013 CFR
2013-10-01
... carry lifebuoys approved under approval series 160.150 or 160.050 as follows: (1) Number. The number of... light approved under approval series 161.010. The self-igniting light must not be attached to the...) Lifejackets. Each OSV must carry lifejackets approved under approval series 160.002, 160.005, 160.055, 160.077...
46 CFR 133.70 - Personal lifesaving appliances.
Code of Federal Regulations, 2011 CFR
2011-10-01
... carry lifebuoys approved under approval series 160.150 or 160.050 as follows: (1) Number. The number of... light approved under approval series 161.010. The self-igniting light must not be attached to the...) Lifejackets. Each OSV must carry lifejackets approved under approval series 160.002, 160.005, 160.055, 160.077...
Monitoring Items in Real Time to Enhance CAT Security
ERIC Educational Resources Information Center
Zhang, Jinming; Li, Jie
2016-01-01
An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2014 CFR
2014-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... disengaging apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f...
29 CFR 1607.4 - Information on impact.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Report EEO-1 series of reports. The user should adopt safeguards to insure that the records required by... reliable, evidence concerning the impact of the procedure over a longer period of time and/or evidence... may in design and execution be race, color, sex, or ethnic conscious, selection procedures under such...
46 CFR 108.550 - Survival craft launching and recovery arrangements: General.
Code of Federal Regulations, 2013 CFR
2013-10-01
...-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT Lifesaving Equipment § 108.550 Survival craft...) A launching appliance approved on or before November 10, 2011 under approval series 160.163. (b) All... being launched with their full complement of persons and equipment within 10 minutes from the time the...
Demand for Light Duty Trucks : The Wharton EFA Motor Vehicle Demand Model (Mark II).
DOT National Transportation Integrated Search
1981-01-01
A preliminary model of U.S. light-duty vehicle demand is presented which contains an integrated analysis of automobiles and light trucks (under 10,000 lbs. GVW). The model has been estimated using both cross-section and time-series data, and is a dev...
A Comparison of Numerical Problem Solving under Three Types of Calculation Conditions.
ERIC Educational Resources Information Center
Roberts, Dennis M.; Glynn, Shawn M.
1978-01-01
The study reported is the first in a series of investigations designed to empirically test the hypothesis that calculators reduce quantitative working time and increase computational accuracy, and to examine the relative magnitude of benefit that accompanies utilizing calculators compared to manual work. (MN)
A method for assessment of watershed health is developed by employing measures of reliability, resilience and vulnerability (R-R-V) using stream water quality data. Observed water quality data are usually sparse, so that a water quality time series is often reconstructed using s...
ERIC Educational Resources Information Center
Muller, Eugene W.
1985-01-01
Develops generalizations for empirical evaluation of software based upon suitability of several research designs--pretest posttest control group, single-group pretest posttest, nonequivalent control group, time series, and regression discontinuity--to type of software being evaluated, and on circumstances under which evaluation is conducted. (MBR)
A 40 Year Time Series of SBUV Observations: the Version 8.6 Processing
NASA Technical Reports Server (NTRS)
McPeters, Richard; Bhartia, P. K.; Flynn, L.
2012-01-01
Under a NASA program to produce long term data records from instruments on multiple satellites (MEaSUREs), data from a series of eight SBUV and SBUV 12 instruments have been reprocessed to create a 40 year long ozone time series. Data from the Nimbus 4 BUV, Nimbus 7 SBUV, and SBUV/2 instruments on NOAA 9, 11, 14, 16, 17, and 18 were used covering the period 1970 to 1972 and 1979 to the present. In past analyses an ozone time series was created from these instruments by adjusting ozone itself, instrument by instrument, for consistency during overlap periods. In the version 8.6 processing adjustments were made to the radiance calibration of each instrument to maintain a consistent calibration over the entire time series. Data for all eight instruments were then reprocessed using the adjusted radiances. Reprocessing is necessary to produce an accurate latitude dependence. Other improvements incorporated in version 8.6 included the use of the ozone cross sections of Brion, Daumont, and Malicet, and the use of a cloud height climatology derived from Aura OMI measurements. The new cross sections have a more accurate temperature dependence than the cross sections previously used. The OMI-based cloud heights account for the penetration of UV into the upper layers of clouds. The consistency of the version 8.6 time series was evaluated by intra-instrument comparisons during overlap periods, comparisons with ground-based instruments, and comparisons with measurements made by instruments on other satellites such as SAGE II and UARS MLS. These comparisons show that for the instruments on NOAA 16, 17 and 18, the instrument calibrations were remarkably stable and consistent from instrument to instrument. The data record from the Nimbus 7 SBUV was also very stable, and SAGE and ground-based comparisons show that the' calibration was consistent with measurements made years laterby the NOAA 16 instrument. The calibrations of the SBUV/2 instruments on NOAA 9, 11, and 14 were more of a problem. The rapidly drifting orbits of these satellites resulted in relative time and altitude dependent differences that are significant. Despite these problems, total column ozone appears to be consistent to better than 1% over the entire time series, while the ozone vertical distribution is consistent to approximately 5%.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Organics removal from landfill leachate and activated sludge production in SBR reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimiuk, Ewa; Kulikowska, Dorota
2006-07-01
This study is aimed at estimating organic compounds removal and sludge production in SBR during treatment of landfill leachate. Four series were performed. At each series, experiments were carried out at the hydraulic retention time (HRT) of 12, 6, 3 and 2 d. The series varied in SBR filling strategies, duration of the mixing and aeration phases, and the sludge age. In series 1 and 2 (a short filling period, mixing and aeration phases in the operating cycle), the relationship between organics concentration (COD) in the leachate treated and HRT was pseudo-first-order kinetics. In series 3 (with mixing and aerationmore » phases) and series 4 (only aeration phase) with leachate supplied by means of a peristaltic pump for 4 h of the cycle (filling during reaction period) - this relationship was zero-order kinetics. Activated sludge production expressed as the observed coefficient of biomass production (Y {sub obs}) decreased correspondingly with increasing HRT. The smallest differences between reactors were observed in series 3 in which Y {sub obs} was almost stable (0.55-0.6 mg VSS/mg COD). The elimination of the mixing phase in the cycle (series 4) caused the Y {sub obs} to decrease significantly from 0.32 mg VSS/mg COD at HRT 2 d to 0.04 mg VSS/mg COD at HRT 12 d. The theoretical yield coefficient Y accounted for 0.534 mg VSS/mg COD (series 1) and 0.583 mg VSS/mg COD (series 2). In series 3 and 4, it was almost stable (0.628 mg VSS/mg COD and 0.616 mg VSS/mg COD, respectively). After the elimination of the mixing phase in the operating cycle, the specific biomass decay rate increased from 0.006 d{sup -1} (series 3) to 0.032 d{sup -1} (series 4). The operating conditions employing mixing/aeration or only aeration phases enable regulation of the sludge production. The SBRs operated under aerobic conditions are more favourable at a short hydraulic retention time. At long hydraulic retention time, it can lead to a decrease in biomass concentration in the SBR as a result of cell decay. On the contrary, in the activated sludge at long HRT, a short filling period and operating cycle of the reactor with the mixing and aeration phases seem the most favourable.« less
What does the structure of its visibility graph tell us about the nature of the time series?
NASA Astrophysics Data System (ADS)
Franke, Jasper G.; Donner, Reik V.
2017-04-01
Visibility graphs are a recently introduced method to construct complex network representations based upon univariate time series in order to study their dynamical characteristics [1]. In the last years, this approach has been successfully applied to studying a considerable variety of geoscientific research questions and data sets, including non-trivial temporal patterns in complex earthquake catalogs [2] or time-reversibility in climate time series [3]. It has been shown that several characteristic features of the thus constructed networks differ between stochastic and deterministic (possibly chaotic) processes, which is, however, relatively hard to exploit in the case of real-world applications. In this study, we propose studying two new measures related with the network complexity of visibility graphs constructed from time series, one being a special type of network entropy [4] and the other a recently introduced measure of the heterogeneity of the network's degree distribution [5]. For paradigmatic model systems exhibiting bifurcation sequences between regular and chaotic dynamics, both properties clearly trace the transitions between both types of regimes and exhibit marked quantitative differences for regular and chaotic dynamics. Moreover, for dynamical systems with a small amount of additive noise, the considered properties demonstrate gradual changes prior to the bifurcation point. This finding appears closely related to the subsequent loss of stability of the current state known to lead to a critical slowing down as the transition point is approaches. In this spirit, both considered visibility graph characteristics provide alternative tracers of dynamical early warning signals consistent with classical indicators. Our results demonstrate that measures of visibility graph complexity (i) provide a potentially useful means to tracing changes in the dynamical patterns encoded in a univariate time series that originate from increasing autocorrelation and (ii) allow to systematically distinguish regular from deterministic-chaotic dynamics. We demonstrate the application of our method for different model systems as well as selected paleoclimate time series from the North Atlantic region. Notably, visibility graph based methods are particularly suited for studying the latter type of geoscientific data, since they do not impose intrinsic restrictions or assumptions on the nature of the time series under investigation in terms of noise process, linearity and sampling homogeneity. [1] Lacasa, Lucas, et al. "From time series to complex networks: The visibility graph." Proceedings of the National Academy of Sciences 105.13 (2008): 4972-4975. [2] Telesca, Luciano, and Michele Lovallo. "Analysis of seismic sequences by using the method of visibility graph." EPL (Europhysics Letters) 97.5 (2012): 50002. [3] Donges, Jonathan F., Reik V. Donner, and Jürgen Kurths. "Testing time series irreversibility using complex network methods." EPL (Europhysics Letters) 102.1 (2013): 10004. [4] Small, Michael. "Complex networks from time series: capturing dynamics." 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013), Beijing (2013): 2509-2512. [5] Jacob, Rinku, K.P. Harikrishnan, Ranjeev Misra, and G. Ambika. "Measure for degree heterogeneity in complex networks and its application to recurrence network analysis." arXiv preprint 1605.06607 (2016).
Proceedings of the 6th annual Speakeasy conference. [Chicago, August 17-18, 1978
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-01-01
This meeting on the Speakeasy programming language and its applications included papers on the following subjects: graphics (graphics under Speakeasy, Speakeasy on a mini, color graphics), time series (OASIS - a user-oriented system at USDA, writing input-burdened linkules), applications (weather and crop yield analysis system, property investment analysis system), data bases under Speakeasy (relational data base, applications of relational data bases), survey analysis (survey analysis package from Liege, sic and its future under Speakeasy), and new features in Speakeasy (partial differential equations, the Speakeasy compiler and optimization). (RWR)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
..., as well as 500 grams of Artemia salina cysts as food for krill. They plan to measure how fast DNA is... krill at a series of later time points. By measuring how much of the prey DNA is left in the krill guts after various amounts of time since feeding, they can calculate how quickly the DNA was digested...
Understanding stellar activity and flares to search for Earth-like exoplanets
NASA Astrophysics Data System (ADS)
Del Sordo, Fabio
2015-08-01
The radial velocity method is a powerful way to search for exoplanetary systems and it led to many discoveries of exoplanets in the last 20 years. Nowadays, understanding stellar activity, flares and noise is a key factor for achieving a substantial improvement in such technique.Radial-velocity data are time-series containing the effect of both planets and stellar disturbances: the detection of Earth-like planets requires to improve the signal-to-noise ratio, i.e. it is central to understand the noise present in the data. Noise is caused by physical processes which operate on different time-scales, oftentimes acting in a non-periodic fashion. We present here an approach to such problem: to look for multifractal structures in the time-series coming from radial velocity measurements, identifying the underlying long-range correlations and fractal scaling properties, connecting them to the underlying physical processes (stellar oscillations, stellar wind, granulation, rotation, magnetic activity). This method has been previously applied to satellite data related to Arctic sea albedo, relevant for identify trends and noise in the Arctic sea ice (Agarwal, Moon, Wettlaufer, 2012). Here we suggest to use such analysis for exoplanetary data related to possible Earth-like planets.
Phenomenological analysis of medical time series with regular and stochastic components
NASA Astrophysics Data System (ADS)
Timashev, Serge F.; Polyakov, Yuriy S.
2007-06-01
Flicker-Noise Spectroscopy (FNS), a general approach to the extraction and parameterization of resonant and stochastic components contained in medical time series, is presented. The basic idea of FNS is to treat the correlation links present in sequences of different irregularities, such as spikes, "jumps", and discontinuities in derivatives of different orders, on all levels of the spatiotemporal hierarchy of the system under study as main information carriers. The tools to extract and analyze the information are power spectra and difference moments (structural functions), which complement the information of each other. The structural function stochastic component is formed exclusively by "jumps" of the dynamic variable while the power spectrum stochastic component is formed by both spikes and "jumps" on every level of the hierarchy. The information "passport" characteristics that are determined by fitting the derived expressions to the experimental variations for the stochastic components of power spectra and structural functions are interpreted as the correlation times and parameters that describe the rate of "memory loss" on these correlation time intervals for different irregularities. The number of the extracted parameters is determined by the requirements of the problem under study. Application of this approach to the analysis of tremor velocity signals for a Parkinsonian patient is discussed.
Numerical analysis of transient fields near thin-wire antennas and scatterers
NASA Astrophysics Data System (ADS)
Landt, J. A.
1981-11-01
Under the premise that `accelerated charge radiates,' one would expect radiation on wire structures to occur from driving points, ends of wires, bends in wires, or locations of lumped loading. Here, this premise is investigated in a series of numerical experiments. The numerical procedure is based on a moment-method solution of a thin-wire time-domain electric-field integral equation. The fields in the vicinity of wire structures are calculated for short impulsive-type excitations, and are viewed in a series of time sequences or snapshots. For these excitations, the fields are spatially limited in the radial dimension, and expand in spheres centered about points of radiation. These centers of radiation coincide with the above list of possible source regions. Time retardation permits these observations to be made clearly in the time domain, similar to time-range gating. In addition to providing insight into transient radiation processes, these studies show that the direction of energy flow is not always defined by Poynting's vector near wire structures.
Detection of a sudden change of the field time series based on the Lorenz system.
Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.
Scaling analysis and model estimation of solar corona index
NASA Astrophysics Data System (ADS)
Ray, Samujjwal; Ray, Rajdeep; Khondekar, Mofazzal Hossain; Ghosh, Koushik
2018-04-01
A monthly average solar green coronal index time series for the period from January 1939 to December 2008 collected from NOAA (The National Oceanic and Atmospheric Administration) has been analysed in this paper in perspective of scaling analysis and modelling. Smoothed and de-noising have been done using suitable mother wavelet as a pre-requisite. The Finite Variance Scaling Method (FVSM), Higuchi method, rescaled range (R/S) and a generalized method have been applied to calculate the scaling exponents and fractal dimensions of the time series. Autocorrelation function (ACF) is used to find autoregressive (AR) process and Partial autocorrelation function (PACF) has been used to get the order of AR model. Finally a best fit model has been proposed using Yule-Walker Method with supporting results of goodness of fit and wavelet spectrum. The results reveal an anti-persistent, Short Range Dependent (SRD), self-similar property with signatures of non-causality, non-stationarity and nonlinearity in the data series. The model shows the best fit to the data under observation.
Characterization of time dynamical evolution of electroencephalographic epileptic records
NASA Astrophysics Data System (ADS)
Rosso, Osvaldo A.; Mairal, María. Liliana
2002-09-01
Since traditional electrical brain signal analysis is mostly qualitative, the development of new quantitative methods is crucial for restricting the subjectivity in the study of brain signals. These methods are particularly fruitful when they are strongly correlated with intuitive physical concepts that allow a better understanding of the brain dynamics. The processing of information by the brain is reflected in dynamical changes of the electrical activity in time, frequency, and space. Therefore, the concomitant studies require methods capable of describing the qualitative variation of the signal in both time and frequency. The entropy defined from the wavelet functions is a measure of the order/disorder degree present in a time series. In consequence, this entropy evaluates over EEG time series gives information about the underlying dynamical process in the brain, more specifically of the synchrony of the group cells involved in the different neural responses. The total wavelet entropy results independent of the signal energy and becomes a good tool for detecting dynamical changes in the system behavior. In addition the total wavelet entropy has advantages over the Lyapunov exponents, because it is parameter free and independent of the stationarity of the time series. In this work we compared the results of the time evolution of the chaoticity (Lyapunov exponent as a function of time) with the corresponding time evolution of the total wavelet entropy in two different EEG records, one provide by depth electrodes and other by scalp ones.
Volatility of linear and nonlinear time series
NASA Astrophysics Data System (ADS)
Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo
2005-07-01
Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.
Duality between Time Series and Networks
Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.
2011-01-01
Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093
RECENT DEVELOPMENTS IN HYDROWEB DATABASE Water level time series on lakes and reservoirs (Invited)
NASA Astrophysics Data System (ADS)
Cretaux, J.; Arsen, A.; Calmant, S.
2013-12-01
We present the current state of the Hydroweb database as well as developments in progress. It provides offline water level time series on rivers, reservoirs and lakes based on altimetry data from several satellites (Topex/Poseidon, ERS, Jason-1&2, GFO and ENVISAT). The major developments in Hydroweb concerns the development of an operational data centre with automatic acquisition and processing of IGDR data for updating time series in near real time (both for lakes & rivers) and also use of additional remote sensing data, like satellite imagery allowing the calculation of lake's surfaces. A lake data centre is under development at the Legos in coordination with Hydrolare Project leaded by SHI (State Hydrological Institute of the Russian Academy of Science). It will provide the level-surface-volume variations of about 230 lakes and reservoirs, calculated through combination of various satellite images (Modis, Asar, Landsat, Cbers) and radar altimetry (Topex / Poseidon, Jason-1 & 2, GFO, Envisat, ERS2, AltiKa). The final objective is to propose a data centre fully based on remote sensing technique and controlled by in situ infrastructure for the Global Terrestrial Network for Lakes (GTN-L) under the supervision of WMO and GCOS. In a longer perspective, the Hydroweb database will integrate data from future missions (Jason-3, Jason-CS, Sentinel-3A/B) and finally will serve for the design of the SWOT mission. The products of hydroweb will be used as input data for simulation of the SWOT products (water height and surface variations of lakes and rivers). In the future, the SWOT mission will allow to monitor on a sub-monthly basis the worldwide lakes and reservoirs bigger than 250 * 250 m and Hydroweb will host water level and extent products from this
Eagle, Sarah D.; Orndorff, William; Schwartz, Benjamin F.; Doctor, Daniel H.; Gerst, Jonathan D.; Schreiber, Madeline E.
2016-01-01
The epikarst, which consists of highly weathered rock in the upper vadose zone of exposed karst systems, plays a critical role in determining the hydrologic and geochemical characteristics of recharge to an underlying karst aquifer. This study utilized time series (2007–2014) of hydrologic and geochemical data of drip water collected within James Cave, Virginia, to examine the influence of epikarst on the quantity and quality of recharge in a mature, doline-dominated karst terrain. Results show a strong seasonality of both hydrology and geochemistry of recharge, which has implications for management of karst aquifers in temperate climatic zones. First, recharge (discharge from the epikarst to the underlying aquifer) reaches a maximum between late winter and early spring, with the onset of the recharge season ranging from as early as December to as late as March during the study period. The timing and duration of the recharge season were found to be a function of precipitation in excess of evapotranspiration on a seasonal time scale. Secondly, seasonally variable residence times for water in the epikarst influence rock-water interaction and, hence, the geochemical characteristics of recharge. Overall, results highlight the strong and complex influence that the epikarst has on karst recharge, which requires long-term and high-resolution data sets to accurately understand and quantify.
Nonlinear analysis and dynamic structure in the energy market
NASA Astrophysics Data System (ADS)
Aghababa, Hajar
This research assesses the dynamic structure of the energy sector of the aggregate economy in the context of nonlinear mechanisms. Earlier studies have focused mainly on the price of the energy products when detecting nonlinearities in time series data of the energy market, and there is little mention of the production side of the market. Moreover, there is a lack of exploration about the implication of high dimensionality and time aggregation when analyzing the market's fundamentals. This research will address these gaps by including the quantity side of the market in addition to the price and by systematically incorporating various frequencies for sample sizes in three essays. The goal of this research is to provide an inclusive and exhaustive examination of the dynamics in the energy markets. The first essay begins with the application of statistical techniques, and it incorporates the most well-known univariate tests for nonlinearity with distinct power functions over alternatives and tests different null hypotheses. It utilizes the daily spot price observations on five major products in the energy market. The results suggest that the time series daily spot prices of the energy products are highly nonlinear in their nature. They demonstrate apparent evidence of general nonlinear serial dependence in each individual series, as well as nonlinearity in the first, second, and third moments of the series. The second essay examines the underlying mechanism of crude oil production and identifies the nonlinear structure of the production market by utilizing various monthly time series observations of crude oil production: the U.S. field, Organization of the Petroleum Exporting Countries (OPEC), non-OPEC, and the world production of crude oil. The finding implies that the time series data of the U.S. field, OPEC, and the world production of crude oil exhibit deep nonlinearity in their structure and are generated by nonlinear mechanisms. However, the dynamics of the non-OPEC production time series data does not reveal signs of nonlinearity. The third essay explores nonlinear structure in the case of high dimensionality of the observations, different frequencies of sample sizes, and division of the samples into sub-samples. It systematically examines the robustness of the inference methods at various levels of time aggregation by employing daily spot prices on crude oil for 26 years as well as monthly spot price index on crude oil for 41 years. The daily and monthly samples are divided into sub-samples as well. All the tests detect strong evidence of nonlinear structure in the daily spot price of crude oil; whereas in monthly observations the evidence of nonlinear dependence is less dramatic, indicating that the nonlinear serial dependence will not be as intense when the time aggregation increase in time series observations.
Dionne-Odom, Jodie; Westfall, Andrew O; Nzuobontane, Divine; Vinikoor, Michael J; Halle-Ekane, Gregory; Welty, Thomas; Tita, Alan T N
2018-01-01
Although most African countries offer hepatitis B immunization through a 3-dose vaccine series recommended at 6, 10 and 14 weeks of age, very few provide birth dose vaccination. In support of Cameroon's national plan to implement the birth dose vaccine in 2017, we investigated predictors of infant hepatitis B virus (HBV) vaccination under the current program. Using the 2011 Demographic Health Survey in Cameroon, we identified women with at least one living child (age 12-60 months) and information about the hepatitis B vaccine series. Vaccination rates were calculated, and logistic regression modeling was used to identify factors associated with 3-dose series completion. Changes over time were assessed with linear logistic model. Among 4594 mothers analyzed, 66.7% (95% confidence interval [CI]: 64.1-69.3) of infants completed the hepatitis B vaccine series; however, an average 4-week delay in series initiation was noted with median dose timing at 10, 14 and 19 weeks of age. Predictors of series completion included facility delivery (adjusted odds ratio [aOR]: 2.1; 95% CI: 1.7-2.6), household wealth (aOR: 1.9; 95% CI: 1.2-3.1 comparing the highest and lowest quintiles), Christian religion (aOR: 1.8; 95% CI: 1.3-2.5 compared with Muslim religion) and older maternal age (aOR: 1.4; 95% CI: 1.2-1.7 for 10 year units). Birth dose vaccination to reduce vertical and early childhood transmission of hepatitis B may overcome some of the obstacles to timely and complete HBV immunization in Cameroon. Increased awareness of HBV is needed among pregnant women and high-risk groups about vertical transmission, the importance of facility delivery and the effectiveness of prevention beginning with monovalent HBV vaccination at birth.
Extracting Leading Nonlinear Modes of Changing Climate From Global SST Time Series
NASA Astrophysics Data System (ADS)
Mukhin, D.; Gavrilov, A.; Loskutov, E. M.; Feigin, A. M.; Kurths, J.
2017-12-01
Data-driven modeling of climate requires adequate principal variables extracted from observed high-dimensional data. For constructing such variables it is needed to find spatial-temporal patterns explaining a substantial part of the variability and comprising all dynamically related time series from the data. The difficulties of this task rise from the nonlinearity and non-stationarity of the climate dynamical system. The nonlinearity leads to insufficiency of linear methods of data decomposition for separating different processes entangled in the observed time series. On the other hand, various forcings, both anthropogenic and natural, make the dynamics non-stationary, and we should be able to describe the response of the system to such forcings in order to separate the modes explaining the internal variability. The method we present is aimed to overcome both these problems. The method is based on the Nonlinear Dynamical Mode (NDM) decomposition [1,2], but takes into account external forcing signals. An each mode depends on hidden, unknown a priori, time series which, together with external forcing time series, are mapped onto data space. Finding both the hidden signals and the mapping allows us to study the evolution of the modes' structure in changing external conditions and to compare the roles of the internal variability and forcing in the observed behavior. The method is used for extracting of the principal modes of SST variability on inter-annual and multidecadal time scales accounting the external forcings such as CO2, variations of the solar activity and volcanic activity. The structure of the revealed teleconnection patterns as well as their forecast under different CO2 emission scenarios are discussed.[1] Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. [2] Gavrilov, A., Mukhin, D., Loskutov, E., Volodin, E., Feigin, A., & Kurths, J. (2016). Method for reconstructing nonlinear modes with adaptive structure from multidimensional data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 26(12), 123101.
NASA Astrophysics Data System (ADS)
Fujiki, Shogoro; Okada, Kei-ichi; Nishio, Shogo; Kitayama, Kanehiro
2016-09-01
We developed a new method to estimate stand ages of secondary vegetation in the Bornean montane zone, where local people conduct traditional shifting cultivation and protected areas are surrounded by patches of recovering secondary vegetation of various ages. Identifying stand ages at the landscape level is critical to improve conservation policies. We combined a high-resolution satellite image (WorldView-2) with time-series Landsat images. We extracted stand ages (the time elapsed since the most recent slash and burn) from a change-detection analysis with Landsat time-series images and superimposed the derived stand ages on the segments classified by object-based image analysis using WorldView-2. We regarded stand ages as a response variable, and object-based metrics as independent variables, to develop regression models that explain stand ages. Subsequently, we classified the vegetation of the target area into six age units and one rubber plantation unit (1-3 yr, 3-5 yr, 5-7 yr, 7-30 yr, 30-50 yr, >50 yr and 'rubber plantation') using regression models and linear discriminant analyses. Validation demonstrated an accuracy of 84.3%. Our approach is particularly effective in classifying highly dynamic pioneer vegetation younger than 7 years into 2-yr intervals, suggesting that rapid changes in vegetation canopies can be detected with high accuracy. The combination of a spectral time-series analysis and object-based metrics based on high-resolution imagery enabled the classification of dynamic vegetation under intensive shifting cultivation and yielded an informative land cover map based on stand ages.
Development of web tools to disseminate space geodesy data-related products
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.; Mezerette, A.
2014-12-01
In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). The next step currently in progress is the creation of a database to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).
Detection of a sudden change of the field time series based on the Lorenz system
Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832
Endogenous time-varying risk aversion and asset returns.
Berardi, Michele
2016-01-01
Stylized facts about statistical properties for short horizon returns in financial markets have been identified in the literature, but a satisfactory understanding for their manifestation is yet to be achieved. In this work, we show that a simple asset pricing model with representative agent is able to generate time series of returns that replicate such stylized facts if the risk aversion coefficient is allowed to change endogenously over time in response to unexpected excess returns under evolutionary forces. The same model, under constant risk aversion, would instead generate returns that are essentially Gaussian. We conclude that an endogenous time-varying risk aversion represents a very parsimonious way to make the model match real data on key statistical properties, and therefore deserves careful consideration from economists and practitioners alike.
Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A.
2013-01-01
Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/ PMID:24124417
Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A
2013-01-01
Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/
NASA Astrophysics Data System (ADS)
Rahim, K. J.; Cumming, B. F.; Hallett, D. J.; Thomson, D. J.
2007-12-01
An accurate assessment of historical local Holocene data is important in making future climate predictions. Holocene climate is often obtained through proxy measures such as diatoms or pollen using radiocarbon dating. Wiggle Match Dating (WMD) uses an iterative least squares approach to tune a core with a large amount of 14C dates to the 14C calibration curve. This poster will present a new method of tuning a time series with when only a modest number of 14C dates are available. The method presented uses the multitaper spectral estimation, and it specifically makes use of a multitaper spectral coherence tuning technique. Holocene climate reconstructions are often based on a simple depth-time fit such as a linear interpolation, splines, or low order polynomials. Many of these models make use of only a small number of 14C dates, each of which is a point estimate with a significant variance. This technique attempts to tune the 14C dates to a reference series, such as tree rings, varves, or the radiocarbon calibration curve. The amount of 14C in the atmosphere is not constant, and a significant source of variance is solar activity. A decrease in solar activity coincides with an increase in cosmogenic isotope production, and an increase in cosmogenic isotope production coincides with a decrease in temperature. The method presented uses multitaper coherence estimates and adjusts the phase of the time series to line up significant line components with that of the reference series in attempt to obtain a better depth-time fit then the original model. Given recent concerns and demonstrations of the variation in estimated dates from radiocarbon labs, methods to confirm and tune the depth-time fit can aid climate reconstructions by improving and serving to confirm the accuracy of the underlying depth-time fit. Climate reconstructions can then be made on the improved depth-time fit. This poster presents a run though of this process using Chauvin Lake in the Canadian prairies and Mt. Barr Cirque Lake located in British Columbia as examples.
Phytoplankton pigment patterns and wind forcing off central California
NASA Technical Reports Server (NTRS)
Abbott, Mark R.; Barksdale, Brett
1991-01-01
Mesoscale variability in phytoplankton pigment distributions of central California during the spring-summer upwelling season are studied via a 4-yr time series of high-resolution coastal zone color scanner imagery. Empirical orthogonal functions are used to decompose the time series of spatial images into its dominant modes of variability. The coupling between wind forcing of the upper ocean and phytoplankton distribution on mesoscales is investigated. Wind forcing, in particular the curl of the wind stress, was found to play an important role in the distribution of phytoplankton pigment in the California Current. The spring transition varies in timing and intensity from year to year but appears to be a recurrent feature associated with the rapid onset of the upwelling-favorable winds. Although the underlying dynamics may be dominated by processes other than forcing by wind stress curl, it appears that curl may force the variability of the filaments and hence the pigment patterns.
Using exogenous variables in testing for monotonic trends in hydrologic time series
Alley, William M.
1988-01-01
One approach that has been used in performing a nonparametric test for monotonic trend in a hydrologic time series consists of a two-stage analysis. First, a regression equation is estimated for the variable being tested as a function of an exogenous variable. A nonparametric trend test such as the Kendall test is then performed on the residuals from the equation. By analogy to stagewise regression and through Monte Carlo experiments, it is demonstrated that this approach will tend to underestimate the magnitude of the trend and to result in some loss in power as a result of ignoring the interaction between the exogenous variable and time. An alternative approach, referred to as the adjusted variable Kendall test, is demonstrated to generally have increased statistical power and to provide more reliable estimates of the trend slope. In addition, the utility of including an exogenous variable in a trend test is examined under selected conditions.
Tool Wear Monitoring Using Time Series Analysis
NASA Astrophysics Data System (ADS)
Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu
A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.
Fractal structure of the interplanetary magnetic field
NASA Technical Reports Server (NTRS)
Burlaga, L. F.; Klein, L. W.
1985-01-01
Under some conditions, time series of the interplanetary magnetic field strength and components have the properties of fractal curves. Magnetic field measurements made near 8.5 AU by Voyager 2 from June 5 to August 24, 1981 were self-similar over time scales from approximately 20 sec to approximately 3 x 100,000 sec, and the fractal dimension of the time series of the strength and components of the magnetic field was D = 5/3, corresponding to a power spectrum P(f) approximately f sup -5/3. Since the Kolmogorov spectrum for homogeneous, isotropic, stationary turbulence is also f sup -5/3, the Voyager 2 measurements are consistent with the observation of an inertial range of turbulence extending over approximately four decades in frequency. Interaction regions probably contributed most of the power in this interval. As an example, one interaction region is discussed in which the magnetic field had a fractal dimension D = 5/3.
An Analytical Time–Domain Expression for the Net Ripple Produced by Parallel Interleaved Converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian B.; Krein, Philip T.
We apply modular arithmetic and Fourier series to analyze the superposition of N interleaved triangular waveforms with identical amplitudes and duty-ratios. Here, interleaving refers to the condition when a collection of periodic waveforms with identical periods are each uniformly phase-shifted across one period. The main result is a time-domain expression which provides an exact representation of the summed and interleaved triangular waveforms, where the peak amplitude and parameters of the time-periodic component are all specified in closed-form. Analysis is general and can be used to study various applications in multi-converter systems. This model is unique not only in that itmore » reveals a simple and intuitive expression for the net ripple, but its derivation via modular arithmetic and Fourier series is distinct from prior approaches. The analytical framework is experimentally validated with a system of three parallel converters under time-varying operating conditions.« less
Structure of a financial cross-correlation matrix under attack
NASA Astrophysics Data System (ADS)
Lim, Gyuchang; Kim, SooYong; Kim, Junghwan; Kim, Pyungsoo; Kang, Yoonjong; Park, Sanghoon; Park, Inho; Park, Sang-Bum; Kim, Kyungsik
2009-09-01
We investigate the structure of a perturbed stock market in terms of correlation matrices. For the purpose of perturbing a stock market, two distinct methods are used, namely local and global perturbation. The former involves replacing a correlation coefficient of the cross-correlation matrix with one calculated from two Gaussian-distributed time series while the latter reconstructs the cross-correlation matrix just after replacing the original return series with Gaussian-distributed time series. Concerning the local case, it is a technical study only and there is no attempt to model reality. The term ‘global’ means the overall effect of the replacement on other untouched returns. Through statistical analyses such as random matrix theory (RMT), network theory, and the correlation coefficient distributions, we show that the global structure of a stock market is vulnerable to perturbation. However, apart from in the analysis of inverse participation ratios (IPRs), the vulnerability becomes dull under a small-scale perturbation. This means that these analysis tools are inappropriate for monitoring the whole stock market due to the low sensitivity of a stock market to a small-scale perturbation. In contrast, when going down to the structure of business sectors, we confirm that correlation-based business sectors are regrouped in terms of IPRs. This result gives a clue about monitoring the effect of hidden intentions, which are revealed via portfolios taken mostly by large investors.
Reid, Brian J; Papanikolaou, Niki D; Wilcox, Ronah K
2005-02-01
The catabolic activity with respect to the systemic herbicide isoproturon was determined in soil samples by (14)C-radiorespirometry. The first experiment assessed levels of intrinsic catabolic activity in soil samples that represented three dissimilar soil series under arable cultivation. Results showed average extents of isoproturon mineralisation (after 240 h assay time) in the three soil series to be low. A second experiment assessed the impact of addition of isoproturon (0.05 microg kg(-1)) into these soils on the levels of catabolic activity following 28 days of incubation. Increased catabolic activity was observed in all three soils. A third experiment assessed levels of intrinsic catabolic activity in soil samples representing a single soil series managed under either conventional agricultural practice (including the use of isoproturon) or organic farming practice (with no use of isoproturon). Results showed higher (and more consistent) levels of isoproturon mineralisation in the soil samples collected from conventional land use. The final experiment assessed the impact of isoproturon addition on the levels of inducible catabolic activity in these soils. The results showed no significant difference in the case of the conventional farm soil samples while the induction of catabolic activity in the organic farm soil samples was significant.
Single-qubit decoherence under a separable coupling to a random matrix environment
NASA Astrophysics Data System (ADS)
Carrera, M.; Gorin, T.; Seligman, T. H.
2014-08-01
This paper describes the dynamics of a quantum two-level system (qubit) under the influence of an environment modeled by an ensemble of random matrices. In distinction to earlier work, we consider here separable couplings and focus on a regime where the decoherence time is of the same order of magnitude as the environmental Heisenberg time. We derive an analytical expression in the linear response approximation, and study its accuracy by comparison with numerical simulations. We discuss a series of unusual properties, such as purity oscillations, strong signatures of spectral correlations (in the environment Hamiltonian), memory effects, and symmetry-breaking equilibrium states.
Lippmann, M.
1964-04-01
A cascade particle impactor capable of collecting particles and distributing them according to size is described. In addition the device is capable of collecting on a pair of slides a series of different samples so that less time is required for the changing of slides. Other features of the device are its compactness and its ruggedness making it useful under field conditions. Essentially the unit consists of a main body with a series of transverse jets discharging on a pair of parallel, spaced glass plates. The plates are capable of being moved incremental in steps to obtain the multiple samples. (AEC)
Multiple Indicator Stationary Time Series Models.
ERIC Educational Resources Information Center
Sivo, Stephen A.
2001-01-01
Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…
Wohlin, Åsa
2015-03-21
The distribution of codons in the nearly universal genetic code is a long discussed issue. At the atomic level, the numeral series 2x(2) (x=5-0) lies behind electron shells and orbitals. Numeral series appear in formulas for spectral lines of hydrogen. The question here was if some similar scheme could be found in the genetic code. A table of 24 codons was constructed (synonyms counted as one) for 20 amino acids, four of which have two different codons. An atomic mass analysis was performed, built on common isotopes. It was found that a numeral series 5 to 0 with exponent 2/3 times 10(2) revealed detailed congruency with codon-grouped amino acid side-chains, simultaneously with the division on atom kinds, further with main 3rd base groups, backbone chains and with codon-grouped amino acids in relation to their origin from glycolysis or the citrate cycle. Hence, it is proposed that this series in a dynamic way may have guided the selection of amino acids into codon domains. Series with simpler exponents also showed noteworthy correlations with the atomic mass distribution on main codon domains; especially the 2x(2)-series times a factor 16 appeared as a conceivable underlying level, both for the atomic mass and charge distribution. Furthermore, it was found that atomic mass transformations between numeral systems, possibly interpretable as dimension degree steps, connected the atomic mass of codon bases with codon-grouped amino acids and with the exponent 2/3-series in several astonishing ways. Thus, it is suggested that they may be part of a deeper reference system. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2012 CFR
2012-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2010 CFR
2010-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...
46 CFR 108.540 - Survival craft muster and embarkation arrangements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT Lifesaving Equipment § 108.540 Survival craft muster and... minutes from the time the instruction to board is given. (e) Each davit-launched and free-fall survival... ladder as follows: (1) Each embarkation ladder must be approved under approval series 160.117 or be a...
26 CFR 1.402(f)-1 - Required explanation of eligible rollover distributions; questions and answers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... in a series of periodic payments that are eligible rollover distributions? Q-4: May a plan... a reasonable period of time before making an eligible rollover distribution, to provide the...(f) notice must be designed to be easily understood and must explain the following: the rules under...
33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?
Code of Federal Regulations, 2011 CFR
2011-07-01
... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts
1981-05-01
program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .
Processing Conversational Implicatures: Alternatives and Counterfactual Reasoning
ERIC Educational Resources Information Center
Tiel, Bob; Schaeken, Walter
2017-01-01
In a series of experiments, Bott and Noveck (2004) found that the computation of scalar inferences, a variety of conversational implicature, caused a delay in response times. In order to determine what aspect of the inferential process that underlies scalar inferences caused this delay, we extended their paradigm to three other kinds of…
Toward A Theory of HRD Learning Participation
ERIC Educational Resources Information Center
Wang, Greg G.; Wang, Jia
2005-01-01
This article fills a gap by identifying an under-studied area for learning participation (LP) in HRD theory building. A literature review is presented to identify gaps in adult education and HRD literature. An HRD LP framework is then proposed, from cross-sectional/time-series perspectives, to describe the pattern, factors, structure, and the…
A Critical Review of Line Graphs in Behavior Analytic Journals
ERIC Educational Resources Information Center
Kubina, Richard M., Jr.; Kostewicz, Douglas E.; Brennan, Kaitlyn M.; King, Seth A.
2017-01-01
Visual displays such as graphs have played an instrumental role in psychology. One discipline relies almost exclusively on graphs in both applied and basic settings, behavior analysis. The most common graphic used in behavior analysis falls under the category of time series. The line graph represents the most frequently used display for visual…
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 2 2010-07-01 2010-07-01 false Amendment. 102.17 Section 102.17 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Procedure Under Section 10 (a... hearing; and after the case has been transferred to the Board pursuant to § 102.45, at any time prior to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahle, J.J.; Buettner, L.C.; Mauer, S.
A series of experimental results are reported for breakthrough of the agent simulants DMMP and DIMP on coconut carbon. This adsorbent is used in filters for the Chemical Demiliterization program. The conditions were appropriate for a post treatment stack gas filter. Results indicate that high capacity and long filtration times are achievable under moderate humidity conditions up to 180 degrees F.
Genetic Networks and Anticipation of Gene Expression Patterns
NASA Astrophysics Data System (ADS)
Gebert, J.; Lätsch, M.; Pickl, S. W.; Radde, N.; Weber, G.-W.; Wünschiers, R.
2004-08-01
An interesting problem for computational biology is the analysis of time-series expression data. Here, the application of modern methods from dynamical systems, optimization theory, numerical algorithms and the utilization of implicit discrete information lead to a deeper understanding. In [1], we suggested to represent the behavior of time-series gene expression patterns by a system of ordinary differential equations, which we analytically and algorithmically investigated under the parametrical aspect of stability or instability. Our algorithm strongly exploited combinatorial information. In this paper, we deepen, extend and exemplify this study from the viewpoint of underlying mathematical modelling. This modelling consists in evaluating DNA-microarray measurements as the basis of anticipatory prediction, in the choice of a smooth model given by differential equations, in an approach of the right-hand side with parametric matrices, and in a discrete approximation which is a least squares optimization problem. We give a mathematical and biological discussion, and pay attention to the special case of a linear system, where the matrices do not depend on the state of expressions. Here, we present first numerical examples.
Temporal turnover and the maintenance of diversity in ecological assemblages
Magurran, Anne E.; Henderson, Peter A.
2010-01-01
Temporal variation in species abundances occurs in all ecological communities. Here, we explore the role that this temporal turnover plays in maintaining assemblage diversity. We investigate a three-decade time series of estuarine fishes and show that the abundances of the individual species fluctuate asynchronously around their mean levels. We then use a time-series modelling approach to examine the consequences of different patterns of turnover, by asking how the correlation between the abundance of a species in a given year and its abundance in the previous year influences the structure of the overall assemblage. Classical diversity measures that ignore species identities reveal that the observed assemblage structure will persist under all but the most extreme conditions. However, metrics that track species identities indicate a narrower set of turnover scenarios under which the predicted assemblage resembles the natural one. Our study suggests that species diversity metrics are insensitive to change and that measures that track species ranks may provide better early warning that an assemblage is being perturbed. It also highlights the need to incorporate temporal turnover in investigations of assemblage structure and function. PMID:20980310
Assessing the catchment's filtering effect on the propagation of meteorological anomalies
NASA Astrophysics Data System (ADS)
di Domenico, Antonella; Laguardia, Giovanni; Margiotta, Maria Rosaria
2010-05-01
The characteristics of drought propagation within a catchment are evaluated by means of the analysis of time series of water fluxes and storages' states. The study area is the Agri basin, Southern Italy, closed at the Tarangelo gauging station (507 km2). Once calibrated the IRP weather generator (Veneziano and Iacobellis, 2002) on observed data, a 100 years time series of precipitation has been produced. The drought statistics obtained from the synthetic data have been compared to the ones obtained from the limited observations available. The DREAM hydrological model has been calibrated based on observed precipitation and discharge. From the model run on the synthetic precipitation we have obtained the time series of variables relevant for assessing the status of the catchment, namely total runoff and its components, actual evapotranspiration, and soil moisture. The Standardized Precipitation Index (SPI; McKee et al., 1993) has been calculated for different averaging periods. The modelled data have been processed for the calculation of drought indices. In particular, we have chosen to use their transformation into standardized variables. We have performed autocorrelation analysis for assessing the characteristic time scales of the variables. Moreover, we have investigated through cross correlation their relationships, assessing also the SPI averaging period for which the maximum correlation is reached. The variables' drought statistics, namely number of events, duration, and deficit volumes, have been assessed. As a result of the filtering effect exerted by the different catchment storages, the characteristic time scale and the maximum correlation SPI averaging periods for the different time series tend to increase. Thus, the number of drought events tends to decrease and their duration to increase under increasing storage.
Trends and Patterns in a New Time Series of Natural and Anthropogenic Methane Emissions, 1980-2000
NASA Astrophysics Data System (ADS)
Matthews, E.; Bruhwiler, L.; Themelis, N. J.
2007-12-01
We report on a new time series of methane (CH4) emissions from anthropogenic and natural sources developed for a multi-decadal methane modeling study (see following presentation by Bruhwiler et al.). The emission series extends from 1980 through the early 2000s with annual emissions for all countries has several features distinct from the source histories based on IPCC methods typically employed in modeling the global methane cycle. Fossil fuel emissions rely on 7 fuel-process emission combinations and minimize reliance on highly-uncertain emission factors. Emissions from ruminant animals employ regional profiles of bovine populations that account for the influence of variable age- and size-demographics on emissions and are ~15% lower than other estimates. Waste-related emissions are developed using an approach that avoids using of data-poor emission factors and accounts for impacts of recycling and thermal treatment of waste on diverting material from landfills and CH4 capture at landfill facilities. Emissions from irrigated rice use rice-harvest areas under 3 water-management systems and a new historical data set that analyzes multiple sources for trends in water management since 1980. A time series of emissions from natural wetlands was developed by applying a multiple-regression model derived from full process-based model of Walter with analyzed meteorology from the ERA-40 reanalysis.
NASA Astrophysics Data System (ADS)
Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.
2011-12-01
Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical calculations such as running averages are easily implemented against the time-centric columns of data.
Radon anomalies: When are they possible to be detected?
NASA Astrophysics Data System (ADS)
Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik
2017-04-01
Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission rate, but the results are strongly dependent on the length of the time window and/or type of frequency filtering. More importantly, when raw time-series contain cyclic components (e.g. seasonal or diurnal variation), the quest of anomalies related to transients becomes meaningless. We conclude that an objective identification of transient changes can be performed only after filtering the raw time-series for the physically meaningful frequency content.
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
Subsidence Evaluation of High-Speed Railway in Shenyang Based on Time-Series Insar
NASA Astrophysics Data System (ADS)
Zhang, Yun; Wei, Lianhuan; Li, Jiayu; Liu, Shanjun; Mao, Yachun; Wu, Lixin
2018-04-01
More and more high-speed railway are under construction in China. The slow settlement along high-speed railway tracks and newly-built stations would lead to inhomogeneous deformation of local area, and the accumulation may be a threat to the safe operation of high-speed rail system. In this paper, surface deformation of the newly-built high-speed railway station as well as the railway lines in Shenyang region will be retrieved by time series InSAR analysis using multi-orbit COSMO-SkyMed images. This paper focuses on the non-uniform subsidence caused by the changing of local environment along the railway. The accuracy of the settlement results can be verified by cross validation of the results obtained from two different orbits during the same period.
Directed dynamical influence is more detectable with noise
Jiang, Jun-Jie; Huang, Zi-Gang; Huang, Liang; Liu, Huan; Lai, Ying-Cheng
2016-01-01
Successful identification of directed dynamical influence in complex systems is relevant to significant problems of current interest. Traditional methods based on Granger causality and transfer entropy have issues such as difficulty with nonlinearity and large data requirement. Recently a framework based on nonlinear dynamical analysis was proposed to overcome these difficulties. We find, surprisingly, that noise can counterintuitively enhance the detectability of directed dynamical influence. In fact, intentionally injecting a proper amount of asymmetric noise into the available time series has the unexpected benefit of dramatically increasing confidence in ascertaining the directed dynamical influence in the underlying system. This result is established based on both real data and model time series from nonlinear ecosystems. We develop a physical understanding of the beneficial role of noise in enhancing detection of directed dynamical influence. PMID:27066763
Directed dynamical influence is more detectable with noise.
Jiang, Jun-Jie; Huang, Zi-Gang; Huang, Liang; Liu, Huan; Lai, Ying-Cheng
2016-04-12
Successful identification of directed dynamical influence in complex systems is relevant to significant problems of current interest. Traditional methods based on Granger causality and transfer entropy have issues such as difficulty with nonlinearity and large data requirement. Recently a framework based on nonlinear dynamical analysis was proposed to overcome these difficulties. We find, surprisingly, that noise can counterintuitively enhance the detectability of directed dynamical influence. In fact, intentionally injecting a proper amount of asymmetric noise into the available time series has the unexpected benefit of dramatically increasing confidence in ascertaining the directed dynamical influence in the underlying system. This result is established based on both real data and model time series from nonlinear ecosystems. We develop a physical understanding of the beneficial role of noise in enhancing detection of directed dynamical influence.
Testing the mean for dependent business data.
Liang, Jiajuan; Martin, Linda
2008-01-01
In business data analysis, it is well known that the comparison of several means is usually carried out by the F-test in analysis of variance under the assumption of independently collected data from all populations. This assumption, however, is likely to be violated in survey data collected from various questionnaires or time-series data. As a result, it is not justifiable or problematic to apply the traditional F-test to comparison of dependent means directly. In this article, we develop a generalized F-test for comparing population means with dependent data. Simulation studies show that the proposed test has a simple approximate null distribution and feasible finite-sample properties. Applications of the proposed test in analysis of survey data and time-series data are illustrated by two real datasets.
A novel weight determination method for time series data aggregation
NASA Astrophysics Data System (ADS)
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Zhong, Ning; Li, Kuncheng
2014-08-01
Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series and letter series completion tasks; even the underlying rules are identical. In the present study, we examined cortical activation as a function of two different reasoning strategies for solving series completion tasks. The retrieval strategy, used in number series completion tasks, involves direct retrieving of arithmetic knowledge to get the relations between items. The procedural strategy, used in letter series completion tasks, requires counting a certain number of times to detect the relations linking two items. The two strategies require essentially the equivalent cognitive processes, but have different working memory demands (the procedural strategy incurs greater demands). The procedural strategy produced significant greater activity in areas involved in memory retrieval (dorsolateral prefrontal cortex, DLPFC) and mental representation/maintenance (posterior parietal cortex, PPC). An ACT-R model of the tasks successfully predicted behavioral performance and BOLD responses. The present findings support a general-purpose dual-process theory of inductive reasoning regarding the cognitive architecture. Copyright © 2014 Elsevier B.V. All rights reserved.
Detectability of Granger causality for subsampled continuous-time neurophysiological processes.
Barnett, Lionel; Seth, Anil K
2017-01-01
Granger causality is well established within the neurosciences for inference of directed functional connectivity from neurophysiological data. These data usually consist of time series which subsample a continuous-time biophysiological process. While it is well known that subsampling can lead to imputation of spurious causal connections where none exist, less is known about the effects of subsampling on the ability to reliably detect causal connections which do exist. We present a theoretical analysis of the effects of subsampling on Granger-causal inference. Neurophysiological processes typically feature signal propagation delays on multiple time scales; accordingly, we base our analysis on a distributed-lag, continuous-time stochastic model, and consider Granger causality in continuous time at finite prediction horizons. Via exact analytical solutions, we identify relationships among sampling frequency, underlying causal time scales and detectability of causalities. We reveal complex interactions between the time scale(s) of neural signal propagation and sampling frequency. We demonstrate that detectability decays exponentially as the sample time interval increases beyond causal delay times, identify detectability "black spots" and "sweet spots", and show that downsampling may potentially improve detectability. We also demonstrate that the invariance of Granger causality under causal, invertible filtering fails at finite prediction horizons, with particular implications for inference of Granger causality from fMRI data. Our analysis emphasises that sampling rates for causal analysis of neurophysiological time series should be informed by domain-specific time scales, and that state-space modelling should be preferred to purely autoregressive modelling. On the basis of a very general model that captures the structure of neurophysiological processes, we are able to help identify confounds, and offer practical insights, for successful detection of causal connectivity from neurophysiological recordings. Copyright © 2016 Elsevier B.V. All rights reserved.
Association mining of dependency between time series
NASA Astrophysics Data System (ADS)
Hafez, Alaaeldin
2001-03-01
Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-21
... Evacuation Systems Approved Under Technical Standard Order (TSO) TSO-C69b and Installed on Airbus Model A330-200 and -300 Series Airplanes, Model A340-200 and -300 Series Airplanes, and Model A340-541 and -642... evacuation systems approved under TSO- C69b and installed on certain Model A330-200 and -300 series airplanes...
Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin
2017-03-01
The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.
A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.
Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan
2017-01-01
Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2018-01-01
Recurrence networks and the associated statistical measures have become important tools in the analysis of time series data. In this work, we test how effective the recurrence network measures are in analyzing real world data involving two main types of noise, white noise and colored noise. We use two prominent network measures as discriminating statistic for hypothesis testing using surrogate data for a specific null hypothesis that the data is derived from a linear stochastic process. We show that the characteristic path length is especially efficient as a discriminating measure with the conclusions reasonably accurate even with limited number of data points in the time series. We also highlight an additional advantage of the network approach in identifying the dimensionality of the system underlying the time series through a convergence measure derived from the probability distribution of the local clustering coefficients. As examples of real world data, we use the light curves from a prominent black hole system and show that a combined analysis using three primary network measures can provide vital information regarding the nature of temporal variability of light curves from different spectroscopic classes.
NASA Astrophysics Data System (ADS)
An, Yang; Sun, Mei; Gao, Cuixia; Han, Dun; Li, Xiuming
2018-02-01
This paper studies the influence of Brent oil price fluctuations on the stock prices of China's two distinct blocks, namely, the petrochemical block and the electric equipment and new energy block, applying the Shannon entropy of information theory. The co-movement trend of crude oil price and stock prices is divided into different fluctuation patterns with the coarse-graining method. Then, the bivariate time series network model is established for the two blocks stock in five different periods. By joint analysis of the network-oriented metrics, the key modes and underlying evolutionary mechanisms were identified. The results show that the both networks have different fluctuation characteristics in different periods. Their co-movement patterns are clustered in some key modes and conversion intermediaries. The study not only reveals the lag effect of crude oil price fluctuations on the stock in Chinese industry blocks but also verifies the necessity of research on special periods, and suggests that the government should use different energy policies to stabilize market volatility in different periods. A new way is provided to study the unidirectional influence between multiple variables or complex time series.
NASA Astrophysics Data System (ADS)
Dergachev, V. A.; Dmitriev, P. B.
2017-12-01
An inhomogeneous time series of measurements of the percentage content of biogenic silica in the samples of joint cores BDP-96-1 and BDP-96-2 from the bottom of Lake Baikal drilled at a depth of 321 m under water has been analyzed. The composite depth of cores is 77 m, which covers the Pleistocene Epoch to 1.8 Ma. The time series was reduced to a regular form with a time step of 1 kyr, which allowed 16 distinct quasi-periodic components with periods from 19 to 251 kyr to be revealed in this series at a significance level of their amplitudes exceeding 4σ. For this, the combined spectral periodogram (a modification of the spectral analysis method) was used. Some of the revealed quasi-harmonics are related to the characteristic cyclical oscillations of the Earth's orbital parameters. Special focus was payed to the temporal change in the parameters of the revealed quasi-harmonic components over the Pleistocene Epoch, which was studied by constructing the spectral density of the analyzed data in the running window of 201 and 701 kyr.
Anomaly on Superspace of Time Series Data
NASA Astrophysics Data System (ADS)
Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin
2017-11-01
We apply the G-theory and anomaly of ghost and antighost fields in the theory of supersymmetry to study a superspace over time series data for the detection of hidden general supply and demand equilibrium in the financial market. We provide proof of the existence of a general equilibrium point over 14 extradimensions of the new G-theory compared with the M-theory of the 11 dimensions model of Edward Witten. We found that the process of coupling between nonequilibrium and equilibrium spinor fields of expectation ghost fields in the superspace of time series data induces an infinitely long exact sequence of cohomology from a short exact sequence of moduli state space model. If we assume that the financial market is separated into two topological spaces of supply and demand as the D-brane and anti-D-brane model, then we can use a cohomology group to compute the stability of the market as a stable point of the general equilibrium of the interaction between D-branes of the market. We obtain the result that the general equilibrium will exist if and only if the 14th Batalin-Vilkovisky cohomology group with the negative dimensions underlying 14 major hidden factors influencing the market is zero.
NASA Astrophysics Data System (ADS)
Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.
2017-12-01
The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time series change detection and update approach followed here, science outcomes or reports representing one temporal epoch can be considered stable and will not be altered when a time series is updated with newly available data.
NASA Technical Reports Server (NTRS)
Chelton, Dudley B.; Schlax, Michael G.
1991-01-01
The sampling error of an arbitrary linear estimate of a time-averaged quantity constructed from a time series of irregularly spaced observations at a fixed located is quantified through a formalism. The method is applied to satellite observations of chlorophyll from the coastal zone color scanner. The two specific linear estimates under consideration are the composite average formed from the simple average of all observations within the averaging period and the optimal estimate formed by minimizing the mean squared error of the temporal average based on all the observations in the time series. The resulting suboptimal estimates are shown to be more accurate than composite averages. Suboptimal estimates are also found to be nearly as accurate as optimal estimates using the correct signal and measurement error variances and correlation functions for realistic ranges of these parameters, which makes it a viable practical alternative to the composite average method generally employed at present.
Chaotic behaviour of the short-term variations in ozone column observed in Arctic
NASA Astrophysics Data System (ADS)
Petkov, Boyan H.; Vitale, Vito; Mazzola, Mauro; Lanconelli, Christian; Lupi, Angelo
2015-09-01
The diurnal variations observed in the ozone column at Ny-Ålesund, Svalbard during different periods of 2009, 2010 and 2011 have been examined to test the hypothesis that they could be a result of a chaotic process. It was found that each of the attractors, reconstructed by applying the time delay technique and corresponding to any of the three time series can be embedded by 6-dimensional space. Recurrence plots, depicted to characterise the attractor features revealed structures typical for a chaotic system. In addition, the two positive Lyapunov exponents found for the three attractors, the fractal Hausdorff dimension presented by the Kaplan-Yorke estimator and the feasibility to predict the short-term ozone column variations within 10-20 h, knowing the past behaviour make the assumption about their chaotic character more realistic. The similarities of the estimated parameters in all three cases allow us to hypothesise that the three time series under study likely present one-dimensional projections of the same chaotic system taken at different time intervals.
Inferring the interplay between network structure and market effects in Bitcoin
NASA Astrophysics Data System (ADS)
Kondor, Dániel; Csabai, István; Szüle, János; Pósfai, Márton; Vattay, Gábor
2014-12-01
A main focus in economics research is understanding the time series of prices of goods and assets. While statistical models using only the properties of the time series itself have been successful in many aspects, we expect to gain a better understanding of the phenomena involved if we can model the underlying system of interacting agents. In this article, we consider the history of Bitcoin, a novel digital currency system, for which the complete list of transactions is available for analysis. Using this dataset, we reconstruct the transaction network between users and analyze changes in the structure of the subgraph induced by the most active users. Our approach is based on the unsupervised identification of important features of the time variation of the network. Applying the widely used method of Principal Component Analysis to the matrix constructed from snapshots of the network at different times, we are able to show how structural changes in the network accompany significant changes in the exchange price of bitcoins.
Leal, Aura Lucia; Montañez, Anita Maria; Buitrago, Giancarlo; Patiño, Jaime; Camacho, German; Moreno, Vivian Marcela; Colombia, Red Neumo
2017-01-01
Abstract Background Trends in distribution of S. pneumoniae capsular serotypes are associated with the introduction of pneumococcal conjugate vaccines (PCV) among population. In Colombia, 10-valent PCV (PCV10) has been included in the national vaccination program since 2011. As a part of the pneumococcal surveillance network (SIREVA), Colombia has gathered data of serotype distribution since 1993. The aim of this work is to determine the effect of PCV10 introduction on non-coverage serotypes by PCV10 in Colombia, specifically, the effect on 6A, 19A and 3 serotypes. Methods Information was obtained from the national surveillance program since 1993 to 2016 in children under 5 years. The isolates came from sterile sites (blood, cerebrospinal fluid, pleural fluid, articular and peritoneal fluids). All the isolates were serotyping by National Institute of Health. An interrupted time series analysis was performed to determine the effect of the PCV10 introduction on the 6A, 19A and 3 serotypes (ARIMA model). Results Serotyping was performed in 4683 isolates. The annual proportion trend of the 6A, 19A and 3 serotypes remained constant until 2012. An increase of double in the serotype proportion trends was observed after 2012 (Figure). The interrupted time-series analysis showed a positive effect of the PCV10 introduction on trends of 19A and 3 serotypes, with coefficients 20.92 (P = 0.00, ARIMA(2,0,1)) and 6.32 (P = 0.00, ARIMA(2,1,1), respectively. There was no significant effect on 6A serotype trend. Conclusion The introduction of PCV10 in the national vaccination program in Colombia, affected the distribution of PVC 13 capsular types non included in the PCV 7 and PCV 10 in children under 5 years. This information emphasizes the importance to surveillance the changes in serotype distributions to guide prevention strategies in children under 5 years in Colombia. Figure. 1 Trends in distribution of serotypes 19A, 3 and 6A in children under 5 years. Colombia. Disclosures All authors: No reported disclosures.
New Insights into the Explosion Source from SPE
NASA Astrophysics Data System (ADS)
Patton, H. J.
2015-12-01
Phase I of the Source Physics Experiments (SPE) is a series of chemical explosions at varying depths and yields detonated in the same emplacement hole on Climax stock, a granitic pluton located on the Nevada National Security Site. To date, four of the seven planned tests have been conducted, the last in May 2015, called SPE-4P, with a scaled depth of burial of 1549 m/kt1/3 in order to localize the source in time and space. Surface ground motions validated that the source medium did not undergo spallation, and a key experimental objective was achieved where SPE-4P is the closest of all tests in the series to a pure monopole source and will serve as an empirical Green's function for analysis against other SPE tests. A scientific objective of SPE is to understand mechanisms of rock damage for generating seismic waves, particularly surface and S waves, including prompt damage under compressive stresses and "late-time" damage under tensile stresses. Studies have shown that prompt damage can explain ~75% of the seismic moment for some SPE tests. Spallation is a form of late-time damage and a facilitator of damage mechanisms under tensile stresses including inelastic brittle deformation and shear dilatancy on pre-existing faults or joints. As an empirical Green's function, SPE-4P allows the study of late-time damage mechanisms on other SPE tests that induce spallation and late-time damage, and I'll discuss these studies. The importance for nuclear monitoring cannot be overstated because new research shows that damage mechanisms can affect surface wave magnitude Ms more than tectonic release, and are a likely factor related to anomalous mb-Ms behavior for North Korean tests.
Jo, Kyuri; Kwon, Hawk-Bin; Kim, Sun
2014-06-01
Measuring expression levels of genes at the whole genome level can be useful for many purposes, especially for revealing biological pathways underlying specific phenotype conditions. When gene expression is measured over a time period, we have opportunities to understand how organisms react to stress conditions over time. Thus many biologists routinely measure whole genome level gene expressions at multiple time points. However, there are several technical difficulties for analyzing such whole genome expression data. In addition, these days gene expression data is often measured by using RNA-sequencing rather than microarray technologies and then analysis of expression data is much more complicated since the analysis process should start with mapping short reads and produce differentially activated pathways and also possibly interactions among pathways. In addition, many useful tools for analyzing microarray gene expression data are not applicable for the RNA-seq data. Thus a comprehensive package for analyzing time series transcriptome data is much needed. In this article, we present a comprehensive package, Time-series RNA-seq Analysis Package (TRAP), integrating all necessary tasks such as mapping short reads, measuring gene expression levels, finding differentially expressed genes (DEGs), clustering and pathway analysis for time-series data in a single environment. In addition to implementing useful algorithms that are not available for RNA-seq data, we extended existing pathway analysis methods, ORA and SPIA, for time series analysis and estimates statistical values for combined dataset by an advanced metric. TRAP also produces visual summary of pathway interactions. Gene expression change labeling, a practical clustering method used in TRAP, enables more accurate interpretation of the data when combined with pathway analysis. We applied our methods on a real dataset for the analysis of rice (Oryza sativa L. Japonica nipponbare) upon drought stress. The result showed that TRAP was able to detect pathways more accurately than several existing methods. TRAP is available at http://biohealth.snu.ac.kr/software/TRAP/. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zingone, Adriana; Harrison, Paul J.; Kraberg, Alexandra; Lehtinen, Sirpa; McQuatters-Gollop, Abigail; O'Brien, Todd; Sun, Jun; Jakobsen, Hans H.
2015-09-01
Phytoplankton diversity and its variation over an extended time scale can provide answers to a wide range of questions relevant to societal needs. These include human health, the safe and sustained use of marine resources and the ecological status of the marine environment, including long-term changes under the impact of multiple stressors. The analysis of phytoplankton data collected at the same place over time, as well as the comparison among different sampling sites, provide key information for assessing environmental change, and evaluating new actions that must be made to reduce human induced pressures on the environment. To achieve these aims, phytoplankton data may be used several decades later by users that have not participated in their production, including automatic data retrieval and analysis. The methods used in phytoplankton species analysis vary widely among research and monitoring groups, while quality control procedures have not been implemented in most cases. Here we highlight some of the main differences in the sampling and analytical procedures applied to phytoplankton analysis and identify critical steps that are required to improve the quality and inter-comparability of data obtained at different sites and/or times. Harmonization of methods may not be a realistic goal, considering the wide range of purposes of phytoplankton time-series data collection. However, we propose that more consistent and detailed metadata and complementary information be recorded and made available along with phytoplankton time-series datasets, including description of the procedures and elements allowing for a quality control of the data. To keep up with the progress in taxonomic research, there is a need for continued training of taxonomists, and for supporting and complementing existing web resources, in order to allow a constant upgrade of knowledge in phytoplankton classification and identification. Efforts towards the improvement of metadata recording, data annotation and quality control procedures will ensure the internal consistency of phytoplankton time series and facilitate their comparability and accessibility, thus strongly increasing the value of the precious information they provide. Ultimately, the sharing of quality controlled data will allow one to recoup the high cost of obtaining the data through the multiple use of the time-series data in various projects over many decades.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Relating traffic fatalities to GDP in Europe on the long term.
Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora; Lassarre, Sylvain
2016-07-01
Modeling road safety development can provide important insight into policies for the reduction of traffic fatalities. In order to achieve this goal, both the quantifiable impact of specific parameters, as well as the underlying trends that cannot always be measured or observed, need to be considered. One of the key relationships in road safety links fatalities with risk and exposure, where exposure reflects the amount of travel, which in turn translates to how much travelers are exposed to risk. In general two economic variables: GDP and unemployment rate are selected to analyse the statistical relationships with some indicators of road accident fatality risk. The objective of this research is to provide an overview of relevant literature on the topic and outline some recent developments in macro-panel data analysis that have resulted in ongoing research that has the potential to improve our ability to forecast traffic fatality trends, especially under turbulent financial situations. For this analysis, time series of the number of fatalities and GDP in 30 European countries for a period of 38 years (1975-2012) are used. This process relies on estimating long-term models (as captured by long term time-series models, which model each country separately). Based on these developments, utilizing state-of-the-art modelling and analysis techniques such as the Common Correlated Effects Mean Group estimator (Pesaran), the long-term elasticity mean value equals 0.63, and is significantly different from zero for 10 countries only. When we take away the countries, where the number of fatalities is stationary, the average elasticity takes a higher value of nearly 1. This shows the strong sensitivity of the estimate of the average elasticity over a panel of European countries and underlines the necessity to be aware of the underlying nature of the time series, to get a suitable regression model. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ngan, Chun-Kit
2013-01-01
Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…
Multiscale structure of time series revealed by the monotony spectrum.
Vamoş, Călin
2017-03-01
Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.
Zhang, Zhenwei; VanSwearingen, Jessie; Brach, Jennifer S.; Perera, Subashan
2016-01-01
Human gait is a complex interaction of many nonlinear systems and stride intervals exhibit self-similarity over long time scales that can be modeled as a fractal process. The scaling exponent represents the fractal degree and can be interpreted as a biomarker of relative diseases. The previous study showed that the average wavelet method provides the most accurate results to estimate this scaling exponent when applied to stride interval time series. The purpose of this paper is to determine the most suitable mother wavelet for the average wavelet method. This paper presents a comparative numerical analysis of sixteen mother wavelets using simulated and real fractal signals. Simulated fractal signals were generated under varying signal lengths and scaling exponents that indicate a range of physiologically conceivable fractal signals. The five candidates were chosen due to their good performance on the mean square error test for both short and long signals. Next, we comparatively analyzed these five mother wavelets for physiologically relevant stride time series lengths. Our analysis showed that the symlet 2 mother wavelet provides a low mean square error and low variance for long time intervals and relatively low errors for short signal lengths. It can be considered as the most suitable mother function without the burden of considering the signal length. PMID:27960102
Java 3D Interactive Visualization for Astrophysics
NASA Astrophysics Data System (ADS)
Chae, K.; Edirisinghe, D.; Lingerfelt, E. J.; Guidry, M. W.
2003-05-01
We are developing a series of interactive 3D visualization tools that employ the Java 3D API. We have applied this approach initially to a simple 3-dimensional galaxy collision model (restricted 3-body approximation), with quite satisfactory results. Running either as an applet under Web browser control, or as a Java standalone application, this program permits real-time zooming, panning, and 3-dimensional rotation of the galaxy collision simulation under user mouse and keyboard control. We shall also discuss applications of this technology to 3-dimensional visualization for other problems of astrophysical interest such as neutron star mergers and the time evolution of element/energy production networks in X-ray bursts. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.
Ethical and methodological standards for laboratory and medical biological rhythm research.
Portaluppi, Francesco; Touitou, Yvan; Smolensky, Michael H
2008-11-01
The main objectives of this article are to update the ethical standards for the conduct of human and animal biological rhythm research and recommend essential elements for quality chronobiological research information, which should be especially useful for new investigators of the rhythms of life. A secondary objective is to provide for those with an interest in the results of chronobiology investigations, but who might be unfamiliar with the field, an introduction to the basic methods and standards of biological rhythm research and time series data analysis. The journal and its editors endorse compliance of all investigators to the principles of the Declaration of Helsinki of the World Medical Association, which relate to the conduct of ethical research on human beings, and the Guide for the Care and Use of Laboratory Animals of the Institute for Laboratory Animal Research of the National Research Council, which relate to the conduct of ethical research on laboratory and other animals. The editors and the readers of the journal expect the authors of submitted manuscripts to have adhered to the ethical standards dictated by local, national, and international laws and regulations in the conduct of investigations and to be unbiased and accurate in reporting never-before-published research findings. Authors of scientific papers are required to disclose all potential conflicts of interest, particularly when the research is funded in part or in full by the medical and pharmaceutical industry, when the authors are stock-holders of the company that manufactures or markets the products under study, or when the authors are a recent or current paid consultant to the involved company. It is the responsibility of the authors of submitted manuscripts to clearly present sufficient detail about the synchronizer schedule of the studied subjects (i.e., the sleep-wake schedule, ambient light-dark cycle, intensity and spectrum of ambient light exposure, seasons when the research was conducted, shift schedule in studies involving shift work, and menstrual cycle stage in studies involving young women). Rhythm analysis of time series data should be performed with the perspective that rhythms of different periods might be superimposed upon the observed temporal pattern of interest. A variety of different and complementary statistical procedures can be used for rhythm detection. Fitting a mathematical model to the time series data provides a better and more objective analysis of time series data than simple data inspection and narrative description, and if rhythmicity is documented by objective methods, its characterization is required by relevant parameters such as the rhythm's period (tau), MESOR (time series average), amplitude (range of temporal variation), acrophase (time of peak value), and bathyphase (time of trough value). However, the assumptions underlying the time series modeling must be satisfied and applicable in each case, especially the assumption of sinusoidality in the case of cosinor analysis, before it can be accepted as appropriate. An important aspect of the peer review of manuscripts submitted to Chronobiology International entails judgment of the conformity of research protocols and methods to the standards described in this article.
NASA Astrophysics Data System (ADS)
Klos, Anna; Olivares, German; Teferle, Felix Norman; Bogusz, Janusz
2016-04-01
Station velocity uncertainties determined from a series of Global Navigation Satellite System (GNSS) position estimates depend on both the deterministic and stochastic models applied to the time series. While the deterministic model generally includes parameters for a linear and several periodic terms the stochastic model is a representation of the noise character of the time series in form of a power-law process. For both of these models the optimal model may vary from one time series to another while the models also depend, to some degree, on each other. In the past various power-law processes have been shown to fit the time series and the sources for the apparent temporally-correlated noise were attributed to, for example, mismodelling of satellites orbits, antenna phase centre variations, troposphere, Earth Orientation Parameters, mass loading effects and monument instabilities. Blewitt and Lavallée (2002) demonstrated how improperly modelled seasonal signals affected the estimates of station velocity uncertainties. However, in their study they assumed that the time series followed a white noise process with no consideration of additional temporally-correlated noise. Bos et al. (2010) empirically showed for a small number of stations that the noise character was much more important for the reliable estimation of station velocity uncertainties than the seasonal signals. In this presentation we pick up from Blewitt and Lavallée (2002) and Bos et al. (2010), and have derived formulas for the computation of the General Dilution of Precision (GDP) under presence of periodic signals and temporally-correlated noise in the time series. We show, based on simulated and real time series from globally distributed IGS (International GNSS Service) stations processed by the Jet Propulsion Laboratory (JPL), that periodic signals dominate the effect on the velocity uncertainties at short time scales while for those beyond four years, the type of noise becomes much more important. In other words, for time series long enough, the assumed periodic signals do not affect the velocity uncertainties as much as the assumed noise model. We calculated the GDP to be the ratio between two errors of velocity: without and with inclusion of seasonal terms of periods equal to one year and its overtones till 3rd. To all these cases power-law processes of white, flicker and random-walk noise were added separately. Few oscillations in GDP can be noticed for integer years, which arise from periodic terms added. Their amplitudes in GDP increase along with the increasing spectral index. Strong peaks of oscillations in GDP are indicated for short time scales, especially for random-walk processes. This means that badly monumented stations are affected the most. Local minima and maxima in GDP are also enlarged as the noise approaches random walk. We noticed that the semi-annual signal increased the local GDP minimum for white noise. This suggests that adding power-law noise to a deterministic model with annual term or adding a semi-annual term to white noise causes an increased velocity uncertainty even at the points, where determined velocity is not biased.
Detection of deformation time-series in Miyake-jima using PALSAR/InSAR
NASA Astrophysics Data System (ADS)
Ozawa, T.; Ueda, H.
2010-12-01
Volcano deformation is often complicated temporally and spatially. Then deformation mapping by InSAR is useful to understand it in detail. However, InSAR is affected by the atmospheric, the ionospheric and other noises, and then we sometimes miss an important temporal change of deformation with a few cm. So we want to develop InSAR time-series analysis which detects volcano deformation precisely. Generally, the area of 10×10km which covers general volcano size is included in several SAR scenes obtained from different orbits or observation modes. First, interferograms are generated for each orbit path. In InSAR processing, the atmospheric noise reduction using the simulation from numerical weather model is used. Long wavelength noise due to orbit error and the ionospheric disturbance is corrected by adjusting to GPS deformation time-series, assuming it to be a plane. Next, we estimate deformation time-series from obtained interferograms. Radar incidence directions for each orbit path are different, but those for observation modes with 34.3° and 41.5° offnadir angles are almost included in one plane. Then slant-range change for all orbit paths can be described by the horizontal and the vertical components of its co-plane. Inversely, we estimate them for all epochs with the constraint that temporal change of deformation is smooth. Simultaneously, we estimate DEM error. As one of case studies, we present an application in Miyake-jima. Miyake-jima is a volcanic island located to 200km south of Tokyo, and a large amount of volcanic gas has been ejecting since the 2000 eruption. Crustal deformation associated with such volcanic activity has been observed by continuous GPS observations. However, its distribution is complicated, and therefore we applied this method to detect precise deformation time-series. In the most of GPS sites, obtained time-series were good agreement with GPS time-series, and the root-mean-square of residuals was less than 1cm. However, the temporal step of deformation was estimated in 2008, and it is not consistent with GPS time-series. We think that the effect of an orbit maneuver in 2008 has appeared. An improvement for such noise is one of next subjects. In the obtained deformation map, contraction around the caldera and uplift along the north-west-south coast were found. It is obvious that this deformation pattern cannot be explained by simple one inflation or deflation source, and its interpretation is also one of next subjects. In the caldera bottom, subsidence with 14cm/yr was found. Though its subsidence speed was constant until 2008, it decelerated to 20cm/yr from 2009. Furthermore subsidence speed in 2010 was 3cm/yr. Around the same time, low-frequency earthquakes increased just under the caldera. Then we speculate that deceleration of subsidence may directly relate with the volcanic activity. Although the result shows volcano deformation in detail, some mis-estimations were obtained. We believe that this InSAR time-series analysis is useful, but more improvements are necessary.
To What Extent Can Vegetation Mitigate Greenhouse Warming? A Modeling Approach
NASA Technical Reports Server (NTRS)
Bounoua, L.; Hall, F.G.; Collatz, G.J.; Tucker, C.J.; Sellers, P.J.; Kumar, A.
2008-01-01
Climate models participating in the IPCC Fourth Assessment Report indicate that under a 2xCO2 environment, runoff would increase faster than precipitation overland. However, observations over large U.S watersheds indicate otherwise. This inconsistency suggests that there may be important feedbacks between climate and land surface unaccounted for in the present generation of models. We postulate that the increase in precipitation associated with the increase in CO2 is also increasing vegetation density, which may already be feeding back onto climate. Including this feedback in a climate model simulation resulted in precipitation and runoff trends consistent with observations and reduced the warming by 0.6OC overland. This unaccounted for missing water may be linked to about 10% of the missing land carbon sink. A recent compilation of outputs from 19 coupled atmosphere-ocean general circulation models used in the IPCC Fourth Assessment Report (AR4) shows projected increases in air temperature, precipitation and river discharge for 24 major rivers in the world in response to doubling CO2 by the end of the century (1). The ensemble mean from these models also indicates that, compared to their respective baselines overland, the global mean of the runoff change would increase faster (8.9% per year) than that of the precipitation (5% per year). We analyze century-scale observed annual runoff time-series (1901-2002) over 9 hydrological units covering large regions of the Eastern United States (Fig.1) compiled by the United States Geological Survey (USGS)(2). These regions were selected because they are the most forested; the least water-limited and are not under extensive irrigation. We compare these time-series to similar time-series of observed annual precipitation anomalies spanning the period 1900-1995 (3). Both time-series exhibit a positive longterm trend (Fig. 2); however, in contrast to the analysis of (I), these historic data records show that the rate of precipitation increase is 5.5 % per year, roughly double the rate of runoff increase of 3.1 % per year.
Consistent Long-Time Series of GPS Satellite Antenna Phase Center Corrections
NASA Astrophysics Data System (ADS)
Steigenberger, P.; Schmid, R.; Rothacher, M.
2004-12-01
The current IGS processing strategy disregards satellite antenna phase center variations (pcvs) depending on the nadir angle and applies block-specific phase center offsets only. However, the transition from relative to absolute receiver antenna corrections presently under discussion necessitates the consideration of satellite antenna pcvs. Moreover, studies of several groups have shown that the offsets are not homogeneous within a satellite block. Manufacturer specifications seem to confirm this assumption. In order to get best possible antenna corrections, consistent ten-year time series (1994-2004) of satellite-specific pcvs and offsets were generated. This challenging effort became possible as part of the reprocessing of a global GPS network currently performed by the Technical Universities of Munich and Dresden. The data of about 160 stations since the official start of the IGS in 1994 have been reprocessed, as today's GPS time series are mostly inhomogeneous and inconsistent due to continuous improvements in the processing strategies and modeling of global GPS solutions. An analysis of the signals contained in the time series of the phase center offsets demonstrates amplitudes on the decimeter level, at least one order of magnitude worse than the desired accuracy. The periods partly arise from the GPS orbit configuration, as the orientation of the orbit planes with regard to the inertial system repeats after about 350 days due to the rotation of the ascending nodes. In addition, the rms values of the X- and Y-offsets show a high correlation with the angle between the orbit plane and the direction to the sun. The time series of the pcvs mainly point at the correlation with the global terrestrial scale. Solutions with relative and absolute phase center corrections, with block- and satellite-specific satellite antenna corrections demonstrate the effect of this parameter group on other global GPS parameters such as the terrestrial scale, station velocities, the geocenter position or the tropospheric delays. Thus, deeper insight into the so-called `Bermuda triangle' of several highly correlated parameters is given.
Measuring efficiency of international crude oil markets: A multifractality approach
NASA Astrophysics Data System (ADS)
Niere, H. M.
2015-01-01
The three major international crude oil markets are treated as complex systems and their multifractal properties are explored. The study covers daily prices of Brent crude, OPEC reference basket and West Texas Intermediate (WTI) crude from January 2, 2003 to January 2, 2014. A multifractal detrended fluctuation analysis (MFDFA) is employed to extract the generalized Hurst exponents in each of the time series. The generalized Hurst exponent is used to measure the degree of multifractality which in turn is used to quantify the efficiency of the three international crude oil markets. To identify whether the source of multifractality is long-range correlations or broad fat-tail distributions, shuffled data and surrogated data corresponding to each of the time series are generated. Shuffled data are obtained by randomizing the order of the price returns data. This will destroy any long-range correlation of the time series. Surrogated data is produced using the Fourier-Detrended Fluctuation Analysis (F-DFA). This is done by randomizing the phases of the price returns data in Fourier space. This will normalize the distribution of the time series. The study found that for the three crude oil markets, there is a strong dependence of the generalized Hurst exponents with respect to the order of fluctuations. This shows that the daily price time series of the markets under study have signs of multifractality. Using the degree of multifractality as a measure of efficiency, the results show that WTI is the most efficient while OPEC is the least efficient market. This implies that OPEC has the highest likelihood to be manipulated among the three markets. This reflects the fact that Brent and WTI is a very competitive market hence, it has a higher level of complexity compared against OPEC, which has a large monopoly power. Comparing with shuffled data and surrogated data, the findings suggest that for all the three crude oil markets, the multifractality is mainly due to long-range correlations.
Microforms in gravel bed rivers: Formation, disintegration, and effects on bedload transport
Strom, K.; Papanicolaou, A.N.; Evangelopoulos, N.; Odeh, M.
2004-01-01
This research aims to advance current knowledge on cluster formation and evolution by tackling some of the aspects associated with cluster microtopography and the effects of clusters on bedload transport. The specific objectives of the study are (1) to identify the bed shear stress range in which clusters form and disintegrate, (2) to quantitatively describe the spacing characteristics and orientation of clusters with respect to flow characteristics, (3) to quantify the effects clusters have on the mean bedload rate, and (4) to assess the effects of clusters on the pulsating nature of bedload. In order to meet the objectives of this study, two main experimental scenarios, namely, Test Series A and B (20 experiments overall) are considered in a laboratory flume under well-controlled conditions. Series A tests are performed to address objectives (1) and (2) while Series B is designed to meet objectives (3) and (4). Results show that cluster microforms develop in uniform sediment at 1.25 to 2 times the Shields parameter of an individual particle and start disintegrating at about 2.25 times the Shields parameter. It is found that during an unsteady flow event, effects of clusters on bedload transport rate can be classified in three different phases: a sink phase where clusters absorb incoming sediment, a neutral phase where clusters do not affect bedload, and a source phase where clusters release particles. Clusters also increase the magnitude of the fluctuations in bedload transport rate, showing that clusters amplify the unsteady nature of bedload transport. A fourth-order autoregressive, autoregressive integrated moving average model is employed to describe the time series of bedload and provide a predictive formula for predicting bedload at different periods. Finally, a change-point analysis enhanced with a binary segmentation procedure is performed to identify the abrupt changes in the bedload statistic characteristics due to the effects of clusters and detect the different phases in bedload time series using probability theory. The analysis verifies the experimental findings that three phases are detected in the bedload rate time series structure, namely, sink, neutral, and source. ?? ASCE / JUNE 2004.
IDSP- INTERACTIVE DIGITAL SIGNAL PROCESSOR
NASA Technical Reports Server (NTRS)
Mish, W. H.
1994-01-01
The Interactive Digital Signal Processor, IDSP, consists of a set of time series analysis "operators" based on the various algorithms commonly used for digital signal analysis work. The processing of a digital time series to extract information is usually achieved by the application of a number of fairly standard operations. However, it is often desirable to "experiment" with various operations and combinations of operations to explore their effect on the results. IDSP is designed to provide an interactive and easy-to-use system for this type of digital time series analysis. The IDSP operators can be applied in any sensible order (even recursively), and can be applied to single time series or to simultaneous time series. IDSP is being used extensively to process data obtained from scientific instruments onboard spacecraft. It is also an excellent teaching tool for demonstrating the application of time series operators to artificially-generated signals. IDSP currently includes over 43 standard operators. Processing operators provide for Fourier transformation operations, design and application of digital filters, and Eigenvalue analysis. Additional support operators provide for data editing, display of information, graphical output, and batch operation. User-developed operators can be easily interfaced with the system to provide for expansion and experimentation. Each operator application generates one or more output files from an input file. The processing of a file can involve many operators in a complex application. IDSP maintains historical information as an integral part of each file so that the user can display the operator history of the file at any time during an interactive analysis. IDSP is written in VAX FORTRAN 77 for interactive or batch execution and has been implemented on a DEC VAX-11/780 operating under VMS. The IDSP system generates graphics output for a variety of graphics systems. The program requires the use of Versaplot and Template plotting routines and IMSL Math/Library routines. These software packages are not included in IDSP. The virtual memory requirement for the program is approximately 2.36 MB. The IDSP system was developed in 1982 and was last updated in 1986. Versaplot is a registered trademark of Versatec Inc. Template is a registered trademark of Template Graphics Software Inc. IMSL Math/Library is a registered trademark of IMSL Inc.
Bernard, Helen; Werber, Dirk; Höhle, Michael
2014-03-01
Laboratory-confirmed norovirus illness is reportable in Germany since 2001. Reported case numbers are known to be undercounts, and a valid estimate of the actual incidence in Germany does not exist. An increase of reported norovirus illness was observed simultaneously to a large outbreak of Shiga toxin-producing E. coli O104:H4 in Germany in 2011--likely due to enhanced (but not complete) awareness of diarrhoea at that time. We aimed at estimating age- and sex-specific factors of that excess, which should be interpretable as (minimal) under-reporting factors of norovirus illness in Germany. We used national reporting data on laboratory-confirmed norovirus illness in Germany from calendar week 31 in 2003 through calendar week 30 in 2012. A negative binomial time series regression model was used to describe the weekly counts in 8∙2 age-sex strata while adjusting for secular trend and seasonality. Overall as well as age- and sex-specific factors for the excess were estimated by including additional terms (either an O104:H4 outbreak period indicator or a triple interaction term between outbreak period, age and sex) in the model. We estimated the overall under-reporting factor to be 1.76 (95% CI 1.28-2.41) for the first three weeks of the outbreak before the outbreak vehicle was publicly communicated. Highest under-reporting factors were here estimated for 20-29 year-old males (2.88, 95% CI 2.01-4.11) and females (2.67, 95% CI 1.87-3.79). Under-reporting was substantially lower in persons aged <10 years and 70 years or older. These are the first estimates of (minimal) under-reporting factors for norovirus illness in Germany. They provide a starting point for a more detailed investigation of the relationship between actual incidence and reporting incidence of norovirus illness in Germany.
Marwaha, Puneeta; Sunkaria, Ramesh Kumar
2016-09-01
The sample entropy (SampEn) has been widely used to quantify the complexity of RR-interval time series. It is a fact that higher complexity, and hence, entropy is associated with the RR-interval time series of healthy subjects. But, SampEn suffers from the disadvantage that it assigns higher entropy to the randomized surrogate time series as well as to certain pathological time series, which is a misleading observation. This wrong estimation of the complexity of a time series may be due to the fact that the existing SampEn technique updates the threshold value as a function of long-term standard deviation (SD) of a time series. However, time series of certain pathologies exhibits substantial variability in beat-to-beat fluctuations. So the SD of the first order difference (short term SD) of the time series should be considered while updating threshold value, to account for period-to-period variations inherited in a time series. In the present work, improved sample entropy (I-SampEn), a new methodology has been proposed in which threshold value is updated by considering the period-to-period variations of a time series. The I-SampEn technique results in assigning higher entropy value to age-matched healthy subjects than patients suffering atrial fibrillation (AF) and diabetes mellitus (DM). Our results are in agreement with the theory of reduction in complexity of RR-interval time series in patients suffering from chronic cardiovascular and non-cardiovascular diseases.
Inductive Approaches to Improving Diagnosis and Design for Diagnosability
NASA Technical Reports Server (NTRS)
Fisher, Douglas H. (Principal Investigator)
1995-01-01
The first research area under this grant addresses the problem of classifying time series according to their morphological features in the time domain. A supervised learning system called CALCHAS, which induces a classification procedure for signatures from preclassified examples, was developed. For each of several signature classes, the system infers a model that captures the class's morphological features using Bayesian model induction and the minimum message length approach to assign priors. After induction, a time series (signature) is classified in one of the classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. A second area of research assumes two sources of information about a system: a model or domain theory that encodes aspects of the system under study and data from actual system operations over time. A model, when it exists, represents strong prior expectations about how a system will perform. Our work with a diagnostic model of the RCS (Reaction Control System) of the Space Shuttle motivated the development of SIG, a system which combines information from a model (or domain theory) and data. As it tracks RCS behavior, the model computes quantitative and qualitative values. Induction is then performed over the data represented by both the 'raw' features and the model-computed high-level features. Finally, work on clustering for operating mode discovery motivated some important extensions to the clustering strategy we had used. One modification appends an iterative optimization technique onto the clustering system; this optimization strategy appears to be novel in the clustering literature. A second modification improves the noise tolerance of the clustering system. In particular, we adapt resampling-based pruning strategies used by supervised learning systems to the task of simplifying hierarchical clusterings, thus making post-clustering analysis easier.
NASA Astrophysics Data System (ADS)
Adegoke, Oluwashina; Dhang, Prasun; Mukhopadhyay, Banibrata; Ramadevi, M. C.; Bhattacharya, Debbijoy
2018-05-01
By analysing the time series of RXTE/PCA data, the non-linear variabilities of compact sources have been repeatedly established. Depending on the variation in temporal classes, compact sources exhibit different non-linear features. Sometimes they show low correlation/fractal dimension, but in other classes or intervals of time they exhibit stochastic nature. This could be because the accretion flow around a compact object is a non-linear general relativistic system involving magnetohydrodynamics. However, the more conventional way of addressing a compact source is the analysis of its spectral state. Therefore, the question arises: What is the connection of non-linearity to the underlying spectral properties of the flow when the non-linear properties are related to the associated transport mechanisms describing the geometry of the flow? This work is aimed at addressing this question. Based on the connection between observed spectral and non-linear (time series) properties of two X-ray binaries: GRS 1915+105 and Sco X-1, we attempt to diagnose the underlying accretion modes of the sources in terms of known accretion classes, namely, Keplerian disc, slim disc, advection dominated accretion flow and general advective accretion flow. We explore the possible transition of the sources from one accretion mode to others with time. We further argue that the accretion rate must play an important role in transition between these modes.
Eliciting interval beliefs: An experimental study
Peeters, Ronald; Wolk, Leonard
2017-01-01
In this paper we study the interval scoring rule as a mechanism to elicit subjective beliefs under varying degrees of uncertainty. In our experiment, subjects forecast the termination time of a time series to be generated from a given but unknown stochastic process. Subjects gradually learn more about the underlying process over time and hence the true distribution over termination times. We conduct two treatments, one with a high and one with a low volatility process. We find that elicited intervals are better when subjects are facing a low volatility process. In this treatment, participants learn to position their intervals almost optimally over the course of the experiment. This is in contrast with the high volatility treatment, where subjects, over the course of the experiment, learn to optimize the location of their intervals but fail to provide the optimal length. PMID:28380020
Neutron star dynamics under time-dependent external torques
NASA Astrophysics Data System (ADS)
Gügercinoǧlu, Erbil; Alpar, M. Ali
2017-11-01
The two-component model describes neutron star dynamics incorporating the response of the superfluid interior. Conventional solutions and applications involve constant external torques, as appropriate for radio pulsars on dynamical time-scales. We present the general solution of two-component dynamics under arbitrary time-dependent external torques, with internal torques that are linear in the rotation rates, or with the extremely non-linear internal torques due to vortex creep. The two-component model incorporating the response of linear or non-linear internal torques can now be applied not only to radio pulsars but also to magnetars and to neutron stars in binary systems, with strong observed variability and noise in the spin-down or spin-up rates. Our results allow the extraction of the time-dependent external torques from the observed spin-down (or spin-up) time series, \\dot{Ω }(t). Applications are discussed.
The changing profile of disability in the U.S. Army: 1981-2005.
Bell, Nicole S; Schwartz, Carolyn E; Harford, Thomas; Hollander, Ilyssa E; Amoroso, Paul J
2008-01-01
we sought to provide a profile of U.S. Army soldiers discharged with a permanent disability and to clarify whether underlying demographic changes explain increasing risks. frequency distributions and logistic regression analyses describe active-duty Army soldiers discharged with a disability (January 1981 through December 2005; N = 108,119). Time-series analysis describes temporal changes in demographic factors associated with disability. disability risk has increased 7-fold over the past 25 years. In 2005, there were 1,262 disability discharges per 100,000 active-duty soldiers. Risk factors include female gender, lower rank, married or formerly married, high school education or less, and age 40 or younger. Army population demographics changed during this time; the average age and tenure of soldiers increased, and the proportion of soldiers who were officers, women, and college educated grew. Adjusting for these demographic changes did not explain the rapidly increasing risk of disability. Time-series models revealed that disability among women is increasing independently of the increasing number of women in the Army; disability is also increasing at a faster pace for younger, lower-ranked, enlisted, and shorter-tenured soldiers. disability is costly and growing in the Army. Temporal changes in underlying Army population demographics do not explain overall disability increases. Disability is increasing most rapidly among female, junior enlisted, and younger soldiers.
NASA Astrophysics Data System (ADS)
Charakopoulos, A. K.; Katsouli, G. A.; Karakasidis, T. E.
2018-04-01
Understanding the underlying processes and extracting detailed characteristics of spatiotemporal dynamics of ocean and atmosphere as well as their interaction is of significant interest and has not been well thoroughly established. The purpose of this study was to examine the performance of two main additional methodologies for the identification of spatiotemporal underlying dynamic characteristics and patterns among atmospheric and oceanic variables from Seawatch buoys from Aegean and Ionian Sea, provided by the Hellenic Center for Marine Research (HCMR). The first approach involves the estimation of cross correlation analysis in an attempt to investigate time-lagged relationships, and further in order to identify the direction of interactions between the variables we performed the Granger causality method. According to the second approach the time series are converted into complex networks and then the main topological network properties such as degree distribution, average path length, diameter, modularity and clustering coefficient are evaluated. Our results show that the proposed analysis of complex network analysis of time series can lead to the extraction of hidden spatiotemporal characteristics. Also our findings indicate high level of positive and negative correlations and causalities among variables, both from the same buoy and also between buoys from different stations, which cannot be determined from the use of simple statistical measures.