A Radial Basis Function Approach to Financial Time Series Analysis
1993-12-01
Massachusetts Institute of Technology Artificial Intelligence Laboratory AI-TR 1457 545 Technology Square Cambridge, Massachusetts 02139 9. SPONSORING...Function Approach to Financial Time Series Analysis Ihy James M. Hutchinson Master of Science in EE(S. Massachusetts Institute of Technology (1986...Philosophy 0e r . at tilt NTiS CRA&IDTIC TAB Massachusetts Institute of Technology Unannoun•ea February. I9-1 Justific.igo,, . @1991 Massachusetts Institut
A Multiscale Approach to InSAR Time Series Analysis
NASA Astrophysics Data System (ADS)
Hetland, E. A.; Muse, P.; Simons, M.; Lin, N.; Dicaprio, C. J.
2010-12-01
We present a technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale InSAR Time Series analysis), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. As opposed to single pixel InSAR time series techniques, MInTS takes advantage of both spatial and temporal characteristics of the deformation field. We use a weighting scheme which accounts for the presence of localized holes due to decorrelation or unwrapping errors in any given interferogram. We represent time-dependent deformation using a dictionary of general basis functions, capable of detecting both steady and transient processes. The estimation is regularized using a model resolution based smoothing so as to be able to capture rapid deformation where there are temporally dense radar acquisitions and to avoid oscillations during time periods devoid of acquisitions. MInTS also has the flexibility to explicitly parametrize known time-dependent processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). We use cross validation to choose the regularization penalty parameter in the inversion of for the time-dependent deformation field. We demonstrate MInTS using a set of 63 ERS-1/2 and 29 Envisat interferograms for Long Valley Caldera.
A multiscale approach to InSAR time series analysis
NASA Astrophysics Data System (ADS)
Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y. N.; Dicaprio, C.; Rickerby, A.
2008-12-01
We describe a new technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale analysis of InSAR Time Series), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. This approach also permits a consistent treatment of all data independent of the presence of localized holes in any given interferogram. In essence, MInTS allows one to considers all data at the same time (as opposed to one pixel at a time), thereby taking advantage of both spatial and temporal characteristics of the deformation field. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to includes a set of general functions (e.g., splines) in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate our results by comparing with ground-based observations.
A Multiscale Approach to InSAR Time Series Analysis
NASA Astrophysics Data System (ADS)
Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y.; Dicaprio, C. J.
2009-12-01
We describe progress in the development of MInTS (Multiscale analysis of InSAR Time Series), an approach to constructed self-consistent time-dependent deformation observations from repeated satellite-based InSAR images of a given region. MInTS relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. In essence, MInTS allows one to considers all data at the same time as opposed to one pixel at a time, thereby taking advantage of both spatial and temporal characteristics of the deformation field. This approach also permits a consistent treatment of all data independent of the presence of localized holes due to unwrapping issues in any given interferogram. Specifically, the presence of holes is accounted for through a weighting scheme that accounts for the extent of actual data versus the area of holes associated with any given wavelet. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to include a set of general functions in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. We also experiment with the use of sparsity inducing regularization as a way to select from a large dictionary of time functions. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate
An entropic approach to the analysis of time series
NASA Astrophysics Data System (ADS)
Scafetta, Nicola
Statistical analysis of time series. With compelling arguments we show that the Diffusion Entropy Analysis (DEA) is the only method of the literature of the Science of Complexity that correctly determines the scaling hidden within a time series reflecting a Complex Process. The time series is thought of as a source of fluctuations, and the DEA is based on the Shannon entropy of the diffusion process generated by these fluctuations. All traditional methods of scaling analysis, instead, are based on the variance of this diffusion process. The variance methods detect the real scaling only if the Gaussian assumption holds true. We call H the scaling exponent detected by the variance methods and delta the real scaling exponent. If the time series is characterized by Fractional Brownian Motion, we have H = delta and the scaling can be safely determined, in this case, by using the variance methods. If, on the contrary, the time series is characterized, for example, by Levy statistics, H ≠ delta and the variance methods cannot be used to detect the true scaling. Levy walk yields the relation delta = 1/(3 - 2H). In the case of Levy flights, the variance diverges and the exponent H cannot be determined, whereas the scaling delta exists and can be established by using the DEA. Therefore, only the joint use of two different scaling analysis methods, the variance scaling analysis and the DEA, can assess the real nature, Gauss or Levy or something else, of a time series. Moreover, the DEA determines the information content, under the form of Shannon entropy, or of any other convenient entropic indicator, at each time step of the process that, given a sufficiently large number of data, is expected to become diffusion with scaling. This makes it possible to study the regime of transition from dynamics to thermodynamics, non-stationary regimes, and the saturation regime as well. First of all, the efficiency of the DEA is proved with theoretical arguments and with numerical work
The promise of the state space approach to time series analysis for nursing research.
Levy, Janet A; Elser, Heather E; Knobel, Robin B
2012-01-01
Nursing research, particularly related to physiological development, often depends on the collection of time series data. The state space approach to time series analysis has great potential to answer exploratory questions relevant to physiological development but has not been used extensively in nursing. The aim of the study was to introduce the state space approach to time series analysis and demonstrate potential applicability to neonatal monitoring and physiology. We present a set of univariate state space models; each one describing a process that generates a variable of interest over time. Each model is presented algebraically and a realization of the process is presented graphically from simulated data. This is followed by a discussion of how the model has been or may be used in two nursing projects on neonatal physiological development. The defining feature of the state space approach is the decomposition of the series into components that are functions of time; specifically, slowly varying level, faster varying periodic, and irregular components. State space models potentially simulate developmental processes where a phenomenon emerges and disappears before stabilizing, where the periodic component may become more regular with time, or where the developmental trajectory of a phenomenon is irregular. The ultimate contribution of this approach to nursing science will require close collaboration and cross-disciplinary education between nurses and statisticians.
NASA Astrophysics Data System (ADS)
Donner, R. V.; Zou, Y.; Donges, J. F.; Marwan, N.; Kurths, J.
2009-12-01
We present a new approach for analysing structural properties of time series from complex systems. Starting from the concept of recurrences in phase space, the recurrence matrix of a time series is interpreted as the adjacency matrix of an associated complex network which links different points in time if the evolution of the considered states is very similar. A critical comparison of these recurrence networks with similar existing techniques is presented, revealing strong conceptual benefits of the new approach which can be considered as a unifying framework for transforming time series into complex networks that also includes other methods as special cases. Based on different model systems, we demonstrate that there are fundamental interrelationships between the topological properties of recurrence networks and the statistical properties of the phase space density of the underlying dynamical system. Hence, the network description yields new quantitative characteristics of the dynamical complexity of a time series, which substantially complement existing measures of recurrence quantification analysis. Finally, we illustrate the potential of our approach for detecting hidden dynamical transitions from geoscientific time series by applying it to different paleoclimate records. In particular, we are able to resolve previously unknown climatic regime shifts in East Africa during the last about 4 million years, which might have had a considerable influence on the evolution of hominids in the area.
Li, Cheng; Ding, Guang-Hong; Wu, Guo-Qiang; Poon, Chi-Sang
2009-01-01
A wide variety of methods based on fractal, entropic or chaotic approaches have been applied to the analysis of complex physiological time series. In this paper, we show that fractal and entropy measures are poor indicators of nonlinearity for gait data and heart rate variability data. In contrast, the noise titration method based on Volterra autoregressive modeling represents the most reliable currently available method for testing nonlinear determinism and chaotic dynamics in the presence of measurement noise and dynamic noise.
A Space Affine Matching Approach to fMRI Time Series Analysis.
Chen, Liang; Zhang, Weishi; Liu, Hongbo; Feng, Shigang; Chen, C L Philip; Wang, Huili
2016-07-01
For fMRI time series analysis, an important challenge is to overcome the potential delay between hemodynamic response signal and cognitive stimuli signal, namely the same frequency but different phase (SFDP) problem. In this paper, a novel space affine matching feature is presented by introducing the time domain and frequency domain features. The time domain feature is used to discern different stimuli, while the frequency domain feature to eliminate the delay. And then we propose a space affine matching (SAM) algorithm to match fMRI time series by our affine feature, in which a normal vector is estimated using gradient descent to explore the time series matching optimally. The experimental results illustrate that the SAM algorithm is insensitive to the delay between the hemodynamic response signal and the cognitive stimuli signal. Our approach significantly outperforms GLM method while there exists the delay. The approach can help us solve the SFDP problem in fMRI time series matching and thus of great promise to reveal brain dynamics.
Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Pilger, E.; Wright, R.
2011-07-01
We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.
Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach.
Monti, Martin M
2011-01-01
Functional magnetic resonance imaging (fMRI) is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a general linear model (GLM) approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power, and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making.
Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall
2016-01-01
Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.
Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall
2016-01-01
Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973
Detection of chaos: New approach to atmospheric pollen time-series analysis
NASA Astrophysics Data System (ADS)
Bianchi, M. M.; Arizmendi, C. M.; Sanchez, J. R.
1992-09-01
Pollen and spores are biological particles that are ubiquitous to the atmosphere and are pathologically significant, causing plant diseases and inhalant allergies. One of the main objectives of aerobiological surveys is forecasting. Prediction models are required in order to apply aerobiological knowledge to medical or agricultural practice; a necessary condition of these models is not to be chaotic. The existence of chaos is detected through the analysis of a time series. The time series comprises hourly counts of atmospheric pollen grains obtained using a Burkard spore trap from 1987 to 1989 at Mar del Plata. Abraham's method to obtain the correlation dimension was applied. A low and fractal dimension shows chaotic dynamics. The predictability of models for atomspheric pollen forecasting is discussed.
Dynamic analysis of traffic time series at different temporal scales: A complex networks approach
NASA Astrophysics Data System (ADS)
Tang, Jinjun; Wang, Yinhai; Wang, Hua; Zhang, Shen; Liu, Fang
2014-07-01
The analysis of dynamics in traffic flow is an important step to achieve advanced traffic management and control in Intelligent Transportation System (ITS). Complexity and periodicity are definitely two fundamental properties in traffic dynamics. In this study, we first measure the complexity of traffic flow data by Lempel-Ziv algorithm at different temporal scales, and the data are collected from loop detectors on freeway. Second, to obtain more insight into the complexity and periodicity in traffic time series, we then construct complex networks from traffic time series by considering each day as a cycle and each cycle as a single node. The optimal threshold value of complex networks is estimated by the distribution of density and its derivative. In addition, the complex networks are subsequently analyzed in terms of some statistical properties, such as average path length, clustering coefficient, density, average degree and betweenness. Finally, take 2 min aggregation data as example, we use the correlation coefficient matrix, adjacent matrix and closeness to exploit the periodicity of weekdays and weekends in traffic flow data. The findings in this paper indicate that complex network is a practical tool for exploring dynamics in traffic time series.
Analysis of MALDI FT-ICR Mass Spectrometry Data: a Time Series Approach
Kronewitter, Scott R.; Lebrilla, Carlito B.; Rocke, David M.
2009-01-01
Matrix-assisted laser desorption/ionization Fourier transform ion cyclotron resonance mass spectrometry is a technique for high mass-resolution analysis of substances that is rapidly gaining popularity as an analytic tool. Extracting signal from the background noise, however, poses significant challenges. In this article, we model the noise part of a spectrum as an autoregressive, moving average (ARMA) time series with innovations given by a generalized gamma distribution with varying scale parameter but constant shape parameter and exponent. This enables us to classify peaks found in actual spectra as either noise or signal using a reasonable criterion that outperforms a standard threshold criterion. PMID:19646586
Analysis of MALDI FT-ICR mass spectrometry data: a time series approach.
Barkauskas, Donald A; Kronewitter, Scott R; Lebrilla, Carlito B; Rocke, David M
2009-08-26
Matrix-assisted laser desorption/ionization Fourier transform ion cyclotron resonance mass spectrometry is a technique for high mass-resolution analysis of substances that is rapidly gaining popularity as an analytic tool. Extracting signal from the background noise, however, poses significant challenges. In this article, we model the noise part of a spectrum as an autoregressive, moving average (ARMA) time series with innovations given by a generalized gamma distribution with varying scale parameter but constant shape parameter and exponent. This enables us to classify peaks found in actual spectra as either noise or signal using a reasonable criterion that outperforms a standard threshold criterion.
Oomens, Wouter; Maes, Joseph H. R.; Hasselman, Fred; Egger, Jos I. M.
2015-01-01
The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG) task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA), a non-linear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation. PMID:26097449
McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques
2007-04-01
The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.
Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments
NASA Astrophysics Data System (ADS)
Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel
2017-03-01
This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.
Hsu, Han-Hsiu; Araki, Michihiro; Mochizuki, Masao; Hori, Yoshimi; Murata, Masahiro; Kahar, Prihardi; Yoshida, Takanobu; Hasunuma, Tomohisa; Kondo, Akihiko
2017-03-02
Chinese hamster ovary (CHO) cells are the primary host used for biopharmaceutical protein production. The engineering of CHO cells to produce higher amounts of biopharmaceuticals has been highly dependent on empirical approaches, but recent high-throughput "omics" methods are changing the situation in a rational manner. Omics data analyses using gene expression or metabolite profiling make it possible to identify key genes and metabolites in antibody production. Systematic omics approaches using different types of time-series data are expected to further enhance understanding of cellular behaviours and molecular networks for rational design of CHO cells. This study developed a systematic method for obtaining and analysing time-dependent intracellular and extracellular metabolite profiles, RNA-seq data (enzymatic mRNA levels) and cell counts from CHO cell cultures to capture an overall view of the CHO central metabolic pathway (CMP). We then calculated correlation coefficients among all the profiles and visualised the whole CMP by heatmap analysis and metabolic pathway mapping, to classify genes and metabolites together. This approach provides an efficient platform to identify key genes and metabolites in CHO cell culture.
Hsu, Han-Hsiu; Araki, Michihiro; Mochizuki, Masao; Hori, Yoshimi; Murata, Masahiro; Kahar, Prihardi; Yoshida, Takanobu; Hasunuma, Tomohisa; Kondo, Akihiko
2017-01-01
Chinese hamster ovary (CHO) cells are the primary host used for biopharmaceutical protein production. The engineering of CHO cells to produce higher amounts of biopharmaceuticals has been highly dependent on empirical approaches, but recent high-throughput “omics” methods are changing the situation in a rational manner. Omics data analyses using gene expression or metabolite profiling make it possible to identify key genes and metabolites in antibody production. Systematic omics approaches using different types of time-series data are expected to further enhance understanding of cellular behaviours and molecular networks for rational design of CHO cells. This study developed a systematic method for obtaining and analysing time-dependent intracellular and extracellular metabolite profiles, RNA-seq data (enzymatic mRNA levels) and cell counts from CHO cell cultures to capture an overall view of the CHO central metabolic pathway (CMP). We then calculated correlation coefficients among all the profiles and visualised the whole CMP by heatmap analysis and metabolic pathway mapping, to classify genes and metabolites together. This approach provides an efficient platform to identify key genes and metabolites in CHO cell culture. PMID:28252038
An Analysis on the Unemployment Rate in the Philippines: A Time Series Data Approach
NASA Astrophysics Data System (ADS)
Urrutia, J. D.; Tampis, R. L.; E Atienza, JB
2017-03-01
This study aims to formulate a mathematical model for forecasting and estimating unemployment rate in the Philippines. Also, factors which can predict the unemployment is to be determined among the considered variables namely Labor Force Rate, Population, Inflation Rate, Gross Domestic Product, and Gross National Income. Granger-causal relationship and integration among the dependent and independent variables are also examined using Pairwise Granger-causality test and Johansen Cointegration Test. The data used were acquired from the Philippine Statistics Authority, National Statistics Office, and Bangko Sentral ng Pilipinas. Following the Box-Jenkins method, the formulated model for forecasting the unemployment rate is SARIMA (6, 1, 5) × (0, 1, 1)4 with a coefficient of determination of 0.79. The actual values are 99 percent identical to the predicted values obtained through the model, and are 72 percent closely relative to the forecasted ones. According to the results of the regression analysis, Labor Force Rate and Population are the significant factors of unemployment rate. Among the independent variables, Population, GDP, and GNI showed to have a granger-causal relationship with unemployment. It is also found that there are at least four cointegrating relations between the dependent and independent variables.
Simplification of multiple Fourier series - An example of algorithmic approach
NASA Technical Reports Server (NTRS)
Ng, E. W.
1981-01-01
This paper describes one example of multiple Fourier series which originate from a problem of spectral analysis of time series data. The example is exercised here with an algorithmic approach which can be generalized for other series manipulation on a computer. The generalized approach is presently pursued towards applications to a variety of multiple series and towards a general purpose algorithm for computer algebra implementation.
Verplancke, T; Van Looy, S; Steurbaut, K; Benoit, D; De Turck, F; De Moor, G; Decruyenaere, J
2010-01-21
Echo-state networks (ESN) are part of a group of reservoir computing methods and are basically a form of recurrent artificial neural networks (ANN). These methods can perform classification tasks on time series data. The recurrent ANN of an echo-state network has an 'echo-state' characteristic. This 'echo-state' functions as a fading memory: samples that have been introduced into the network in a further past, are faded away. The echo-state approach for the training of recurrent neural networks was first described by Jaeger H. et al. In clinical medicine, until this moment, no original research articles have been published to examine the use of echo-state networks. This study examines the possibility of using an echo-state network for prediction of dialysis in the ICU. Therefore, diuresis values and creatinine levels of the first three days after ICU admission were collected from 830 patients admitted to the intensive care unit (ICU) between May 31 th 2003 and November 17th 2007. The outcome parameter was the performance by the echo-state network in predicting the need for dialysis between day 5 and day 10 of ICU admission. Patients with an ICU length of stay <10 days or patients that received dialysis in the first five days of ICU admission were excluded. Performance by the echo-state network was then compared by means of the area under the receiver operating characteristic curve (AUC) with results obtained by two other time series analysis methods by means of a support vector machine (SVM) and a naive Bayes algorithm (NB). The AUC's in the three developed echo-state networks were 0.822, 0.818, and 0.817. These results were comparable to the results obtained by the SVM and the NB algorithm. This proof of concept study is the first to evaluate the performance of echo-state networks in an ICU environment. This echo-state network predicted the need for dialysis in ICU patients. The AUC's of the echo-state networks were good and comparable to the performance of other
Bruno, Paula Marta; Pereira, Fernando Duarte; Fernandes, Renato; de Mendonça, Goncalo Vilhena
2011-02-01
The responses to supramaximal exercise testing have been traditionally analyzed by means of standard parametric and nonparametric statistics. Unfortunately, these statistical approaches do not allow insight into the pattern of variation of a given parameter over time. The purpose of this study was to determine if the application of dynamic factor analysis (DFA) allowed discriminating different patterns of power output (PO), during supramaximal exercise, in two groups of children engaged in competitive sports: swimmers and soccer players. Data derived from Wingate testing were used in this study. Analyses were performed on epochs (30 s) of upper and lower body PO obtained from twenty two healthy boys (11 swimmers and 11 soccer players) age 11-12 years old. DFA revealed two distinct patterns of PO during Wingate. Swimmers tended to attain their peak PO (upper and lower body) earlier than soccer players. As importantly, DFA showed that children with a given pattern of upper body PO tend to perform similarly during lower body exercise.
Fontes, Cristiano Hora; Budman, Hector
2017-09-16
A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun
2017-01-17
Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/ CONTACT: sunkim.bioinfo@snu.ac.krSupplementary information
Cabinetmaker. Occupational Analysis Series.
ERIC Educational Resources Information Center
Chinien, Chris; Boutin, France
This document contains the analysis of the occupation of cabinetmaker, or joiner, that is accepted by the Canadian Council of Directors as the national standard for the occupation. The front matter preceding the analysis includes exploration of the development of the analysis, structure of the analysis, validation method, scope of the cabinetmaker…
Permutations and time series analysis.
Cánovas, Jose S; Guillamón, Antonio
2009-12-01
The main aim of this paper is to show how the use of permutations can be useful in the study of time series analysis. In particular, we introduce a test for checking the independence of a time series which is based on the number of admissible permutations on it. The main improvement in our tests is that we are able to give a theoretical distribution for independent time series.
Time Series Analysis Without Model Identification.
ERIC Educational Resources Information Center
Velicer, Wayne F.; McDonald, Roderick P.
1984-01-01
A new approach to time series analysis was developed. It employs a generalized transformation of the observed data to meet the assumptions of the general linear model, thus eliminating the need to identify a specific model. This approach permits alternative computational procedures, based on a generalized least squares algorithm. (Author/BW)
NASA Astrophysics Data System (ADS)
Allan, Alasdair
2014-06-01
FROG performs time series analysis and display. It provides a simple user interface for astronomers wanting to do time-domain astrophysics but still offers the powerful features found in packages such as PERIOD (ascl:1406.005). FROG includes a number of tools for manipulation of time series. Among other things, the user can combine individual time series, detrend series (multiple methods) and perform basic arithmetic functions. The data can also be exported directly into the TOPCAT (ascl:1101.010) application for further manipulation if needed.
Predicting road accidents: Structural time series approach
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-07-01
In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.
ERIC Educational Resources Information Center
Locock, Katherine; Tran, Hue; Codd, Rachel; Allan, Robin
2015-01-01
This series of three practical sessions centers on drugs that inhibit the enzyme acetylcholineesterase. This enzyme is responsible for the inactivation of acetylcholine and has been the target of drugs to treat glaucoma and Alzheimer's disease and for a number of insecticides and warfare agents. These sessions relate to a series of carbamate…
ERIC Educational Resources Information Center
Locock, Katherine; Tran, Hue; Codd, Rachel; Allan, Robin
2015-01-01
This series of three practical sessions centers on drugs that inhibit the enzyme acetylcholineesterase. This enzyme is responsible for the inactivation of acetylcholine and has been the target of drugs to treat glaucoma and Alzheimer's disease and for a number of insecticides and warfare agents. These sessions relate to a series of carbamate…
Time series analysis of injuries.
Martinez-Schnell, B; Zaidi, A
1989-12-01
We used time series models in the exploratory and confirmatory analysis of selected fatal injuries in the United States from 1972 to 1983. We built autoregressive integrated moving average (ARIMA) models for monthly, weekly, and daily series of deaths and used these models to generate hypotheses. These deaths resulted from six causes of injuries: motor vehicles, suicides, homicides, falls, drownings, and residential fires. For each cause of injury, we estimated calendar effects on the monthly death counts. We confirmed the significant effect of vehicle miles travelled on motor vehicle fatalities with a transfer function model. Finally, we applied intervention analysis to deaths due to motor vehicles.
Visibility graphlet approach to chaotic time series
Mutua, Stephen; Gu, Changgui E-mail: hjyang@ustc.edu.cn; Yang, Huijie E-mail: hjyang@ustc.edu.cn
2016-05-15
Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.
Visibility graphlet approach to chaotic time series.
Mutua, Stephen; Gu, Changgui; Yang, Huijie
2016-05-01
Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.
Task Analysis Inventories. Series II.
ERIC Educational Resources Information Center
Wesson, Carl E.
This second in a series of task analysis inventories contains checklists of work performed in twenty-two occupations. Each inventory is a comprehensive list of work activities, responsibilities, educational courses, machines, tools, equipment, and work aids used and the products produced or services rendered in a designated occupational area. The…
Visibility Graph Based Time Series Analysis
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
NASA Astrophysics Data System (ADS)
Feigin, Alexander; Mukhin, Dmitry; Gavrilov, Andrey; Volodin, Evgeny; Loskutov, Evgeny
2014-05-01
Natural systems are in general space-distributed, and their evolution represents a broad spectrum of temporal scales. The multiscale nature may be resulted from multiplicity of mechanisms governing the system behaviour, and a large number of feedbacks and nonlinearities. A way to reveal and understand the underlying mechanisms as well as to model corresponding sub-systems is decomposition of the full (complex) system into well separated spatio-temporal patterns ("modes") that evolve with essentially different time scales. In the report a new method of a similar decomposition is discussed. The method is based on generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding space-distributed time series in basis of spatio-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points. The method is applied to decomposition of the Earth's climate system: on the base of 156 years time series of SST anomalies distributed over the globe [2] two climatic modes possessing by noticeably different time scales (3-5 and 9-11 years) are separated. For more accurate exclusion of "too slow" (and thus not represented correctly) processes from real data the numerically produced STEOF basis is used. For doing this the time series generated by the INM RAS Coupled Climate Model [3] is utilized. Relations of separated modes to ENSO and PDO are investigated. Possible development of the suggested approach in order to the separation of the modes that are nonlinearly uncorrelated is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/ 3. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm
Introduction to Time Series Analysis
NASA Technical Reports Server (NTRS)
Hardin, J. C.
1986-01-01
The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.
Evan Brooks; Valerie Thomas; Wynne Randolph; John Coulston
2012-01-01
With the advent of free Landsat data stretching back decades, there has been a surge of interest in utilizing remotely sensed data in multitemporal analysis for estimation of biophysical parameters. Such analysis is confounded by cloud cover and other image-specific problems, which result in missing data at various aperiodic times of the year. While there is a wealth...
Nonlinear time-series analysis revisited.
Bradley, Elizabeth; Kantz, Holger
2015-09-01
In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data-typically univariate-via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems.
Nonlinear time-series analysis revisited
NASA Astrophysics Data System (ADS)
Bradley, Elizabeth; Kantz, Holger
2015-09-01
In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data—typically univariate—via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems.
Comparative Analysis on Time Series with Included Structural Break
NASA Astrophysics Data System (ADS)
Andreeski, Cvetko J.; Vasant, Pandian
2009-08-01
The time series analysis (ARIMA models) is a good approach for identification of time series. But, if we have structural break in the time series, we cannot create only one model of time series. Further more, if we don't have enough data between two structural breaks, it's impossible to create valid time series models for identification of the time series. This paper explores the possibility of identification of the inflation process dynamics via of the system-theoretic, by means of both Box-Jenkins ARIMA methodologies and artificial neural networks.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Radar Interferometry Time Series Analysis and Tools
NASA Astrophysics Data System (ADS)
Buckley, S. M.
2006-12-01
We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including localized subsidence bowls with constant velocity and seasonal deformation fluctuations. We will present results and insights from the application of these time series analysis techniques to several land subsidence study sites with varying deformation and environmental conditions, e.g., arid Phoenix and coastal Houston-Galveston metropolitan areas and rural Texas sink holes. We consistently find that the time invested in implementing, applying and comparing multiple InSAR time series approaches for a given study site is rewarded with a deeper understanding of the techniques and deformation phenomena. To this end, and with support from NSF, we are preparing a first-version of an InSAR post-processing toolkit to be released to the InSAR science community. These studies form a baseline of results to compare against the higher spatial and temporal sampling anticipated from TerraSAR-X as well as the trade-off between spatial coverage and resolution when relying on ScanSAR interferometry.
Analysis of Polyphonic Musical Time Series
NASA Astrophysics Data System (ADS)
Sommer, Katrin; Weihs, Claus
A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.
A novel time series link prediction method: Learning automata approach
NASA Astrophysics Data System (ADS)
Moradabadi, Behnaz; Meybodi, Mohammad Reza
2017-09-01
Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.
Complex network approach to fractional time series
Manshour, Pouya
2015-10-15
In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.
Spectral analysis of multiple time series
NASA Technical Reports Server (NTRS)
Dubman, M. R.
1972-01-01
Application of spectral analysis for mathematically determining relationship of random vibrations in structures and concurrent events in electric circuits, physiology, economics, and seismograms is discussed. Computer program for performing spectral analysis of multiple time series is described.
Analysis of series resonant converter with series-parallel connection
NASA Astrophysics Data System (ADS)
Lin, Bor-Ren; Huang, Chien-Lan
2011-02-01
In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.
Short time-series microarray analysis: Methods and challenges
Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina
2008-01-01
The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data. PMID:18605994
A multivariate time-series approach to marital interaction
Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar
2005-01-01
Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation). The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa. Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R², p<0.0001 for the multivariate models). The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction. PMID:19742066
A multivariate time-series approach to marital interaction.
Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar
2005-08-02
Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation).The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa.Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R(2), p<0.0001 for the multivariate models).The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction.
Nonlinear Analysis of Surface EMG Time Series
NASA Astrophysics Data System (ADS)
Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2004-04-01
Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
Comparing or estimating the costs of educational projects by merely using cost-per-student figures is imprecise and ignores area differences in prices. The resource approach to cost analysis begins by determining specific physical resources (such as facilities, staff, equipment, materials, and services) needed for a project. Then the cost of these…
Making a difference: the osteopathic approach lecture series.
Danto, J B; Kavieff, T R
1999-03-01
The Osteopathic Approach Lecture Series (Osteopathic AppLeS) was created in response to both the current need in the osteopathic medical profession for a distinctive osteopathic identity and lack of readily available information regarding an osteopathic approach. The series consisted of 16 lectures given to interns, externs, residents, and attendings at a community-based osteopathic hospital during a 10-month time span which emphasized and inculcated an osteopathic approach to patients with a variety of illnesses. Emphasis was placed on osteopathic manipulative treatment training throughout the series. Data were collected at individual presentations using a survey of participants and studied retrospectively. The results of the surveys indicated that using this type of presentation series may substantially increase confidence and knowledge in an osteopathic approach and osteopathic manipulative treatment skills.
Distinguishing chaotic time series from noise: A random matrix approach
NASA Astrophysics Data System (ADS)
Ye, Bin; Chen, Jianxing; Ju, Chen; Li, Huijun; Wang, Xuesong
2017-03-01
Deterministically chaotic systems can often give rise to random and unpredictable behaviors which make the time series obtained from them to be almost indistinguishable from noise. Motivated by the fact that data points in a chaotic time series will have intrinsic correlations between them, we propose a random matrix theory (RMT) approach to identify the deterministic or stochastic dynamics of the system. We show that the spectral distributions of the correlation matrices, constructed from the chaotic time series, deviate significantly from the predictions of random matrix ensembles. On the contrary, the eigenvalue statistics for a noisy signal follow closely those of random matrix ensembles. Numerical results also indicate that the approach is to some extent robust to additive observational noise which pollutes the data in many practical situations. Our approach is efficient in recognizing the continuous chaotic dynamics underlying the evolution of the time series.
Circulant Matrices and Time-Series Analysis
ERIC Educational Resources Information Center
Pollock, D. S. G.
2002-01-01
This paper sets forth some salient results in the algebra of circulant matrices which can be used in time-series analysis. It provides easy derivations of some results that are central to the analysis of statistical periodograms and empirical spectral density functions. A statistical test for the stationarity or homogeneity of empirical processes…
Circulant Matrices and Time-Series Analysis
ERIC Educational Resources Information Center
Pollock, D. S. G.
2002-01-01
This paper sets forth some salient results in the algebra of circulant matrices which can be used in time-series analysis. It provides easy derivations of some results that are central to the analysis of statistical periodograms and empirical spectral density functions. A statistical test for the stationarity or homogeneity of empirical processes…
Allan deviation analysis of financial return series
NASA Astrophysics Data System (ADS)
Hernández-Pérez, R.
2012-05-01
We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.
Entropic Analysis of Electromyography Time Series
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Sung, Paul
2005-03-01
We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.
Improved singular spectrum analysis for time series with missing data
NASA Astrophysics Data System (ADS)
Shen, Y.; Peng, F.; Li, B.
2015-07-01
Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution develops an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach is evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60 %, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30 % for the first four principal components, respectively. Both the mean absolute error and mean root mean squared error of the reconstructed time series by ISSA are also smaller than those by SSAM. The respective improvements are 34.45 and 33.91 % when the missing data accounts for 60 %. The results from real incomplete time series also show that the standard deviation (SD) derived by ISSA is 12.27 mg L-1, smaller than the 13.48 mg L-1 derived by SSAM.
Improved singular spectrum analysis for time series with missing data
NASA Astrophysics Data System (ADS)
Shen, Y.; Peng, F.; Li, B.
2014-12-01
Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution will develop an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach was evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60%, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30% for the first four principal components, respectively. Besides, both the mean absolute errors and mean root mean squared errors of the reconstructed time series by ISSA are also much smaller than those by SSAM. The respective improvements are 34.45 and 33.91% when the missing data accounts for 60%. The results from real incomplete time series also show that the SD derived by ISSA is 12.27 mg L-1, smaller than 13.48 mg L-1 derived by SSAM.
Modelling road accidents: An approach using structural time series
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-09-01
In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.
Noolvi, Malleshappa N.; Patel, Harun M.
2010-01-01
Epidermal growth factor receptor (EGFR) protein tyrosine kinases (PTKs) are known for its role in cancer. Quinazoline have been reported to be the molecules of interest, with potent anticancer activity and they act by binding to ATP site of protein kinases. ATP binding site of protein kinases provides an extensive opportunity to design newer analogs. With this background, we report an attempt to discern the structural and physicochemical requirements for inhibition of EGFR tyrosine kinase. The k-Nearest Neighbor Molecular Field Analysis (kNN-MFA), a three dimensional quantitative structure activity relationship (3D- QSAR) method has been used in the present case to study the correlation between the molecular properties and the tyrosine kinase (EGFR) inhibitory activities on a series of quinazoline derivatives. kNNMFA calculations for both electrostatic and steric field were carried out. The master grid maps derived from the best model has been used to display the contribution of electrostatic potential and steric field. The statistical results showed significant correlation coefficient r2 (q2) of 0.846, r2 for external test set (pred_r2) 0.8029, coefficient of correlation of predicted data set (pred_r2se) of 0.6658, degree of freedom 89 and k nearest neighbor of 2. Therefore, this study not only casts light on binding mechanism between EGFR and its inhibitors, but also provides hints for the design of new EGFR inhibitors with observable structural diversity PMID:24825983
Noolvi, Malleshappa N; Patel, Harun M
2010-06-01
Epidermal growth factor receptor (EGFR) protein tyrosine kinases (PTKs) are known for its role in cancer. Quinazoline have been reported to be the molecules of interest, with potent anticancer activity and they act by binding to ATP site of protein kinases. ATP binding site of protein kinases provides an extensive opportunity to design newer analogs. With this background, we report an attempt to discern the structural and physicochemical requirements for inhibition of EGFR tyrosine kinase. The k-Nearest Neighbor Molecular Field Analysis (kNN-MFA), a three dimensional quantitative structure activity relationship (3D- QSAR) method has been used in the present case to study the correlation between the molecular properties and the tyrosine kinase (EGFR) inhibitory activities on a series of quinazoline derivatives. kNNMFA calculations for both electrostatic and steric field were carried out. The master grid maps derived from the best model has been used to display the contribution of electrostatic potential and steric field. The statistical results showed significant correlation coefficient r(2) (q(2)) of 0.846, r(2) for external test set (pred_r2) 0.8029, coefficient of correlation of predicted data set (pred_r(2)se) of 0.6658, degree of freedom 89 and k nearest neighbor of 2. Therefore, this study not only casts light on binding mechanism between EGFR and its inhibitors, but also provides hints for the design of new EGFR inhibitors with observable structural diversity.
Complex network analysis of time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Small, Michael; Kurths, Jürgen
2016-12-01
Revealing complicated behaviors from time series constitutes a fundamental problem of continuing interest and it has attracted a great deal of attention from a wide variety of fields on account of its significant importance. The past decade has witnessed a rapid development of complex network studies, which allow to characterize many types of systems in nature and technology that contain a large number of components interacting with each other in a complicated manner. Recently, the complex network theory has been incorporated into the analysis of time series and fruitful achievements have been obtained. Complex network analysis of time series opens up new venues to address interdisciplinary challenges in climate dynamics, multiphase flow, brain functions, ECG dynamics, economics and traffic systems.
NASA Astrophysics Data System (ADS)
Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md
2017-08-01
Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.
Integrated method for chaotic time series analysis
Hively, L.M.; Ng, E.G.
1998-09-29
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.
Integrated method for chaotic time series analysis
Hively, Lee M.; Ng, Esmond G.
1998-01-01
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.
Nonlinear Time Series Analysis via Neural Networks
NASA Astrophysics Data System (ADS)
Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin
This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.
Time-frequency analysis of electroencephalogram series
NASA Astrophysics Data System (ADS)
Blanco, S.; Quiroga, R. Quian; Rosso, O. A.; Kochen, S.
1995-03-01
In this paper we propose a method, based on the Gabor transform, to quantify and visualize the time evolution of the traditional frequency bands defined in the analysis of electroencephalogram (EEG) series. The information obtained in this way can be used for the information transfer analyses of the epileptic seizure as well as for their characterization. We found an optimal correlation between EEG visual inspection and the proposed method in the characterization of paroxism, spikes, and other transient alterations of background activity. The dynamical changes during an epileptic seizure are shown through the phase portrait. The method proposed was examplified with EEG series obtained with depth electrodes in refractory epileptic patients.
Transition Icons for Time Series Visualization and Exploratory Analysis.
Nickerson, Paul; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd D; Tighe, Patrick J; Rashidi, Parisa
2017-05-16
The modern healthcare landscape has seen the rapid emergence of techniques and devices which temporally monitor and record physiological signals. The prevalence of time series data within the healthcare field necessitates the development of methods which can analyze the data in order to draw meaningful conclusions. Time series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call Transition Icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition Icons are adept at detecting and displaying subtle differences and similarities e.g. between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods which collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from Symbolic Aggregate approXimation (SAX) representations, and compiles transition frequencies into a Bag of Patterns (BoP) constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the Transition Icon technique for two time series data sets - postoperative pain scores, and hip-worn accelerometer activity counts. We believe Transition Icons can be an important tool for researchers approaching time series data, as they give rich and intuitive information about collective time series behaviors.
A probability distribution approach to synthetic turbulence time series
NASA Astrophysics Data System (ADS)
Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael
2016-11-01
The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.
Interrupted time series analysis in clinical research.
Matowe, Lloyd K; Leister, Cathie A; Crivera, Concetta; Korth-Bradley, Joan M
2003-01-01
To demonstrate the usefulness of interrupted time series analysis in clinical trial design. A safety data set of electrocardiographic (ECG) information was simulated from actual data that had been collected in a Phase I study. Simulated data on 18 healthy volunteers based on a study performed in a contract research facility were collected based on single doses of an experimental medication that may affect ECG parameters. Serial ECGs were collected before and during treatment with the experimental medication. Data from 7 real subjects receiving placebo were used to simulate the pretreatment phase of time series; data from 18 real subjects receiving active treatment were used to simulate the treatment phase of the time series. Visual inspection of data was performed, followed by tests for trend, seasonality, and autocorrelation by use of SAS. There was no evidence of trend, seasonality, or autocorrelation. In 11 of 18 simulated individuals, statistically significant changes in QTc intervals were observed following treatment with the experimental medication. A significant time of day and treatment interaction was observed in 4 simulated patients. Interrupted time series analysis techniques offer an additional tool for the study of clinical situations in which patients must act as their own controls and where serial data can be collected at evenly distributed intervals.
Delay differential analysis of time series.
Lainscsek, Claudia; Sejnowski, Terrence J
2015-03-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time
Delay Differential Analysis of Time Series
Lainscsek, Claudia; Sejnowski, Terrence J.
2015-01-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time
Time-Series Analysis: A Cautionary Tale
NASA Technical Reports Server (NTRS)
Damadeo, Robert
2015-01-01
Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.
Multiresolution analysis of Bursa Malaysia KLCI time series
NASA Astrophysics Data System (ADS)
Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed
2017-05-01
In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data
Learning dynamics from nonstationary time series: Analysis of electroencephalograms
NASA Astrophysics Data System (ADS)
Gribkov, Dmitrii; Gribkova, Valentina
2000-06-01
We propose an empirical modeling technique for a nonstationary time series analysis. Proposed methods include a high-dimensional (N>3) dynamical model construction in the form of delay differential equations, a nonparametric method of respective time delay calculation, the detection of quasistationary regions of the process by reccurence analysis in the space of model coefficients, and final fitting of the model to quasistationary segments of observed time series. We also demonstrate the effectiveness of our approach for nonstationary signal classification in the space of model coefficients. Applying the empirical modeling technique to electroencephalogram (EEG) records analysis, we find evidence of high-dimensional nonlinear dynamics in quasistationary EEG segments. Reccurence analysis of model parameters reveals long-term correlations in nonstationary EEG records. Using the dynamical model as a nonlinear filter, we find that different emotional states of subjects can be clearly distinguished in the space of model coefficients.
Richmond, Amy; Sanchez, Belinda; Stevenson, Valerie; Baker, Russell T.; May, James; Nasypany, Alan; Reordan, Don
2016-01-01
ABSTRACT Background Partial meniscectomy does not consistently produce the desired positive outcomes intended for meniscal tears lesions; therefore, a need exists for research into alternatives for treating symptoms of meniscal tears. The purpose of this case series was to examine the effect of the Mulligan Concept (MC) “Squeeze” technique in physically active participants who presented with clinical symptoms of meniscal tears. Description of Cases The MC “Squeeze” technique was applied in five cases of clinically diagnosed meniscal tears in a physically active population. The Numeric Pain Rating Scale (NRS), the Patient Specific Functional Scale (PSFS), the Disability in the Physically Active (DPA) Scale, and the Knee injury and Osteoarthritis Outcomes Score (KOOS) were administered to assess participant pain level and function. Outcomes Statistically significant improvements were found on cumulative NRS (p ≤ 0.001), current NRS (p ≤ 0.002), PSFS (p ≤ 0.003), DPA (p ≤ 0.019), and KOOS (p ≤ 0.002) scores across all five participants. All participants exceeded the minimal clinically important difference (MCID) on the first treatment and reported an NRS score and current pain score of one point or less at discharge. The MC “Squeeze” technique produced statistically and clinically significant changes across all outcome measures in all five participants. Discussion The use of the MC “Squeeze” technique in this case series indicated positive outcomes in five participants who presented with meniscal tear symptoms. Of importance to the athletic population, each of the participants continued to engage in sport activity as tolerated unless otherwise required during the treatment period. The outcomes reported in this case series exceed those reported when using traditional conservative therapy and the return to play timelines for meniscal tears treated with partial meniscectomies. Levels of Evidence Level 4 PMID:27525181
Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie
2012-01-01
We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221
Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie
2012-01-01
We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock.
Singular spectrum analysis for time series with missing data
Schoellhamer, D.H.
2001-01-01
Geophysical time series often contain missing data, which prevents analysis with many signal processing and multivariate tools. A modification of singular spectrum analysis for time series with missing data is developed and successfully tested with synthetic and actual incomplete time series of suspended-sediment concentration from San Francisco Bay. This method also can be used to low pass filter incomplete time series.
Tremor classification and tremor time series analysis
NASA Astrophysics Data System (ADS)
Deuschl, Günther; Lauk, Michael; Timmer, Jens
1995-03-01
The separation between physiologic tremor (PT) in normal subjects and the pathological tremors of essential tremor (ET) or Parkinson's disease (PD) was investigated on the basis of monoaxial accelerometric recordings of 35 s hand tremor epochs. Frequency and amplitude were insufficient to separate between these conditions, except for the trivial distinction between normal and pathologic tremors that is already defined on the basis of amplitude. We found that waveform analysis revealed highly significant differences between normal and pathologic tremors, and, more importantly, among different forms of pathologic tremors. We found in our group of 25 patients with PT and 15 with ET a reasonable distinction with the third momentum and the time reversal invariance. A nearly complete distinction between these two conditions on the basis of the asymmetric decay of the autocorrelation function. We conclude that time series analysis can probably be developed into a powerful tool for the objective analysis of tremors.
NASA Astrophysics Data System (ADS)
Wang, Dong; Singh, Vijay P.; Shang, Xiaosan; Ding, Hao; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Wang, Shicheng; Wang, Zhenlong
2014-07-01
De-noising meteorologic and hydrologic time series is important to improve the accuracy and reliability of extraction, analysis, simulation, and forecasting. A hybrid approach, combining sample entropy and wavelet de-noising method, is developed to separate noise from original series and is named as AWDA-SE (adaptive wavelet de-noising approach using sample entropy). The AWDA-SE approach adaptively determines the threshold for wavelet analysis. Two kinds of meteorologic and hydrologic data sets, synthetic data set and 3 representative field measured data sets (one is the annual rainfall data of Jinan station and the other two are annual streamflow series from two typical stations in China, Yingluoxia station on the Heihe River, which is little affected by human activities, and Lijin station on the Yellow River, which is greatly affected by human activities), are used to illustrate the approach. The AWDA-SE approach is compared with three conventional de-noising methods, including fixed-form threshold algorithm, Stein unbiased risk estimation algorithm, and minimax algorithm. Results show that the AWDA-SE approach separates effectively the signal and noise of the data sets and is found to be better than the conventional methods. Measures of assessment standards show that the developed approach can be employed to investigate noisy and short time series and can also be applied to other areas.
Choosing chemicals for precautionary regulation: a filter series approach.
Müller-Herold, Ulrich; Morosini, Marco; Schucht, Olivier
2005-02-01
The present case study develops and applies a systematic approach to the precautionary pre-screening of xenobiotic organic chemicals with respectto large-scale environmental threats. It starts from scenarios for uncontrollable harm and identifies conditions for their occurrence that then are related to a set of amplifying factors, such as characteristic isotropic spatial range p. The amplifying factors related to a particular scenario are combined in a pre-screening filter. It is the amplifying factors that can transform a potential local damage into a large-scale threat. Controlling the amplifying factors means controlling the scope and range of the potential for damage. The threshold levels for the amplifying factors of each filter are fixed through recourse to historical and present-day reference chemicals so as to filter out as many as possible of the currently regulated environmental chemicals and to allow the economically important compounds that pose no large-scale environmental concern. The totality of filters, with each filter corresponding to a particular threat scenario, provides the filter series to be used in precautionary regulation. As a demonstration, the filter series is then applied to a group of nonreferential chemicals. The case study suggests that the filter series approach may serve as a starting point for precautionary assessment as a scientific method of its own.
Compounding approach for univariate time series with nonstationary variances
NASA Astrophysics Data System (ADS)
Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich
2015-12-01
A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.
Dimensionless embedding for nonlinear time series analysis
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; Aihara, Kazuyuki
2017-09-01
Recently, infinite-dimensional delay coordinates (InDDeCs) have been proposed for predicting high-dimensional dynamics instead of conventional delay coordinates. Although InDDeCs can realize faster computation and more accurate short-term prediction, it is still not well-known whether InDDeCs can be used in other applications of nonlinear time series analysis in which reconstruction is needed for the underlying dynamics from a scalar time series generated from a dynamical system. Here, we give theoretical support for justifying the use of InDDeCs and provide numerical examples to show that InDDeCs can be used for various applications for obtaining the recurrence plots, correlation dimensions, and maximal Lyapunov exponents, as well as testing directional couplings and extracting slow-driving forces. We demonstrate performance of the InDDeCs using the weather data. Thus, InDDeCs can eventually realize "dimensionless embedding" while we enjoy faster and more reliable computations.
Nonlinear independent component analysis and multivariate time series analysis
NASA Astrophysics Data System (ADS)
Storck, Jan; Deco, Gustavo
1997-02-01
We derive an information-theory-based unsupervised learning paradigm for nonlinear independent component analysis (NICA) with neural networks. We demonstrate that under the constraint of bounded and invertible output transfer functions the two main goals of unsupervised learning, redundancy reduction and maximization of the transmitted information between input and output (Infomax-principle), are equivalent. No assumptions are made concerning the kind of input and output distributions, i.e. the kind of nonlinearity of correlations. An adapted version of the general NICA network is used for the modeling of multivariate time series by unsupervised learning. Given time series of various observables of a dynamical system, our net learns their evolution in time by extracting statistical dependencies between past and present elements of the time series. Multivariate modeling is obtained by making present value of each time series statistically independent not only from their own past but also from the past of the other series. Therefore, in contrast to univariate methods, the information lying in the couplings between the observables is also used and a detection of higher-order cross correlations is possible. We apply our method to time series of the two-dimensional Hénon map and to experimental time series obtained from the measurements of axial velocities in different locations in weakly turbulent Taylor-Couette flow.
Time-dependent spectral analysis of epidemiological time-series with wavelets.
Cazelles, Bernard; Chavez, Mario; Magny, Guillaume Constantin de; Guégan, Jean-Francois; Hales, Simon
2007-08-22
In the current context of global infectious disease risks, a better understanding of the dynamics of major epidemics is urgently needed. Time-series analysis has appeared as an interesting approach to explore the dynamics of numerous diseases. Classical time-series methods can only be used for stationary time-series (in which the statistical properties do not vary with time). However, epidemiological time-series are typically noisy, complex and strongly non-stationary. Given this specific nature, wavelet analysis appears particularly attractive because it is well suited to the analysis of non-stationary signals. Here, we review the basic properties of the wavelet approach as an appropriate and elegant method for time-series analysis in epidemiological studies. The wavelet decomposition offers several advantages that are discussed in this paper based on epidemiological examples. In particular, the wavelet approach permits analysis of transient relationships between two signals and is especially suitable for gradual change in force by exogenous variables.
Mixed Spectrum Analysis on fMRI Time-Series.
Kumar, Arun; Lin, Feng; Rajapakse, Jagath C
2016-06-01
Temporal autocorrelation present in functional magnetic resonance image (fMRI) data poses challenges to its analysis. The existing approaches handling autocorrelation in fMRI time-series often presume a specific model of autocorrelation such as an auto-regressive model. The main limitation here is that the correlation structure of voxels is generally unknown and varies in different brain regions because of different levels of neurogenic noises and pulsatile effects. Enforcing a universal model on all brain regions leads to bias and loss of efficiency in the analysis. In this paper, we propose the mixed spectrum analysis of the voxel time-series to separate the discrete component corresponding to input stimuli and the continuous component carrying temporal autocorrelation. A mixed spectral analysis technique based on M-spectral estimator is proposed, which effectively removes autocorrelation effects from voxel time-series and identify significant peaks of the spectrum. As the proposed method does not assume any prior model for the autocorrelation effect in voxel time-series, varying correlation structure among the brain regions does not affect its performance. We have modified the standard M-spectral method for an application on a spatial set of time-series by incorporating the contextual information related to the continuous spectrum of neighborhood voxels, thus reducing considerably the computation cost. Likelihood of the activation is predicted by comparing the amplitude of discrete component at stimulus frequency of voxels across the brain by using normal distribution and modeling spatial correlations among the likelihood with a conditional random field. We also demonstrate the application of the proposed method in detecting other desired frequencies.
Behavior of road accidents: Structural time series approach
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin
2014-12-01
Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.
Time series analysis of temporal networks
NASA Astrophysics Data System (ADS)
Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh
2016-01-01
A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue
Flutter Analysis for Turbomachinery Using Volterra Series
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Yao, Weigang
2014-01-01
The objective of this paper is to describe an accurate and efficient reduced order modeling method for aeroelastic (AE) analysis and for determining the flutter boundary. Without losing accuracy, we develop a reduced order model based on the Volterra series to achieve significant savings in computational cost. The aerodynamic force is provided by a high-fidelity solution from the Reynolds-averaged Navier-Stokes (RANS) equations; the structural mode shapes are determined from the finite element analysis. The fluid-structure coupling is then modeled by the state-space formulation with the structural displacement as input and the aerodynamic force as output, which in turn acts as an external force to the aeroelastic displacement equation for providing the structural deformation. NASA's rotor 67 blade is used to study its aeroelastic characteristics under the designated operating condition. First, the CFD results are validated against measured data available for the steady state condition. Then, the accuracy of the developed reduced order model is compared with the full-order solutions. Finally the aeroelastic solutions of the blade are computed and a flutter boundary is identified, suggesting that the rotor, with the material property chosen for the study, is structurally stable at the operating condition, free of encountering flutter.
TSAN: a package for time series analysis.
Wang, D C; Vagnucci, A H
1980-04-01
Many biomedical data are in the form of time series. Analyses of these data include: (1) search for any biorhythm; (2) test of homogeneity of several time series; (3) assessment of stationarity; (4) test of normality of the time series histogram; (5) evaluation of dependence between data points. In this paper we present a subroutine package called TSAN. It is developed to accomplish these tasks. Computational methods, as well as flowcharts, for these subroutines are described. Two sample runs are demonstrated.
A novel similarity comparison approach for dynamic ECG series.
Yin, Hong; Zhu, Xiaoqian; Ma, Shaodong; Yang, Shuqiang; Chen, Liqian
2015-01-01
The heart sound signal is a reflection of heart and vascular system motion. Long-term continuous electrocardiogram (ECG) contains important information which can be helpful to prevent heart failure. A single piece of a long-term ECG recording usually consists of more than one hundred thousand data points in length, making it difficult to derive hidden features that may be reflected through dynamic ECG monitoring, which is also very time-consuming to analyze. In this paper, a Dynamic Time Warping based on MapReduce (MRDTW) is proposed to make prognoses of possible lesions in patients. Through comparison of a real-time ECG of a patient with the reference sets of normal and problematic cardiac waveforms, the experimental results reveal that our approach not only retains high accuracy, but also greatly improves the efficiency of the similarity measure in dynamic ECG series.
Rotavirus and adenovirus gastroenteritis: time series analysis.
Celik, Cem; Gozel, Mustafa Gokhan; Turkay, Hakan; Bakici, Mustafa Zahir; Güven, Ahmet Sami; Elaldi, Nazif
2015-08-01
This study investigated the effects of changes in weather conditions (monthly average temperature, monthly minimum temperature, monthly average humidity) on rotavirus and adenovirus gastroenteritis frequency and whether there was a seasonal correlation. Between 2006 and 2012, 4702 fecal samples were taken from patients ≤ 5 years of age with acute gastroenteritis; these samples were analyzed in terms of rotavirus group A and adenovirus serotype 40-41 antigens using time-series and negative binomial regression analysis. Rotavirus antigens were found in 797 samples (17.0%), adenovirus antigens in 113 samples (2.4%), and rotavirus and adenovirus antigens together in 16 samples (0.3%). There was a seasonal change in rotavirus gastroenteritis (P < 0.001), and a 1°C decrease in average temperature increased the ratio of rotavirus cases in those with diarrhea by 0.523%. In addition, compared with data from other years, the number of patients was lower in the first month of 2008 and in the second month of 2012, when the temperature was below -20°C (monthly minimum temperature). There was no statistically significant relationship between adenovirus infection and change in weather conditions. Various factors such as change in weather conditions, as well as the population's sensitivity and associated changes in activity, play a role in the spread of rotavirus infection. © 2015 Japan Pediatric Society.
Deciding on the best (in this case) approach to time-series forecasting
Pack, D. J.
1980-01-01
This paper was motivated by a Decision Sciences article (v. 10, no. 2, 232-244(April 1979)) that presented comparisons of the adaptive estimation procedure (AEP), adaptive filtering, the Box-Jenkins (BJ) methodology, and multiple regression analysis as they apply to time-series forecasting with single-series models. While such comparisons are to be applauded in general, it is demonstrated that the empirical comparisons of the above paper are quite misleading with respect to choosing between the AEP and BJ approaches. This demonstration is followed by a somewhat philosophical discussion on comparison-of-methods techniques.
The scaling of time series size towards detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Gao, Xiaolei; Ren, Liwei; Shang, Pengjian; Feng, Guochen
2016-06-01
In this paper, we introduce a modification of detrended fluctuation analysis (DFA), called multivariate DFA (MNDFA) method, based on the scaling of time series size N. In traditional DFA method, we obtained the influence of the sequence segmentation interval s, and it inspires us to propose a new model MNDFA to discuss the scaling of time series size towards DFA. The effectiveness of the procedure is verified by numerical experiments with both artificial and stock returns series. Results show that the proposed MNDFA method contains more significant information of series compared to traditional DFA method. The scaling of time series size has an influence on the auto-correlation (AC) in time series. For certain series, we obtain an exponential relationship, and also calculate the slope through the fitting function. Our analysis and finite-size effect test demonstrate that an appropriate choice of the time series size can avoid unnecessary influences, and also make the testing results more accurate.
Apparatus for statistical time-series analysis of electrical signals
NASA Technical Reports Server (NTRS)
Stewart, C. H. (Inventor)
1973-01-01
An apparatus for performing statistical time-series analysis of complex electrical signal waveforms, permitting prompt and accurate determination of statistical characteristics of the signal is presented.
NASA Astrophysics Data System (ADS)
Hu, Y. M.; Liang, Z. M.; Jiang, X. L.; Bu, H.
2015-06-01
In this paper, a novel approach for non-stationary hydrological frequency analysis is proposed. The approach is due to the following consideration that, at present the data series used to detect mutation characteristic is very short, which may only reflect the partial characteristic of the population. That is to say, the mutation characteristic of short series may not fully represent the mutation characteristic of population, such as the difference of mutation degree between short sample and population. In this proposed method, an assumption is done that the variation hydrological series in a big time window owns an expected vibration center (EVC), which is a linear combination of the two mean values of the two subsample series obtained through separating the original hydrological series by a novel optimal segmentation technique (change rate of slope method). Then using the EVC to reconstruct non-stationary series to meet the requirement of stationary, and further ensure the conventional frequency analysis methods is valid.
Time Series in Education: The Analysis of Daily Attendance in Two High Schools
ERIC Educational Resources Information Center
Koopmans, Matthijs
2011-01-01
This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…
A Taylor series approach for coupled queueing systems with intermediate load
NASA Astrophysics Data System (ADS)
Evdokimova, Ekaterina; Wittevrongel, Sabine; Fiems, Dieter
2017-07-01
We focus on the numerical analysis of a coupled queueing system with Poisson arrivals and exponentially distributed service times. Such a system consists of multiple queues served by a single server. Service is synchronised meaning that there is a departure from every queue upon service completion and there is no service whenever one of the queues is empty. It was shown before that the terms in the Maclaurin series expansion of the steady-state distribution of this queueing system when the service rate is sent to 0 (overload) can be calculated efficiently. In the present paper we extend this approach to lower loads. We focus on a sequence of Taylor series expansions of the stationary distribution around increasing service rates. For each series expansion, we use Jacobi iteration to calculate the terms in the series expansion where the initial solution is the approximation found by the preceding series expansion. As the generator matrix of the queueing system at hand is sparse, the numerical complexity of a single Jacobi iteration is O(N MK), where N is the order of the series expansion, K is the number of queues and M is the size of the state space. Having a good initial solution reduces the number of Jacobi iterations considerably, meaning that we can find a sequence of good approximations of the steady state probabilities fast.
Time-series analysis of offshore-wind-wave groupiness
Liang, H.B.
1988-01-01
This research is to applies basic time-series-analysis techniques on the complex envelope function where the study of the offshore-wind-wave groupiness is a relevant interest. In constructing the complex envelope function, a phase-unwrapping technique is integrated into the algorithm for estimating the carrier frequency and preserving the phase information for further studies. The Gaussian random wave model forms the basis of the wave-group statistics by the envelope-amplitude crossings. Good agreement between the theory and the analysis of field records is found. Other linear models, such as the individual-waves approach and the energy approach, are compared to the envelope approach by analyzing the same set of records. It is found that the character of the filter used in each approach dominates the wave-group statistics. Analyses indicate that the deep offshore wind waves are weakly nonlinear and the Gaussian random assumption remains appropriate for describing the sea state. Wave groups statistics derived from the Gaussian random wave model thus become applicable.
Analysis of Nonstationary Time Series for Biological Rhythms Research.
Leise, Tanya L
2017-06-01
This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.
Incontinence--an aggressive approach to treatment: a case series.
Dornan, P R
2005-12-01
Recent evidence suggests that, for some, leaking urine may be a barrier to physical activity. Although important from a lifestyle point of view, bladder problems and incontinence also affect both men and women socially, psychologically and economically. For example, it can be particularly distressing when incontinence occurs post-prostate surgery, especially if these patients were continent before surgery. This case series outlines an aggressive, innovative, exercise-based approach to the management of stress incontinence post-prostatectomy. The program attempts to enhance neuromuscular and vascular systems associated with continence, with emphasis placed on the abdominal and pelvic floor muscles. The program was undertaken by 14 incontinent post-prostatectomy patients (mean age 63.5 y, using a mean of 3.5 sanitary pads per day). The program was initiated a mean of two months post-op and had a mean duration of six months. Upon completion of the program. 10 patients were found to be completely dry with three retaining a small leakage (a few drops). The 14th could not comply with the program because of illness. The results of this study appear promising in this patient population. There are indications for further research.
A Time Series Approach for Soil Moisture Estimation
NASA Technical Reports Server (NTRS)
Kim, Yunjin; vanZyl, Jakob
2006-01-01
Soil moisture is a key parameter in understanding the global water cycle and in predicting natural hazards. Polarimetric radar measurements have been used for estimating soil moisture of bare surfaces. In order to estimate soil moisture accurately, the surface roughness effect must be compensated properly. In addition, these algorithms will not produce accurate results for vegetated surfaces. It is difficult to retrieve soil moisture of a vegetated surface since the radar backscattering cross section is sensitive to the vegetation structure and environmental conditions such as the ground slope. Therefore, it is necessary to develop a method to estimate the effect of the surface roughness and vegetation reliably. One way to remove the roughness effect and the vegetation contamination is to take advantage of the temporal variation of soil moisture. In order to understand the global hydrologic cycle, it is desirable to measure soil moisture with one- to two-days revisit. Using these frequent measurements, a time series approach can be implemented to improve the soil moisture retrieval accuracy.
Time-series analysis of Campylobacter incidence in Switzerland.
Wei, W; Schüpbach, G; Held, L
2015-07-01
Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.
Critical Thinking Skills. Analysis and Action Series.
ERIC Educational Resources Information Center
Heiman, Marcia; Slomianko, Joshua
Intended for teachers across grade levels and disciplines, this monograph reviews research on the development of critical thinking skills and introduces a series of these skills that can be incorporated into classroom teaching. Beginning with a definition of critical thinking, the monograph contains two main sections. The first section reviews…
Three Analysis Examples for Time Series Data
USDA-ARS?s Scientific Manuscript database
With improvements in instrumentation and the automation of data collection, plot level repeated measures and time series data are increasingly available to monitor and assess selected variables throughout the duration of an experiment or project. Records and metadata on variables of interest alone o...
Time Series Analysis of Mother-Infant Interaction.
ERIC Educational Resources Information Center
Rosenfeld, Howard M.
A method of studying attachment behavior in infants was devised using time series and time sequence analyses. Time series analysis refers to relationships between events coded over adjacent fixed-time units. Time sequence analysis refers to the distribution of exact times at which particular events happen. Using these techniques, multivariate…
Approaches for mechanical joining of 7xxx series aluminum alloys
NASA Astrophysics Data System (ADS)
Jäckel, M.; Grimm, T.; Landgrebe, D.
2016-10-01
This paper shows a numerical and experimental analysis of the different problems occurring during or after the conventional self-pierce riveting with semi-tubular and solid rivets of the high strength aluminum alloy EN AW-7021 T4. Furthermore this paper describes different pre-process methods by which the fracture in the high strength aluminum, caused by the self-pierce riveting processes, can be prevented and proper joining results are achieved. On this basis, the different approaches are compared regarding joint strength.
A time-series approach to dynamical systems from classical and quantum worlds
Fossion, Ruben
2014-01-08
This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.
A time-series approach to dynamical systems from classical and quantum worlds
NASA Astrophysics Data System (ADS)
Fossion, Ruben
2014-01-01
This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.
Fractal approach towards power-law coherency to measure cross-correlations between time series
NASA Astrophysics Data System (ADS)
Kristoufek, Ladislav
2017-09-01
We focus on power-law coherency as an alternative approach towards studying power-law cross-correlations between simultaneously recorded time series. To be able to study empirical data, we introduce three estimators of the power-law coherency parameter Hρ based on popular techniques usually utilized for studying power-law cross-correlations - detrended cross-correlation analysis (DCCA), detrending moving-average cross-correlation analysis (DMCA) and height cross-correlation analysis (HXA). In the finite sample properties study, we focus on the bias, variance and mean squared error of the estimators. We find that the DMCA-based method is the safest choice among the three. The HXA method is reasonable for long time series with at least 104 observations, which can be easily attainable in some disciplines but problematic in others. The DCCA-based method does not provide favorable properties which even deteriorate with an increasing time series length. The paper opens a new venue towards studying cross-correlations between time series.
Interstage Flammability Analysis Approach
NASA Technical Reports Server (NTRS)
Little, Jeffrey K.; Eppard, William M.
2011-01-01
The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.
Time series analysis of Monte Carlo neutron transport calculations
NASA Astrophysics Data System (ADS)
Nease, Brian Robert
A time series based approach is applied to the Monte Carlo (MC) fission source distribution to calculate the non-fundamental mode eigenvalues of the system. The approach applies Principal Oscillation Patterns (POPs) to the fission source distribution, transforming the problem into a simple autoregressive order one (AR(1)) process. Proof is provided that the stationary MC process is linear to first order approximation, which is a requirement for the application of POPs. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern k-eigenvalue MC codes calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. The strength of this approach is contrasted against the Fission Matrix method (FMM) in terms of accuracy versus computer memory constraints. Multi-dimensional problems are considered since the approach has strong potential for use in reactor analysis, and the implementation of the method into production codes is discussed. Lastly, the appearance of complex eigenvalues is investigated and solutions are provided.
Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis
NASA Astrophysics Data System (ADS)
Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo
2002-03-01
In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.
The U-series comminution approach: where to from here
NASA Astrophysics Data System (ADS)
Handley, Heather; Turner, Simon; Afonso, Juan; Turner, Michael; Hesse, Paul
2015-04-01
Quantifying the rates of landscape evolution in response to climate change is inhibited by the difficulty of dating the formation of continental detrital sediments. The 'comminution age' dating model of DePaolo et al. (2006) hypothesises that the measured disequilibria between U-series nuclides (234U and 238U) in fine-grained continental (detrital) sediments can be used to calculate the time elapsed since mechanical weathering of a grain to the threshold size ( 50 µm). The comminution age includes the time that a particle has been mobilised in transport, held in temporary storage (e.g., soils and floodplains) and the time elapsed since final deposition to present day. Therefore, if the deposition age of sediment can be constrained independently, for example via optically stimulated luminescence (OSL) dating, the residence time of sediment (e.g., a palaeochannel deposit) can be determined. Despite the significant potential of this approach, there is still much work to be done before meaningful absolute comminution ages can be obtained. The calculated recoil loss factor and comminution age are highly dependent on the method of recoil loss factor determination used and the inherent assumptions. We present new and recently published uranium isotope data for aeolian sediment deposits, leached and unleached palaeochannel sediments and bedrock samples from Australia to exemplify areas of current uncertainty in the comminution age approach. In addition to the information gained from natural samples, Monte Carlo simulations have been conducted for a synthetic sediment sample to determine the individual and combined comminution age uncertainties associated to each input variable. Using a reasonable associated uncertainty for each input factor and including variations in the source rock and measured (234U/238U) ratios, the total combined uncertainty on comminution age in our simulation (for two methods of recoil loss factor estimation: weighted geometric and surface area
Analysis of Multipsectral Time Series for supporting Forest Management Plans
NASA Astrophysics Data System (ADS)
Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.
2010-05-01
Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.
Taylor series expansion and modified extended Prony analysis for localization
Mosher, J.C.; Lewis, P.S.
1994-12-01
In the multiple source localization problem, many inverse routines use a rooting of a polynomial to determine the source locations. The authors present a rooting algorithm for locating an unknown number of three-dimensional, near-field, static sources from measurements at an arbitrarily spaced three-dimensional array. Since the sources are near-field and static, the spatial covariance matrix is always rank one, and spatial smoothing approaches are inappropriate due to the spatial diversity. The authors approach the solution through spherical harmonics, essentially replacing the point source function with its Taylor series expansion. They then perform a modified extended Prony analysis of the expansion coefficients to determine the number and location of the sources. The full inverse method is typically ill-conditioned, but a portion of the algorithm is suitable for synthesis analysis. They present a simulation for simplifying point charges limited to a spherical region, using an array of voltage potential measurements made outside the region. Future efforts of this work will focus on adapting the analysis to the electroencephalography and magnetoencephalography.
Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis
NASA Technical Reports Server (NTRS)
Eberhart, C. J.; Casiano, M. J.
2015-01-01
Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.
Performance of multifractal detrended fluctuation analysis on short time series
NASA Astrophysics Data System (ADS)
López, Juan Luis; Contreras, Jesús Guillermo
2013-02-01
The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.
Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel
2015-01-01
Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in
A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series
Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M.; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel
2015-01-01
Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in
Dean, Dennis A.; Adler, Gail K.; Nguyen, David P.; Klerman, Elizabeth B.
2014-01-01
We present a novel approach for analyzing biological time-series data using a context-free language (CFL) representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP) analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals. PMID:25184442
Dean, Dennis A; Adler, Gail K; Nguyen, David P; Klerman, Elizabeth B
2014-01-01
We present a novel approach for analyzing biological time-series data using a context-free language (CFL) representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP) analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.
Nonlinear times series analysis of epileptic human electroencephalogram (EEG)
NASA Astrophysics Data System (ADS)
Li, Dingzhou
The problem of seizure anticipation in patients with epilepsy has attracted significant attention in the past few years. In this paper we discuss two approaches, using methods of nonlinear time series analysis applied to scalp electrode recordings, which is able to distinguish between epochs temporally distant from and just prior to, the onset of a seizure in patients with temporal lobe epilepsy. First we describe a method involving a comparison of recordings taken from electrodes adjacent to and remote from the site of the seizure focus. In particular, we define a nonlinear quantity which we call marginal predictability. This quantity is computed using data from remote and from adjacent electrodes. We find that the difference between the marginal predictabilities computed for the remote and adjacent electrodes decreases several tens of minutes prior to seizure onset, compared to its value interictally. We also show that these difl'crcnc es of marginal predictability intervals are independent of the behavior state of the patient. Next we examine the please coherence between different electrodes both in the long-range and the short-range. When time is distant from seizure onsets ("interictally"), epileptic patients have lower long-range phase coherence in the delta (1-4Hz) and beta (18-30Hz) frequency band compared to nonepileptic subjects. When seizures approach (''preictally"), we observe an increase in phase coherence in the beta band. However, interictally there is no difference in short-range phase coherence between this cohort of patients and non-epileptic subjects. Preictally short-range phase coherence also increases in the alpha (10-13Hz) and the beta band. Next we apply the quantity marginal predictability on the phase difference time series. Such marginal predictabilities are lower in the patients than in the non-epileptic subjects. However, when seizure approaches, the former moves asymptotically towards the latter.
SAGE: A tool for time-series analysis of Greenland
NASA Astrophysics Data System (ADS)
Duerr, R. E.; Gallaher, D. W.; Khalsa, S. S.; Lewis, S.
2011-12-01
The National Snow and Ice Data Center (NSIDC) has developed an operational tool for analysis. This production tool is known as "Services for the Analysis of the Greenland Environment" (SAGE). Using an integrated workspace approach, a researcher has the ability to find relevant data and perform various analysis functions on the data, as well as retrieve the data and analysis results. While there continues to be compelling observational evidence for increased surface melting and rapid thinning along the margins of the Greenland ice sheet, there are still uncertainties with respect to estimates of mass balance of Greenland's ice sheet as a whole. To better understand the dynamics of these issues, it is important for scientists to have access to a variety of datasets from multiple sources, and to be able to integrate and analyze the data. SAGE provides data from various sources, such as AMSR-E and AVHRR datasets, which can be analyzed individually through various time-series plots and aggregation functions; or they can be analyzed together with scatterplots or overlaid time-series plots to provide quick and useful results to support various research products. The application is available at http://nsidc.org/data/sage/. SAGE was built on top of NSIDC's existing Searchlight engine. The SAGE interface gives users access to much of NSIDC's relevant Greenland raster data holdings, as well as data from outside sources. Additionally, various web services provide access for other clients to utilize the functionality that the SAGE interface provides. Combined, these methods of accessing the tool allow scientists the ability to devote more of their time to their research, and less on trying to find and retrieve the data they need.
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
Wavelet transform approach for fitting financial time series data
NASA Astrophysics Data System (ADS)
Ahmed, Amel Abdoullah; Ismail, Mohd Tahir
2015-10-01
This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
Time series analysis of air pollutants in Beirut, Lebanon.
Farah, Wehbeh; Nakhlé, Myriam Mrad; Abboud, Maher; Annesi-Maesano, Isabella; Zaarour, Rita; Saliba, Nada; Germanos, Georges; Gerard, Jocelyne
2014-12-01
This study reports for the first time a time series analysis of daily urban air pollutant levels (CO, NO, NO2, O3, PM10, and SO2) in Beirut, Lebanon. The study examines data obtained between September 2005 and July 2006, and their descriptive analysis shows long-term variations of daily levels of air pollution concentrations. Strong persistence of these daily levels is identified in the time series using an autocorrelation function, except for SO2. Time series of standardized residual values (SRVs) are also calculated to compare fluctuations of the time series with different levels. Time series plots of the SRVs indicate that NO and NO2 had similar temporal fluctuations. However, NO2 and O3 had opposite temporal fluctuations, attributable to weather conditions and the accumulation of vehicular emissions. The effects of both desert dust storms and airborne particulate matter resulting from the Lebanon War in July 2006 are also discernible in the SRV plots.
Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip
2016-06-28
The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.
[The ethical approach applied to the TV series ER].
Svandra, Philippe
2013-05-01
The television series ER presents an opportunity to reflect on ethical dilemmas. This article discusses the example of an episode in which a patient suffering from an incurable disease, unable to express his views clearly, has a tracheotomy performed on him without the consent of the team or his health care proxy.
Wavelet analysis and scaling properties of time series
NASA Astrophysics Data System (ADS)
Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.
2005-10-01
We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis
Wang, Qiang; Megalooikonomou, Vasileios
2008-01-01
We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587
NASA Technical Reports Server (NTRS)
Gao, X. H.; Stanford, J. L.
1988-01-01
The formulas for performing several statistical calculations based on Fourier coefficients are presented for use in atmospheric observational studies. The calculations discussed include a method for estimating the degree of temporal freedoms of two correlated time series and a method for performing seasonal analyses using a half-year summer/winter projection operator in the frequency domain. A modified lag-correlation calculation is proposed for obtaining lag correlations in the frequency domain. Also, a spectral approach for Empirical Orthogonal Function (EOF) and Extended EOF analysis is given which reduces the size of the matrix to be solved in the eigenproblem.
NASA Technical Reports Server (NTRS)
Gao, X. H.; Stanford, J. L.
1988-01-01
The formulas for performing several statistical calculations based on Fourier coefficients are presented for use in atmospheric observational studies. The calculations discussed include a method for estimating the degree of temporal freedoms of two correlated time series and a method for performing seasonal analyses using a half-year summer/winter projection operator in the frequency domain. A modified lag-correlation calculation is proposed for obtaining lag correlations in the frequency domain. Also, a spectral approach for Empirical Orthogonal Function (EOF) and Extended EOF analysis is given which reduces the size of the matrix to be solved in the eigenproblem.
Detrended fluctuation analysis of multivariate time series
NASA Astrophysics Data System (ADS)
Xiong, Hui; Shang, P.
2017-01-01
In this work, we generalize the detrended fluctuation analysis (DFA) to the multivariate case, named multivariate DFA (MVDFA). The validity of the proposed MVDFA is illustrated by numerical simulations on synthetic multivariate processes, where the cases that initial data are generated independently from the same system and from different systems as well as the correlated variate from one system are considered. Moreover, the proposed MVDFA works well when applied to the multi-scale analysis of the returns of stock indices in Chinese and US stock markets. Generally, connections between the multivariate system and the individual variate are uncovered, showing the solid performances of MVDFA and the multi-scale MVDFA.
Nonlinear Time Series Analysis in Earth Sciences - Potentials and Pitfalls
NASA Astrophysics Data System (ADS)
Kurths, Jürgen; Donges, Jonathan F.; Donner, Reik V.; Marwan, Norbert; Zou, Yong
2010-05-01
The application of methods of nonlinear time series analysis has a rich tradition in Earth sciences and has enabled substantially new insights into various complex processes there. However, some approaches and findings have been controversially discussed over the last decades. One reason is that they are often bases on strong restrictions and their violation may lead to pitfalls and misinterpretations. Here, we discuss three general concepts of nonlinear dynamics and statistical physics, synchronization, recurrence and complex networks and explain how to use them for data analysis. We show that the corresponding methods can be applied even to rather short and non-stationary data which are typical in Earth sciences. References Marwan, N., Romano, M., Thiel, M., Kurths, J.: Recurrence plots for the analysis of complex systems, Physics Reports 438, 237-329 (2007) Arenas, A., Diaz-Guilera, A., Kurths, J., Moreno, Y., Zhou, C.: Synchronization in complex networks, Physics Reports 469, 93-153 (2008) Marwan, N., Donges, J.F., Zou, Y., Donner, R. and Kurths, J., Phys. Lett. A 373, 4246 (2009) Donges, J.F., Zou, Y., Marwan, N. and Kurths, J. Europhys. Lett. 87, 48007 (2009) Donner, R., Zou, Y., Donges, J.F., Marwan, N. and Kurths, J., Phys. Rev. E 81, 015101(R) (2010)
Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory
NASA Astrophysics Data System (ADS)
Wang, Na; Li, Dong; Wang, Qiwen
2012-12-01
The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government
Time series analysis of compressor noise.
NASA Technical Reports Server (NTRS)
Hardin, J. C.; Brown, T. J.
1972-01-01
Description of a method to determine compressor directivity at discrete frequencies by separation of the tone power from the noise background. The technique developed effected the separation by Fourier analysis of the sound pressure autocorrelation at large lags. This produced experimental directivity results which compared reasonably well with the theoretical patterns except near the compressor axis.
Time averaging, ageing and delay analysis of financial time series
NASA Astrophysics Data System (ADS)
Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf
2017-06-01
We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.
Wang, Dan; Xue, Luan; Yang, Yue; Hu, Jiandong; Li, Guoling; Piao, Xuemei
2014-09-01
The purpose of this study was to analyze the temporal gene expression in salivary and lacrimal glands of Sjögren's syndrome (SS) based on time-series microarray data. We downloaded gene expression data GSE15640 and GSE48139 from gene expression omnibus and identified differentially expressed genes (DEGs) at varying time points using a modified Bayes analysis. Gene clustering was applied to analyze the expression differences in time series of the DEGs. Protein-protein interaction networks were used for searching the hub genes, and gene ontology (GO) and KEGG pathways were applied to analyze the DEGs at a functional level. A total of 744 and 1,490 DEGs were screened out from the salivary glands and lacrimal glands, respectively. Among these genes, 194 were overlapped between salivary glands and lacrimal glands, and these genes were compartmentalized into six clusters with different expression profiles. The GO terms of intracellular transport, protein transport and protein localization were significantly enriched by DEGs in salivary glands; while in the lacrimal glands, DEGs were significantly enriched in protein localization, establishment of protein localization and protein transport. Our results suggest that the SS pathogenesis was significantly different in time series in the salivary and lacrimal glands. The DEGs whose expressions may correlate with molecular mechanisms of SS in our study might provide new insight into the underlying cause or regulation of this disease.
NASA Astrophysics Data System (ADS)
Yan, Jun; Dong, Danan; Chen, Wen
2016-04-01
Due to the development of GNSS technology and the improvement of its positioning accuracy, observational data obtained by GNSS is widely used in Earth space geodesy and geodynamics research. Whereas the GNSS time series data of observation stations contains a plenty of information. This includes geographical space changes, deformation of the Earth, the migration of subsurface material, instantaneous deformation of the Earth, weak deformation and other blind signals. In order to decompose some instantaneous deformation underground, weak deformation and other blind signals hided in GNSS time series, we apply Independent Component Analysis (ICA) to daily station coordinate time series of the Southern California Integrated GPS Network. As ICA is based on the statistical characteristics of the observed signal. It uses non-Gaussian and independence character to process time series to obtain the source signal of the basic geophysical events. In term of the post-processing procedure of precise time-series data by GNSS, this paper examines GNSS time series using the principal component analysis (PCA) module of QOCA and ICA algorithm to separate the source signal. Then we focus on taking into account of these two signal separation technologies, PCA and ICA, for separating original signal that related geophysical disturbance changes from the observed signals. After analyzing these separation process approaches, we demonstrate that in the case of multiple factors, PCA exists ambiguity in the separation of source signals, that is the result related is not clear, and ICA will be better than PCA, which means that dealing with GNSS time series that the combination of signal source is unknown is suitable to use ICA.
Topic Time Series Analysis of Microblogs
2014-10-01
is generated by Instagram. Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite...center in Commerce, CA (a subdivision of Los Angeles). Topic 80, Distance: 6.6391 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis...8.65 0.90 0.040 ‘cold’ ‘af’ ‘outside’ 7.88 0.60 0.059 ‘chico’ ‘fluff’ ‘ice’ 9.10 0.19 0.002 ‘rawr’ ‘ dinosaur ’ ‘jurassic’ ‘seen’ 0.55 0.36 4.15 6.2
Time Series Analysis of Insar Data: Methods and Trends
NASA Technical Reports Server (NTRS)
Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique
2015-01-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
Time Series Analysis of Insar Data: Methods and Trends
NASA Technical Reports Server (NTRS)
Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique
2015-01-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
Qu, Cheng; Wang, Lin-Yan; Jin, Wen-Tao; Tang, Yu-Ping; Jin, Yi; Shi, Xu-Qin; Shang, Li-Li; Shang, Er-Xin; Duan, Jin-Ao
2016-11-06
The flower of Carthamus tinctorius L. (Carthami Flos, safflower), important in traditional Chinese medicine (TCM), is known for treating blood stasis, coronary heart disease, hypertension, and cerebrovascular disease in clinical and experimental studies. It is widely accepted that hydroxysafflor yellow A (HSYA) and anhydrosafflor yellow B (ASYB) are the major bioactive components of many formulae comprised of safflower. In this study, selective knock-out of target components such as HSYA and ASYB by using preparative high performance liquid chromatography (prep-HPLC) followed by antiplatelet and anticoagulation activities evaluation was used to investigate the roles of bioactive ingredients in safflower series of herb pairs. The results showed that both HSYA and ASYB not only played a direct role in activating blood circulation, but also indirectly made a contribution to the total bioactivity of safflower series of herb pairs. The degree of contribution of HSYA in the safflower and its series herb pairs was as follows: Carthami Flos-Ginseng Radix et Rhizoma Rubra (CF-GR) > Carthami Flos-Sappan Lignum (CF-SL) > Carthami Flos-Angelicae Sinensis Radix (CF-AS) > Carthami Flos-Astragali Radix (CF-AR) > Carthami Flos-Angelicae Sinensis Radix (CF-AS) > Carthami Flos-Glycyrrhizae Radix et Rhizoma (CF-GL) > Carthami Flos-Salviae Miltiorrhizae Radix et Rhizoma (CF-SM) > Carthami Flos (CF), and the contribution degree of ASYB in the safflower and its series herb pairs: CF-GL > CF-PS > CF-AS > CF-SL > CF-SM > CF-AR > CF-GR > CF. So, this study provided a significant and effective approach to elucidate the contribution of different herbal components to the bioactivity of the herb pair, and clarification of the variation of herb-pair compatibilities. In addition, this study provides guidance for investigating the relationship between herbal compounds and the bioactivities of herb pairs. It also provides a scientific basis for reasonable clinical applications and new drug
Spectral analysis of the Elatina varve series
NASA Technical Reports Server (NTRS)
Bracewell, R. N.
1988-01-01
The Elatina formation in South America, which provides a rich fossil record of presumptive solar activity in the late Precambrian, is of great potential significance for the physics of the sun because it contains luminae grouped in cycles of about 12, an appearance suggestive of the solar cycle. Here, the laminae are treated as varves laid down yearly and modulated in thickness in accordance with the late Precambrian sunspot activity for the year of deposition. The purpose is to present a simple structure, or intrinsic spectrum, that will be uncovered by appropriate data analysis.
Automated analysis of brachial ultrasound time series
NASA Astrophysics Data System (ADS)
Liang, Weidong; Browning, Roger L.; Lauer, Ronald M.; Sonka, Milan
1998-07-01
Atherosclerosis begins in childhood with the accumulation of lipid in the intima of arteries to form fatty streaks, advances through adult life when occlusive vascular disease may result in coronary heart disease, stroke and peripheral vascular disease. Non-invasive B-mode ultrasound has been found useful in studying risk factors in the symptom-free population. Large amount of data is acquired from continuous imaging of the vessels in a large study population. A high quality brachial vessel diameter measurement method is necessary such that accurate diameters can be measured consistently in all frames in a sequence, across different observers. Though human expert has the advantage over automated computer methods in recognizing noise during diameter measurement, manual measurement suffers from inter- and intra-observer variability. It is also time-consuming. An automated measurement method is presented in this paper which utilizes quality assurance approaches to adapt to specific image features, to recognize and minimize the noise effect. Experimental results showed the method's potential for clinical usage in the epidemiological studies.
Schoolwide Approaches to Discipline. The Informed Educator Series.
ERIC Educational Resources Information Center
Porch, Stephanie
Although there are no simple solutions for how to turn around a school with serious discipline problems, schoolwide approaches have been effective, according to this report. The report examines research on schoolwide approaches to discipline and discusses the characteristics of programs that promote a culture of safety and support, improved…
Time series data analysis using DFA
NASA Astrophysics Data System (ADS)
Okumoto, A.; Akiyama, T.; Sekino, H.; Sumi, T.
2014-02-01
Detrended fluctuation analysis (DFA) was originally developed for the evaluation of DNA sequence and interval for heart rate variability (HRV), but it is now used to obtain various biological information. In this study we perform DFA on artificially generated data where we already know the relationship between signal and the physical event causing the signal. We generate artificial data using molecular dynamics. The Brownian motion of a polymer under an external force is investigated. In order to generate artificial fluctuation in the physical properties, we introduce obstacle pillars fixed to nanostructures. Using different conditions such as presence or absence of obstacles, external field, and the polymer length, we perform DFA on energies and positions of the polymer.
Advanced tools for astronomical time series and image analysis
NASA Astrophysics Data System (ADS)
Scargle, Jeffrey D.
The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.
Aroma characterization based on aromatic series analysis in table grapes.
Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping
2016-08-04
Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, 'Kyoho' grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes.
Aroma characterization based on aromatic series analysis in table grapes
Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping
2016-01-01
Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, ‘Kyoho’ grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935
Causal analysis of time series from hydrological systems
NASA Astrophysics Data System (ADS)
Selle, Benny; Aufgebauer, Britta; Knorr, Klaus-Holger
2017-04-01
It is often difficult to infer cause and effect in hydrological systems for which time series of system inputs, outputs and state variables are observed. A recently published technique called Convergent Cross Mapping could be a promising tool to detect causality between time series. A response variable Y may be causally related to a forcing variable X, if the so called cross mapping of X using Y improves with the amount of data included. The idea is that a response variable contains information on the history of its driving variable whereas the reverse may not be true. We propose an alternative approach based on similar ideas using neural networks. Our approach is firstly compared to Convergent Cross Mapping using a synthetic time series of precipitation and streamflow generated by a rainfall runoff model. Secondly, measured concentrations of dissolved organic carbon and dissolved iron from a mountainous stream in Germany, that were previously hypothesised to be casually linked, are tested.
Digital time series analysis for flutter test data
NASA Technical Reports Server (NTRS)
Batill, S. M.; Carey, D. M.; Kehoe, M. W.
1992-01-01
An application of digital time series analysis to flutter test data processing was conducted. A numerical investigation was used to evaluate the method, as well as its sensitivity to noise and parameter variations. These parameters included those involved with data acquisition, as well as system response characteristics. This digital time series method was then used to predict flutter speed from subcritical response wind tunnel tests. Flutter speeds predicted from forced response, subcritical wind tunnel tests were compared to the experimental flutter speeds.
Time series power flow analysis for distribution connected PV generation.
Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger
2013-01-01
Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating
Inverse approach to chronotaxic systems for single-variable time series
NASA Astrophysics Data System (ADS)
Clemson, Philip T.; Suprunenko, Yevhen F.; Stankovski, Tomislav; Stefanovska, Aneta
2014-03-01
Following the development of a new class of self-sustained oscillators with a time-varying but stable frequency, the inverse approach to these systems is now formulated. We show how observed data arranged in a single-variable time series can be used to recognize such systems. This approach makes use of time-frequency domain information using the wavelet transform as well as the recently developed method of Bayesian-based inference. In addition, a set of methods, named phase fluctuation analysis, is introduced to detect the defining properties of the new class of systems by directly analyzing the statistics of the observed perturbations.We apply these methods to numerical examples but also elaborate further on the cardiac system.
Fundamental approach to dipmeter analysis
Enderlin, M.B.; Hansen, D.K.T.
1988-01-01
Historically, in dipmeter analysis, depositional patterns are delineated for environmental, structural, and stratigraphic interpretations. The proposed method is a fundamental approach using raw data measurements from the dipmeter sonde to help the geologist describe subsurface structures on a stratigraphic scale. Raw data are available at the well site, require no post-processing, are cost effective, easy to use, require only a basic understanding of sedimentary features and can be combined with computed results. A case study illustrates the reconstruction of sedimentary features from a raw data log recorded by a six-arm dipmeter. The dipmeter is a wireline tool with a series of evenly spaced, focused electrodes applied to the circumference of the borehole wall. The raw data are presented as curves representing the electrode response and tool orientation. In outcrop, the geologist usually can see an entire sedimentary feature in a large perspective, that is, with the surrounding landscape. Therefore, a large range of features can be resolved. However, in the borehole environment the perspective is reduced to the borehole diameter, thus reducing the range of recognizable features. In this study, a table was assembled that identifies the features distinguished by the proposed method as a function of borehole diameter.
NASA Astrophysics Data System (ADS)
Šilhán, Karel; Stoffel, Markus
2015-05-01
Different approaches and thresholds have been utilized in the past to date landslides with growth ring series of disturbed trees. Past work was mostly based on conifer species because of their well-defined ring boundaries and the easy identification of compression wood after stem tilting. More recently, work has been expanded to include broad-leaved trees, which are thought to produce less and less evident reactions after landsliding. This contribution reviews recent progress made in dendrogeomorphic landslide analysis and introduces a new approach in which landslides are dated via ring eccentricity formed after tilting. We compare results of this new and the more conventional approaches. In addition, the paper also addresses tree sensitivity to landslide disturbance as a function of tree age and trunk diameter using 119 common beech (Fagus sylvatica L.) and 39 Crimean pine (Pinus nigra ssp. pallasiana) trees growing on two landslide bodies. The landslide events reconstructed with the classical approach (reaction wood) also appear as events in the eccentricity analysis, but the inclusion of eccentricity clearly allowed for more (162%) landslides to be detected in the tree-ring series. With respect to tree sensitivity, conifers and broad-leaved trees show the strongest reactions to landslides at ages comprised between 40 and 60 years, with a second phase of increased sensitivity in P. nigra at ages of ca. 120-130 years. These phases of highest sensitivities correspond with trunk diameters at breast height of 6-8 and 18-22 cm, respectively (P. nigra). This study thus calls for the inclusion of eccentricity analyses in future landslide reconstructions as well as for the selection of trees belonging to different age and diameter classes to allow for a well-balanced and more complete reconstruction of past events.
Activity Approach to Just Beyond the Classroom. Environmental Education Series.
ERIC Educational Resources Information Center
Skliar, Norman; La Mantia, Laura
To provide teachers with some of the many activities that can be carried on "just beyond the classroom," the booklet presents plans for more than 40 outdoor education activities, all emphasizing multidisciplinary, inquiry approach to learning. The school grounds offer optimum conditions for initiating studies in the out-of-doors. While every…
Activity Approach to Just Beyond the Classroom. Environmental Education Series.
ERIC Educational Resources Information Center
Skliar, Norman; La Mantia, Laura
To provide teachers with some of the many activities that can be carried on "just beyond the classroom," the booklet presents plans for more than 40 outdoor education activities, all emphasizing multidisciplinary, inquiry approach to learning. The school grounds offer optimum conditions for initiating studies in the out-of-doors. While every…
Emergent Approaches to Mental Health Problems. The Century Psychology Series.
ERIC Educational Resources Information Center
Cowen, Emory L., Ed.; And Others
Innovative approaches to mental health problems are described. Conceptualizations about the following areas are outlined: psychiatry, the universe, and the community; theoretical malaise and community mental health; the relation of conceptual models to manpower needs; and mental health manpower and institutional change. Community programs and new…
A graph-based approach to detect spatiotemporal dynamics in satellite image time series
NASA Astrophysics Data System (ADS)
Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal
2017-08-01
Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.
Multiscale multifractal diffusion entropy analysis of financial time series
NASA Astrophysics Data System (ADS)
Huang, Jingjing; Shang, Pengjian
2015-02-01
This paper introduces a multiscale multifractal diffusion entropy analysis (MMDEA) method to analyze long-range correlation then applies this method to stock index series. The method combines the techniques of diffusion process and Rényi entropy to focus on the scaling behaviors of stock index series using a multiscale, which allows us to extend the description of stock index variability to include the dependence on the magnitude of the variability and time scale. Compared to multifractal diffusion entropy analysis, the MMDEA can show more details of scale properties and provide a reliable analysis. In this paper, we concentrate not only on the fact that the stock index series has multifractal properties but also that these properties depend on the time scale in which the multifractality is measured. This time scale is related to the frequency band of the signal. We find that stock index variability appears to be far more complex than reported in the studies using a fixed time scale.
Iranian rainfall series analysis by means of nonparametric tests
NASA Astrophysics Data System (ADS)
Talaee, P. Hosseinzadeh
2014-05-01
The study of the trends and fluctuations in rainfall has received a great deal of attention, since changes in rainfall patterns may lead to floods or droughts. The objective of this study was to analyze the annual, seasonal, and monthly rainfall time series at seven rain gauge stations in the west of Iran for a 40-year period (from October 1969 to September 2009). The homogeneity of the rainfall data sets at the rain gauge stations was checked by using the cumulative deviations test. Three nonparametric tests, namely Kendall, Spearman, and Mann-Kendall, at the 95 % confidence level were used for the trend analysis and the Theil-Sen estimator was applied for determining the magnitudes of the trends. According to the homogeneity analysis, all of the rainfall series except the September series at Vasaj station were found to be homogeneous. The obtained results showed an insignificant trend in the annual and seasonal rainfall series at the majority of the considered stations. Moreover, only three significant trends were observed at the February rainfall of Aghajanbolaghi station, the November series of Vasaj station, and the March rainfall series of Khomigan station. The findings of this study on the temporal trends of rainfall can be implemented to improve the water resources strategies in the study region.
Clinical immunology review series: an approach to desensitization
Krishna, M T; Huissoon, A P
2011-01-01
Allergen immunotherapy describes the treatment of allergic disease through administration of gradually increasing doses of allergen. This form of immune tolerance induction is now safer, more reliably efficacious and better understood than when it was first formally described in 1911. In this paper the authors aim to summarize the current state of the art in immunotherapy in the treatment of inhalant, venom and drug allergies, with specific reference to its practice in the United Kingdom. A practical approach has been taken, with reference to current evidence and guidelines, including illustrative protocols and vaccine schedules. A number of novel approaches and techniques are likely to change considerably the way in which we select and treat allergy patients in the coming decade, and these advances are previewed. PMID:21175592
Introduction to the Special Series on Research Synthesis: A Cross-Disciplinary Approach.
Robinson, Lisa A; Hammitt, James K
2015-06-01
To estimate the effects of a policy change, analysts must often rely on available data as time and resource constraints limit their ability to commission new primary research. Research synthesis methods-including systematic review, meta-analysis, and expert elicitation-play an important role in ensuring that this evidence is appropriately weighed and considered. We present the conclusions of a multidisciplinary Harvard Center for Risk Analysis project that evaluated and applied these methods, and introduce the resulting series of articles. The first step in any analysis is to clearly define the problem to be addressed; the second is a systematic review of the literature. Whether additional analysis is needed depends on the quality and relevance of the available data to the policy question, and the likely effect of uncertainty on the policy decision. Meta-analysis promotes understanding the variation between studies and may be used to combine the estimates to develop values for application in policy analysis. Formal, structured expert elicitation promotes careful consideration of the evidence when data are limited or inconsistent, and aids in extrapolating to the policy context. Regardless of the methods used, clear communication of the approach, assumptions, and uncertainty is essential. © 2015 Society for Risk Analysis.
ERIC Educational Resources Information Center
Sun, Yu-Chih
2016-01-01
Extensive reading for second language learners have been widely documented over the past few decades. However, few studies, if any, have used a corpus analysis approach to analyze the vocabulary coverage within a single-author story series, its repetition of vocabulary, and the incidental and intentional vocabulary learning opportunities therein.…
ERIC Educational Resources Information Center
Sun, Yu-Chih
2016-01-01
Extensive reading for second language learners have been widely documented over the past few decades. However, few studies, if any, have used a corpus analysis approach to analyze the vocabulary coverage within a single-author story series, its repetition of vocabulary, and the incidental and intentional vocabulary learning opportunities therein.…
Spectral decompositions of multiple time series: a Bayesian non-parametric approach.
Macaro, Christian; Prado, Raquel
2014-01-01
We consider spectral decompositions of multiple time series that arise in studies where the interest lies in assessing the influence of two or more factors. We write the spectral density of each time series as a sum of the spectral densities associated to the different levels of the factors. We then use Whittle's approximation to the likelihood function and follow a Bayesian non-parametric approach to obtain posterior inference on the spectral densities based on Bernstein-Dirichlet prior distributions. The prior is strategically important as it carries identifiability conditions for the models and allows us to quantify our degree of confidence in such conditions. A Markov chain Monte Carlo (MCMC) algorithm for posterior inference within this class of frequency-domain models is presented.We illustrate the approach by analyzing simulated and real data via spectral one-way and two-way models. In particular, we present an analysis of functional magnetic resonance imaging (fMRI) brain responses measured in individuals who participated in a designed experiment to study pain perception in humans.
Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology
NASA Astrophysics Data System (ADS)
Scheier, Christian; Tschacher, Wolfgang
Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.
Scale-space analysis of time series in circulatory research.
Mortensen, Kim Erlend; Godtliebsen, Fred; Revhaug, Arthur
2006-12-01
Statistical analysis of time series is still inadequate within circulation research. With the advent of increasing computational power and real-time recordings from hemodynamic studies, one is increasingly dealing with vast amounts of data in time series. This paper aims to illustrate how statistical analysis using the significant nonstationarities (SiNoS) method may complement traditional repeated-measures ANOVA and linear mixed models. We applied these methods on a dataset of local hepatic and systemic circulatory changes induced by aortoportal shunting and graded liver resection. We found SiNoS analysis more comprehensive when compared with traditional statistical analysis in the following four ways: 1) the method allows better signal-to-noise detection; 2) including all data points from real time recordings in a statistical analysis permits better detection of significant features in the data; 3) analysis with multiple scales of resolution facilitates a more differentiated observation of the material; and 4) the method affords excellent visual presentation by combining group differences, time trends, and multiscale statistical analysis allowing the observer to quickly view and evaluate the material. It is our opinion that SiNoS analysis of time series is a very powerful statistical tool that may be used to complement conventional statistical methods.
Simulation of active and passive millimeter-wave (35 GHz) sensors by time series analysis
NASA Astrophysics Data System (ADS)
Strenzwilk, D. F.; Maruyama, R. T.
1982-11-01
Analog voltage signals from a millimeter-wave (MMW) radiometer (passive sensor) and radar (active sensor) were collected over varying grassy terrains at Aberdeen Proving Ground (APG), Maryland in July 1980. These measurements were made as part of continuing studies of MMW sensors for smart munitions. The signals were digitized at a rate of 2,000 observations per second and then analyzed by the Box and Jenkins time series approach. This analysis reports on the characterization of these data sets. The passive time series signals resulted in a simple autoregressive-moving average process, similar to a previous set of data taken at Rome Air Development Center in Rome, N.Y. by Ballistic Research Laboratory. On the other hand, the radar data (active sensor) required a data transformation to enhance the analysis. In both cases the signals were well characterized using the Box-Jenkins time series approach.
A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series
Keshmiri, Soheil; Sumioka, Hidenobu; Yamazaki, Ryuji; Ishiguro, Hiroshi
2017-01-01
We present a non-parametric approach to prediction of the n-back n ∈ {1, 2} task as a proxy measure of mental workload using Near Infrared Spectroscopy (NIRS) data. In particular, we focus on measuring the mental workload through hemodynamic responses in the brain induced by these tasks, thereby realizing the potential that they can offer for their detection in real world scenarios (e.g., difficulty of a conversation). Our approach takes advantage of intrinsic linearity that is inherent in the components of the NIRS time series to adopt a one-step regression strategy. We demonstrate the correctness of our approach through its mathematical analysis. Furthermore, we study the performance of our model in an inter-subject setting in contrast with state-of-the-art techniques in the literature to show a significant improvement on prediction of these tasks (82.50 and 86.40% for female and male participants, respectively). Moreover, our empirical analysis suggest a gender difference effect on the performance of the classifiers (with male data exhibiting a higher non-linearity) along with the left-lateralized activation in both genders with higher specificity in females. PMID:28217088
A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series.
Keshmiri, Soheil; Sumioka, Hidenobu; Yamazaki, Ryuji; Ishiguro, Hiroshi
2017-01-01
We present a non-parametric approach to prediction of the n-back n ∈ {1, 2} task as a proxy measure of mental workload using Near Infrared Spectroscopy (NIRS) data. In particular, we focus on measuring the mental workload through hemodynamic responses in the brain induced by these tasks, thereby realizing the potential that they can offer for their detection in real world scenarios (e.g., difficulty of a conversation). Our approach takes advantage of intrinsic linearity that is inherent in the components of the NIRS time series to adopt a one-step regression strategy. We demonstrate the correctness of our approach through its mathematical analysis. Furthermore, we study the performance of our model in an inter-subject setting in contrast with state-of-the-art techniques in the literature to show a significant improvement on prediction of these tasks (82.50 and 86.40% for female and male participants, respectively). Moreover, our empirical analysis suggest a gender difference effect on the performance of the classifiers (with male data exhibiting a higher non-linearity) along with the left-lateralized activation in both genders with higher specificity in females.
Optimal trading strategies—a time series approach
NASA Astrophysics Data System (ADS)
Bebbington, Peter A.; Kühn, Reimer
2016-05-01
Motivated by recent advances in the spectral theory of auto-covariance matrices, we are led to revisit a reformulation of Markowitz’ mean-variance portfolio optimization approach in the time domain. In its simplest incarnation it applies to a single traded asset and allows an optimal trading strategy to be found which—for a given return—is minimally exposed to market price fluctuations. The model is initially investigated for a range of synthetic price processes, taken to be either second order stationary, or to exhibit second order stationary increments. Attention is paid to consequences of estimating auto-covariance matrices from small finite samples, and auto-covariance matrix cleaning strategies to mitigate against these are investigated. Finally we apply our framework to real world data.
Automatic differentiation for Fourier series and the radii polynomial approach
NASA Astrophysics Data System (ADS)
Lessard, Jean-Philippe; Mireles James, J. D.; Ransford, Julian
2016-11-01
In this work we develop a computer-assisted technique for proving existence of periodic solutions of nonlinear differential equations with non-polynomial nonlinearities. We exploit ideas from the theory of automatic differentiation in order to formulate an augmented polynomial system. We compute a numerical Fourier expansion of the periodic orbit for the augmented system, and prove the existence of a true solution nearby using an a-posteriori validation scheme (the radii polynomial approach). The problems considered here are given in terms of locally analytic vector fields (i.e. the field is analytic in a neighborhood of the periodic orbit) hence the computer-assisted proofs are formulated in a Banach space of sequences satisfying a geometric decay condition. In order to illustrate the use and utility of these ideas we implement a number of computer-assisted existence proofs for periodic orbits of the Planar Circular Restricted Three-Body Problem (PCRTBP).
Nonlinear Analysis of Surface EMG Time Series of Back Muscles
NASA Astrophysics Data System (ADS)
Dolton, Donald C.; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2004-10-01
A nonlinear analysis of surface electromyography time series of subjects with and without low back pain is presented. The mean-square displacement and entropy shows anomalous diffusive behavior on intermediate time range 10 ms < t < 1 s. This behavior implies the presence of correlations in the signal. We discuss the shape of the power spectrum of the signal.
Time series analysis of monthly pulpwood use in the Northeast
James T. Bones
1980-01-01
Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.
A Time-Series Analysis of Student and Teacher Interaction.
ERIC Educational Resources Information Center
Schempp, Paul G.
The stability of teaching behavior was examined by observing student/teacher interaction over one academic year. One teacher was studied using a time-series analysis. He had 14 years experience and taught physical education in grades K-6 in a single school. Data were collected over one academic year using the Cheffers Adaptation of Flanders…
Application of time series analysis for assessing reservoir trophic status
Paris Honglay Chen; Ka-Chu Leung
2000-01-01
This study is to develop and apply a practical procedure for the time series analysis of reservoir eutrophication conditions. A multiplicative decomposition method is used to determine the trophic variations including seasonal, circular, long-term and irregular changes. The results indicate that (1) there is a long high peak for seven months from April to October...
Time Series Analysis Based on Running Mann Whitney Z Statistics
USDA-ARS?s Scientific Manuscript database
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
Application of modern time series analysis to high stability oscillators
NASA Technical Reports Server (NTRS)
Farrell, B. F.; Mattison, W. M.; Vessot, R. F. C.
1980-01-01
Techniques of modern time series analysis useful for investigating the characteristics of high-stability oscillators and identifying systematic perturbations are discussed with reference to an experiment in which the frequencies of superconducting cavity-stabilized oscillators and hydrogen masers were compared. The techniques examined include transformation to stationarity, autocorrelation and cross-correlation, superresolution, and transfer function determination.
Identification of human operator performance models utilizing time series analysis
NASA Technical Reports Server (NTRS)
Holden, F. M.; Shinners, S. M.
1973-01-01
The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.
ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES
PENG, C.-K.; COSTA, MADALENA; GOLDBERGER, ARY L.
2009-01-01
We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations. PMID:20041035
Wavelet analysis for non-stationary, nonlinear time series
NASA Astrophysics Data System (ADS)
Schulte, Justin A.
2016-08-01
Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the quasi-biennial oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16 and 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.
The QuakeSim System for GPS Time Series Analysis
NASA Astrophysics Data System (ADS)
Granat, R. A.; Gao, X.; Pierce, M.; Wang, J.
2010-12-01
We present a system for analysis of GPS time series data available to geosciences users through a web services / web portal interface. The system provides two time series analysis methods, one based on hidden Markov model (HMM) segmentation, the other based on covariance descriptor analysis (CDA). In addition, it provides data pre-processing routines that perform spike noise removal, linear de-trending, sum-of-sines removal, and common mode removal using probabilistic principle components analysis (PPCA). These components can be composed by the user into the desired series of processing steps for analysis through an intuitive graphical interface. The system is accessed through a web portal that allows both micro-scale (individual station) and macro-scale (whole network) exploration of data sets and analysis results via Google Maps. Users can focus in on or scroll through particular spatial or temporal time windows, or observe dynamic behavior by created movies that display the system state. Analysis results can be exported to KML format for easy combination with other sources of data, such as fault databases and InSAR interferograms. GPS solutions for California member stations of the plate boundary observatory from both the SOPAC and JPL gipsy context groups are automatically imported into the system as that data becomes available. We show the results of the methods as applied to these data sets for an assortment of case studies, and show how the system can be used to analyze both seismic and aseismic signals.
Interglacial climate dynamics and advanced time series analysis
NASA Astrophysics Data System (ADS)
Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit
2013-04-01
Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R
NASA Astrophysics Data System (ADS)
Assireu, A. T.; Rosa, R. R.; Vijaykumar, N. L.; Lorenzzetti, J. A.; Rempel, E. L.; Ramos, F. M.; Abreu Sá, L. D.; Bolzan, M. J. A.; Zanandrea, A.
2002-08-01
Based on the gradient pattern analysis (GPA) technique we introduce a new methodology for analyzing short nonstationary time series. Using the asymmetric amplitude fragmentation (AAF) operator from GPA we analyze Lagrangian data observed as velocity time series for ocean flow. The results show that quasi-periodic, chaotic and turbulent regimes can be well characterized by means of this new geometrical approach.
Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary
2014-11-01
Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin
NASA Astrophysics Data System (ADS)
zhang, L.
2011-12-01
Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be
A perturbative approach for enhancing the performance of time series forecasting.
de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C
2017-04-01
This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published.
Time series segmentation: a new approach based on Genetic Algorithm and Hidden Markov Model
NASA Astrophysics Data System (ADS)
Toreti, A.; Kuglitsch, F. G.; Xoplaki, E.; Luterbacher, J.
2009-04-01
The subdivision of a time series into homogeneous segments has been performed using various methods applied to different disciplines. In climatology, for example, it is accompanied by the well-known homogenization problem and the detection of artificial change points. In this context, we present a new method (GAMM) based on Hidden Markov Model (HMM) and Genetic Algorithm (GA), applicable to series of independent observations (and easily adaptable to autoregressive processes). A left-to-right hidden Markov model, estimating the parameters and the best-state sequence, respectively, with the Baum-Welch and Viterbi algorithms, was applied. In order to avoid the well-known dependence of the Baum-Welch algorithm on the initial condition, a Genetic Algorithm was developed. This algorithm is characterized by mutation, elitism and a crossover procedure implemented with some restrictive rules. Moreover the function to be minimized was derived following the approach of Kehagias (2004), i.e. it is the so-called complete log-likelihood. The number of states was determined applying a two-fold cross-validation procedure (Celeux and Durand, 2008). Being aware that the last issue is complex, and it influences all the analysis, a Multi Response Permutation Procedure (MRPP; Mielke et al., 1981) was inserted. It tests the model with K+1 states (where K is the state number of the best model) if its likelihood is close to K-state model. Finally, an evaluation of the GAMM performances, applied as a break detection method in the field of climate time series homogenization, is shown. 1. G. Celeux and J.B. Durand, Comput Stat 2008. 2. A. Kehagias, Stoch Envir Res 2004. 3. P.W. Mielke, K.J. Berry, G.W. Brier, Monthly Wea Rev 1981.
The application of complex network time series analysis in turbulent heated jets
Charakopoulos, A. K.; Karakasidis, T. E. Liakopoulos, A.; Papanicolaou, P. N.
2014-06-15
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.
The application of complex network time series analysis in turbulent heated jets.
Charakopoulos, A Κ; Karakasidis, T E; Papanicolaou, P N; Liakopoulos, A
2014-06-01
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.
Time series analysis of nuclear instrumentation in EBR-II
Imel, G.R.
1996-05-01
Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel`s response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals.
Profile Analysis: Multidimensional Scaling Approach.
ERIC Educational Resources Information Center
Ding, Cody S.
2001-01-01
Outlines an exploratory multidimensional scaling-based approach to profile analysis called Profile Analysis via Multidimensional Scaling (PAMS) (M. Davison, 1994). The PAMS model has the advantages of being applied to samples of any size easily, classifying persons on a continuum, and using person profile index for further hypothesis studies, but…
Mode Analysis with Autocorrelation Method (Single Time Series) in Tokamak
NASA Astrophysics Data System (ADS)
Saadat, Shervin; Salem, Mohammad K.; Goranneviss, Mahmoud; Khorshid, Pejman
2010-08-01
In this paper plasma mode analyzed with statistical method that designated Autocorrelation function. Auto correlation function used from one time series, so for this purpose we need one Minov coil. After autocorrelation analysis on mirnov coil data, spectral density diagram is plotted. Spectral density diagram from symmetries and trends can analyzed plasma mode. RHF fields effects with this method ate investigated in IR-T1 tokamak and results corresponded with multichannel methods such as SVD and FFT.
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.
Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series
Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C.; Golino, Caroline A.; Kemper, Peter; Saha, Margaret S.
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features. PMID:27977764
Satellite time series analysis using Empirical Mode Decomposition
NASA Astrophysics Data System (ADS)
Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.
2016-04-01
Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.
Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina
2015-01-01
Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.
Time series analysis using semiparametric regression on oil palm production
NASA Astrophysics Data System (ADS)
Yundari, Pasaribu, U. S.; Mukhaiyar, U.
2016-04-01
This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).
Transoral robotic approach to parapharyngeal space tumors: Case series and technical limitations.
Boyce, Brian J; Curry, Joseph M; Luginbuhl, Adam; Cognetti, David M
2016-08-01
The transoral robotic approach to parapharyngeal space (PPS) tumors is a new technique with limited data available on its feasibility, safety, and efficacy. We analyzed our experience with transoral robotic excisions of PPS tumors to evaluate the safety and efficacy of this technique. Retrospective chart analysis at tertiary academic medical center. From July 2010 to June 2014, 17 patients who had transoral robotic excision of PPS tumors were included in the study. Our cohort had an average age of 61.6 years and was 52.9% male. All patients had successful removal of their PPS tumors, and the average size of the tumors was 27.3 cm(3) (range 2-80 cm(3) ). Two cases (11.7%) required a cervical incision to assist with tumor removal. The average total operative time was 140.5 minutes. Two PPS PAs had focal areas of capsule rupture and one was fragmented. The average length of stay was 1.8 days (range 1-7 days), and all patients were discharged on an oral diet. Three patients experienced complications. There was no clinical or radiographic evidence of recurrence. This is the largest single-institution case series of transoral robotic approaches to PPS tumors. We demonstrate that this approach is feasible and safe but also note limitations of the robotic approaches for tumors on the far lateral and superior areas of the PPS, which required transcervical assistance. There were no patients who demonstrated recurrent tumor either radiographically or clinically. 4. Laryngoscope, 126:1776-1782, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
Observability of nonlinear dynamics: Normalized results and a time-series approach
NASA Astrophysics Data System (ADS)
Aguirre, Luis A.; Bastos, Saulo B.; Alves, Marcela A.; Letellier, Christophe
2008-03-01
This paper investigates the observability of nonlinear dynamical systems. Two difficulties associated with previous studies are dealt with. First, a normalized degree observability is defined. This permits the comparison of different systems, which was not generally possible before. Second, a time-series approach is proposed based on omnidirectional nonlinear correlation functions to rank a set of time series of a system in terms of their potential use to reconstruct the original dynamics without requiring the knowledge of the system equations. The two approaches proposed in this paper and a former method were applied to five benchmark systems and an overall agreement of over 92% was found.
Vertex epidural hematoma: An analysis of a large series
Ramesh, Vengalathur Ganesan; Kodeeswaran, Marappan; Deiveegan, Kunjithapatham; Sundar, Venkataraman; Sriram, Kuchalambal
2017-01-01
Context: Vertex epidural hematoma (VEDH) is uncommon. A high index of suspicion is required to suspect and diagnose this condition, and the surgical management is a challenge to neurosurgeons. There are only isolated case reports or small series of VEDH in the literature. Aims: We have tried to analyze a large series of VEDH seen in our institute. Settings and Design: Retrospective observational study. Subjects and Methods: This is an analysis of case records of patients with VEDH during 17 years period from 1995 to 2012. Statistical Analysis Used: Nil. Results: Twenty nine cases of VEDH encountered over a period of 17 years have been analyzed, including 26 males and 3 females. Majority were due to road accidents. Headache, papilledema and lower limb weakness have been the major presenting features in these cases. The diagnosis was by direct coronal computerized tomography (CT) scan in most of them. Majority were managed conservatively with observation and serial imaging. Four patients who had large VEDH with altered sensorium were managed surgically. The source of bleeding was mainly from superior sagittal sinus. Conclusions: VEDH has to be suspected when a patient presents with impact over the vertex and features of raised intracranial pressure. Direct coronal CT or magnetic resonance imaging is useful in the diagnosis. Surgery is required when the patient develops progressive deterioration in sensorium and/or with the hematoma volume more than 30 ml. The present series of 29 cases is the largest reported so far. PMID:28484524
A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.
Houseman, E Andres; Virji, M Abbas
2017-08-01
Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates
The multiscale analysis between stock market time series
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian
2015-11-01
This paper is devoted to multiscale cross-correlation analysis on stock market time series, where multiscale DCCA cross-correlation coefficient as well as multiscale cross-sample entropy (MSCE) is applied. Multiscale DCCA cross-correlation coefficient is a realization of DCCA cross-correlation coefficient on multiple scales. The results of this method present a good scaling characterization. More significantly, this method is able to group stock markets by areas. Compared to multiscale DCCA cross-correlation coefficient, MSCE presents a more remarkable scaling characterization and the value of each log return of financial time series decreases with the increasing of scale factor. But the results of grouping is not as good as multiscale DCCA cross-correlation coefficient.
Enveloping Spectral Surfaces: Covariate Dependent Spectral Analysis of Categorical Time Series
Krafty, Robert T.; Xiong, Shuangyan; Stoffer, David S.; Buysse, Daniel J.; Hall, Martica
2014-01-01
Motivated by problems in Sleep Medicine and Circadian Biology, we present a method for the analysis of cross-sectional categorical time series collected from multiple subjects where the effect of static continuous-valued covariates is of interest. Toward this goal, we extend the spectral envelope methodology for the frequency domain analysis of a single categorical process to cross-sectional categorical processes that are possibly covariate dependent. The analysis introduces an enveloping spectral surface for describing the association between the frequency domain properties of qualitative time series and covariates. The resulting surface offers an intuitively interpretable measure of association between covariates and a qualitative time series by finding the maximum possible conditional power at a given frequency from scalings of the qualitative time series conditional on the covariates. The optimal scalings that maximize the power provide scientific insight by identifying the aspects of the qualitative series which have the most pronounced periodic features at a given frequency conditional on the value of the covariates. To facilitate the assessment of the dependence of the enveloping spectral surface on the covariates, we include a theory for analyzing the partial derivatives of the surface. Our approach is entirely nonparametric, and we present estimation and asymptotics in the setting of local polynomial smoothing. PMID:24790257
PyRQA-Conducting recurrence quantification analysis on very long time series efficiently
NASA Astrophysics Data System (ADS)
Rawald, Tobias; Sips, Mike; Marwan, Norbert
2017-07-01
PyRQA is a software package that efficiently conducts recurrence quantification analysis (RQA) on time series consisting of more than one million data points. RQA is a method from non-linear time series analysis that quantifies the recurrent behaviour of systems. Existing implementations to RQA are not capable of analysing such very long time series at all or require large amounts of time to calculate the quantitative measures. PyRQA overcomes their limitations by conducting the RQA computations in a highly parallel manner. Building on the OpenCL framework, PyRQA leverages the computing capabilities of a variety of parallel hardware architectures, such as GPUs. The underlying computing approach partitions the RQA computations and enables to employ multiple compute devices at the same time. The goal of this publication is to demonstrate the features and the runtime efficiency of PyRQA. For this purpose we employ a real-world example, comparing the dynamics of two climatological time series, and a synthetic example, reducing the runtime regarding the analysis of a series consisting of over one million data points from almost eight hours using state-of-the-art RQA software to roughly 69 s using PyRQA.
A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting
NASA Astrophysics Data System (ADS)
Kim, T.; Joo, K.; Seo, J.; Heo, J. H.
2016-12-01
Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.
Nonlinear time series analysis of solar and stellar data
NASA Astrophysics Data System (ADS)
Jevtic, Nada
2003-06-01
Nonlinear time series analysis was developed to study chaotic systems. Its utility was investigated for the study of solar and stellar data time series. Sunspot data are the longest astronomical time series, and it reflects the long-term variation of the solar magnetic field. Due to periods of low solar activity, such as the Maunder minimum, and the solar cycle's quasiperiodicity, it has been postulated that the solar dynamo is a chaotic system. We show that, due to the definition of sunspot number, using nonlinear time series methods, it is not possible to test this postulate. To complement the sunspot data analysis, theoretically generated data for the α-Ω solar dynamo with meridional circulation were analyzed. Effects of stochastic fluctuations on the energy of an α-Ω dynamo with meridional circulation were investigated. This proved extremely useful in generating a clearer understanding of the effect of dynamical noise on the unperturbed system. This was useful in the study of the light intensity curve of white dwarf PG 1351+489. Dynamical resetting was identified for PG 1351+489, using phase space methods, and then, using nonlinear noise reduction methods, the white noise tail of the power spectrum was lowered by a factor of 40. This allowed the identification of 10 new lines in the power spectrum. Finally, using Poincare section return times, a periodicity in the light curve of cataclysmic variable SS Cygni was identified. We initially expected that time delay methods would be useful as a qualitative comparison tool. However, they were capable, under the proper set of constraints on the data sets, of providing quantitative information about the signal source.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar
2016-02-01
The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time
ERIC Educational Resources Information Center
Nguyen, Dang Liem
The constrastive analysis presented here is Volume Three in the author's series "A Contrastive Analysis of English and Vietnamese." Other volumes already published are "English Grammar, A Combined Tagmemic and Transformational Approach" [AL 002 422] and -Vietnamese Grammar, A Combined Tagmemic and Transformational Approach." The volume covering…
Three-dimensional Neumann-series approach to model light transport in nonuniform media
Jha, Abhinav K.; Kupinski, Matthew A.; Barrett, Harrison H.; Clarkson, Eric; Hartman, John H.
2014-01-01
We present the implementation, validation, and performance of a three-dimensional (3D) Neumann-series approach to model photon propagation in nonuniform media using the radiative transport equation (RTE). The RTE is implemented for nonuniform scattering media in a spherical harmonic basis for a diffuse-optical-imaging setup. The method is parallelizable and implemented on a computing system consisting of NVIDIA Tesla C2050 graphics processing units (GPUs). The GPU implementation provides a speedup of up to two orders of magnitude over non-GPU implementation, which leads to good computational efficiency for the Neumann-series method. The results using the method are compared with the results obtained using the Monte Carlo simulations for various small-geometry phantoms, and good agreement is observed. We observe that the Neumann-series approach gives accurate results in many cases where the diffusion approximation is not accurate. PMID:23201945
Capturing Context-Related Change in Emotional Dynamics via Fixed Moderated Time Series Analysis.
Adolf, Janne K; Voelkle, Manuel C; Brose, Annette; Schmiedek, Florian
2017-01-01
Much of recent affect research relies on intensive longitudinal studies to assess daily emotional experiences. The resulting data are analyzed with dynamic models to capture regulatory processes involved in emotional functioning. Daily contexts, however, are commonly ignored. This may not only result in biased parameter estimates and wrong conclusions, but also ignores the opportunity to investigate contextual effects on emotional dynamics. With fixed moderated time series analysis, we present an approach that resolves this problem by estimating context-dependent change in dynamic parameters in single-subject time series models. The approach examines parameter changes of known shape and thus addresses the problem of observed intra-individual heterogeneity (e.g., changes in emotional dynamics due to observed changes in daily stress). In comparison to existing approaches to unobserved heterogeneity, model estimation is facilitated and different forms of change can readily be accommodated. We demonstrate the approach's viability given relatively short time series by means of a simulation study. In addition, we present an empirical application, targeting the joint dynamics of affect and stress and how these co-vary with daily events. We discuss potentials and limitations of the approach and close with an outlook on the broader implications for understanding emotional adaption and development.
Time series analysis for psychological research: examining and forecasting change
Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341
Time series analysis for psychological research: examining and forecasting change.
Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.
On-line analysis of reactor noise using time-series analysis
McGevna, V.G.
1981-10-01
A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives.
Conditional adaptive Bayesian spectral analysis of nonstationary biomedical time series.
Bruce, Scott A; Hall, Martica H; Buysse, Daniel J; Krafty, Robert T
2017-05-08
Many studies of biomedical time series signals aim to measure the association between frequency-domain properties of time series and clinical and behavioral covariates. However, the time-varying dynamics of these associations are largely ignored due to a lack of methods that can assess the changing nature of the relationship through time. This article introduces a method for the simultaneous and automatic analysis of the association between the time-varying power spectrum and covariates, which we refer to as conditional adaptive Bayesian spectrum analysis (CABS). The procedure adaptively partitions the grid of time and covariate values into an unknown number of approximately stationary blocks and nonparametrically estimates local spectra within blocks through penalized splines. CABS is formulated in a fully Bayesian framework, in which the number and locations of partition points are random, and fit using reversible jump Markov chain Monte Carlo techniques. Estimation and inference averaged over the distribution of partitions allows for the accurate analysis of spectra with both smooth and abrupt changes. The proposed methodology is used to analyze the association between the time-varying spectrum of heart rate variability and self-reported sleep quality in a study of older adults serving as the primary caregiver for their ill spouse. © 2017, The International Biometric Society.
FROG: Time Series Analysis for the Web Service Era
NASA Astrophysics Data System (ADS)
Allan, A.
2005-12-01
The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).
Chaotic time series analysis in economics: Balance and perspectives
Faggini, Marisa
2014-12-15
The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.
Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability.
Krafty, Robert T
2016-07-01
Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modeled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possess favorable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability.
Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability
Krafty, Robert T.
2016-01-01
Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modeled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possess favorable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability. PMID:27695143
Lehman, Li-wei; Ghassemi, Mohammad; Snoek, Jasper; Nemati, Shamim
2016-01-01
In this work, we propose a stacked switching vector-autoregressive (SVAR)-CNN architecture to model the changing dynamics in physiological time series for patient prognosis. The SVAR-layer extracts dynamical features (or modes) from the time-series, which are then fed into the CNN-layer to extract higher-level features representative of transition patterns among the dynamical modes. We evaluate our approach using 8-hours of minute-by-minute mean arterial blood pressure (BP) from over 450 patients in the MIMIC-II database. We modeled the time-series using a third-order SVAR process with 20 modes, resulting in first-level dynamical features of size 20×480 per patient. A fully connected CNN is then used to learn hierarchical features from these inputs, and to predict hospital mortality. The combined CNN/SVAR approach using BP time-series achieved a median and interquartile-range AUC of 0.74 [0.69, 0.75], significantly outperforming CNN-alone (0.54 [0.46, 0.59]), and SVAR-alone with logistic regression (0.69 [0.65, 0.72]). Our results indicate that including an SVAR layer improves the ability of CNNs to classify nonlinear and nonstationary time-series. PMID:27790623
Gompel, Jamie J. Van; Alikhani, Puya; Youssef, A. Samy; Loveren, Harry R. van; Boyev, K. Paul; Agazzi, Sivero
2015-01-01
Objective Anterior petrosectomy(AP) was popularized in the 1980s and 1990s as micro-neurosurgery proliferated. Original reports concentrated on the anatomy of the approach and small case series. Recently, with the advent of additional endonasal approaches to the petrous apex, the morbidity of AP remains unclear. This report details approach-related morbidity around and under the temporal lobe. Methods A total of 46 consecutive patients identified from our surgical database were reviewed retrospectively. Results Of the 46 patients, 61% were women. Median age of the patients was 50 years (mean: 48 ± 2 years). Median follow-up of this cohort was 66 months. Most procedures dealt with intradural pathology (n = 40 [87%]). Approach-related morbidity consisted of only two patients (4%) with new postoperative seizures. There were only two significant postoperative hemorrhages (4%). Cerebrospinal fluid leakage occurred in two patients (4%) requiring reoperation. Conclusion Approach-related complications such as seizures and hematoma were infrequent in this series, < 4%. This report describes a contemporary group of patients treated with open AP and should serve as a comparison for approach-related morbidity of endoscopic approaches. Given the pathologies treated with this approach, the morbidity appears acceptable. PMID:26401480
Van Gompel, Jamie J; Alikhani, Puya; Youssef, A Samy; Loveren, Harry R van; Boyev, K Paul; Agazzi, Sivero
2015-09-01
Objective Anterior petrosectomy(AP) was popularized in the 1980s and 1990s as micro-neurosurgery proliferated. Original reports concentrated on the anatomy of the approach and small case series. Recently, with the advent of additional endonasal approaches to the petrous apex, the morbidity of AP remains unclear. This report details approach-related morbidity around and under the temporal lobe. Methods A total of 46 consecutive patients identified from our surgical database were reviewed retrospectively. Results Of the 46 patients, 61% were women. Median age of the patients was 50 years (mean: 48 ± 2 years). Median follow-up of this cohort was 66 months. Most procedures dealt with intradural pathology (n = 40 [87%]). Approach-related morbidity consisted of only two patients (4%) with new postoperative seizures. There were only two significant postoperative hemorrhages (4%). Cerebrospinal fluid leakage occurred in two patients (4%) requiring reoperation. Conclusion Approach-related complications such as seizures and hematoma were infrequent in this series, < 4%. This report describes a contemporary group of patients treated with open AP and should serve as a comparison for approach-related morbidity of endoscopic approaches. Given the pathologies treated with this approach, the morbidity appears acceptable.
Weighted permutation entropy based on different symbolic approaches for financial time series
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2016-02-01
In this paper, we introduce weighted permutation entropy (WPE) and three different symbolic approaches to investigate the complexities of stock time series containing amplitude-coded information and explore the influence of using different symbolic approaches on obtained WPE results. We employ WPE based on symbolic approaches to the US and Chinese stock markets and make a comparison between the results of US and Chinese stock markets. Three symbolic approaches are able to help the complexity containing in the stock time series by WPE method drop whatever the embedding dimension is. The similarity between these stock markets can be detected by the WPE based on Binary Δ-coding-method, while the difference between them can be revealed by the WPE based on σ-method, Max-min-method. The combinations of the symbolic approaches: σ-method and Max-min-method, and WPE method are capable of reflecting the multiscale structure of complexity by different time delay and analyze the differences between complexities of stock time series in more detail and more accurately. Furthermore, the correlations between stock markets in the same region and the similarities hidden in the S&P500 and DJI, ShangZheng and ShenCheng are uncovered by the comparison of the WPE based on Binary Δ-coding-method of six stock markets.
Transradial approach to treating endovascular cerebral aneurysms: Case series and technical note
Goland, Javier; Doroszuk, Gustavo Fabián; Garbugino, Silvia Lina; Ypa, María Paula
2017-01-01
Background: Several benefits have been described over the years of the transradial versus femoral endovascular approach to cardiac interventions. Consequently, its use has become habitual at most centers that perform cardiac catheterizations. This paper details a right transradial approach, incorporating a variety of coils or flow diverters, which can be utilized for the endovascular treatment of different cerebral aneurysms. Methods: From 2014 to 2016, we performed 40 endovascular procedures to treat cerebral aneurysms adopting the same right transradial approach. Five aneurysms were treated with flow diverters and 35 were treated with coils. Seven of these aneurisms were asymptomatic, whereas 33 had already ruptured. Results: Satisfactory treatment was achieved in all cases through the same approach in the absence of any complications. Conclusions: A right transradial approach may be satisfactory for the endovascular treatment of different cerebral aneurysms, including aneurysms in either hemisphere. This is the largest series of cerebral aneurysms treated through a transradial approach. PMID:28584676
McKinney, B A; Crowe, J E; Voss, H U; Crooke, P S; Barney, N; Moore, J H
2006-02-01
We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual's response to the smallpox vaccine.
NASA Astrophysics Data System (ADS)
McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.
2006-02-01
We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.
Time series clustering analysis of health-promoting behavior
NASA Astrophysics Data System (ADS)
Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng
2013-10-01
Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.
Monthly hail time series analysis related to agricultural insurance
NASA Astrophysics Data System (ADS)
Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.
2010-05-01
Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.
Time series analysis of gold production in Malaysia
NASA Astrophysics Data System (ADS)
Muda, Nora; Hoon, Lee Yuen
2012-05-01
Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.
On statistical inference in time series analysis of the evolution of road safety.
Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora
2013-11-01
Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Amirjamshidi, Abbas; Abbasioun, Kazem; Amiri, Rouzbeh Shams; Ardalan, Ali; Hashemi, Seyyed Mahmood Ramak
2015-01-01
Background: Sphenoid wing meningiomas extending to the orbit (ePMSW) are currently removed through several transcranial approaches. Presenting the largest surgical cohort of hyperostosing ePMSW with the longest follow up period, we will provide data supporting minilateral orbitotomy with excellent exposure for wide resection of all compartments of the tumor. Methods: A retrospective survival analysis is made of the data cumulated prospectively during a period of 34 years, including 88 cases of ePMSW with a mean follow up period of 136.4 months. The impact of preoperative variables upon different outcome measures is evaluated. Standard pterional craniotomy was performed in 12 patients (C) while the other 76 cases underwent the proposed modified lateral miniorbitotomy (LO). Results: There were 31 men and 57 women. The age range varied between 12 and 70 years. Patients presented with unilateral exophthalmos (Uex) ranging between 3 and 16 mm. Duration of proptosis before operation varied between 6 months and 16 years. The status of visual acuity (VA) prior to operation was: no light perception (NLP) in 16, light perception (LP) up to 0.2 in 3, 0.3–0.5 in 22, 0.6–0.9 in 24, and full vision in 23 patients. Postoperatively, acceptable cosmetic appearance of the eyes was seen in 38 cases and in 46 mild inequality of < 2 mm was detected. Four cases had mild enophthalmos (En). Among those who had the worst VA, two improved and one became almost blind after operation. The cases with VA in the range of 0.3–0.5 improved. Among those with good VA (0.5 to full vision), 2 became blind, vision diminished in 10, and improved or remained full in the other 35 cases. Tumor recurrence occurred in 33.3% of group C and 10.5% of group LO (P = 0.05). The major determinant of tumor regrowth was the technique of LO (P = 0.008). Conclusion: Using LO technique, the risky corners involved by the tumor is visualized from the latero-inferior side rather than from the latero-superior avenue
[Causal analysis approaches in epidemiology].
Dumas, O; Siroux, V; Le Moual, N; Varraso, R
2014-02-01
Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the
A Simple Pile-up Model for Time Series Analysis
NASA Astrophysics Data System (ADS)
Sevilla, Diego J. R.
2017-07-01
In this paper, a simple pile-up model is presented. This model calculates the probability P(n| N) of having n counts if N particles collide with a sensor during an exposure time. Through some approximations, an analytic expression depending on only one parameter is obtained. This parameter characterizes the pile-up magnitude, and depends on features of the instrument and the source. The statistical model obtained permits the determination of probability distributions of measured counts from the probability distributions of incoming particles, which is valuable for time series analysis. Applicability limits are discussed, and an example of the improvement that can be achieved in the statistical analysis considering the proposed pile-up model is shown by analyzing real data.
Permutation approach, high frequency trading and variety of micro patterns in financial time series
NASA Astrophysics Data System (ADS)
Aghamohammadi, Cina; Ebrahimian, Mehran; Tahmooresi, Hamed
2014-11-01
Permutation approach is suggested as a method to investigate financial time series in micro scales. The method is used to see how high frequency trading in recent years has affected the micro patterns which may be seen in financial time series. Tick to tick exchange rates are considered as examples. It is seen that variety of patterns evolve through time; and that the scale over which the target markets have no dominant patterns, have decreased steadily over time with the emergence of higher frequency trading.
An iterative approach to optimize change classification in SAR time series data
NASA Astrophysics Data System (ADS)
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2016-10-01
The detection of changes using remote sensing imagery has become a broad field of research with many approaches for many different applications. Besides the simple detection of changes between at least two images acquired at different times, analyses which aim on the change type or category are at least equally important. In this study, an approach for a semi-automatic classification of change segments is presented. A sparse dataset is considered to ensure the fast and simple applicability for practical issues. The dataset is given by 15 high resolution (HR) TerraSAR-X (TSX) amplitude images acquired over a time period of one year (11/2013 to 11/2014). The scenery contains the airport of Stuttgart (GER) and its surroundings, including urban, rural, and suburban areas. Time series imagery offers the advantage of analyzing the change frequency of selected areas. In this study, the focus is set on the analysis of small-sized high frequently changing regions like parking areas, construction sites and collecting points consisting of high activity (HA) change objects. For each HA change object, suitable features are extracted and a k-means clustering is applied as the categorization step. Resulting clusters are finally compared to a previously introduced knowledge-based class catalogue, which is modified until an optimal class description results. In other words, the subjective understanding of the scenery semantics is optimized by the data given reality. Doing so, an even sparsely dataset containing only amplitude imagery can be evaluated without requiring comprehensive training datasets. Falsely defined classes might be rejected. Furthermore, classes which were defined too coarsely might be divided into sub-classes. Consequently, classes which were initially defined too narrowly might be merged. An optimal classification results when the combination of previously defined key indicators (e.g., number of clusters per class) reaches an optimum.
Revuelta-Gutiérrez, Rogelio; Morales-Martínez, Andres Humberto; Mejías-Soto, Carolina; Martínez-Anda, Jaime Jesús; Ortega-Porcayo, Luis Alberto
2016-01-01
Background: Glossopharyngeal neuralgia (GPN) is an uncommon craniofacial pain syndrome. It is characterized by a sudden onset lancinating pain usually localized in the sensory distribution of the IX cranial nerve associated with excessive vagal outflow, which leads to bradycardia, hypotension, syncope, or cardiac arrest. This study aims to review our surgical experience performing microvascular decompression (MVD) in patients with GPN. Methods: Over the last 20 years, 14 consecutive cases were diagnosed with GPN. MVD using a microasterional approach was performed in all patients. Demographic data, clinical presentation, surgical findings, clinical outcome, complications, and long-term follow-up were reviewed. Results: The median age of onset was 58.7 years. The mean time from onset of symptoms to treatment was 8.8 years. Glossopharyngeal and vagus nerve compression was from the posterior inferior cerebellar artery in eleven cases (78.5%), vertebral artery in two cases (14.2%), and choroid plexus in one case (7.1%). Postoperative mean follow-up was 26 months (3–180 months). Pain analysis demonstrated long-term pain improvement of 114 ± 27.1 months and pain remission in 13 patients (92.9%) (P = 0.0001) two complications were documented, one patient had a cerebrospinal fluid leak, and another had bacterial meningitis. There was no surgical mortality. Conclusions: GPN is a rare entity, and secondary causes should be discarded. MVD through a retractorless microasterional approach is a safe and effective technique. Our series demonstrated an excellent clinical outcome with pain remission in 92.9%. PMID:27213105
Practical approaches in accident analysis
NASA Astrophysics Data System (ADS)
Stock, M.
An accident analysis technique based on successive application of structural response, explosion dynamics, gas cloud formation, and plant operation failure mode models is proposed. The method takes into account the nonideal explosion characteristic of a deflagration in the unconfined cloud. The resulting pressure wave differs significantly from a shock wave and the response of structures like lamp posts and walls can differ correspondingly. This gives a more realistic insight into explosion courses than a simple TNT-equivalent approach.
Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool
NASA Technical Reports Server (NTRS)
McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall
2008-01-01
The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify
NASA Astrophysics Data System (ADS)
Balidakis, Kyriakos; Heinkelmann, Robert; Lu, Cuixian; Soja, Benedikt; Karbon, Maria; Nilsson, Tobias; Glaser, Susanne; Andres Mora-Diaz, Julian; Anderson, James; Liu, Li; Raposo-Pulido, Virginia; Xu, Minghui; Schuh, Harald
2015-04-01
Time series of meteorological parameters recorded at VLBI (Very Long Baseline Interferometry) observatories allow us to realistically model and consequently to eliminate the atmosphere-induced effects in the VLBI products to a large extent. Nevertheless, this advantage of VLBI is not fully exploited since such information is contaminated with inconsistencies, such as uncertainties regarding the calibration and location of the meteorological sensors, outliers, missing data points, and breaks. It has been shown that such inconsistencies in meteorological data used for VLBI data analysis impose problems in the geodetic products (e.g vertical site position) and result in mistakes in geophysical interpretation. The aim of the procedure followed here is to optimally model the tropospheric delay and bending effects that are still the main sources of error in VLBI data analysis. In this study, the meteorological data recorded with sensors mounted in the vicinity of VLBI stations have been homogenized spanning the period from 1979 until today. In order to meet this objective, inhomogeneities were detected and adjusted using test results and metadata. Some of the approaches employed include Alexandersson's Standard Normal Homogeneity Test and an iterative procedure, of which the segmentation part is based on a dynamic programming algorithm and the functional part on a LASSO (Least Absolute Shrinkage and Selection Operator) estimator procedure. For the provision of reference time series that are necessary to apply the aforementioned methods, ECMWF's (European Centre for Medium-Range Weather Forecasts) ERA-Interim reanalysis surface data were employed. Special care was taken regarding the datum definition of this model. Due to the significant height difference between the VLBI antenna's reference point and the elevation included in geopotential fields of the specific numerical weather models, a hypsometric adjustment is applied using the absolute pressure level from the WMO
Time series modeling by a regression approach based on a latent process.
Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice
2009-01-01
Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.
Multidimensional stock network analysis: An Escoufier's RV coefficient approach
NASA Astrophysics Data System (ADS)
Lee, Gan Siew; Djauhari, Maman A.
2013-09-01
The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.
Inorganic chemical analysis of environmental materials—A lecture series
Crock, J.G.; Lamothe, P.J.
2011-01-01
At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.
Analysis of temperature time-series: Embedding dynamics into the MDS method
NASA Astrophysics Data System (ADS)
Lopes, António M.; Tenreiro Machado, J. A.
2014-04-01
Global warming and the associated climate changes are being the subject of intensive research due to their major impact on social, economic and health aspects of the human life. Surface temperature time-series characterise Earth as a slow dynamics spatiotemporal system, evidencing long memory behaviour, typical of fractional order systems. Such phenomena are difficult to model and analyse, demanding for alternative approaches. This paper studies the complex correlations between global temperature time-series using the Multidimensional scaling (MDS) approach. MDS provides a graphical representation of the pattern of climatic similarities between regions around the globe. The similarities are quantified through two mathematical indices that correlate the monthly average temperatures observed in meteorological stations, over a given period of time. Furthermore, time dynamics is analysed by performing the MDS analysis over slices sampling the time series. MDS generates maps describing the stations' locus in the perspective that, if they are perceived to be similar to each other, then they are placed on the map forming clusters. We show that MDS provides an intuitive and useful visual representation of the complex relationships that are present among temperature time-series, which are not perceived on traditional geographic maps. Moreover, MDS avoids sensitivity to the irregular distribution density of the meteorological stations.
Feature extraction for change analysis in SAR time series
NASA Astrophysics Data System (ADS)
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2015-10-01
In remote sensing, the change detection topic represents a broad field of research. If time series data is available, change detection can be used for monitoring applications. These applications require regular image acquisitions at identical time of day along a defined period. Focusing on remote sensing sensors, radar is especially well-capable for applications requiring regularity, since it is independent from most weather and atmospheric influences. Furthermore, regarding the image acquisitions, the time of day plays no role due to the independence from daylight. Since 2007, the German SAR (Synthetic Aperture Radar) satellite TerraSAR-X (TSX) permits the acquisition of high resolution radar images capable for the analysis of dense built-up areas. In a former study, we presented the change analysis of the Stuttgart (Germany) airport. The aim of this study is the categorization of detected changes in the time series. This categorization is motivated by the fact that it is a poor statement only to describe where and when a specific area has changed. At least as important is the statement about what has caused the change. The focus is set on the analysis of so-called high activity areas (HAA) representing areas changing at least four times along the investigated period. As first step for categorizing these HAAs, the matching HAA changes (blobs) have to be identified. Afterwards, operating in this object-based blob level, several features are extracted which comprise shape-based, radiometric, statistic, morphological values and one context feature basing on a segmentation of the HAAs. This segmentation builds on the morphological differential attribute profiles (DAPs). Seven context classes are established: Urban, infrastructure, rural stable, rural unstable, natural, water and unclassified. A specific HA blob is assigned to one of these classes analyzing the CovAmCoh time series signature of the surrounding segments. In combination, also surrounding GIS information
Charakopoulos, A K; Karakasidis, T E; Papanicolaou, P N; Liakopoulos, A
2014-03-01
In the present work we approach the hydrodynamic problem of discriminating the state of the turbulent fluid region as a function of the distance from the axis of a turbulent jet axis. More specifically, we analyzed temperature fluctuations in vertical turbulent heated jets where temperature time series were recorded along a horizontal line through the jet axis. We employed data from different sets of experiments with various initial conditions out of circular and elliptical shaped nozzles in order to identify time series taken at the jet axis, and discriminate them from those taken near the boundary with ambient fluid using nonconventional hydrodynamics methods. For each temperature time series measured at a different distance from jet axis, we estimated mainly nonlinear measures such as mutual information combined with descriptive statistics measures, as well as some linear and nonlinear dynamic detectors such as Hurst exponent, detrended fluctuation analysis, and Hjorth parameters. The results obtained in all cases have shown that the proposed methodology allows us to distinguish the flow regime around the jet axis and identify the time series corresponding to the jet axis in agreement with the conventional statistical hydrodynamic method. Furthermore, in order to reject the null hypothesis that the time series originate from a stochastic process, we applied the surrogate data method.
Spectral analysis of GPS precise point positioning time series
NASA Astrophysics Data System (ADS)
Selle, C.; Desai, S.; Garcia Fernandez, M.; Sibois, A.
2014-12-01
This paper presents the results from performing spectral analysis on GPS positioning time series obtained from precise point positioning (PPP). The goal of this work was to evaluate the impact of different choices of processing strategies and models on GPS-based PPP. We studied the spectra of station positions, examined overall noise levels and identified the presence of spurious periodic signals. Testing various processing options allowed us to assess their effect on station position estimates. With the Jet Propulsion Laboratory's contribution to the second reprocessing campaign of the International GNSS Service (IGS) as our reference source for input orbits and clocks, we also considered the effects of using different orbit and clock products. This included products from the previous reprocessing campaign, which were fixed in the IGS05 reference frame, while recent products use the IGS08 frame. Of particular importance are our results from assessing the impact on the station position time series from the single-receiver ambiguity resolution capability offered by JPL's reprocessing campaigns. Furthermore, our tests raise the possibility of distinguishing between PPP processing settings, input orbits and clocks, and station data and location-dependent effects as causes of these features.
Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-01-01
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-02-20
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
[Analysis of a series pheochromocytoma cases over 15 years].
Rojo Alvaro, J; Toni, M; Ollero, Md; Pineda, Jj; Munárriz, P; Anda, E
2012-01-01
The pheochromocytoma is a catecholamine secreting tumour derived from chromaffin cells of the sympathetic nervous system. Eighty to eighty-five percent of these tumours are localized in the adrenal medulla. When pheocromocytomas are found outside the adrenal gland they are referred to as extra-adrenal pheochromocytomas or paragangliomas. The diagnosis is confirmed by elevation of catecholamines and the metanephrines in blood plasma and urine. Localization of the tumour should be done following biochemical diagnosis by means of CT scan and/or MRI. The treatment of choice is tumour resection by laparoscopic surgery. A review was made of all patient medical histories diagnosed with pheochromocytoma confirmed by the pathology reports of Pathological anatomy of the Navarre hospital Complex (Anatomía patológica del Complejo hospitalario de Navarra A y B) between 1996 to 2010. Descriptive analysis was made using the IBM SPSS statistics program. Our series consists of 43 patients diagnosed with pheochromocytoma over a span of 15 years. The average age on presentation was 47 years. Among the younger patients specific genetic syndromes were found. Computerized tomography was the most widely used method of localization. Contradictory results were found regarding perioperative medical management protocols. All pheocromocytoma tumours in this series were benign. It is advisable to carry out a genetic study on patients under twenty. The biochemical indicators with the greatest diagnostic sensitivity were the levels of normetanephrine and metanephrine in urine. Surgery was the only treatment option.
Statistical analysis of CSP plants by simulating extensive meteorological series
NASA Astrophysics Data System (ADS)
Pavón, Manuel; Fernández, Carlos M.; Silva, Manuel; Moreno, Sara; Guisado, María V.; Bernardos, Ana
2017-06-01
The feasibility analysis of any power plant project needs the estimation of the amount of energy it will be able to deliver to the grid during its lifetime. To achieve this, its feasibility study requires a precise knowledge of the solar resource over a long term period. In Concentrating Solar Power projects (CSP), financing institutions typically requires several statistical probability of exceedance scenarios of the expected electric energy output. Currently, the industry assumes a correlation between probabilities of exceedance of annual Direct Normal Irradiance (DNI) and energy yield. In this work, this assumption is tested by the simulation of the energy yield of CSP plants using as input a 34-year series of measured meteorological parameters and solar irradiance. The results of this work show that, even if some correspondence between the probabilities of exceedance of annual DNI values and energy yields is found, the intra-annual distribution of DNI may significantly affect this correlation. This result highlights the need of standardized procedures for the elaboration of representative DNI time series representative of a given probability of exceedance of annual DNI.
Identification of statistical patterns in complex systems via symbolic time series analysis.
Gupta, Shalabh; Khatkhate, Amol; Ray, Asok; Keller, Eric
2006-10-01
Identification of statistical patterns from observed time series of spatially distributed sensor data is critical for performance monitoring and decision making in human-engineered complex systems, such as electric power generation, petrochemical, and networked transportation. This paper presents an information-theoretic approach to identification of statistical patterns in such systems, where the main objective is to enhance structural integrity and operation reliability. The core concept of pattern identification is built upon the principles of Symbolic Dynamics, Automata Theory, and Information Theory. To this end, a symbolic time series analysis method has been formulated and experimentally validated on a special-purpose test apparatus that is designed for data acquisition and real-time analysis of fatigue damage in polycrystalline alloys.
Dutta, Debaditya; Mahmoud, Ahmed M.; Leers, Steven A.; Kim, Kang
2013-01-01
Large lipid pools in vulnerable plaques, in principle, can be detected using US based thermal strain imaging (US-TSI). One practical challenge for in vivo cardiovascular application of US-TSI is that the thermal strain is masked by the mechanical strain caused by cardiac pulsation. ECG gating is a widely adopted method for cardiac motion compensation, but it is often susceptible to electrical and physiological noise. In this paper, we present an alternative time series analysis approach to separate thermal strain from the mechanical strain without using ECG. The performance and feasibility of the time-series analysis technique was tested via numerical simulation as well as in vitro water tank experiments using a vessel mimicking phantom and an excised human atherosclerotic artery where the cardiac pulsation is simulated by a pulsatile pump. PMID:24808628
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.
2005-01-01
We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.
A new approach to calibrate steady groundwater flow models with time series of head observations
NASA Astrophysics Data System (ADS)
Obergfell, C.; Bakker, M.; Maas, C.
2012-04-01
We developed a new method to calibrate aquifer parameters of steady-state well field models using measured time series of head fluctuations. Our method is an alternative to standard pumping tests and is based on time series analysis using parametric impulse response functions. First, the pumping influence is isolated from the overall groundwater fluctuation observed at monitoring wells around the well field, and response functions are determined for each individual well. Time series parameters are optimized using a quasi-Newton algorithm. For one monitoring well, time series model parameters are also optimized by means of SCEM-UA, a Markov Chain Monte Carlo algorithm, as a control on the validity of the parameters obtained by the faster quasi-Newton method. Subsequently, the drawdown corresponding to an average yearly pumping rate is calculated from the response functions determined by time series analysis. The drawdown values estimated with acceptable confidence intervals are used as calibration targets of a steady groundwater flow model. A case study is presented of the drinking water supply well field of Waalwijk (Netherlands). In this case study, a uniform aquifer transmissivity is optimized together with the conductance of ditches in the vicinity of the well field. Groundwater recharge or boundary heads do not have to be entered, which eliminates two import sources of uncertainty. The method constitutes a cost-efficient alternative to pumping tests and allows the determination of pumping influences without changes in well field operation.
Zunino, L; Soriano, M C; Rosso, O A
2012-10-01
In this paper we introduce a multiscale symbolic information-theory approach for discriminating nonlinear deterministic and stochastic dynamics from time series associated with complex systems. More precisely, we show that the multiscale complexity-entropy causality plane is a useful representation space to identify the range of scales at which deterministic or noisy behaviors dominate the system's dynamics. Numerical simulations obtained from the well-known and widely used Mackey-Glass oscillator operating in a high-dimensional chaotic regime were used as test beds. The effect of an increased amount of observational white noise was carefully examined. The results obtained were contrasted with those derived from correlated stochastic processes and continuous stochastic limit cycles. Finally, several experimental and natural time series were analyzed in order to show the applicability of this scale-dependent symbolic approach in practical situations.
Series-expansion thermal tensor network approach for quantum lattice models
NASA Astrophysics Data System (ADS)
Chen, Bin-Bin; Liu, Yun-Jing; Chen, Ziyu; Li, Wei
2017-04-01
We propose a series-expansion thermal tensor network (SETTN) approach for efficient simulations of quantum lattice models. This continuous-time SETTN method is based on the numerically exact Taylor series expansion of the equilibrium density operator e-β H (with H the total Hamiltonian and β the imaginary time), and is thus Trotter-error free. We discover, through simulating XXZ spin chain and square-lattice quantum Ising models, that not only the Hamiltonian H , but also its powers Hn, can be efficiently expressed as matrix product operators, which enables us to calculate with high precision the equilibrium and dynamical properties of quantum lattice models at finite temperatures. Our SETTN method provides an alternative to conventional Trotter-Suzuki renormalization-group (RG) approaches, and achieves a very high standard of thermal RG simulations in terms of accuracy and flexibility.
Centrality measures in temporal networks with time series analysis
NASA Astrophysics Data System (ADS)
Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun
2017-05-01
The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.
Visual analytics for model selection in time series analysis.
Bögl, Markus; Aigner, Wolfgang; Filzmoser, Peter; Lammarsch, Tim; Miksch, Silvia; Rind, Alexander
2013-12-01
Model selection in time series analysis is a challenging task for domain experts in many application areas such as epidemiology, economy, or environmental sciences. The methodology used for this task demands a close combination of human judgement and automated computation. However, statistical software tools do not adequately support this combination through interactive visual interfaces. We propose a Visual Analytics process to guide domain experts in this task. For this purpose, we developed the TiMoVA prototype that implements this process based on user stories and iterative expert feedback on user experience. The prototype was evaluated by usage scenarios with an example dataset from epidemiology and interviews with two external domain experts in statistics. The insights from the experts' feedback and the usage scenarios show that TiMoVA is able to support domain experts in model selection tasks through interactive visual interfaces with short feedback cycles.
Time series Analysis of Integrateds Building System Variables
NASA Astrophysics Data System (ADS)
Georgiev, Tz.; Jonkov, T.; Yonchev, E.
2010-10-01
This article deals with time series analysis of indoor and outdoor variables of the integrated building system. The kernel of these systems is heating, ventilation and air conditioning (HVAC) problems. Important outdoor and indoor variables are: air temperature, global and diffuse radiations, wind speed and direction, temperature, relative humidity, mean radiant temperature, and so on. The aim of this article is TO select the structure and investigation of a linear auto—regressive (AR) and auto—regressive with external inputs (ARX) models. The investigation of obtained models is based on real—live data. All researches are derived in MATLAB environment. The further research will focus on synthesis of robust energy saving control algorithms.
Time series analysis for minority game simulations of financial markets
NASA Astrophysics Data System (ADS)
Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy
2003-04-01
The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.
Time series analysis of diverse extreme phenomena: universal features
NASA Astrophysics Data System (ADS)
Eftaxias, K.; Balasis, G.
2012-04-01
The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.
Geostatistical analysis as applied to two environmental radiometric time series.
Dowdall, Mark; Lind, Bjørn; Gerland, Sebastian; Rudjord, Anne Liv
2003-03-01
This article details the results of an investigation into the application of geostatistical data analysis to two environmental radiometric time series. The data series employed consist of 99Tc values for seaweed (Fucus vesiculosus) and seawater samples taken as part of a marine monitoring program conducted on the coast of northern Norway by the Norwegian Radiation Protection Authority. Geostatistical methods were selected in order to provide information on values of the variables at unsampled times and to investigate the temporal correlation exhibited by the data sets. This information is of use in the optimisation of future sampling schemes and for providing information on the temporal behaviour of the variables in question that may not be obtained during a cursory analysis. The results indicate a high degree of temporal correlation within the data sets, the correlation for the seawater and seaweed data being modelled with an exponential and linear function, respectively. The semi-variogram for the seawater data indicates a temporal range of correlation of approximately 395 days with no apparent random component to the overall variance structure and was described best by an exponential function. The temporal structure of the seaweed data was best modelled by a linear function with a small nugget component. Evidence of drift was present in both semi-variograms. Interpolation of the data sets using the fitted models and a simple kriging procedure were compared, using a cross-validation procedure, with simple linear interpolation. Results of this exercise indicate that, for the seawater data, the kriging procedure outperformed the simple interpolation with respect to error distribution and correlation of estimates with actual values. Using the unbounded linear model with the seaweed data produced estimates that were only marginally better than those produced by the simple interpolation.
Wild, Beate; Stadnitski, Tatjana; Wesche, Daniela; Stroe-Kunold, Esther; Schultz, Jobst-Hendrik; Rudofsky, Gottfried; Maser-Gluth, Christiane; Herzog, Wolfgang; Friederich, Hans-Christoph
2016-04-01
The aim of the study was to investigate the characteristics of the awakening salivary cortisol in patients with anorexia nervosa (AN) using a time series design. We included ten AN inpatients, six with a very low BMI (high symptom severity, HSS group) and four patients with less severe symptoms (low symptom severity, LSS group). Patients collected salivary cortisol daily upon awakening. The number of collected saliva samples varied across patients between n=65 and n=229 (due to the different lengths of their inpatient stay). In addition, before retiring, the patients answered questions daily on the handheld regarding disorder-related psychosocial variables. The analysis of cortisol and diary data was conducted by using a time series approach. Time series showed that the awakening cortisol of the AN patients was elevated as compared to a control group. Cortisol measurements of patients with LSS essentially fluctuated in a stationary manner around a constant mean. The series of patients with HSS were generally less stable; four HSS patients showed a non-stationary cortisol awakening series. Antipsychotic medication did not change awakening cortisol in a specific way. The lagged dependencies between cortisol and depressive feelings became significant for four patients. Here, higher cortisol values were temporally associated with higher values of depressive feelings. Upon awakening, the cortisol of all AN patients was in the standard range but elevated as compared to healthy controls. Patients with HSS appeared to show less stable awakening cortisol time series compared to patients with LSS. Copyright © 2016 Elsevier B.V. All rights reserved.
Visibility graph analysis for re-sampled time series from auto-regressive stochastic processes
NASA Astrophysics Data System (ADS)
Zhang, Rong; Zou, Yong; Zhou, Jie; Gao, Zhong-Ke; Guan, Shuguang
2017-01-01
Visibility graph (VG) and horizontal visibility graph (HVG) play a crucial role in modern complex network approaches to nonlinear time series analysis. However, depending on the underlying dynamic processes, it remains to characterize the exponents of presumably exponential degree distributions. It has been recently conjectured that there is a critical value of exponent λc = ln 3 / 2 , which separates chaotic from correlated stochastic processes. Here, we systematically apply (H)VG analysis to time series from autoregressive (AR) models, which confirms the hypothesis that an increased correlation length results in larger values of λ > λc. On the other hand, we numerically find a regime of negatively correlated process increments where λ < λc, which is in contrast to this hypothesis. Furthermore, by constructing graphs based on re-sampled time series, we find that network measures show non-trivial dependencies on the autocorrelation functions of the processes. We propose to choose the decorrelation time as the maximal re-sampling delay for the algorithm. Our results are detailed for time series from AR(1) and AR(2) processes.
Multiscale multifractal detrended cross-correlation analysis of financial time series
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian; Wang, Jing; Lin, Aijing
2014-06-01
In this paper, we introduce a method called multiscale multifractal detrended cross-correlation analysis (MM-DCCA). The method allows us to extend the description of the cross-correlation properties between two time series. MM-DCCA may provide new ways of measuring the nonlinearity of two signals, and it helps to present much richer information than multifractal detrended cross-correlation analysis (MF-DCCA) by sweeping all the range of scale at which the multifractal structures of complex system are discussed. Moreover, to illustrate the advantages of this approach we make use of the MM-DCCA to analyze the cross-correlation properties between financial time series. We show that this new method can be adapted to investigate stock markets under investigation. It can provide a more faithful and more interpretable description of the dynamic mechanism between financial time series than traditional MF-DCCA. We also propose to reduce the scale ranges to analyze short time series, and some inherent properties which remain hidden when a wide range is used may exhibit perfectly in this way.
Finite element techniques in computational time series analysis of turbulent flows
NASA Astrophysics Data System (ADS)
Horenko, I.
2009-04-01
In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical
Bronson, Jonathan E; Fei, Jingyi; Hofman, Jake M; Gonzalez, Ruben L; Wiggins, Chris H
2009-12-16
Time series data provided by single-molecule Förster resonance energy transfer (smFRET) experiments offer the opportunity to infer not only model parameters describing molecular complexes, e.g., rate constants, but also information about the model itself, e.g., the number of conformational states. Resolving whether such states exist or how many of them exist requires a careful approach to the problem of model selection, here meaning discrimination among models with differing numbers of states. The most straightforward approach to model selection generalizes the common idea of maximum likelihood--selecting the most likely parameter values--to maximum evidence: selecting the most likely model. In either case, such an inference presents a tremendous computational challenge, which we here address by exploiting an approximation technique termed variational Bayesian expectation maximization. We demonstrate how this technique can be applied to temporal data such as smFRET time series; show superior statistical consistency relative to the maximum likelihood approach; compare its performance on smFRET data generated from experiments on the ribosome; and illustrate how model selection in such probabilistic or generative modeling can facilitate analysis of closely related temporal data currently prevalent in biophysics. Source code used in this analysis, including a graphical user interface, is available open source via http://vbFRET.sourceforge.net.
Interrupted time-series analysis: studying trends in neurosurgery.
Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K
2015-12-01
OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.
Forced response approach of a parametric vibration with a trigonometric series
NASA Astrophysics Data System (ADS)
Huang, Dishan
2015-02-01
A forced vibration problem with parametric stiffness is modeled by feedback structure in this manuscript, and the forced response is expressed as a special trigonometric series. The forced response of this problem is determined by algebraic equation. By applying harmonic balance and limitation operation, all coefficients of the harmonic components in the forced response solution are fully approached. The investigation result shows that the new approach has an advantage in the computational time and accuracy, and it is very significant for the theoretical research and engineering application in dealing with the problem of forced parametric vibration.
Swetapadma, Aleena; Yadav, Anamika
2015-01-01
Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance.
Swetapadma, Aleena; Yadav, Anamika
2015-01-01
Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance. PMID:26413088
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
Inverting geodetic time series with a principal component analysis-based inversion method
NASA Astrophysics Data System (ADS)
Kositsky, A. P.; Avouac, J.-P.
2010-03-01
The Global Positioning System (GPS) system now makes it possible to monitor deformation of the Earth's surface along plate boundaries with unprecedented accuracy. In theory, the spatiotemporal evolution of slip on the plate boundary at depth, associated with either seismic or aseismic slip, can be inferred from these measurements through some inversion procedure based on the theory of dislocations in an elastic half-space. We describe and test a principal component analysis-based inversion method (PCAIM), an inversion strategy that relies on principal component analysis of the surface displacement time series. We prove that the fault slip history can be recovered from the inversion of each principal component. Because PCAIM does not require externally imposed temporal filtering, it can deal with any kind of time variation of fault slip. We test the approach by applying the technique to synthetic geodetic time series to show that a complicated slip history combining coseismic, postseismic, and nonstationary interseismic slip can be retrieved from this approach. PCAIM produces slip models comparable to those obtained from standard inversion techniques with less computational complexity. We also compare an afterslip model derived from the PCAIM inversion of postseismic displacements following the 2005 8.6 Nias earthquake with another solution obtained from the extended network inversion filter (ENIF). We introduce several extensions of the algorithm to allow statistically rigorous integration of multiple data sources (e.g., both GPS and interferometric synthetic aperture radar time series) over multiple timescales. PCAIM can be generalized to any linear inversion algorithm.
Imai, Chisato; Hashizume, Masahiro
2015-03-01
Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases.
An Alternative Approach to Atopic Dermatitis: Part I—Case-Series Presentation
2004-01-01
Atopic dermatitis (AD) is a complex disease of obscure pathogenesis. A substantial portion of AD patients treated with conventional therapy become intractable after several cycles of recurrence. Over the last 20 years we have developed an alternative approach to treat many of these patients by diet and Kampo herbal medicine. However, as our approach is highly individualized and the Kampo formulae sometimes complicated, it is not easy to provide evidence to establish usefulness of this approach. In this Review, to demonstrate the effectiveness of the method of individualized Kampo therapy, results are presented for a series of patients who had failed with conventional therapy but were treated afterwards in our institution. Based on these data, we contend that there exist a definite subgroup of AD patients in whom conventional therapy fails, but the ‘Diet and Kampo’ approach succeeds, to heal. Therefore, this approach should be considered seriously as a second-line treatment for AD patients. In the Discussion, we review the evidential status of the current conventional strategies for AD treatment in general, and then specifically discuss the possibility of integrating Kampo regimens into it, taking our case-series presented here as evidential basis. We emphasize that Kampo therapy for AD is more ‘art’ than technology, for which expertise is an essential pre-requisite. PMID:15257326
Three Approaches to Environmental Resources Analysis.
ERIC Educational Resources Information Center
Harvard Univ., Cambridge, MA. Graduate School of Design.
This booklet, the first of a projected series related to the development of methodologies and techniques for environments planning and design, examines three approaches that are currently being used to identify, analyze, and evaluate the natural and man-made resources that comprise the physical environment. One approach by G. Angus Hills uses a…
Three approaches to reliability analysis
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.
1989-01-01
It is noted that current reliability analysis tools differ not only in their solution techniques, but also in their approach to model abstraction. The analyst must be satisfied with the constraints that are intrinsic to any combination of solution technique and model abstraction. To get a better idea of the nature of these constraints, three reliability analysis tools (HARP, ASSIST/SURE, and CAME) were used to model portions of the Integrated Airframe/Propulsion Control System architecture. When presented with the example problem, all three tools failed to produce correct results. In all cases, either the tool or the model had to be modified. It is suggested that most of the difficulty is rooted in the large model size and long computational times which are characteristic of Markov model solutions.
An approach for estimating time-variable rates from geodetic time series
NASA Astrophysics Data System (ADS)
Didova, Olga; Gunter, Brian; Riva, Riccardo; Klees, Roland; Roese-Koerner, Lutz
2016-11-01
There has been considerable research in the literature focused on computing and forecasting sea-level changes in terms of constant trends or rates. The Antarctic ice sheet is one of the main contributors to sea-level change with highly uncertain rates of glacial thinning and accumulation. Geodetic observing systems such as the Gravity Recovery and Climate Experiment (GRACE) and the Global Positioning System (GPS) are routinely used to estimate these trends. In an effort to improve the accuracy and reliability of these trends, this study investigates a technique that allows the estimated rates, along with co-estimated seasonal components, to vary in time. For this, state space models are defined and then solved by a Kalman filter (KF). The reliable estimation of noise parameters is one of the main problems encountered when using a KF approach, which is solved by numerically optimizing likelihood. Since the optimization problem is non-convex, it is challenging to find an optimal solution. To address this issue, we limited the parameter search space using classical least-squares adjustment (LSA). In this context, we also tested the usage of inequality constraints by directly verifying whether they are supported by the data. The suggested technique for time-series analysis is expanded to classify and handle time-correlated observational noise within the state space framework. The performance of the method is demonstrated using GRACE and GPS data at the CAS1 station located in East Antarctica and compared to commonly used LSA. The results suggest that the outlined technique allows for more reliable trend estimates, as well as for more physically valuable interpretations, while validating independent observing systems.
Financial time series analysis based on effective phase transfer entropy
NASA Astrophysics Data System (ADS)
Yang, Pengbo; Shang, Pengjian; Lin, Aijing
2017-02-01
Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.
Time Series Analysis of the Quasar PKS 1749+096
NASA Astrophysics Data System (ADS)
Lam, Michael T.; Balonek, T. J.
2011-01-01
Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.
Time Series Analysis of Symbiotic Stars and Cataclysmic Variables
NASA Astrophysics Data System (ADS)
Ren, Jiaying; MacLachlan, G.; Panchmal, A.; Dhuga, K.; Morris, D.
2010-01-01
Symbiotic stars (SSs) and Cataclysmic Variables (CVs) are two families of binary systems which occasionally vary in brightness because of accretion from the secondary star. High frequency oscillations, also known as flickering, are thought to occur because of turbulence in the accretion disk especially in and near the vicinity of the boundary layer between the surface of the compact object and the inner edge of the disk. Lower frequency oscillations are also observed but these are typically associated with the orbital and spin motions of the binary system and may be modulated by the presence of a magnetic field. By studying these variations, we probe the emission regions in these compact systems and gain a better understanding of the accretion process. Time-ordered series of apparent magnitudes for several SSs and CVs, obtained from the American Association of Variable Star Observers (AAVSO), have been analyzed. The analysis techniques include Power Spectral Densities, Rescaled R/S Analysis, and Discrete Wavelet Transforms. The results are used to estimate a Hurst exponent which is a measure of long-range memory dependence and self-similarity.
Multi-Granular Trend Detection for Time-Series Analysis.
Arthur Van, Goethem; Staals, Frank; Loffler, Maarten; Dykes, Jason; Speckmann, Bettina
2017-01-01
Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data sets. Trend detection is an effective way to simplify time-varying data and to summarize salient information for visual display and interactive analysis. We propose a geometric model for trend-detection in one-dimensional time-varying data, inspired by topological grouping structures for moving objects in two- or higher-dimensional space. Our model gives provable guarantees on the trends detected and uses three natural parameters: granularity, support-size, and duration. These parameters can be changed on-demand. Our system also supports a variety of selection brushes and a time-sweep to facilitate refined searches and interactive visualization of (sub-)trends. We explore different visual styles and interactions through which trends, their persistence, and evolution can be explored.
Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.
Ak, Ronay; Fink, Olga; Zio, Enrico
2016-08-01
The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.
Kennedy, Curtis E; Turley, James P
2011-10-24
Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9
2011-01-01
Background Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. Methods We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Results Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of
Beyond linear methods of data analysis: time series analysis and its applications in renal research.
Gupta, Ashwani K; Udrea, Andreea
2013-01-01
Analysis of temporal trends in medicine is needed to understand normal physiology and to study the evolution of disease processes. It is also useful for monitoring response to drugs and interventions, and for accountability and tracking of health care resources. In this review, we discuss what makes time series analysis unique for the purposes of renal research and its limitations. We also introduce nonlinear time series analysis methods and provide examples where these have advantages over linear methods. We review areas where these computational methods have found applications in nephrology ranging from basic physiology to health services research. Some examples include noninvasive assessment of autonomic function in patients with chronic kidney disease, dialysis-dependent renal failure and renal transplantation. Time series models and analysis methods have been utilized in the characterization of mechanisms of renal autoregulation and to identify the interaction between different rhythms of nephron pressure flow regulation. They have also been used in the study of trends in health care delivery. Time series are everywhere in nephrology and analyzing them can lead to valuable knowledge discovery. The study of time trends of vital signs, laboratory parameters and the health status of patients is inherent to our everyday clinical practice, yet formal models and methods for time series analysis are not fully utilized. With this review, we hope to familiarize the reader with these techniques in order to assist in their proper use where appropriate.
Single event time-series analysis in a karst catchment evaluated using a groundwater model
NASA Astrophysics Data System (ADS)
Mayaud, Cyril; Wagner, Thomas; Birk, Steffen
2013-04-01
The Lurbach-Tanneben karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massive itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 (Oswald et al., EGU2009-9255) demonstrates that an overflow from one of the sub-catchment to the other is activated if the spring discharge exceeds a threshold. Time-series analysis (e.g., auto-correlation, cross-correlation) was applied to examine how far the various available methods support the identification of the transient inter-catchment flow observed in this karst system. As inter-catchment flow is intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time-series analysis a simplified groundwater flow model was built using MODFLOW based on the current conceptual understanding of the karst system. The groundwater model represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various options of recharge (e.g., allogenic versus autogenic) were used to generate synthetic discharge data for the time-series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of recharge and aquifer properties in the results from the time-series analysis. Comparing the results from the time-series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. Thus, the heterogeneity of hydraulic aquifer
Dequéant, Mary-Lee; Fagegaltier, Delphine; Hu, Yanhui; Spirohn, Kerstin; Simcox, Amanda; Hannon, Gregory J.; Perrimon, Norbert
2015-01-01
The use of time series profiling to identify groups of functionally related genes (synexpression groups) is a powerful approach for the discovery of gene function. Here we apply this strategy during RasV12 immortalization of Drosophila embryonic cells, a phenomenon not well characterized. Using high-resolution transcriptional time-series datasets, we generated a gene network based on temporal expression profile similarities. This analysis revealed that common immortalized cells are related to adult muscle precursors (AMPs), a stem cell-like population contributing to adult muscles and sharing properties with vertebrate satellite cells. Remarkably, the immortalized cells retained the capacity for myogenic differentiation when treated with the steroid hormone ecdysone. Further, we validated in vivo the transcription factor CG9650, the ortholog of mammalian Bcl11a/b, as a regulator of AMP proliferation predicted by our analysis. Our study demonstrates the power of time series synexpression analysis to characterize Drosophila embryonic progenitor lines and identify stem/progenitor cell regulators. PMID:26438832
Evaluating disease management program effectiveness: an introduction to time-series analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2003-01-01
Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
Boschi, Vladimir; Pogorelic, Zenon; Gulan, Gordan; Vilovic, Katarina; Stalekar, Hrvoje; Bilan, Kanito; Grandic, Leo
2013-01-01
Background There are few surgical approaches for treating humeral shaft fractures. Here we present our results using a subbrachial approach. Methods We conducted a retrospective case series involving patients who had surgery for a humeral shaft fracture between January 1994 and January 2008. We divided patients into 4 groups based on the surgical approach (anterior, anterolateral, posterior, subbrachial). In all patients, an AO 4.5 mm dynamic compression plate was used. Results During our study period, 280 patients aged 30–36 years underwent surgery for a humeral shaft fracture. The average duration of surgery was shortest using the subbrachial approach (40 min). The average loss of muscle strength was 40% for the anterolateral, 48% for the posterior, 42% for the anterior and 20% for the subbrachial approaches. The average loss of tension in the brachialis muscle after 4 months was 61% for the anterolateral, 48% for the anterior and 11% for the subbrachial approaches. Sixteen patients in the anterolateral and anterior groups and 6 patients in the posterior group experienced intraoperative lesions of the radial nerve. No postoperative complications were observed in the subbrachial group. Conclusion The subbrachial approach is practical and effective. The average duration of the surgery is shortened by half, loss of the muscle strength is minimal, and patients can resume everyday activities within 4 months. No patients in the sub-brachial group experienced injuries to the radial or musculocutaneous nerves. PMID:23187037
Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay
2013-12-01
Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations.
A New Approach To Teaching Dimensional Analysis.
ERIC Educational Resources Information Center
Churchill, Stuart W.
1997-01-01
Explains an approach to teaching dimensional analysis that differs slightly from the traditional approach. The difference lies in the novelty of exposition in the presentation and interpretation of dimensional analysis as a speculative process. (DDR)
Traffic time series analysis by using multiscale time irreversibility and entropy.
Wang, Xuejiao; Shang, Pengjian; Fang, Jintang
2014-09-01
Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method-multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy.
Predicting microRNA targets in time-series microarray experiments via functional data analysis.
Parker, Brian J; Wen, Jiayu
2009-01-30
MicroRNA (miRNA) target prediction is an important component in understanding gene regulation. One approach is computational: searching nucleotide sequences for miRNA complementary base pairing. An alternative approach explored in this paper is the use of gene expression profiles from time-series microarray experiments to aid in miRNA target prediction. This requires distinguishing genuine targets from genes that are secondarily down-regulated as part of the same regulatory module. We use a functional data analytic (FDA) approach, FDA being a subfield of statistics that extends standard multivariate techniques to datasets with predictor and/or response variables that are functional. In a miR-124 transfection experiment spanning 120 hours, for genes with measurably down-regulated mRNA, exploratory functional data analysis showed differences in expression profiles over time between directly and indirectly down-regulated genes, such as response latency and biphasic response for direct miRNA targets. For prediction, an FDA approach was shown to effectively classify direct miR-124 targets from time-series microarray data (accuracy 88%; AUC 0.96), providing better performance than multivariate approaches. Exploratory FDA analysis can reveal interesting aspects of dynamic microarray miRNA studies. Predictive FDA models can be applied where computational miRNA target predictors fail or are unreliable, e.g. when there is a lack of evolutionary conservation, and can provide posterior probabilities to provide additional confirmatory evidence to validate candidate miRNA targets computationally predicted using sequence information. This approach would be applicable to the investigation of other miRNAs and suggests that dynamic microarray studies at a higher time resolution could reveal further details on miRNA regulation.
Analysis of epileptic seizure count time series by ensemble state space modelling.
Galka, Andreas; Boor, Rainer; Doege, Corinna; von Spiczak, Sarah; Stephani, Ulrich; Siniatchkin, Michael
2015-08-01
We propose an approach for the analysis of epileptic seizure count time series within a state space framework. Time-dependent dosages of several simultaneously administered anticonvulsants are included as external inputs. The method aims at distinguishing which temporal correlations in the data are due to the medications, and which correspond to an unrelated background signal. Through this method it becomes possible to disentagle the effects of the individual anticonvulsants, i.e., to decide which anticonvulsant in a particular patient decreases or rather increases the number of seizures.
NASA Astrophysics Data System (ADS)
Chen, Wei-Shing
2011-04-01
The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.
Water Resources Management Plan for Ganga River using SWAT Modelling and Time series Analysis
NASA Astrophysics Data System (ADS)
Satish, L. N. V.
2015-12-01
Water resources management of the Ganga River is one of the primary objectives of National Ganga River Basin Environmental Management Plan. The present study aims to carry out water balance study and development of appropriate methodologies to compute environmental flow in the middle Ganga river basin between Patna-Farraka, India. The methodology adopted here are set-up a hydrological model to estimate monthly discharge at the tributaries under natural condition, hydrological alternation analysis of both observed and simulated discharge series, flow health analysis to obtain status of the stream health in the last 4 decades and estimating the e-flow using flow health indicators. ArcSWAT, was used to simulate 8 tributaries namely Kosi, Gandak and others. This modelling is quite encouraging and helps to provide the monthly water balance analysis for all tributaries for this study. The water balance analysis indicates significant change in surface and ground water interaction pattern within the study time period Indicators of hydrological alternation has been used for both observed and simulated data series to quantify hydrological alternation occurred in the tributaries and the main river in the last 4 decades,. For temporal variation of stream health, flow health tool has been used for observed and simulated discharge data. A detailed stream health analysis has been performed by considering 3 approaches based on i) observed flow time series, ii) observed and simulated flow time series and iii) simulated flow time series at small upland basin, major tributary and main Ganga river basin levels. At upland basin level, these approaches show that stream health and its temporal variations are good with non-significant temporal variation. At major tributary level, the stream health and its temporal variations are found to be deteriorating from 1970s. At the main Ganga reach level river health and its temporal variations does not show any declining trend. Finally, E- flows
The Prediction of Teacher Turnover Employing Time Series Analysis.
ERIC Educational Resources Information Center
Costa, Crist H.
The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…
On the Fourier and Wavelet Analysis of Coronal Time Series
NASA Astrophysics Data System (ADS)
Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.
2016-07-01
Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.
Stochastic time series analysis of fetal heart-rate variability
NASA Astrophysics Data System (ADS)
Shariati, M. A.; Dripps, J. H.
1990-06-01
Fetal Heart Rate(FHR) is one of the important features of fetal biophysical activity and its long term monitoring is used for the antepartum(period of pregnancy before labour) assessment of fetal well being. But as yet no successful method has been proposed to quantitatively represent variety of random non-white patterns seen in FHR. Objective of this paper is to address this issue. In this study the Box-Jenkins method of model identification and diagnostic checking was used on phonocardiographic derived FHR(averaged) time series. Models remained exclusively autoregressive(AR). Kalman filtering in conjunction with maximum likelihood estimation technique forms the parametric estimator. Diagnosrics perfonned on the residuals indicated that a second order model may be adequate in capturing type of variability observed in 1 up to 2 mm data windows of FHR. The scheme may be viewed as a means of data reduction of a highly redundant information source. This allows a much more efficient transmission of FHR information from remote locations to places with facilities and expertise for doser analysis. The extracted parameters is aimed to reflect numerically the important FHR features. These are normally picked up visually by experts for their assessments. As a result long term FHR recorded during antepartum period could then be screened quantitatively for detection of patterns considered normal or abnonnal. 1.
Learning Bayesian networks for clinical time series analysis.
van der Heijden, Maarten; Velikova, Marina; Lucas, Peter J F
2014-04-01
Autonomous chronic disease management requires models that are able to interpret time series data from patients. However, construction of such models by means of machine learning requires the availability of costly health-care data, often resulting in small samples. We analysed data from chronic obstructive pulmonary disease (COPD) patients with the goal of constructing a model to predict the occurrence of exacerbation events, i.e., episodes of decreased pulmonary health status. Data from 10 COPD patients, gathered with our home monitoring system, were used for temporal Bayesian network learning, combined with bootstrapping methods for data analysis of small data samples. For comparison a temporal variant of augmented naive Bayes models and a temporal nodes Bayesian network (TNBN) were constructed. The performances of the methods were first tested with synthetic data. Subsequently, different COPD models were compared to each other using an external validation data set. The model learning methods are capable of finding good predictive models for our COPD data. Model averaging over models based on bootstrap replications is able to find a good balance between true and false positive rates on predicting COPD exacerbation events. Temporal naive Bayes offers an alternative that trades some performance for a reduction in computation time and easier interpretation. Copyright © 2013 Elsevier Inc. All rights reserved.
Chaotic time series analysis of vision evoked EEG
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Wang, Hong
2009-12-01
To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.
Chaotic time series analysis of vision evoked EEG
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Wang, Hong
2010-01-01
To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.
Nonlinear time series analysis of normal and pathological human walking
NASA Astrophysics Data System (ADS)
Dingwell, Jonathan B.; Cusumano, Joseph P.
2000-12-01
Characterizing locomotor dynamics is essential for understanding the neuromuscular control of locomotion. In particular, quantifying dynamic stability during walking is important for assessing people who have a greater risk of falling. However, traditional biomechanical methods of defining stability have not quantified the resistance of the neuromuscular system to perturbations, suggesting that more precise definitions are required. For the present study, average maximum finite-time Lyapunov exponents were estimated to quantify the local dynamic stability of human walking kinematics. Local scaling exponents, defined as the local slopes of the correlation sum curves, were also calculated to quantify the local scaling structure of each embedded time series. Comparisons were made between overground and motorized treadmill walking in young healthy subjects and between diabetic neuropathic (NP) patients and healthy controls (CO) during overground walking. A modification of the method of surrogate data was developed to examine the stochastic nature of the fluctuations overlying the nominally periodic patterns in these data sets. Results demonstrated that having subjects walk on a motorized treadmill artificially stabilized their natural locomotor kinematics by small but statistically significant amounts. Furthermore, a paradox previously present in the biomechanical literature that resulted from mistakenly equating variability with dynamic stability was resolved. By slowing their self-selected walking speeds, NP patients adopted more locally stable gait patterns, even though they simultaneously exhibited greater kinematic variability than CO subjects. Additionally, the loss of peripheral sensation in NP patients was associated with statistically significant differences in the local scaling structure of their walking kinematics at those length scales where it was anticipated that sensory feedback would play the greatest role. Lastly, stride-to-stride fluctuations in the
Analytical framework for recurrence network analysis of time series
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Donner, Reik V.; Kurths, Jürgen
2012-04-01
Recurrence networks are a powerful nonlinear tool for time series analysis of complex dynamical systems. While there are already many successful applications ranging from medicine to paleoclimatology, a solid theoretical foundation of the method has still been missing so far. Here, we interpret an ɛ-recurrence network as a discrete subnetwork of a “continuous” graph with uncountably many vertices and edges corresponding to the system's attractor. This step allows us to show that various statistical measures commonly used in complex network analysis can be seen as discrete estimators of newly defined continuous measures of certain complex geometric properties of the attractor on the scale given by ɛ. In particular, we introduce local measures such as the ɛ-clustering coefficient, mesoscopic measures such as ɛ-motif density, path-based measures such as ɛ-betweennesses, and global measures such as ɛ-efficiency. This new analytical basis for the so far heuristically motivated network measures also provides an objective criterion for the choice of ɛ via a percolation threshold, and it shows that estimation can be improved by so-called node splitting invariant versions of the measures. We finally illustrate the framework for a number of archetypical chaotic attractors such as those of the Bernoulli and logistic maps, periodic and two-dimensional quasiperiodic motions, and for hyperballs and hypercubes by deriving analytical expressions for the novel measures and comparing them with data from numerical experiments. More generally, the theoretical framework put forward in this work describes random geometric graphs and other networks with spatial constraints, which appear frequently in disciplines ranging from biology to climate science.
Time series analysis of collective motions in proteins
NASA Astrophysics Data System (ADS)
Alakent, Burak; Doruker, Pemra; ćamurdan, Mehmet C.
2004-01-01
The dynamics of α-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the Cα atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm-1 range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers.
Analytical framework for recurrence network analysis of time series.
Donges, Jonathan F; Heitzig, Jobst; Donner, Reik V; Kurths, Jürgen
2012-04-01
Recurrence networks are a powerful nonlinear tool for time series analysis of complex dynamical systems. While there are already many successful applications ranging from medicine to paleoclimatology, a solid theoretical foundation of the method has still been missing so far. Here, we interpret an ɛ-recurrence network as a discrete subnetwork of a "continuous" graph with uncountably many vertices and edges corresponding to the system's attractor. This step allows us to show that various statistical measures commonly used in complex network analysis can be seen as discrete estimators of newly defined continuous measures of certain complex geometric properties of the attractor on the scale given by ɛ. In particular, we introduce local measures such as the ɛ-clustering coefficient, mesoscopic measures such as ɛ-motif density, path-based measures such as ɛ-betweennesses, and global measures such as ɛ-efficiency. This new analytical basis for the so far heuristically motivated network measures also provides an objective criterion for the choice of ɛ via a percolation threshold, and it shows that estimation can be improved by so-called node splitting invariant versions of the measures. We finally illustrate the framework for a number of archetypical chaotic attractors such as those of the Bernoulli and logistic maps, periodic and two-dimensional quasiperiodic motions, and for hyperballs and hypercubes by deriving analytical expressions for the novel measures and comparing them with data from numerical experiments. More generally, the theoretical framework put forward in this work describes random geometric graphs and other networks with spatial constraints, which appear frequently in disciplines ranging from biology to climate science.
Seasonal dynamics of bacterial meningitis: a time-series analysis.
Paireau, Juliette; Chen, Angelica; Broutin, Helene; Grenfell, Bryan; Basta, Nicole E
2016-06-01
Bacterial meningitis, which is caused mainly by Neisseria meningitidis, Haemophilus influenzae, and Streptococcus pneumoniae, inflicts a substantial burden of disease worldwide. Yet, the temporal dynamics of this disease are poorly characterised and many questions remain about the ecology of the disease. We aimed to comprehensively assess seasonal trends in bacterial meningitis on a global scale. We developed the first bacterial meningitis global database by compiling monthly incidence data as reported by country-level surveillance systems. Using country-level wavelet analysis, we identified whether a 12 month periodic component (annual seasonality) was detected in time-series that had at least 5 years of data with at least 40 cases reported per year. We estimated the mean timing of disease activity by computing the centre of gravity of the distribution of cases and investigated whether synchrony exists between the three pathogens responsible for most cases of bacterial meningitis. We used country-level data from 66 countries, including from 47 countries outside the meningitis belt in sub-Saharan Africa. A persistent seasonality was detected in 49 (96%) of the 51 time-series from 38 countries eligible for inclusion in the wavelet analyses. The mean timing of disease activity had a latitudinal trend, with bacterial meningitis seasons peaking during the winter months in countries in both the northern and southern hemispheres. The three pathogens shared similar seasonality, but time-shifts differed slightly by country. Our findings provide key insight into the seasonal dynamics of bacterial meningitis and add to knowledge about the global epidemiology of meningitis and the host, environment, and pathogen characteristics driving these patterns. Comprehensive understanding of global seasonal trends in meningitis could be used to design more effective prevention and control strategies. Princeton University Health Grand Challenge, US National Institutes of Health (NIH
Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.
Durdu, Omer Faruk
2010-10-01
In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic
Multivariate change point analysis in time series for volcano unrest detection
NASA Astrophysics Data System (ADS)
Aliotta, M. A.; Cassisi, C.; Fiumara, S.; Montalto, P.
2016-12-01
The detection of unrest in volcanic areas represents a key task for civil protection purposes. Nowadays, large networks for different kinds of measurements deployed in most of active volcanoes supply huge amount of data, mainly in the form of time series. Automatic techniques are needed to perform the analysis of such amount of data. In this sense, time series analysis techniques can contribute to exploit the information coming from the measurements to identify possible changes into volcanic behaviour. In particular, the change point analysis can be used to this aim. The change point analysis is the process of detecting distributional changes within time-ordered observations. Among the different techniques proposed for this kind of analysis, we chose to use the SeqDrift (Sakthithasan et al., 2013) technique for its ability to deal with real time data. The algorithm iteratively compares two consecutive sliding windows coming from the data stream to choose whether the boundary point (in the between of the two windows) is a change point. The check is carried out by a non-parametric statistical test. We applied the proposed approach to a test case on Mt. Etna using large multivariate dataset from 2011-2015. The results indicate that the technique is effective to detect volcanic state changes. Sakthithasan, S., Pears, R., Koh, Y. S. (2013). One Pass Concept Change Detection for Data Streams. PAKDD (2): 461-472.
Volterra Series Approach for Nonlinear Aeroelastic Response of 2-D Lifting Surfaces
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Marzocca, Piergiovanni; Librescu, Liviu
2001-01-01
The problem of the determination of the subcritical aeroelastic response and flutter instability of nonlinear two-dimensional lifting surfaces in an incompressible flow-field via Volterra series approach is addressed. The related aeroelastic governing equations are based upon the inclusion of structural nonlinearities, of the linear unsteady aerodynamics and consideration of an arbitrary time-dependent external pressure pulse. Unsteady aeroelastic nonlinear kernels are determined, and based on these, frequency and time histories of the subcritical aeroelastic response are obtained, and in this context the influence of geometric nonlinearities is emphasized. Conclusions and results displaying the implications of the considered effects are supplied.
Approximate Symmetry Reduction Approach: Infinite Series Reductions to the KdV-Burgers Equation
NASA Astrophysics Data System (ADS)
Jiao, Xiaoyu; Yao, Ruoxia; Zhang, Shunli; Lou, Sen Y.
2009-11-01
For weak dispersion and weak dissipation cases, the (1+1)-dimensional KdV-Burgers equation is investigated in terms of approximate symmetry reduction approach. The formal coherence of similarity reduction solutions and similarity reduction equations of different orders enables series reduction solutions. For the weak dissipation case, zero-order similarity solutions satisfy the Painlevé II, Painlevé I, and Jacobi elliptic function equations. For the weak dispersion case, zero-order similarity solutions are in the form of Kummer, Airy, and hyperbolic tangent functions. Higher-order similarity solutions can be obtained by solving linear variable coefficients ordinary differential equations.
Dowling, Thomas E; Turner, Thomas F; Carson, Evan W; Saltzgiver, Melody J; Adams, Deborah; Kesner, Brian; Marsh, Paul C
2014-01-01
Time-series analysis is used widely in ecology to study complex phenomena and may have considerable potential to clarify relationships of genetic and demographic processes in natural and exploited populations. We explored the utility of this approach to evaluate population responses to management in razorback sucker, a long-lived and fecund, but declining freshwater fish species. A core population in Lake Mohave (Arizona-Nevada, USA) has experienced no natural recruitment for decades and is maintained by harvesting naturally produced larvae from the lake, rearing them in protective custody, and repatriating them at sizes less vulnerable to predation. Analyses of mtDNA and 15 microsatellites characterized for sequential larval cohorts collected over a 15-year time series revealed no changes in geographic structuring but indicated significant increase in mtDNA diversity for the entire population over time. Likewise, ratios of annual effective breeders to annual census size (Nb/Na) increased significantly despite sevenfold reduction of Na. These results indicated that conservation actions diminished near-term extinction risk due to genetic factors and should now focus on increasing numbers of fish in Lake Mohave to ameliorate longer-term risks. More generally, time-series analysis permitted robust testing of trends in genetic diversity, despite low precision of some metrics. PMID:24665337
Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...
2017-02-16
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
A multivariate time series approach to modeling and forecasting demand in the emergency department.
Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L
2009-02-01
The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.
A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series
NASA Technical Reports Server (NTRS)
Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.
1998-01-01
An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.
Gan, M; Zuo, H; Yang, Z; Jiang, Y
2000-02-01
In this paper, we discuss the applications of time series modeling method in the analysis of lubricating oil of mechanical equipment. We obtained satisfactory results by applying AR model to perform time series modeling and forecasting analysis to the collected spectral analysis data of the air engine. So we have built a practical method for state monitoring and trouble forecasting of mechanical equipment.
Imatinib in advanced chordoma: A retrospective case series analysis.
Hindi, Nadia; Casali, Paolo G; Morosi, Carlo; Messina, Antonella; Palassini, Elena; Pilotti, Silvana; Tamborini, Elena; Radaelli, Stefano; Gronchi, Alessandro; Stacchiotti, Silvia
2015-11-01
Imatinib showed activity in 50 chordoma patients treated within a Phase II study. In that study, 70% of patients remained with stable disease (SD), median progression free survival (PFS) was 9 months and median overall survival (OS) was 34 months. We now report on a retrospective series of PDGFB/PDGFRB positive advanced chordoma patients treated with imatinib as a single agent within a compassionate-use programme at Istituto Nazionale Tumori, Milan, Italy (INT) between August 2002 and November 2010, when the programme was closed. 48 patients were consecutively treated with imatinib 800 mg/d. All patients had inoperable and progressive disease before starting imatinib. Demographics, treatment duration, toxicity and response rate by Response Evaluation Criteria in Solid Tumors (RECIST) were retrospectively recorded. The median duration of therapy was 7 months (1-46.5). No patient is on therapy at present. 46 patients were evaluable for response. No partial responses were detected. Best response was: stable disease 34 (74%), progressive disease 12 (26%). At a median follow-up of 24.5 months (0.5-117), median PFS was 9.9 months (95% confidence interval (CI) 6.7-13). Eight patients (16.5%) remained on therapy >18 months and 10 patients (21%) remained progression-free >18 months. Median OS was 30 months (95% CI 20-40), with 24 (50%) patients dead at the time of the present analysis. We confirm the activity of imatinib in locally advanced and metastatic chordoma, in terms of >70% tumour growth arrest in previously progressive patients. Median duration of response lasted almost 10 months, with >20% of patients progression-free at 18+ months. Copyright © 2015 Elsevier Ltd. All rights reserved.
Time-series intervention analysis of pedestrian countdown timer effects.
Huitema, Bradley E; Van Houten, Ron; Manal, Hana
2014-11-01
Pedestrians account for 40-50% of traffic fatalities in large cities. Several previous studies based on relatively small samples have concluded that Pedestrian Countdown Timers (PCT) may reduce pedestrian crashes at signalized intersections, but other studies report no reduction. The purposes of the present article are to (1) describe a new methodology to evaluate the effectiveness of introducing PCT signals and (2) to present results of applying this methodology to pedestrian crash data collected in a large study carried out in Detroit, Michigan. The study design incorporated within-unit as well as between-unit components. The main focus was on dynamic effects that occurred within the PCT unit of 362 treated sites during the 120 months of the study. An interrupted time-series analysis was developed to evaluate whether change in crash frequency depended upon of the degree to which the countdown timers penetrated the treatment unit. The between-unit component involved comparisons between the treatment unit and a control unit. The overall conclusion is that the introduction of PCT signals in Detroit reduced pedestrian crashes to approximately one-third of the preintervention level. The evidence for this reductionis strong and the change over time was shown to be a function of the extent to which the timers were introduced during the intervention period. There was no general drop-off in crash frequency throughout the baseline interval of over five years; only when the PCT signals were introduced in large numbers was consistent and convincing crash reduction observed. Correspondingly, there was little evidence of change in the control unit.
Del Sorbo, Maria Rosaria; Balzano, Walter; Donato, Michele; Draghici, Sorin
2013-11-01
Differential expression of genes detected with the analysis of high throughput genomic experiments is a commonly used intermediate step for the identification of signaling pathways involved in the response to different biological conditions. The impact analysis was the first approach for the analysis of signaling pathways involved in a certain biological process that was able to take into account not only the magnitude of the expression change of the genes but also the topology of signaling pathways including the type of each interactions between the genes. In the impact analysis, signaling pathways are represented as weighted directed graphs with genes as nodes and the interactions between genes as edges. Edges weights are represented by a β factor, the regulatory efficiency, which is assumed to be equal to 1 in inductive interactions between genes and equal to -1 in repressive interactions. This study presents a similarity analysis between gene expression time series aimed to find correspondences with the regulatory efficiency, i.e. the β factor as found in a widely used pathway database. Here, we focused on correlations among genes directly connected in signaling pathways, assuming that the expression variations of upstream genes impact immediately downstream genes in a short time interval and without significant influences by the interactions with other genes. Time series were processed using three different similarity metrics. The first metric is based on the bit string matching; the second one is a specific application of the Dynamic Time Warping to detect similarities even in presence of stretching and delays; the third one is a quantitative comparative analysis resulting by an evaluation of frequency domain representation of time series: the similarity metric is the correlation between dominant spectral components. These three approaches are tested on real data and pathways, and a comparison is performed using Information Retrieval benchmark tools
NASA Astrophysics Data System (ADS)
Peng, Wei; Dai, Wujiao; Santerre, Rock; Cai, Changsheng; Kuang, Cuilin
2017-02-01
Daily vertical coordinate time series of Global Navigation Satellite System (GNSS) stations usually contains tectonic and non-tectonic deformation signals, residual atmospheric delay signals, measurement noise, etc. In geophysical studies, it is very important to separate various geophysical signals from the GNSS time series to truthfully reflect the effect of mass loadings on crustal deformation. Based on the independence of mass loadings, we combine the Ensemble Empirical Mode Decomposition (EEMD) with the Phase Space Reconstruction-based Independent Component Analysis (PSR-ICA) method to analyze the vertical time series of GNSS reference stations. In the simulation experiment, the seasonal non-tectonic signal is simulated by the sum of the correction of atmospheric mass loading and soil moisture mass loading. The simulated seasonal non-tectonic signal can be separated into two independent signals using the PSR-ICA method, which strongly correlated with atmospheric mass loading and soil moisture mass loading, respectively. Likewise, in the analysis of the vertical time series of GNSS reference stations of Crustal Movement Observation Network of China (CMONOC), similar results have been obtained using the combined EEMD and PSR-ICA method. All these results indicate that the EEMD and PSR-ICA method can effectively separate the independent atmospheric and soil moisture mass loading signals and illustrate the significant cause of the seasonal variation of GNSS vertical time series in the mainland of China.
Dai, Hongliang; Lu, Xiwu; Peng, Yonghong; Zou, Haiming; Shi, Jing
2016-12-01
Homogeneous nucleation of hydroxyapatite (HAP) crystallization in high levels of supersaturation solution has a negative effect on phosphorus recovery efficiency because of the poor settleability of the generated HAP microcrystalline. In this study, a new high-performance approach for phosphorus recovery from anaerobic supernatant using three series-coupled air-agitated crystallization reactors was developed and characterized. During 30-day operation, the proposed process showed a high recovery efficiency (∼95.82%) and low microcrystalline ratio (∼3.11%). Particle size analysis showed that the microcrystalline size was successively increased (from 5.81 to 26.32 μm) with the sequence of series-coupled reactors, confirming the conjectural mechanism that a multistage-induced crystallization system provided an appropriate condition for the growth, aggregation, and precipitation of crystallized products. Furthermore, the new process showed a broad spectrum of handling ability for different concentrations of phosphorus-containing solution in the range of 5-350 mg L(-1), and the obtained results of phosphorus conversion ratio and recovery efficiency were more than 92% and 80%, respectively. Overall, these results showed that the new process exhibited an excellent ability of efficient phosphorus recovery as well as wide application scope, and might be used as an effective approach for phosphorus removal and recovery from wastewater.
Pose Estimation from Line Correspondences: A Complete Analysis and A Series of Solutions.
Xu, Chi; Zhang, Lilian; Cheng, Li; Koch, Reinhard
2016-06-20
In this paper we deal with the camera pose estimation problem from a set of 2D/3D line correspondences, which is also known as PnL (Perspective-n-Line) problem. We carry out our study by comparing PnL with the well-studied PnP (Perspective-n-Point) problem, and our contributions are threefold: (1) We provide a complete 3D configuration analysis for P3L, which includes the well-known P3P problem as well as several existing analyses as special cases. (2) By exploring the similarity between PnL and PnP, we propose a new subset-based PnL approach as well as a series of linear-formulation-based PnL approaches inspired by their PnP counterparts. (3) The proposed linear-formulation-based methods can be easily extended to deal with the line and point features simultaneously.
Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M
2016-01-26
Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP).
Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M
2016-01-01
Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP
NASA Astrophysics Data System (ADS)
Phillips, D. A.; Meertens, C. M.; Hodgkinson, K. M.; Puskas, C. M.; Boler, F. M.; Snett, L.; Mattioli, G. S.
2013-12-01
We present an overview of time series data, tools and services available from UNAVCO along with two specific and compelling examples of geodetic time series analysis. UNAVCO provides a diverse suite of geodetic data products and cyberinfrastructure services to support community research and education. The UNAVCO archive includes data from 2500+ continuous GPS stations, borehole geophysics instruments (strainmeters, seismometers, tiltmeters, pore pressure sensors), and long baseline laser strainmeters. These data span temporal scales from seconds to decades and provide global spatial coverage with regionally focused networks including the EarthScope Plate Boundary Observatory (PBO) and COCONet. This rich, open access dataset is a tremendous resource that enables the exploration, identification and analysis of time varying signals associated with crustal deformation, reference frame determinations, isostatic adjustments, atmospheric phenomena, hydrologic processes and more. UNAVCO provides a suite of time series exploration and analysis resources including static plots, dynamic plotting tools, and data products and services designed to enhance time series analysis. The PBO GPS network allow for identification of ~1 mm level deformation signals. At some GPS stations seasonal signals and longer-term trends in both the vertical and horizontal components can be dominated by effects of hydrological loading from natural and anthropogenic sources. Modeling of hydrologic deformation using GLDAS and a variety of land surface models (NOAH, MOSAIC, VIC and CLM) shows promise for independently modeling hydrologic effects and separating them from tectonic deformation as well as anthropogenic loading sources. A major challenge is to identify where loading is dominant and corrections from GLDAS can apply and where pumping is the dominant signal and corrections are not possible without some other data. In another arena, the PBO strainmeter network was designed to capture small short
NASA Astrophysics Data System (ADS)
Bakr, Mahmoud I.; Butler, Adrian P.
2005-01-01
Nonstationarity of flow fields due to pumping wells and its impact on advective transport is of particular interest in well capture zone design and wellhead protection. However, techniques based on Monte Carlo methods to characterize the associated capture zone uncertainty are time consuming and cumbersome. This paper introduces an alternative approach. The mean and covariance of system state variables (i.e., head, pore water velocity, and particle trajectory) are approximated using a first-order Taylor's series with sensitivity coefficients estimated from the adjoint operator for a system of discrete equations. The approach allows nonstationarity due to several sources (e.g., transmissivity, pumping, boundary conditions) to be treated. By employing numerical solution methods, it is able to handle irregular geometry, varying boundary conditions, complicated sink/source terms, and different covariance functions, all of which are important factors for real-world applications. A comparison of results for the Taylor's series approximation with those from Monte Carlo analysis showed, in general, good agreement for most of the tested particles. Particle trajectory variance calculated using Taylor's series approximation is then used to predict well capture zone probabilities under the assumption of normality of the mass transport's state variables. Verification of this assumption showed that not all particle trajectories (depending on their starting location) are normally or log-normally distributed. However, the risk of using the first-order method to delineate the confidence interval of a well capture zone is minimal since it marginally overestimates the 2.5% probability contour. Furthermore, this should be balanced against its greater computation efficiency over the Monte Carlo approach.
VAET: A Visual Analytics Approach for E-Transactions Time-Series.
Xie, Cong; Chen, Wei; Huang, Xinxin; Hu, Yueqi; Barlowe, Scott; Yang, Jing
2014-12-01
Previous studies on E-transaction time-series have mainly focused on finding temporal trends of transaction behavior. Interesting transactions that are time-stamped and situation-relevant may easily be obscured in a large amount of information. This paper proposes a visual analytics system, Visual Analysis of E-transaction Time-Series (VAET), that allows the analysts to interactively explore large transaction datasets for insights about time-varying transactions. With a set of analyst-determined training samples, VAET automatically estimates the saliency of each transaction in a large time-series using a probabilistic decision tree learner. It provides an effective time-of-saliency (TOS) map where the analysts can explore a large number of transactions at different time granularities. Interesting transactions are further encoded with KnotLines, a compact visual representation that captures both the temporal variations and the contextual connection of transactions. The analysts can thus explore, select, and investigate knotlines of interest. A case study and user study with a real E-transactions dataset (26 million records) demonstrate the effectiveness of VAET.
On fractal analysis of cardiac interbeat time series
NASA Astrophysics Data System (ADS)
Guzmán-Vargas, L.; Calleja-Quevedo, E.; Angulo-Brown, F.
2003-09-01
In recent years the complexity of a cardiac beat-to-beat time series has been taken as an auxiliary tool to identify the health status of human hearts. Several methods has been employed to characterize the time series complexity. In this work we calculate the fractal dimension of interbeat time series arising from three groups: 10 young healthy persons, 8 elderly healthy persons and 10 patients with congestive heart failures. Our numerical results reflect evident differences in the dynamic behavior corresponding to each group. We discuss these results within the context of the neuroautonomic control of heart rate dynamics. We also propose a numerical simulation which reproduce aging effects of heart rate behavior.
Seismic analysis of series isolation system based on geometry nonlinearity
NASA Astrophysics Data System (ADS)
Lin, Z. D.; Shi, H.; Xue, L.
2017-08-01
According to the system of rubber bearing serially connected with column, the mathematical model of serially isolated system based on geometric nonlinear is investigated by using Hamilton’s principle. The effects of axial pressure and difference column size to the series isolation system in seismic response is discussed. The series isolation system dynamics model based on geometric nonlinear is established considering the cross section rotated and the influence of the shear deformation and axial pressure. The differential quadrature element method is employed for discrete processing on governing equations and boundary conditions. Seismic response of series isolation system subjected to the far-field ground motions is solved numerically. Results show that: the slenderness ratio of cantilever column will significantly affect the seismic response of the isolation system under far-field ground motions, and it is particularly to response of the cantilever column.
Potential and limitations of wavelet analysis to unravel complexity in CH4 and CO2 flux time series
NASA Astrophysics Data System (ADS)
Koebsch, Franziska; Lehr, Christian; Hoffmann, Mathias; Augustin, Jürgen; van Huissteden, Ko; Franz, Daniela; Jocher, Georg; Järveoja, Järvi; Peichl, Matthias; Sachs, Torsten
2017-04-01
Greenhouse gas fluxes measured continuously across the land-atmosphere interface are highly autocorrelated and characterized by complex temporal patterns. Wavelet analysis is a time series analysis tool that decomposes a signal in both, frequency and time domain which allows accounting for non-stationarity - a feature that is inherent to most natural processes. Using time series of CH4 and CO2 fluxes derived from both, automated chamber and eddy covariance measurements in different Fluxnet peatland types, we demonstrate the potential and limitations of wavelet analysis in the field of greenhouse gas exchange. More explicitly, we show how gas-specific time series characteristics express themselves in the wavelet spectrum and draw conclusions for the formulation of null hypotheses for wavelet significance testing. We further demonstrate how inevitable technical constraints of greenhouse gas in situ measurements (e. g. data gaps and varying instrumental performance between maintenance intervals) manifest in the flux time series and discuss their implications for the interpretation of wavelet results. Moreover, our multi-method approach allows to address method-inherent capabilities of the automated chamber and eddy covariance technique to resolve CO2 and CH4 release processes on different time scales. Despite some challenges, we consider the wider deployment of wavelet analysis and related time series analysis tools as promising to advance our mechanistic understanding in the field of greenhouse gas exchange across the land-atmosphere interface.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
Minimum entropy density method for the time series analysis
NASA Astrophysics Data System (ADS)
Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae
2009-01-01
The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
Scaling behaviour of heartbeat intervals obtained by wavelet-based time-series analysis
NASA Astrophysics Data System (ADS)
Ivanov, Plamen Ch.; Rosenblum, Michael G.; Peng, C.-K.; Mietus, Joseph; Havlin, Shlomo; Stanley, H. Eugene; Goldberger, Ary L.
1996-09-01
BIOLOGICAL time-series analysis is used to identify hidden dynamical patterns which could yield important insights into underlying physiological mechanisms. Such analysis is complicated by the fact that biological signals are typically both highly irregular and non-stationary, that is, their statistical character changes slowly or intermittently as a result of variations in background influences1-3. Previous statistical analyses of heartbeat dynamics4-6 have identified long-range correlations and power-law scaling in the normal heartbeat, but not the phase interactions between the different frequency components of the signal. Here we introduce a new approach, based on the wavelet transform and an analytic signal approach, which can characterize non-stationary behaviour and elucidate such phase interactions. We find that, when suitably rescaled, the distributions of the variations in the beat-to-beat intervals for all healthy subjects are described by a single function stable over a wide range of timescales. However, a similar scaling function does not exist for a group with cardiopulmonary instability caused by sleep apnoea. We attribute the functional form of the scaling observed in the healthy subjects to underlying nonlinear dynamics, which seem to be essential to normal heart function. The approach introduced here should be useful in the analysis of other nonstationary biological signals.
Jha, Abhinav K.; Kupinski, Matthew A.; Masumura, Takahiro; Clarkson, Eric; Maslov, Alexey V.; Barrett, Harrison H.
2014-01-01
We present the implementation, validation, and performance of a Neumann-series approach for simulating light propagation at optical wavelengths in uniform media using the radiative transport equation (RTE). The RTE is solved for an anisotropic-scattering medium in a spherical harmonic basis for a diffuse-optical-imaging setup. The main objectives of this paper are threefold: to present the theory behind the Neumann-series form for the RTE, to design and develop the mathematical methods and the software to implement the Neumann series for a diffuse-optical-imaging setup, and, finally, to perform an exhaustive study of the accuracy, practical limitations, and computational efficiency of the Neumann-series method. Through our results, we demonstrate that the Neumann-series approach can be used to model light propagation in uniform media with small geometries at optical wavelengths. PMID:23201893
Time series analysis of sferics rate data associated with severe weather patterns
NASA Technical Reports Server (NTRS)
Wang, P. P.; Burns, R. C.
1976-01-01
Data obtained by an electronic transducer measuring the rate of occurrence of electrical disturbances in the atmosphere (the sferic rate in the form of a time series) over the life of electrical storms are analyzed. It is found that the sferic rate time series are not stationary. The sferics rate time series has a complete life cycle associated with a particular storm. The approach to recognition of a spectral pattern is somewhat similar to real-time recognition of the spoken word.
Time series analysis of sferics rate data associated with severe weather patterns
NASA Technical Reports Server (NTRS)
Wang, P. P.; Burns, R. C.
1976-01-01
Data obtained by an electronic transducer measuring the rate of occurrence of electrical disturbances in the atmosphere (the sferic rate in the form of a time series) over the life of electrical storms are analyzed. It is found that the sferic rate time series are not stationary. The sferics rate time series has a complete life cycle associated with a particular storm. The approach to recognition of a spectral pattern is somewhat similar to real-time recognition of the spoken word.
Investigation on Law and Economics Based on Complex Network and Time Series Analysis
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460
Investigation on Law and Economics Based on Complex Network and Time Series Analysis.
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing.
On the use of bipolar montages for time-series analysis of intracranial electroencephalograms.
Zaveri, Hitten P; Duckrow, Robert B; Spencer, Susan S
2006-09-01
Bipolar montages are routinely employed for the interpretation of scalp and intracranial EEGs (icEEGs). In this manuscript we consider the assumptions that support the use of a bipolar montage and question the universal appropriateness of bipolar representation of icEEGs for the time-series analysis of these signals. Bipolar montages introduce an element of spatial processing into the observed time-series. In the case of icEEGs, we argue ambiguity may be introduced in some settings through this operation because of a lack of certifiability of local differentiability and continuity of the spatial structure of icEEGs, and their suboptimal spatial sampling. Example icEEGs were collected from three patients being studied for possible resective epilepsy surgery. Referential and bipolar representations of these signals were subjected to different visual and time-series analysis. The time-series measures calculated were the power spectral density and magnitude squared coherence. Visual analysis and time-series measures revealed that the icEEG time-series was altered by the use of a bipolar montage. The changes resulted from either the introduction of unrelated information from the two referential time-series into the bipolar time-series, or from the removal or alteration of information common to the two referential time-series in the bipolar time-series. The changes could not be predicted without prior knowledge of the relationship between measurement sites that form the bipolar montage. In certain settings, bipolar montages alter icEEGs and can confound the time-series analysis of these signals. In such settings, bipolar montages should be used with caution in the time-series analysis of icEEGs. This manuscript addresses the representation of the intracranial EEG for time-series analysis. There may be contexts where the assumptions underpinning correct application of the bipolar montage to the intracranial EEG are not satisfied.
Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy
NASA Astrophysics Data System (ADS)
Yujun, Yang; Jianping, Li; Yimei, Yang
This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.
Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling
Olofsson, K. Erik J.; Hanson, Jeremy M.; Shiraki, Daisuke; Volpe, Francesco A.; Humphreys, David A.; La Haye, Robert J.; Lanctot, Matthew J.; Strait, Edward J.; Welander, Anders S.; Kolemen, Egemen; Okabayashi, Michio
2014-07-14
Here, time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models. The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.
Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling
Olofsson, K. Erik J.; Hanson, Jeremy M.; Shiraki, Daisuke; ...
2014-07-14
Here, time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models.more » The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.« less
A Time-Series Analysis of Hispanic Unemployment.
ERIC Educational Resources Information Center
Defreitas, Gregory
1986-01-01
This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)
TSNet--a distributed architecture for time series analysis.
Hunter, Jim
2008-01-01
This paper describes an infrastructure (TSNet) which can be used by geographically separated research groups to develop algorithms for the abstraction of complex time series data. The framework was specifically designed for the kinds of abstractions required for the application of clinical guidelines within intensive care.
A Time-Series Analysis of Hispanic Unemployment.
ERIC Educational Resources Information Center
Defreitas, Gregory
1986-01-01
This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)
Model Identification in Time-Series Analysis: Some Empirical Results.
ERIC Educational Resources Information Center
Padia, William L.
Model identification of time-series data is essential to valid statistical tests of intervention effects. Model identification is, at best, inexact in the social and behavioral sciences where one is often confronted with small numbers of observations. These problems are discussed, and the results of independent identifications of 130 social and…
Dynamic Factor Analysis of Nonstationary Multivariate Time Series.
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; And Others
1992-01-01
The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
NASA Astrophysics Data System (ADS)
Mitchell, R.; Hilton, E.; Rosenfield, P.
2011-12-01
Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series is hosted on the University of Washington campus and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the importance of an interdisciplinary approach, and discuss strategies for
NASA Astrophysics Data System (ADS)
Mitchell, R.; Hilton, E.; Rosenfield, P.
2012-12-01
Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010 and 2011, and will be retaught in 2012. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series has been hosted at the University of Washington campus and Seattle Town Hall, and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the
NASA Astrophysics Data System (ADS)
Porta, Alberto; Bari, Vlasta; Ranuzzi, Giovanni; De Maria, Beatrice; Baselli, Giuseppe
2017-09-01
We propose a multiscale complexity (MSC) method assessing irregularity in assigned frequency bands and being appropriate for analyzing the short time series. It is grounded on the identification of the coefficients of an autoregressive model, on the computation of the mean position of the poles generating the components of the power spectral density in an assigned frequency band, and on the assessment of its distance from the unit circle in the complex plane. The MSC method was tested on simulations and applied to the short heart period (HP) variability series recorded during graded head-up tilt in 17 subjects (age from 21 to 54 years, median = 28 years, 7 females) and during paced breathing protocols in 19 subjects (age from 27 to 35 years, median = 31 years, 11 females) to assess the contribution of time scales typical of the cardiac autonomic control, namely in low frequency (LF, from 0.04 to 0.15 Hz) and high frequency (HF, from 0.15 to 0.5 Hz) bands to the complexity of the cardiac regulation. The proposed MSC technique was compared to a traditional model-free multiscale method grounded on information theory, i.e., multiscale entropy (MSE). The approach suggests that the reduction of HP variability complexity observed during graded head-up tilt is due to a regularization of the HP fluctuations in LF band via a possible intervention of sympathetic control and the decrement of HP variability complexity observed during slow breathing is the result of the regularization of the HP variations in both LF and HF bands, thus implying the action of physiological mechanisms working at time scales even different from that of respiration. MSE did not distinguish experimental conditions at time scales larger than 1. Over a short time series MSC allows a more insightful association between cardiac control complexity and physiological mechanisms modulating cardiac rhythm compared to a more traditional tool such as MSE.
Complexity analysis of the turbulent environmental fluid flow time series
NASA Astrophysics Data System (ADS)
Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.
2014-02-01
We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.
Noise analysis of GPS time series in Taiwan
NASA Astrophysics Data System (ADS)
Lee, You-Chia; Chang, Wu-Lung
2017-04-01
Global positioning system (GPS) usually used for researches of plate tectonics and crustal deformation. In most studies, GPS time series considered only time-independent noises (white noise), but time-dependent noises (flicker noise, random walk noise) which were found by nearly twenty years are also important to the precision of data. The rate uncertainties of stations will be underestimated if the GPS time series are assumed only time-independent noise. Therefore studying the noise properties of GPS time series is necessary in order to realize the precision and reliability of velocity estimates. The lengths of our GPS time series are from over 500 stations around Taiwan with time spans longer than 2.5 years up to 20 years. The GPS stations include different monument types such as deep drill braced, roof, metal tripod, and concrete pier, and the most common type in Taiwan is the metal tripod. We investigated the noise properties of continuous GPS time series by using the spectral index and amplitude of the power law noise. During the process we first remove the data outliers, and then estimate linear trend, size of offsets, and seasonal signals, and finally the amplitudes of the power-law and white noise are estimated simultaneously. Our preliminary results show that the noise amplitudes of the north component are smaller than that of the other two components, and the largest amplitudes are in the vertical. We also find that the amplitudes of white noise and power-law noises are positively correlated in three components. Comparisons of noise amplitudes of different monument types in Taiwan reveal that the deep drill braced monuments have smaller data uncertainties and therefore are more stable than other monuments.
Quality analysis and classification of Swiss phenological series (PHENOCLASS)
NASA Astrophysics Data System (ADS)
Auchmann, Renate; Rutishauser, This; Brugnara, Yuri; Brönnimann, Stefan; Pietragalla, Barbara; Gehrig, Regula; Sigg, Christian; Knechtl, Valentin; Konzelmann, Thomas; Calpini, Bertrand
2017-04-01
Data quality control of series and quality assessment of existing stations are crucial steps prior to a reliable application of phenological data and potential network adaptations in climate research. The Swiss Phenology Network started in 1951 and comprises around 160 stations with a maximum of 69 different phenological parameters. The quality and homogeneity of the entire dataset has never been assessed comprehensively. The goal of the recently initiated project PHENOCLASS (supported by MeteoSwiss in the framework of GCOS Switzerland) is the systematic assessment and subsequent classification of all Swiss phenological series and stations according to, among others, their data quality, length, completeness, homogeneity, and availability of metadata. An aim is to provide a list of the top most valuable, high quality Swiss series. Here we present the core part of the assessment: The data quality control (QC) procedure is tailored to the Swiss Phenology Network and provides the basis for the subsequent assessments (e.g., break detection and homogeneity assessment). The QC consists of several levels, comprising mostly automatic but also manual procedures. The automatic part uses absolute and relative comparisons of observations to thresholds to identify unreliable observations from various potential sources of error (e.g., transcription errors, typing errors, unreliable observations due to various reasons). Relative procedures comprise, e.g., comparisons among stations as well as comparisons within stations utilizing linear models. Less than 5 % of all observations contain an automatic flag. The largest number of flags is generated through biological inconsistency testing, as well as using three-monthly-temperature sums, highly correlated within-station, or cross-station phenological series as predictors. All automatic flags are reviewed in a final manual quality control step, where two experts inspect all series of the network independently. Also, all available meta
A fundamental approach to dipmeter analysis
Enderlin, M.B.; Hansen, D.K.T. )
1987-02-01
Historically, in dipmeter analysis, depositional patterns are delineated for environmental, structural, and stratigraphic interpretations. The proposed method is a fundamental approach using raw data measurements from the dipmeter sonde to help the geologist describe subsurface structures on a stratigraphic scale. Raw data are available at the well site, require no post-processing, and can be combined with computed results, if available. They also are cost effective and easy to use, and they require only a basic knowledge of sedimentary features and facies. Case studies illustrate the successful reconstruction of sedimentary features from raw data logs recorded by four- and six-arm dipmeters. The dipmeter is a wireline tool with a series of evenly spaced, focused electrodes applied to the circumference of the borehole wall. The raw data are presented as curves representing the electrode response and tool orientation. Using these data on an expanded scale, the geologist can reconstruct a plausible three-dimensional picture of individual sedimentary features. In outcrop, the geologist usually can see an entire sedimentary feature in a large frame of reference, that is, with the surrounding landscape. Thus, a large range of features can be recognized. However, in the borehole environment, the frame of reference is reduced to the borehole diameter, which because of its size, reduces the range of recogniziable features. In their study, they developed a table that identifies the features distinguished by the proposed method, as a function of borehole diameter.
3-D, bluff body drag estimation using a Green's function/Gram-Charlier series approach.
Barone, Matthew Franklin; De Chant, Lawrence Justin
2004-05-01
In this study, we describe the extension of the 2-d preliminary design bluff body drag estimation tool developed by De Chant to apply for 3-d flows. As with the 2-d method, the 3-d extension uses a combined approximate Green's function/Gram-Charlier series approach to retain the body geometry information. Whereas, the 2-d methodology relied solely upon the use of small disturbance theory for the inviscid flow field associated with the body of interest to estimate the near-field initial conditions, e.g. velocity defect, the 3-d methodology uses both analytical (where available) and numerical inviscid solutions. The defect solution is then used as an initial condition in an approximate 3-d Green's function solution. Finally, the Green's function solution is matched to the 3-d analog of the classical 2-d Gram-Charlier series and then integrated to yield the net form drag on the bluff body. Preliminary results indicate that drag estimates computed are of accuracy equivalent to the 2-d method for flows with large separation, i.e. less than 20% relative error. As was the lower dimensional method, the 3-d concept is intended to be a supplement to turbulent Navier-Stokes and experimental solution for estimating drag coefficients over blunt bodies.
3-D, bluff body drag estimation using a Green's function/Gram-Charlier series approach.
Barone, Matthew Franklin; De Chant, Lawrence Justin
2005-01-01
In this study, we describe the extension of the 2-d preliminary design bluff body drag estimation tool developed by De Chant1 to apply for 3-d flows. As with the 2-d method, the 3-d extension uses a combined approximate Green's function/Gram-Charlier series approach to retain the body geometry information. Whereas, the 2-d methodology relied solely upon the use of small disturbance theory for the inviscid flow field associated with the body of interest to estimate the near-field initial conditions, e.g. velocity defect, the 3-d methodology uses both analytical (where available) and numerical inviscid solutions. The defect solution is then used as an initial condition in an approximate 3-d Green's function solution. Finally, the Green's function solution is matched to the 3-d analog of the classical 2-d Gram-Charlier series and then integrated to yield the net form drag on the bluff body. Preliminary results indicate that drag estimates computed are of accuracy equivalent to the 2-d method for flows with large separation, i.e. less than 20% relative error. As was the lower dimensional method, the 3-d concept is intended to be a supplement to turbulent Navier-Stokes and experimental solution for estimating drag coefficients over blunt bodies.
Extensive mapping of coastal change in Alaska by Landsat time-series analysis, 1972-2013
NASA Astrophysics Data System (ADS)
Reynolds, J.; Macander, M. J.; Swingley, C. S.; Spencer, S. R.
2014-12-01
The landscape-scale effects of coastal storms on Alaska's Bering Sea and Gulf of Alaska coasts includes coastal erosion, migration of spits and barrier islands, breaching of coastal lakes and lagoons, and inundation and salt-kill of vegetation. Large changes in coastal storm frequency and intensity are expected due to climate change and reduced sea-ice extent. Storms have a wide range of impacts on carbon fluxes and on fish and wildlife resources, infrastructure siting and operation, and emergency response planning. In areas experiencing moderate to large effects, changes can be mapped by analyzing trends in time series of Landsat imagery from Landsat 1 through Landsat 8. The authors are performing a time-series trend analysis for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska. Ice- and cloud-free Landsat imagery from Landsat 1-8, covering 1972-2013, were analyzed using a combination of regression, changepoint detection, and classification tree approaches to detect, classify, and map changes in near-infrared reflectance. Areas with significant changes in coastal features, as well as timing of dominant changes and, in some cases, rates of change were identified . The approach captured many coastal changes over the 42-year study period, including coastal erosion exceeding the 60-m pixel resolution of the Multispectral Scanner (MSS) data and migrations of coastal spits and estuarine channels.
Hirano, Shoji; Tsumoto, Shusaku
2002-01-01
In this paper, we present an analysis method of time-series laboratory examination data based on multiscale matching and rough clustering. We obtain similarity of sequences by multi-scale matching, which compares two sequences throughout various scales of view. It has an advantage that connectivity of segments is preserved in the matching results even when the partial segments are obtained from different scales. Given relative similarity of the sequences, we cluster them by a rough-set based clustering technique. It groups the sequences based on their indiscernibility and has an ability to produce interpretable clusters without calculating the centroid or variance of a cluster. In the experiments we demonstrate that the features of patterns were successfully captured by this hybrid approach.
Nader, T; Rothenberg, S; Averbach, R; Charles, B; Fields, J Z; Schneider, R H
2000-01-01
Approximately 40% of the US population report using complementary and alternative medicine, including Maharishi Vedic Medicine (MVM), a traditional, comprehensive system of natural medicine, for relief from chronic and other disorders. Although many reports suggest health benefits from individual MVM techniques, reports on integrated holistic approaches are rare. This case series, designed to investigate the effectiveness of an integrated, multimodality MVM program in an ideal clinical setting, describes the outcomes in four patients: one with sarcoidosis; one with Parkinson's disease; a third with renal hypertension; and a fourth with diabetes/essential hypertension/anxiety disorder. Standard symptom reports and objective markers of disease were evaluated before, during, and after the treatment period. Results suggested substantial improvements as indicated by reductions in major signs, symptoms, and use of conventional medications in the four patients during the 3-week in-residence treatment phase and continuing through the home follow-up program.
Nader, Tony; Rothenberg, Stuart; Averbach, Richard; Charles, Barry; Fields, Jeremy Z.; Schneider, Robert H.
2008-01-01
Approximately 40% of the US population report using complementary and alternative medicine, including Maharishi Vedic Medicine (MVM), a traditional, comprehensive system of natural medicine, for relief from chronic and other disorders. Although many reports suggest health benefits from individual MVM techniques, reports on integrated holistic approaches are rare. This case series, designed to investigate the effectiveness of an integrated, multi-modality MVM program in an ideal clinical setting, describes the outcomes in four patients: one with sarcoidosis; one with Parkinson’s disease; a third with renal hypertension; and a fourth with diabetes/essential hypertension/anxiety disorder. Standard symptom reports and objective markers of disease were evaluated before, during, and after the treatment period. Results suggested substantial improvements as indicated by reductions in major signs, symptoms, and use of conventional medications in the four patients during the 3-week in-residence treatment phase and continuing through the home follow-up program. PMID:10971882
On statistical approaches to climate change analysis
NASA Astrophysics Data System (ADS)
Lee, Terry Chun Kit
Evidence for a human contribution to climatic changes during the past century is accumulating rapidly. Given the strength of the evidence, it seems natural to ask whether forcing projections can be used to forecast climate change. A Bayesian method for post-processing forced climate model simulations that produces probabilistic hindcasts of inter-decadal temperature changes on large spatial scales is proposed. Hindcasts produced for the last two decades of the 20th century are shown to be skillful. The suggestion that skillful decadal forecasts can be produced on large regional scales by exploiting the response to anthropogenic forcing provides additional evidence that anthropogenic change in the composition of the atmosphere has influenced our climate. In the absence of large negative volcanic forcing on the climate system (which cannot presently be forecast), the global mean temperature for the decade 2000-2009 is predicted to lie above the 1970-1999 normal with probability 0.94. The global mean temperature anomaly for this decade relative to 1970-1999 is predicted to be 0.35°C (5-95% confidence range: 0.21°C--0.48°C). Reconstruction of temperature variability of the past centuries using climate proxy data can also provide important information on the role of anthropogenic forcing in the observed 20th century warming. A state-space model approach that allows incorporation of additional non-temperature information, such as the estimated response to external forcing, to reconstruct historical temperature is proposed. An advantage of this approach is that it permits simultaneous reconstruction and detection analysis as well as future projection. A difficulty in using this approach is that estimation of several unknown state-space model parameters is required. To take advantage of the data structure in the reconstruction problem, the existing parameter estimation approach is modified, resulting in two new estimation approaches. The competing estimation approaches
Time-Series Analyses of Air Pollution and Mortality in the United States: A Subsampling Approach
McClellan, Roger O.; Dewanji, Anup; Turim, Jay; Luebeck, E. Georg; Edwards, Melanie
2012-01-01
Background: Hierarchical Bayesian methods have been used in previous papers to estimate national mean effects of air pollutants on daily deaths in time-series analyses. Objectives: We obtained maximum likelihood estimates of the common national effects of the criteria pollutants on mortality based on time-series data from ≤ 108 metropolitan areas in the United States. Methods: We used a subsampling bootstrap procedure to obtain the maximum likelihood estimates and confidence bounds for common national effects of the criteria pollutants, as measured by the percentage increase in daily mortality associated with a unit increase in daily 24-hr mean pollutant concentration on the previous day, while controlling for weather and temporal trends. We considered five pollutants [PM10, ozone (O3), carbon monoxide (CO), nitrogen dioxide (NO2), and sulfur dioxide (SO2)] in single- and multipollutant analyses. Flexible ambient concentration–response models for the pollutant effects were considered as well. We performed limited sensitivity analyses with different degrees of freedom for time trends. Results: In single-pollutant models, we observed significant associations of daily deaths with all pollutants. The O3 coefficient was highly sensitive to the degree of smoothing of time trends. Among the gases, SO2 and NO2 were most strongly associated with mortality. The flexible ambient concentration–response curve for O3 showed evidence of nonlinearity and a threshold at about 30 ppb. Conclusions: Differences between the results of our analyses and those reported from using the Bayesian approach suggest that estimates of the quantitative impact of pollutants depend on the choice of statistical approach, although results are not directly comparable because they are based on different data. In addition, the estimate of the O3-mortality coefficient depends on the amount of smoothing of time trends. PMID:23108284
Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis
NASA Astrophysics Data System (ADS)
Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.
2015-06-01
This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.
Catchment classification based on a comparative analysis of time series of natural tracers
NASA Astrophysics Data System (ADS)
Lehr, Christian; Lischeid, Gunnar; Tetzlaff, Doerthe
2014-05-01
Catchments do not only smooth the precipitation signal into the discharge hydrograph, but transform also chemical signals (e.g. contaminations or nutrients) in a characteristic way. Under the assumption of an approximately homogeneous input signal of a conservative tracer in the catchment the transformation of the signal at different locations can be used to infer hydrological properties of the catchment. For this study comprehensive data on geology, soils, topography, land use, etc. as well as hydrological knowledge about transit times, mixing ratio of base flow, etc. is available for the catchment of the river Dee (1849 km2) in Scotland, UK. The Dee has its origin in the Cairngorm Mountains in Central Scotland and flows towards the eastern coast of Scotland where it ends in the Northern Sea at Aberdeen. From the source in the west to the coast in the east there is a distinct decrease in precipitation and altitude. For one year water quality in the Dee has been sampled biweekly at 59 sites along the main stem of the river and outflows of a number of tributaries. A nonlinear variant of Principal Component Analysis (Isometric Feature Mapping) has been applied on time series of different chemical parameters that were assumed to be relative conservative and applicable as natural tracers. Here, the information in the time series was not used to analyse the temporal development at the different sites, but in a snapshot kind of approach, the spatial expression of the different solutes at the 26 sampling dates. For all natural tracers the first component depicted > 89 % of the variance in the series. Subsequently, the spatial expression of the first component was related to the spatial patterns of the catchment characteristics. The presented approach allows to characterise a catchment in a spatial discrete way according to the hydrologically active properties of the catchment on the landscape scale, which is often the scale of interest for water managing purposes.
Multifractal analysis of time series generated by discrete Ito equations
Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele
2015-06-15
In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.
Dynamical Analysis and Visualization of Tornadoes Time Series
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
Financial time series analysis based on information categorization method
NASA Astrophysics Data System (ADS)
Tian, Qiang; Shang, Pengjian; Feng, Guochen
2014-12-01
The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.
ERIC Educational Resources Information Center
Towgood, Karren J.; Meuwese, Julia D. I.; Gilbert, Sam J.; Turner, Martha S.; Burgess, Paul W.
2009-01-01
In the neuropsychological case series approach, tasks are administered that tap different cognitive domains, and differences within rather than across individuals are the basis for theorising; each individual is effectively their own control. This approach is a mainstay of cognitive neuropsychology, and is particularly suited to the study of…
ERIC Educational Resources Information Center
Towgood, Karren J.; Meuwese, Julia D. I.; Gilbert, Sam J.; Turner, Martha S.; Burgess, Paul W.
2009-01-01
In the neuropsychological case series approach, tasks are administered that tap different cognitive domains, and differences within rather than across individuals are the basis for theorising; each individual is effectively their own control. This approach is a mainstay of cognitive neuropsychology, and is particularly suited to the study of…
The R-package eseis - A toolbox to weld geomorphic, seismologic, spatial, and time series analysis
NASA Astrophysics Data System (ADS)
Dietze, Michael
2017-04-01
Environmental seismology is the science of investigating the seismic signals that are emitted by Earth surface processes. This emerging field provides unique opportunities to identify, locate, track and inspect a wide range of the processes that shape our planet. Modern broadband seismometers are sensitive enough to detect signals from sources as weak as wind interacting with the ground and as powerful as collapsing mountains. This places the field of environmental seismology at the seams of many geoscientific disciplines and requires integration of a series of specialised analysis techniques. R provides the perfect environment for this challenge. The package eseis uses the foundations laid by a series of existing packages and data types tailored to solve specialised problems (e.g., signal, sp, rgdal, Rcpp, matrixStats) and thus provides access to efficiently handling large streams of seismic data (> 300 million samples per station and day). It supports standard data formats (mseed, sac), preparation techniques (deconvolution, filtering, rotation), processing methods (spectra, spectrograms, event picking, migration for localisation) and data visualisation. Thus, eseis provides a seamless approach to the entire workflow of environmental seismology and passes the output to related analysis fields with temporal, spatial and modelling focus in R.
Cross-recurrence quantification analysis of categorical and continuous time series: an R package
Coco, Moreno I.; Dale, Rick
2014-01-01
This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between “correlation” and “co-visitation” when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results’ consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels. PMID:25018736
Cross-recurrence quantification analysis of categorical and continuous time series: an R package.
Coco, Moreno I; Dale, Rick
2014-01-01
This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between "correlation" and "co-visitation" when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results' consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels.
Class D Management Implementation Approach of the First Orbital Mission of the Earth Venture Series
NASA Technical Reports Server (NTRS)
Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris
2013-01-01
A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.
Acoustic neuroma surgery as an interdisciplinary approach: a neurosurgical series of 508 patients
Tonn, J.; Schlake, H.; Goldbrunner, R.; Milewski, C.; Helms, J.; Roosen, K.
2000-01-01
OBJECTIVES—To evaluate an interdisciplinary concept (neurosurgery/ear, nose, and throat (ENT)) of treating acoustic neuromas with extrameatal extension via the retromastoidal approach. To analyse whether monitoring both facial nerve EMG and BAEP improved the functional outcome in acoustic neuroma surgery. METHODS—In a series of 508 patients consecutively operated on over a period of 7 years, functional outcome of the facial nerve was evaluated according to the House/Brackmann scale and hearing preservation was classified using the Gardner/Robertson system. RESULTS—Facial monitoring (396 of 508 operations) and continuous BAEP recording (229 of 399 cases with preserved hearing preoperatively) were performed routinely. With intraoperative monitoring, the rate of excellent/good facial nerve function (House/Brackmann I-II) was 88.7%. Good functional hearing (Gardner/Robertson 1-3) was preserved in 39.8%. CONCLUSION—Acoustic neuroma surgery via a retrosigmoidal approach is a safe and effective treatment for tumours with extrameatal extension. Functional results can be substantially improved by intraoperative monitoring. The interdisciplinary concept of surgery performed by ENT and neurosurgeons was particularly convincing as each pathoanatomical phase of the operation is performed by a surgeon best acquainted with the regional specialties. PMID:10896686
Class D management implementation approach of the first orbital mission of the Earth Venture series
NASA Astrophysics Data System (ADS)
Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris
2013-09-01
A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.
Approaches to Enhance Sensemaking for Intelligence Analysis
2002-05-17
This essay describes four approaches to enhance sensemaking for intelligence analysis . Sensemaking refers to how individuals, groups, and...productively with others. Each approach is explained from a sensemaking perspective and linked to Richard Heuer’s Psychology of Intelligence Analysis . Examples
NASA Astrophysics Data System (ADS)
Gul, Mustafa; Catbas, F. Necati
2011-03-01
This study presents a novel time series analysis methodology to detect, locate, and estimate the extent of the structural changes (e.g. damage). In this methodology, ARX models (Auto-Regressive models with eXogenous input) are created for different sensor clusters by using the free response of the structure. The output of each sensor in a cluster is used as an input to the ARX model to predict the output of the reference channel of that sensor cluster. Two different approaches are used for extracting Damage Features (DFs) from these ARX models. For the first approach, the coefficients of the ARX models are directly used as the DFs. It is shown with a 4 dof numerical model that damage can be identified, located and quantified for simple models and noise free data. To consider the effects of the noise and model complexity, a second approach is presented based on using the ARX model fit ratios as the DFs. The second approach is first applied to the same 4 DOF numerical model and to the numerical data coming from an international benchmark study for noisy conditions. Then, the methodology is applied to the experimental data from a large scale laboratory model. It is shown that the second approach performs successfully for different damage cases to identify and locate the damage using numerical and experimental data. Furthermore, it is observed that the DF level is a good indicator for estimating the extent of the damage for these cases. The potential and advantages of the methodology are discussed along with the analysis results. The limitations of the methodology, recommendations, and future work are also addressed.
Time-Series Analysis of Supergranule Characterstics at Solar Minimum
NASA Technical Reports Server (NTRS)
Williams, Peter E.; Pesnell, W. Dean
2013-01-01
Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.
Analysis of a Series of Electromagnetic Launcher Firings,
1987-06-01
The time-of-arrival record for the second firing in the RPIP series (RPIP02) is presented in Figure 1. After the projectile leaves the gun- barrel , the...arc, it is subject to a Lorentz force. It should be noted that these effects were first reported by Stainsby and Bedford (8). A comparison of the...34explosive" propulsion force (2) Is Initially greater than the Lorentz (I x l) force. when the Lorentz force begins to dominate in a firing, the code
Surrogate-assisted network analysis of nonlinear time series
NASA Astrophysics Data System (ADS)
Laut, Ingo; Räth, Christoph
2016-10-01
The performance of recurrence networks and symbolic networks to detect weak nonlinearities in time series is compared to the nonlinear prediction error. For the synthetic data of the Lorenz system, the network measures show a comparable performance. In the case of relatively short and noisy real-world data from active galactic nuclei, the nonlinear prediction error yields more robust results than the network measures. The tests are based on surrogate data sets. The correlations in the Fourier phases of data sets from some surrogate generating algorithms are also examined. The phase correlations are shown to have an impact on the performance of the tests for nonlinearity.
Divergence analysis of atomic ionization processes and isoelectronic series
Lopez-Rosa, S.; Angulo, J. C.; Antolin, J.; Esquivel, R. O.
2009-07-15
Fisher divergences (FDs) and Jensen-Shannon divergences (JSDs) are used in this work to quantify the informational discrepancies between the one-particle electron densities of neutral atoms, singly charged ions, and isoelectronic series. These dissimilarity magnitudes, computed for a set of 319 atomic systems in both position and momentum spaces, provide relevant information concerning pattern, structure, and periodicity properties of the ionization processes. In particular an apparent correlation between extremal values of the atomic ionization potential and the divergences is found. Results are compared with those obtained by quantum similarity techniques.
The analysis of behavior in orbit GSS two series of US early-warning system
NASA Astrophysics Data System (ADS)
Sukhov, P. P.; Epishev, V. P.; Sukhov, K. P.; Motrunych, I. I.
2016-09-01
Satellites Early Warning System Series class SBIRS US Air Force must replace on GEO early series DSP Series. During 2014-2016 the authors received more than 30 light curves "DSP-18 and "Sbirs-Geo 2". The analysis of the behavior of these satellites in orbit by a coordinate and photometric data. It is shown that for the monitoring of the Earth's surface is enough to place GEO 4 unit SBIRS across 90 deg.
Spectral analysis of time series of categorical variables in earth sciences
NASA Astrophysics Data System (ADS)
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.; Dorador, Javier
2016-10-01
Time series of categorical variables often appear in Earth Science disciplines and there is considerable interest in studying their cyclic behavior. This is true, for example, when the type of facies, petrofabric features, ichnofabrics, fossil assemblages or mineral compositions are measured continuously over a core or throughout a stratigraphic succession. Here we deal with the problem of applying spectral analysis to such sequences. A full indicator approach is proposed to complement the spectral envelope often used in other disciplines. Additionally, a stand-alone computer program is provided for calculating the spectral envelope, in this case implementing the permutation test to assess the statistical significance of the spectral peaks. We studied simulated sequences as well as real data in order to illustrate the methodology.
Clarifying life lost due to cold and heat: a new approach using annual time series
Rehill, Nirandeep; Armstrong, Ben; Wilkinson, Paul
2015-01-01
Objective To clarify whether deaths associated with hot and cold days are among the frail who would have died anyway in the next few weeks or months. Design Time series regression analysis of annual deaths in relation to annual summaries of cold and heat. Setting London, UK. Participants 3 530 280 deaths from all natural causes among London residents between October 1949 and September 2006. Main outcome measures Change in annual risk of death (all natural cause, cardiovascular and respiratory) associated with each additional 1°C of average cold (or heat) below (above) the threshold (18°C) across each year. Results Cold years were associated with increased deaths from all causes. For each additional 1° of cold across the year, all-cause mortality increased by 2.3% (95% CI 0.7% to 3.8%), after adjustment for influenza and secular trends. The estimated association between hot years and all-cause mortality was very imprecise and thus inconclusive (effect estimate 1.7%, −2.9% to 6.5%). These estimates were broadly robust to changes in the way temperature and trend were modelled. Estimated risk increments using weekly data but otherwise comparable were cold: 2.0% (2.0% to 2.1%) and heat: 3.9% (3.4% to 3.8%). Conclusions In this London annual series, we saw an association of cold with mortality which was broadly similar in magnitude to that found in published daily studies and our own weekly analysis, suggesting that most deaths due to cold were among individuals who would not have died in the next 6 months. The estimated association with heat was imprecise, with the CI including magnitudes found in daily studies but also including zero. PMID:25877269
Clarifying life lost due to cold and heat: a new approach using annual time series.
Rehill, Nirandeep; Armstrong, Ben; Wilkinson, Paul
2015-04-15
To clarify whether deaths associated with hot and cold days are among the frail who would have died anyway in the next few weeks or months. Time series regression analysis of annual deaths in relation to annual summaries of cold and heat. London, UK. 3 530 280 deaths from all natural causes among London residents between October 1949 and September 2006. Change in annual risk of death (all natural cause, cardiovascular and respiratory) associated with each additional 1°C of average cold (or heat) below (above) the threshold (18°C) across each year. Cold years were associated with increased deaths from all causes. For each additional 1° of cold across the year, all-cause mortality increased by 2.3% (95% CI 0.7% to 3.8%), after adjustment for influenza and secular trends. The estimated association between hot years and all-cause mortality was very imprecise and thus inconclusive (effect estimate 1.7%, -2.9% to 6.5%). These estimates were broadly robust to changes in the way temperature and trend were modelled. Estimated risk increments using weekly data but otherwise comparable were cold: 2.0% (2.0% to 2.1%) and heat: 3.9% (3.4% to 3.8%). In this London annual series, we saw an association of cold with mortality which was broadly similar in magnitude to that found in published daily studies and our own weekly analysis, suggesting that most deaths due to cold were among individuals who would not have died in the next 6 months. The estimated association with heat was imprecise, with the CI including magnitudes found in daily studies but also including zero. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A new complexity measure for time series analysis and classification
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth
2013-07-01
Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).
Anatomy of the ICDS series: A bibliometric analysis
NASA Astrophysics Data System (ADS)
Cardona, Manuel; Marxa, Werner
2007-12-01
In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called “source journals” covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories.
NASA Astrophysics Data System (ADS)
Baldysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; Kroszczynski, Krzysztof; Araszkiewicz, Andrzej
2015-04-01
In recent years, the GNSS system began to play an increasingly important role in the research related to the climate monitoring. Based on the GPS system, which has the longest operational capability in comparison with other systems, and a common computational strategy applied to all observations, long and homogeneous ZTD (Zenith Tropospheric Delay) time series were derived. This paper presents results of analysis of 16-year ZTD time series obtained from the EPN (EUREF Permanent Network) reprocessing performed by the Military University of Technology. To maintain the uniformity of data, analyzed period of time (1998-2013) is exactly the same for all stations - observations carried out before 1998 were removed from time series and observations processed using different strategy were recalculated according to the MUT LAC approach. For all 16-year time series (59 stations) Lomb-Scargle periodograms were created to obtain information about the oscillations in ZTD time series. Due to strong annual oscillations which disturb the character of oscillations with smaller amplitude and thus hinder their investigation, Lomb-Scargle periodograms for time series with the deleted annual oscillations were created in order to verify presence of semi-annual, ter-annual and quarto-annual oscillations. Linear trend and seasonal components were estimated using LSE (Least Square Estimation) and Mann-Kendall trend test were used to confirm the presence of linear trend designated by LSE method. In order to verify the effect of the length of time series on the estimated size of the linear trend, comparison between two different length of ZTD time series was performed. To carry out a comparative analysis, 30 stations which have been operating since 1996 were selected. For these stations two periods of time were analyzed: shortened 16-year (1998-2013) and full 18-year (1996-2013). For some stations an additional two years of observations have significant impact on changing the size of linear
Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series
Schroeder, Jonathan P.
2012-01-01
The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193
Modular Approach to Instrumental Analysis.
ERIC Educational Resources Information Center
Deming, Richard L.; And Others
1982-01-01
To remedy certain deficiencies, an instrument analysis course was reorganized into six one-unit modules: optical spectroscopy, magnetic resonance, separations, electrochemistry, radiochemistry, and computers and interfacing. Selected aspects of the course are discussed. (SK)
ERIC Educational Resources Information Center
Bloom, Howard S.
2002-01-01
Introduces an new approach for measuring the impact of whole school reforms. The approach, based on "short" interrupted time-series analysis, is explained, its statistical procedures are outlined, and how it was used in the evaluation of a major whole-school reform, Accelerated Schools is described (H. Bloom and others, 2001). (SLD)
Different approaches of spectral analysis
NASA Technical Reports Server (NTRS)
Lacoume, J. L.
1977-01-01
Several approaches to the problem of the calculation of spectral power density of a random function from an estimate of the autocorrelation function were studied. A comparative study was presented of these different methods. The principles on which they are based and the hypothesis implied were pointed out. Some indications on the optimization of the length of the estimated correlation function was given. An example of application of the different methods discussed in this paper was included.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Hilbert-Huang Transform Analysis of Hydrological and Environmental Time Series
NASA Astrophysics Data System (ADS)
Burr, Tom
2009-06-01
Climatic and hydrologic time series often display periodicities, and thus Fourier spectral analysis sometimes is appropriate. However, time series that are nonstationary, and also perhaps nonlinear, are not well handled by standard Fourier spectral analysis. Methods to handle nonstationarity, such as moving-time-window Fourier spectral analysis, assume linearity and have known limitations regarding the combined frequency and time resolution. For example, if the time series is stationary, then it is well known that better frequency resolution can be achieved by observing a longer time series (more time points). However, if the time series is nonstationary, then shorter time windows are required to estimate the “local in time” spectrum, analogous to using short-memory moving averages that use only the recent past few values to forecast the next value (P. Bloomfield, Fourier Analysis of Time Series: An Introduction, 2nd ed., John Wiley, 2000) because the mean value is changing over time. Therefore, in nonstationary time series analysis, there is a tension between the competing goals of time and frequency resolution. This tension is the reason that N. Huang et al. (Proc. R. Soc. A, 454, 903-995, 1998) introduced the Hilbert-Huang Transform (HHT) as an alternative to moving-time-window Fourier spectral analysis (Bloomfield, 2000).
Interrupted Time-Series Analysis with Brief Single-Subject Data.
ERIC Educational Resources Information Center
Crosbie, John
1993-01-01
Describes problems of assessing change with short time-series data: unreliability of visual inference and fact that current statistical procedures cannot control Type I error because they underestimate positive autocorrelation. Shows how problems can be solved with new interrupted time-series analysis procedure (ITSACORR) that uses more accurate…
Fourier series analysis of fractal lenses: theory and experiments with a liquid-crystal display.
Davis, Jeffrey A; Sigarlaki, Sean P; Craven, Julia M; Calvo, María Luisa
2006-02-20
We report on a Fourier series approach that predicts the focal points and intensities produced by fractal zone plate lenses. This approach allows us to separate the effects of the fractal order from those of the lens aperture. We implement these fractal lenses onto a liquid-crystal display and show experimental verification of our theory.
Fisher-Shannon analysis of ionization processes and isoelectronic series
Sen, K. D.; Antolin, J.; Angulo, J. C.
2007-09-15
The Fisher-Shannon plane which embodies the Fisher information measure in conjunction with the Shannon entropy is tested in its ability to quantify and compare the informational behavior of the process of atomic ionization. We report the variation of such an information measure and its constituents for a comprehensive set of neutral atoms, and their isoelectronic series including the mononegative ions, using the numerical data generated on 320 atomic systems in position, momentum, and product spaces at the Hartree-Fock level. It is found that the Fisher-Shannon plane clearly reveals shell-filling patterns across the periodic table. Compared to position space, a significantly higher resolution is exhibited in momentum space. Characteristic features in the Fisher-Shannon plane accompanying the ionization process are identified, and the physical reasons for the observed patterns are described.
Cluster analysis of long time-series medical datasets
NASA Astrophysics Data System (ADS)
Hirano, Shoji; Tsumoto, Shusaku
2004-04-01
This paper presents a comparative study about the characteristics of clustering methods for inhomogeneous time-series medical datasets. Using various combinations of comparison methods and grouping methods, we performed clustering experiments of the hepatitis data set and evaluated validity of the results. The results suggested that (1) complete-linkage (CL) criterion in agglomerative hierarchical clustering (AHC) outperformed average-linkage (AL) criterion in terms of the interpretability of a dendrogram and clustering results, (2) combination of dynamic time warping (DTW) and CL-AHC constantly produced interpretable results, (3) combination of DTW and rough clustering (RC) would be used to find the core sequences of the clusters, (4) multiscale matching may suffer from the treatment of 'no-match' pairs, however, the problem may be eluded by using RC as a subsequent grouping method.
CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach
An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.
Joint analysis of celestial pole offset and free core nutation series
NASA Astrophysics Data System (ADS)
Malkin, Zinovy
2016-10-01
Three combined celestial pole offset (CPO) series computed at the Paris Observatory (C04), the United States Naval Observatory (USNO), and the International VLBI Service for Geodesy and Astrometry (IVS), as well as six free core nutation (FCN) models, were compared from different perspectives, such as stochastic and systematic differences, and FCN amplitude and phase variations. The differences between the C04 and IVS CPO series were mostly stochastic, whereas a low-frequency bias at the level of several tens of μ as was found between the C04 and USNO CPO series. The stochastic differences between the C04 and USNO series became considerably smaller when computed at the IVS epochs, which can indicate possible problems with the interpolation of the IVS data at the midnight epochs during the computation of the C04 and USNO series. The comparison of the FCN series showed that the series computed with similar window widths of 1.1-1.2 years were close to one another at a level of 10-20 μ as, whereas the differences between these series and the series computed with a larger window width of 4 and 7 years reached 100 μ as. The dependence of the FCN model on the underlying CPO series was investigated. The RMS differences between the FCN models derived from the C04, USNO, and IVS CPO series were at a level of approximately 15 μ as, which was considerably smaller than the differences among the CPO series. The analysis of the differences between the IVS, C04, and USNO CPO series suggested that the IVS series would be preferable for both precession-nutation and FCN-related studies.
Joint analysis of celestial pole offset and free core nutation series
NASA Astrophysics Data System (ADS)
Malkin, Zinovy
2017-07-01
Three combined celestial pole offset (CPO) series computed at the Paris Observatory (C04), the United States Naval Observatory (USNO), and the International VLBI Service for Geodesy and Astrometry (IVS), as well as six free core nutation (FCN) models, were compared from different perspectives, such as stochastic and systematic differences, and FCN amplitude and phase variations. The differences between the C04 and IVS CPO series were mostly stochastic, whereas a low-frequency bias at the level of several tens of μas was found between the C04 and USNO CPO series. The stochastic differences between the C04 and USNO series became considerably smaller when computed at the IVS epochs, which can indicate possible problems with the interpolation of the IVS data at the midnight epochs during the computation of the C04 and USNO series. The comparison of the FCN series showed that the series computed with similar window widths of 1.1-1.2 years were close to one another at a level of 10-20 μas, whereas the differences between these series and the series computed with a larger window width of 4 and 7 years reached 100 μas. The dependence of the FCN model on the underlying CPO series was investigated. The RMS differences between the FCN models derived from the C04, USNO, and IVS CPO series were at a level of approximately 15 μas, which was considerably smaller than the differences among the CPO series. The analysis of the differences between the IVS, C04, and USNO CPO series suggested that the IVS series would be preferable for both precession-nutation and FCN-related studies.
Multiscale InSAR Time Series (MInTS) analysis of surface deformation
NASA Astrophysics Data System (ADS)
Hetland, E. A.; Muse, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; DiCaprio, C. J.
2011-12-01
We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, such that coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least-squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.
Multiscale InSAR Time Series (MInTS) analysis of surface deformation
NASA Astrophysics Data System (ADS)
Hetland, E. A.; Musé, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; Dicaprio, C. J.
2012-02-01
We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, since the coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.
Higa, Carlos Ha; Louzada, Vitor Hp; Andrade, Tales P; Hashimoto, Ronaldo F
2011-05-28
A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.
Supported Employment: Review of Literature. Policy Analysis Series, No. 26.
ERIC Educational Resources Information Center
Minnesota Governor's Planning Council on Developmental Disabilities, St. Paul.
This review of the literature on supported employment for individuals with severe disabilities begins by outlining two Federal definitions of supported employment and noting their similarities. Literature on approaches to supported employment is examined, focusing on individual jobs at distributed or scattered sites, enclaves, mobile crews, and…
Student Stress: A Classroom Management System. Analysis and Action Series.
ERIC Educational Resources Information Center
Swick, Kevin J.
This book is concerned with the problem of student stress and the possibility that children and adolescents will internalize ineffective coping strategies used by adult models available to them. The introductory chapter explains a need for an educational plan to promote ways of controlling stress; recommends a systematic approach to managing…
NASA Astrophysics Data System (ADS)
Hinnov, L. A.; Yao, X.; Zhou, Y.
2014-12-01
We describe a Middle Permian radiolarian chert sequence in South China (Chaohu area), with sequence of chert and mudstone layers formulated into binary series.Two interpolation approaches were tested: linear interpolation resulting in a "triangle" series, and staircase interpolation resulting in a "boxcar" series. Spectral analysis of the triangle series reveals decimeter chert-mudstone cycles which represent theoretical Middle Permian 32 kyr obliquity cycling. Tuning these cycles to a 32-kyr periodicity reveals that other cm-scale cycles are in the precession index band and have a strong ~400 kyr amplitude modulation. Additional tuning tests further support a hypothesis of astronomical forcing of the chert sequence. Analysis of the boxcar series reveals additional "eccentricity" terms transmitted by the boxcar representation of the modulating precession-scale cycles. An astronomical time scale reconstructed from these results assumes a Roadian/Wordian boundary age of 268.8 Ma for the onset of the first chert layer at the base of the sequence and ends at 264.1 Ma, for a total duration of 4.7 Myrs. We propose that monsoon-controlled upwelling contributed to the development of the chert-mudstone cycles. A seasonal monsoon controlled by astronomical forcing influenced the intensity of upwelling, modulating radiolarian productivity and silica deposition.
Taxation in Public Education. Analysis and Bibliography Series, No. 12.
ERIC Educational Resources Information Center
Ross, Larry L.
Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…
Data Reorganization for Optimal Time Series Data Access, Analysis, and Visualization
NASA Astrophysics Data System (ADS)
Rui, H.; Teng, W. L.; Strub, R.; Vollmer, B.
2012-12-01
The way data are archived is often not optimal for their access by many user communities (e.g., hydrological), particularly if the data volumes and/or number of data files are large. The number of data records of a non-static data set generally increases with time. Therefore, most data sets are commonly archived by time steps, one step per file, often containing multiple variables. However, many research and application efforts need time series data for a given geographical location or area, i.e., a data organization that is orthogonal to the way the data are archived. The retrieval of a time series of the entire temporal coverage of a data set for a single variable at a single data point, in an optimal way, is an important and longstanding challenge, especially for large science data sets (i.e., with volumes greater than 100 GB). Two examples of such large data sets are the North American Land Data Assimilation System (NLDAS) and Global Land Data Assimilation System (GLDAS), archived at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC; Hydrology Data Holdings Portal, http://disc.sci.gsfc.nasa.gov/hydrology/data-holdings). To date, the NLDAS data set, hourly 0.125x0.125° from Jan. 1, 1979 to present, has a total volume greater than 3 TB (compressed). The GLDAS data set, 3-hourly and monthly 0.25x0.25° and 1.0x1.0° Jan. 1948 to present, has a total volume greater than 1 TB (compressed). Both data sets are accessible, in the archived time step format, via several convenient methods, including Mirador search and download (http://mirador.gsfc.nasa.gov/), GrADS Data Server (GDS; http://hydro1.sci.gsfc.nasa.gov/dods/), direct FTP (ftp://hydro1.sci.gsfc.nasa.gov/data/s4pa/), and Giovanni Online Visualization and Analysis (http://disc.sci.gsfc.nasa.gov/giovanni). However, users who need long time series currently have no efficient way to retrieve them. Continuing a longstanding tradition of facilitating data access, analysis, and
A multi-modal treatment approach for the shoulder: A 4 patient case series
Pribicevic, Mario; Pollard, Henry
2005-01-01
Background This paper describes the clinical management of four cases of shoulder impingement syndrome using a conservative multimodal treatment approach. Clinical Features Four patients presented to a chiropractic clinic with chronic shoulder pain, tenderness in the shoulder region and a limited range of motion with pain and catching. After physical and orthopaedic examination a clinical diagnosis of shoulder impingement syndrome was reached. The four patients were admitted to a multi-modal treatment protocol including soft tissue therapy (ischaemic pressure and cross-friction massage), 7 minutes of phonophoresis (driving of medication into tissue with ultrasound) with 1% cortisone cream, diversified spinal and peripheral joint manipulation and rotator cuff and shoulder girdle muscle exercises. The outcome measures for the study were subjective/objective visual analogue pain scales (VAS), range of motion (goniometer) and return to normal daily, work and sporting activities. All four subjects at the end of the treatment protocol were symptom free with all outcome measures being normal. At 1 month follow up all patients continued to be symptom free with full range of motion and complete return to normal daily activities. Conclusion This case series demonstrates the potential benefit of a multimodal chiropractic protocol in resolving symptoms associated with a suspected clinical diagnosis of shoulder impingement syndrome. PMID:16168053
A sequential approach to calibrate ecosystem models with multiple time series data
NASA Astrophysics Data System (ADS)
Oliveros-Ramos, Ricardo; Verley, Philippe; Echevin, Vincent; Shin, Yunne-Jai
2017-02-01
When models are aimed to support decision-making, their credibility is essential to consider. Model fitting to observed data is one major criterion to assess such credibility. However, due to the complexity of ecosystem models making their calibration more challenging, the scientific community has given more attention to the exploration of model behavior than to a rigorous comparison to observations. This work highlights some issues related to the comparison of complex ecosystem models to data and proposes a methodology for a sequential multi-phases calibration (or parameter estimation) of ecosystem models. We first propose two criteria to classify the parameters of a model: the model dependency and the time variability of the parameters. Then, these criteria and the availability of approximate initial estimates are used as decision rules to determine which parameters need to be estimated, and their precedence order in the sequential calibration process. The end-to-end (E2E) ecosystem model ROMS-PISCES-OSMOSE applied to the Northern Humboldt Current Ecosystem is used as an illustrative case study. The model is calibrated using an evolutionary algorithm and a likelihood approach to fit time series data of landings, abundance indices and catch at length distributions from 1992 to 2008. Testing different calibration schemes regarding the number of phases, the precedence of the parameters' estimation, and the consideration of time varying parameters, the results show that the multiple-phase calibration conducted under our criteria allowed to improve the model fit.
InSAR and GPS time series analysis: Crustal deformation in the Yucca Mountain, Nevada region
NASA Astrophysics Data System (ADS)
Li, Z.; Hammond, W. C.; Blewitt, G.; Kreemer, C. W.; Plag, H.
2010-12-01
Several previous studies have successfully demonstrated that long time series (e.g. >5 years) of GPS measurements can be employed to detect tectonic signals with a vertical rate greater than 0.3 mm/yr (e.g. Hill and Blewitt, 2006; Bennett et al. 2009). However, GPS stations are often sparse, with spacing from a few kilometres to a few hundred kilometres. Interferometric SAR (InSAR) can complement GPS by providing high horizontal spatial resolution (e.g. meters to tens-of metres) over large regions (e.g. 100 km × 100 km). A major source of error for repeat-pass InSAR is the phase delay in radio signal propagation through the atmosphere. The portion of this attributable to tropospheric water vapour causes errors as large as 10-20 cm in deformation retrievals. InSAR Time Series analysis with Atmospheric Estimation Models (InSAR TS + AEM), developed at the University of Glasgow, is a robust time series analysis approach, which mainly uses interferograms with small geometric baselines to minimise the effects of decorrelation and inaccuracies in topographic data. In addition, InSAR TS + AEM can be used to separate deformation signals from atmospheric water vapour effects in order to map surface deformation as it evolves in time. The principal purposes of this study are to assess: (1) how consistent InSAR-derived deformation time series are with GPS; and (2) how precise InSAR-derived atmospheric path delays can be. The Yucca Mountain, Nevada region is chosen as the study site because of its excellent GPS network and extensive radar archives (>10 years of dense and high-quality GPS stations, and >17 years of ERS and ENVISAT radar acquisitions), and because of its arid environment. The latter results in coherence that is generally high, even for long periods that span the existing C-band radar archives of ERS and ENVISAT. Preliminary results show that our InSAR LOS deformation map agrees with GPS measurements to within 0.35 mm/yr RMS misfit at the stations which is the
Pitfalls in Fractal Time Series Analysis: fMRI BOLD as an Exemplary Case
Eke, Andras; Herman, Peter; Sanganahalli, Basavaraju G.; Hyder, Fahmeed; Mukli, Peter; Nagy, Zoltan
2012-01-01
This article will be positioned on our previous work demonstrating the importance of adhering to a carefully selected set of criteria when choosing the suitable method from those available ensuring its adequate performance when applied to real temporal signals, such as fMRI BOLD, to evaluate one important facet of their behavior, fractality. Earlier, we have reviewed on a range of monofractal tools and evaluated their performance. Given the advance in the fractal field, in this article we will discuss the most widely used implementations of multifractal analyses, too. Our recommended flowchart for the fractal characterization of spontaneous, low frequency fluctuations in fMRI BOLD will be used as the framework for this article to make certain that it will provide a hands-on experience for the reader in handling the perplexed issues of fractal analysis. The reason why this particular signal modality and its fractal analysis has been chosen was due to its high impact on today’s neuroscience given it had powerfully emerged as a new way of interpreting the complex functioning of the brain (see “intrinsic activity”). The reader will first be presented with the basic concepts of mono and multifractal time series analyses, followed by some of the most relevant implementations, characterization by numerical approaches. The notion of the dichotomy of fractional Gaussian noise and fractional Brownian motion signal classes and their impact on fractal time series analyses will be thoroughly discussed as the central theme of our application strategy. Sources of pitfalls and way how to avoid them will be identified followed by a demonstration on fractal studies of fMRI BOLD taken from the literature and that of our own in an attempt to consolidate the best practice in fractal analysis of empirical fMRI BOLD signals mapped throughout the brain as an exemplary case of potentially wide interest. PMID:23227008
A novel water quality data analysis framework based on time-series data mining.
Deng, Weihui; Wang, Guoyin
2017-07-01
The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Classifying Enterprise Architecture Analysis Approaches
NASA Astrophysics Data System (ADS)
Buckl, Sabine; Matthes, Florian; Schweda, Christian M.
Enterprise architecture (EA) management forms a commonly accepted means to enhance the alignment of business and IT, and to support the managed evolution of the enterprise. One major challenge of EA management is to provide decision support by analyzing as-is states of the architecture as well as assessing planned future states. Thus, different kinds of analysis regarding the EA exist, each relying on certain conditions and demands for models, methods, and techniques.
Spectral Analysis of the Daily Maximum Precipitation Time Series over Iran
NASA Astrophysics Data System (ADS)
Taghavi, F.
2009-04-01
A method of estimating the spectral density of random components of precipitation time series is presented. Precipitation time series consist of periodic components (such as normal annual variation) and random or nondeterministic components which are the deviations from the normal. In this paper, Fourier analysis technique is used to study the daily maximum precipitation data to identify the 24- hour extreme precipitation frequency and to evaluate the risk of flooding over Iran. Time series data form 40 stations are used in the period 1960-2007. As in most analyses, in time series analysis, it is assumed that the data consist of a systematic pattern and random noise which usually makes the pattern difficult to identify.This paper discuses the problem of estimating the spectral density of random components in order to obtain an estimate of predicting future values of the precipitation time series.
Statistical assessment of a unique time series analysis technique
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1972-01-01
Empirical tests of a particular least squares multiple prediction program were studied, as well as the relationship of this particular program to general least squares multiple prediction theory. Empirical evaluation and test of the program involved: (1) conversion of the program to run on an IBM 360/44 with replication of test results obtained on a Univac 1108; and (2) generation and analysis of Monte Carlo simulated data with the objective of comparison against results theoretically obtainable from spectrum analysis routines.
Variance Analysis of Unevenly Spaced Time Series Data
NASA Technical Reports Server (NTRS)
Hackman, Christine; Parker, Thomas E.
1996-01-01
We have investigated the effect of uneven data spacing on the computation of delta (sub chi)(gamma). Evenly spaced simulated data sets were generated for noise processes ranging from white phase modulation (PM) to random walk frequency modulation (FM). Delta(sub chi)(gamma) was then calculated for each noise type. Data were subsequently removed from each simulated data set using typical two-way satellite time and frequency transfer (TWSTFT) data patterns to create two unevenly spaced sets with average intervals of 2.8 and 3.6 days. Delta(sub chi)(gamma) was then calculated for each sparse data set using two different approaches. First the missing data points were replaced by linear interpolation and delta (sub chi)(gamma) calculated from this now full data set. The second approach ignored the fact that the data were unevenly spaced and calculated delta(sub chi)(gamma) as if the data were equally spaced with average spacing of 2.8 or 3.6 days. Both approaches have advantages and disadvantages, and techniques are presented for correcting errors caused by uneven data spacing in typical TWSTFT data sets.
Methodology Series Module 6: Systematic Reviews and Meta-analysis
Setia, Maninder Singh
2016-01-01
Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the “highest level of evidence” for various clinical questions. There are a lot of studies – sometimes with contradictory conclusions – on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist. PMID:27904176
A unified nonlinear stochastic time series analysis for climate science.
Moon, Woosok; Wettlaufer, John S
2017-03-13
Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.
A unified nonlinear stochastic time series analysis for climate science
Moon, Woosok; Wettlaufer, John S.
2017-01-01
Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability. PMID:28287128
A unified nonlinear stochastic time series analysis for climate science
NASA Astrophysics Data System (ADS)
Moon, Woosok; Wettlaufer, John S.
2017-03-01
Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.
Vapor burn analysis for the Coyote series LNG spill experiments
Rodean, H.C.; Hogan, W.J.; Urtiew, P.A.; Goldwire, H.C. Jr.; McRae, T.G.; Morgan, D.L. Jr.
1984-04-01
A major purpose of the Coyote series of field experiments at China Lake, California, in 1981 was to study the burning of vapor clouds from spills of liquefied natural gas (LNG) on water. Extensive arrays of instrumentation were deployed to obtain micrometeorological, gas concentration, and fire-related data. The instrumentation included in situ sensors of various types, high-speed motion picture cameras, and infrared (IR) imagers. Five of the total of ten Coyote spill experiments investigated vapor burns. The first vapor-burn experiment, Coyote 2, was done with a small spill of LNG to assess instrument capability and survivability in vapor cloud fires. The emphasis in this report is on the other four vapor-burn experiments: Coyotes 3, 5, 6, and 7. The data are analyzed to determine fire spread, flame propagation, and heat flux - quantities that are related to the determination of the damage zone for vapor burns. The results of the analyses are given here. 20 references, 57 figures, 7 tables.
A unified nonlinear stochastic time series analysis for climate science
NASA Astrophysics Data System (ADS)
Moon, Woosok; Wettlaufer, John
2017-04-01
Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Niño Southern Oscillation (ENSO), the Atlantic Niño and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some period of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.
Structural analysis of a series of strontium-substituted apatites.
O'Donnell, M D; Fredholm, Y; de Rouffignac, A; Hill, R G
2008-09-01
A series of Sr-substituted hydroxyapatites, (Sr(x)Ca(1-)(x))(5)(PO(4))(3)OH, where x=0.00, 0.25, 0.50, 0.75 and 1.00, were made by a standard wet chemical route and investigated using X-ray diffraction (XRD), Rietveld refinement and Raman spectroscopy. We report apatites manufactured by two synthesis routes under 90 degrees C, and only the fully Sr-substituted sample had a small amount of an impurity phase, which is believed to be strontium pyrophosphate. Lattice parameters (a and c), unit cell volume and density were shown to increase linearly with strontium addition and were consistent with the addition of a slightly larger and heavier ion (Sr) in place of Ca. XRD Lorentzian peak widths increased to a maximum at x=0.50, then decreased with increasing Sr content. This indicated an increase in crystallite size when moving away from the x=0.50 composition (d approximately 9.4nm). There was a slight preference for strontium to enter the Ca(II) site in the mixed apatites (6 to 12% depending on composition). The position of the Raman band attributed to v(1)PO(4)(3-) at around 963cm(-1) in hydroxyapatite decreased linearly to 949cm(-1) at full Sr-substitution. The full width at half maximum of this peak also correlated well and increased linearly with increasing crystallite size calculated from XRD.
Physiological time-series analysis: what does regularity quantify?
NASA Technical Reports Server (NTRS)
Pincus, S. M.; Goldberger, A. L.
1994-01-01
Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.
Physiological time-series analysis: what does regularity quantify?
NASA Technical Reports Server (NTRS)
Pincus, S. M.; Goldberger, A. L.
1994-01-01
Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.
Detrended fluctuation analysis of time series of a firing fusimotor neuron
NASA Astrophysics Data System (ADS)
Blesić, S.; Milošević, S.; Stratimirović, Dj.; Ljubisavljević, M.
We study the interspike intervals (ISI) time series of the spontaneous fusimotor neuron activity by applying the detrended fluctuation analysis that is a modification of the random walk model analysis. Thus, we have found evidence for the white noise characteristics of the ISI time series, which means that the fusimotor activity does not possess temporal correlations. We conclude that such an activity represents the requisite noisy component for occurrence of the stochastic resonance mechanism in the neural coordination of muscle spindles.
Shallice, Tim; Buiatti, Tania
2011-10-01
The paper addresses a weakness in the Schwartz and Dell paper (2010)-namely, its discussion of the inclusion criteria for case series. The paper distinguishes the different types that exist and how they constrain the theoretical conclusions that one can draw about the organization of the normal cognitive system. Four different types of inclusion criteria are considered. Two are those treated by Schwartz and Dell-namely, theoretically derived clinical criteria, such as the example of semantic dementia, and broad clinical criteria such as the presence of aphasia. In addition, in the present paper two different types of anatomically based criteria are assessed-those using anatomical regions selected a priori and also regions selected as a result of an anatomical group study analysis. Putative functional syndromes are argued to be the empirical building blocks for cognitive neuropsychology. Anatomically based case series can aid in their construction or in their fractionation.
Assessing air quality in Aksaray with time series analysis
NASA Astrophysics Data System (ADS)
Kadilar, Gamze Özel; Kadilar, Cem
2017-04-01
Sulphur dioxide (SO2) is a major air pollutant caused by the dominant usage of diesel, petrol and fuels by vehicles and industries. One of the most air-polluted city in Turkey is Aksaray. Hence, in this study, the level of SO2 is analyzed in Aksaray based on the database monitored at air quality monitoring station of Turkey. Seasonal Autoregressive Integrated Moving Average (SARIMA) approach is used to forecast the level of SO2 air quality parameter. The results indicate that the seasonal ARIMA model provides reliable and satisfactory predictions for the air quality parameters and expected to be an alternative tool for practical assessment and justification.
NASA Astrophysics Data System (ADS)
Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi
2016-05-01
This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable
Philosophy: Discipline Analysis. Women in the Curriculum Series.
ERIC Educational Resources Information Center
Nye, Andrea
This essay examines the ways in which philosophy, as a discipline, has been influenced by feminist scholarship in the field. It explains that in the 1970s feminist philosophers introduced questions regarding personal life and sexuality as matters for philosophical analysis, and that scholars began to challenge the notions of the Western canon.…
Sociology: Discipline Analysis. Women in the Curriculum Series.
ERIC Educational Resources Information Center
Johnson, Jacqueline; Risman, Barbara J.
This essay examines the ways in which sociology, as a discipline, has been influenced by feminist scholarship in the field, and three major contributions of feminist scholarship are presented: the introduction of women into sociological theory and research during the era of "sex role" analysis; the shift to analyzing gender as a basic axis of…
Determinants of Male School Enrollments: A Time-Series Analysis.
ERIC Educational Resources Information Center
Mattila, J. Peter
1982-01-01
Regression analysis of 1956-79 data on high school and college enrollment among males aged 16-21, minimum wage levels, military draft requirements, unemployment rates, family income, high school graduation, and job opportunities indicates that male enrollment rates respond strongly to changes in the expected rate of financial return to schooling.…
Educational Attainment: Analysis by Immigrant Generation. IZA Discussion Paper Series.
ERIC Educational Resources Information Center
Chiswick, Barry R.; DebBurman, Noyna
This paper presents a theoretical and empirical analysis of the largely ignored issue of the determinants of the educational attainment of adults by immigrant generation. Using Current Population Survey (CPS) data, differences in educational attainment are analyzed by immigrant generation (first, second, and higher order generations), and among…
Single-Case Time Series with Bayesian Analysis: A Practitioner's Guide.
ERIC Educational Resources Information Center
Jones, W. Paul
2003-01-01
This article illustrates a simplified time series analysis for use by the counseling researcher practitioner in single-case baseline plus intervention studies with a Bayesian probability analysis to integrate findings from replications. The C statistic is recommended as a primary analysis tool with particular relevance in the context of actual…
Arboix, A; Martí-Vilalta, J L
1998-03-01
There are few clinico-anatomopathological studies of lacunar infarcts (LI), because of the excellent functional prognosis and unlikelihood of death occurring whilst in hospital. We reviewed the 10 main anatomopathological series of LI in the literature. A personal contribution was made based on analysis of the LI analyzed in 50 consecutive autopsies of patients with cerebrovascular disease. A descriptive clinico-anatomopathological assessment was done. Cerebrovascular risk factors, associated neurological syndromes and causes of death were analyzed. A total of 1,200 cases were analyzed in the 11 anatomopathological series. The most usual number of LI was between 2 and 5 per brain (6 series). The commonest topographical lesions found, in order of frequency, were: In the lenticular nucleus (9 series), thalamus (4 series) and frontal white matter (4 series). The main risk factor was arterial hypertension (AHT), which occurred in between 58% and 90%. The main clinical findings were: Pseudobulbar syndrome (6 series), pure motor hemiparesia (3 series) and clinically silent ischemia (2 series). The causes of death were mainly non-neurological and due to ischemic cardiopathy, sepsis and pulmonary embolism. LI are usually multiple, and topographically they are found at the level of the basal ganglia. AHT is the main cerebrovascular risk factor. The causes of death are usually non-neurological.
Lecca, Paola; Mura, Ivan; Re, Angela; Barker, Gary C.; Ihekwaba, Adaoha E. C.
2016-01-01
Chaotic behavior refers to a behavior which, albeit irregular, is generated by an underlying deterministic process. Therefore, a chaotic behavior is potentially controllable. This possibility becomes practically amenable especially when chaos is shown to be low-dimensional, i.e., to be attributable to a small fraction of the total systems components. In this case, indeed, including the major drivers of chaos in a system into the modeling approach allows us to improve predictability of the systems dynamics. Here, we analyzed the numerical simulations of an accurate ordinary differential equation model of the gene network regulating sporulation initiation in Bacillus subtilis to explore whether the non-linearity underlying time series data is due to low-dimensional chaos. Low-dimensional chaos is expectedly common in systems with few degrees of freedom, but rare in systems with many degrees of freedom such as the B. subtilis sporulation network. The estimation of a number of indices, which reflect the chaotic nature of a system, indicates that the dynamics of this network is affected by deterministic chaos. The neat separation between the indices obtained from the time series simulated from the model and those obtained from time series generated by Gaussian white and colored noise confirmed that the B. subtilis sporulation network dynamics is affected by low dimensional chaos rather than by noise. Furthermore, our analysis identifies the principal driver of the networks chaotic dynamics to be sporulation initiation phosphotransferase B (Spo0B). We then analyzed the parameters and the phase space of the system to characterize the instability points of the network dynamics, and, in turn, to identify the ranges of values of Spo0B and of the other drivers of the chaotic dynamics, for which the whole system is highly sensitive to minimal perturbation. In summary, we described an unappreciated source of complexity in the B. subtilis sporulation network by gathering
An approach to constructing a homogeneous time series of soil mositure using SMOS
USDA-ARS?s Scientific Manuscript database
Overlapping soil moisture time series derived from two satellite microwave radiometers (SMOS, Soil Moisture and Ocean Salinity; AMSR-E, Advanced Microwave Scanning Radiometer - Earth Observing System) are used to generate a soil moisture time series from 2003 to 2010. Two statistical methodologies f...
New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…
Intra-cholecystic approach for laparoscopic management of Mirizzi's syndrome: A case series
Nag, Hirdaya H.; Gangadhara, Vageesh Bettageri; Dangi, Amit
2016-01-01
INTRODUCTION: Laparoscopic management of patients with Mirizzi's syndrome (MS) is not routinely recommended due to the high risk of iatrogenic complications. PATIENTS AND METHODS: Intra-cholecystic (IC) or inside-gall bladder (GB) approach was used for laparoscopic management of 16 patients with MS at a tertiary care referral centre in North India from May 2010 to August 2014; a retrospective analysis of prospectively collected data was performed. RESULTS: Mean age was 40.1 ± 14.7 years, the male-to-female ratio was 1:3, and 9 (56.25%) patients had type 1 MS (MS1) and 7 (43.75%) had type 2 MS (MS2) (McSherry's classification). The laparoscopic intra-cholecystic approach (LICA) was successful in 11 (68.75%) patients, whereas 5 patients (31.25%) required conversion to open method. Median blood loss was 100 mL (range: 50-400 mL), and median duration of surgery was 3.25 h (range: 2-7.5 h). No major complications were encountered except 1 patient (6.5%) who required re-operation for retained bile duct stones. The final histopathology report was benign in all the patients. No remote complications were noted during a mean follow-up of 20.18 months. CONCLUSION: LICA is a feasible and safe approach for selected patients with Mirizzi's syndrome; however, a low threshold for conversion is necessary to avoid iatrogenic complications. PMID:27251843
Welkowitz, J; Bond, R N; Feldman, L; Tota, M E
1990-07-01
Mutual influencing processes are assumed to be the basic building blocks in establishing parent-child bonding and in influencing cognitive and language behavior. A study by Jasnow and Feldstein (1986) revealed that, within the temporal domain of speech, preverbal (9-month-old) infants and their mothers exhibit a pattern of mutual influence (attunement) in their average durations of switching pauses. The general purpose of this research was to extend those findings to children with higher verbal functioning. In addition, parent and child genders, nature of the interaction, and specific aspects of parents' personalities, expressiveness, and instrumentality were considered. Each parent interacted with their 4- or 5-year-old son or daughter in each of two conversations--unstructured (social conversation) and structured (task activity). Conversations were processed by an automated computer system yielding objective measure of turns, vocalizations, pauses, and switching-pause durations. To examine interspeaker influence of attunement of temporal speech patterns, "influence coefficients" were computer for each speaker on a "turn-by-turn" basis using time series regression. Analysis of these coefficients revealed that: (1) Mutual influence is most evident with switching-pause duration. (2) Structure in the conversation (as defined by the task or parental instrumentality) seems to facilitate attunement for vocalization and switching pause duration. (3) Attunement with girls seems to occur equally well with both parents, while boys exhibit a style of temporal patterning influence which suggests greater identification with the father. (4) Expressiveness seems to facilitate attunement to the child's switching-pause duration.
NASA Astrophysics Data System (ADS)
Schwatke, Christian; Dettmering, Denise
2016-04-01
Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, for some years, this technology has also been used to retrieve water levels from lakes, reservoirs, rivers, wetlands and in general any inland water body. In this contribution, a new approach for the estimation of inland water level time series is presented. The method is the basis for the computation of time series of rivers and lakes available through the web service 'Database for Hydrological Time Series over Inland Water' (DAHITI). It is based on an extended outlier rejection and a Kalman filter approach incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1, Jason-2, Topex/Poseidon, and SARAL/AltiKa, including their uncertainties. The new approach yields RMS differences with respect to in situ data between 4 cm and 36 cm for lakes and 8 cm and 114 cm for rivers, respectively. Within this presentation, the new approach will be introduced and examples for water level time series for a variety of lakes and rivers will be shown featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases.
ERIC Educational Resources Information Center
Stifter, Cynthia A.; Rovine, Michael
2015-01-01
The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…
ERIC Educational Resources Information Center
Stifter, Cynthia A.; Rovine, Michael
2015-01-01
The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…
Mousavi, S M; Reihani, S N Seyed; Anvari, G; Anvari, M; Alinezhad, H G; Tabar, M Reza Rahimi
2017-07-06
We propose a nonlinear method for the analysis of the time series for the spatial position of a bead trapped in optical tweezers, which enables us to reconstruct its dynamical equation of motion. The main advantage of the method is that all the functions and parameters of the dynamics are determined directly (non-parametrically) from the measured series. It also allows us to determine, for the first time to our knowledge, the spatial-dependence of the diffusion coefficient of a bead in an optical trap, and to demonstrate that it is not in general constant. This is in contrast with the main assumption of the popularly-used power spectrum calibration method. The proposed method is validated via synthetic time series for the bead position with spatially-varying diffusion coefficients. Our detailed analysis of the measured time series reveals that the power spectrum analysis overestimates considerably the force constant.
Analysis of cervical ribs in a series of human fetuses
Bots, Jessica; Wijnaendts, Liliane C D; Delen, Sofie; Van Dongen, Stefan; Heikinheimo, Kristiina; Galis, Frietson
2011-01-01
In humans, an increasing body of evidence has linked the frequency of cervical ribs to stillbirths, other malformations and early childhood cancers. However, the frequency of cervical ribs in a putatively healthy fetal population is not sufficiently known to assess the actual medical risks of these prenatal findings. We therefore analyzed the presence of skeletal anomalies in a series of 199 electively aborted fetuses, which were whole-mount stained with alizarin red specific for skeletal tissues. Results show that approximately 40% of the fetuses had cervical ribs, even though external congenital abnormalities such as craniofacial and limb defects were absent. A literature overview indicates that the observed frequency of cervical ribs is comparable to results previously obtained for deceased fetuses with no or minor congenital anomalies, and higher than expected for healthy fetuses. This unexpected result can probably in part be explained by a higher detection rate of small cervical ribs when using alizarin red staining instead of radiographs. Additionally, studies in the literature suggest that the size of a cervical rib may indicate the severity of abnormalities, but this possibility requires further research. Anomalies of the axial skeleton are known to be caused by a disturbance of early development, which alters Hox gene expression, but in this study the origin of the stress could not be verified as maternal medical data were not available. The co-occurrence of rudimentary or absent 12th ribs in 23.6% of the cases with cervical ribs indicates that in approximately 8% of the fetuses a homeotic shift occurred over a larger part of the vertebral column. This suggests that the expression of multiple Hox genes may have been affected in these fetuses. Together, the high incidence of cervical ribs and also their co-occurrence with rudimentary or absent 12th ribs suggests that there may have been a disturbance of early development such that the studied fetuses are
Analysis of cervical ribs in a series of human fetuses.
Bots, Jessica; Wijnaendts, Liliane C D; Delen, Sofie; Van Dongen, Stefan; Heikinheimo, Kristiina; Galis, Frietson
2011-09-01
In humans, an increasing body of evidence has linked the frequency of cervical ribs to stillbirths, other malformations and early childhood cancers. However, the frequency of cervical ribs in a putatively healthy fetal population is not sufficiently known to assess the actual medical risks of these prenatal findings. We therefore analyzed the presence of skeletal anomalies in a series of 199 electively aborted fetuses, which were whole-mount stained with alizarin red specific for skeletal tissues. Results show that approximately 40% of the fetuses had cervical ribs, even though external congenital abnormalities such as craniofacial and limb defects were absent. A literature overview indicates that the observed frequency of cervical ribs is comparable to results previously obtained for deceased fetuses with no or minor congenital anomalies, and higher than expected for healthy fetuses. This unexpected result can probably in part be explained by a higher detection rate of small cervical ribs when using alizarin red staining instead of radiographs. Additionally, studies in the literature suggest that the size of a cervical rib may indicate the severity of abnormalities, but this possibility requires further research. Anomalies of the axial skeleton are known to be caused by a disturbance of early development, which alters Hox gene expression, but in this study the origin of the stress could not be verified as maternal medical data were not available. The co-occurrence of rudimentary or absent 12th ribs in 23.6% of the cases with cervical ribs indicates that in approximately 8% of the fetuses a homeotic shift occurred over a larger part of the vertebral column. This suggests that the expression of multiple Hox genes may have been affected in these fetuses. Together, the high incidence of cervical ribs and also their co-occurrence with rudimentary or absent 12th ribs suggests that there may have been a disturbance of early development such that the studied fetuses are
NASA Astrophysics Data System (ADS)
Meroni, M.; Fasbender, D.; Kayitakire, F.; Pini, G.; Rembold, F.; Urbano, F.; Verstraete, M. M.
2013-12-01
Timely information on vegetation development at regional scale is needed in arid and semiarid African regions where rainfall variability leads to high inter-annual fluctuations in crop and pasture productivity, as well as to high risk of food crisis in the presence of severe drought events. The present study aims at developing and testing an automatic procedure to estimate the probability of experiencing a seasonal biomass production deficit solely on the basis of historical and near real-time remote sensing observations. The method is based on the extraction of vegetation phenology from SPOT-VEGTATION time series of the Fraction of Absorbed Photosynthetically Active Radiation (FAPAR) and the subsequent computation of seasonally cumulated FAPAR as a proxy for vegetation gross primary production. Within season forecasts of the overall seasonal performance, expressed in terms of probability of experiencing a critical deficit, are based on a statistical approach taking into account two factors: i) the similarity between the current FAPAR profile and past profiles observable in the 15 years FAPAR time series; ii) the uncertainty of past predictions of season outcome as derived using jack-knifing technique. The method is applicable at the regional to continental scale and can be updated regularly during the season (whenever a new satellite observation is made available) to provide a synoptic view of the hot spots of likely production deficit. The specific objective of the procedure described here is to deliver to the food security analyst, as early as possible within the season, only the relevant information (e.g., masking out areas without active vegetation at the time of analysis), expressed through a reliable and easily interpretable measure of impending risk. Evaluation of method performance and examples of application in the Sahel region are discussed.
Time series analysis of InSAR data: Methods and trends
NASA Astrophysics Data System (ADS)
Osmanoğlu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cabral-Cano, Enrique
2016-05-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ;unwrapping; of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
Structure in Photon Maps and Time Series: A New Approach to Bayesian Blocks
NASA Astrophysics Data System (ADS)
Scargle, J. D.; Norris, J. P.
2000-10-01
The Bayesian Blocks algorithm finds the most probable piecewise constant ("blocky") representation for time series in the form of binned, time-tagged, or time-to-spill photon counting data. In (Scargle, 1998, ApJ 504, 405) the number of blocks was determined in an ad hoc iterative procedure. Another approach maximizes the posterior -- after marginalizing all parameters except the number of blocks -- computed with Markov Chain Monte Carlo methods. A new, better algorithm starts with the Voronoi tessellation of the individual events in an arbitrary dimensioned data space. (This generalization allows solution of problems such as detection of clusters in high dimensional parameter spaces, and identification of structures in images.) In successive steps, these many cells are merged to form fewer, larger ones. The decision to merge two cells or keep them apart is based on comparison of the corresponding posterior probabilities. Let P(N,V) be the posterior for a Poisson model of a volume of size V containing N events, a function easily calculated explicitly. Then cells j and k are merged if $P( Nj + Nk, Vj + Vk ) > P( Nj, Vj ) P( Nk, Vk )$ and kept separate otherwise. When this criterion favors the merging of no further cells computation halts. Local structures ("shots") in the variability of Cygnus X-1 and RXTE 1118+480 were detected in this way, using time-tagged photon data from the USA X-ray Telescope. Since no time bins are invoked, the full sub-millisecond time resolution of the USA instrument is maintained. The method contains no parameters other than those defining prior probability distributions, and therefor yields objective structure estimates. For image data, the cells need not be restricted to be simply connected, e.g. in order to treat background regions surrounding sources. Partly funded by the NASA Applied Information Systems Research Program.
Contribution of alcohol in accident related mortality in Belarus: a time series approach
Razvodovsky, Yury Evgeny
2012-01-01
Abstract: Background: High accidental death rates in the former Soviet republics (FSR) and its profound fluctuation over the past decades have attracted considerable interest. The research evidences emphasize binge drinking pattern as a potentially important contributor to accident mortality crisis in FSR. In line with this evidence we assume that higher level of alcohol consumption in conjunction with binge drinking pattern results in close aggregate-level association between alcohol psychoses and accidental death rates in the former Soviet Slavic republic Belarus. Methods: Trends in alcohol psychoses rate (as a proxy for alcohol consumption) from 1979 to 2007 were analyzed employing a distributed lag analysis in order to asses bivariate relationship between the two time series. Results: According to the Bureau of Forensic Medicine autopsy reports the number of deaths due to accidents and injuries increased by 52.5% (from 62.3 to 95.0 per 100.000 of residents), and fatal alcohol poisoning rate increased by 108.6% (from 12.8 to 26.7 per 100.000 of residents) in Belarus between 1979 and 2007. Alcohol in blood was found in 50.1% victims of deaths from accidents and injuries for the whole period, with the minimum figure 40% in 1986 and maximum 58.2% in 2005. The outcome of distributed lags analysis indicated statistically significant association between the number of alcohol psychoses cases and the number BAC-positive deaths from accidents at zero lag. Conclusion: The outcome of this study supports previous findings suggesting that alcohol and deaths from accidents are closely connected in a culture with prevailing intoxication-oriented drinking pattern, and add to growing body of evidence that a substantial proportion of accidental deaths in Belarus are due to effects of binge drinking. PMID:21502784
A Mixed Approach Of Automated ECG Analysis
NASA Astrophysics Data System (ADS)
De, A. K.; Das, J.; Majumder, D. Dutta
1982-11-01
ECG is one of the non-invasive and risk-free technique for collecting data about the functional state of the heart. However, all these data-processing techniques can be classified into two basically different approaches -- the first and second generation ECG computer program. Not the opposition, but simbiosis of these two approaches will lead to systems with the highest accuracy. In our paper we are going to describe a mixed approach which will show higher accuracy with lesser amount of computational work. Key Words : Primary features, Patients' parameter matrix, Screening, Logical comparison technique, Multivariate statistical analysis, Mixed approach.
Mapping mountain pine beetle mortality through growth trend analysis of time-series landsat data
Liang, Lu; Chen, Yanlei; Hawbaker, Todd J.; Zhu, Zhi-Liang; Gong, Peng
2014-01-01
Disturbances are key processes in the carbon cycle of forests and other ecosystems. In recent decades, mountain pine beetle (MPB; Dendroctonus ponderosae) outbreaks have become more frequent and extensive in western North America. Remote sensing has the ability to fill the data gaps of long-term infestation monitoring, but the elimination of observational noise and attributing changes quantitatively are two main challenges in its effective application. Here, we present a forest growth trend analysis method that integrates Landsat temporal trajectories and decision tree techniques to derive annual forest disturbance maps over an 11-year period. The temporal trajectory component successfully captures the disturbance events as represented by spectral segments, whereas decision tree modeling efficiently recognizes and attributes events based upon the characteristics of the segments. Validated against a point set sampled across a gradient of MPB mortality, 86.74% to 94.00% overall accuracy was achieved with small variability in accuracy among years. In contrast, the overall accuracies of single-date classifications ranged from 37.20% to 75.20% and only become comparable with our approach when the training sample size was increased at least four-fold. This demonstrates that the advantages of this time series work flow exist in its small training sample size requirement. The easily understandable, interpretable and modifiable characteristics of our approach suggest that it could be applicable to other ecoregions.
Phenomenological analysis of medical time series with regular and stochastic components
NASA Astrophysics Data System (ADS)
Timashev, Serge F.; Polyakov, Yuriy S.
2007-06-01
Flicker-Noise Spectroscopy (FNS), a general approach to the extraction and parameterization of resonant and stochastic components contained in medical time series, is presented. The basic idea of FNS is to treat the correlation links present in sequences of different irregularities, such as spikes, "jumps", and discontinuities in derivatives of different orders, on all levels of the spatiotemporal hierarchy of the system under study as main information carriers. The tools to extract and analyze the information are power spectra and difference moments (structural functions), which complement the information of each other. The structural function stochastic component is formed exclusively by "jumps" of the dynamic variable while the power spectrum stochastic component is formed by both spikes and "jumps" on every level of the hierarchy. The information "passport" characteristics that are determined by fitting the derived expressions to the experimental variations for the stochastic components of power spectra and structural functions are interpreted as the correlation times and parameters that describe the rate of "memory loss" on these correlation time intervals for different irregularities. The number of the extracted parameters is determined by the requirements of the problem under study. Application of this approach to the analysis of tremor velocity signals for a Parkinsonian patient is discussed.
The Use of Scaffolding Approach to Enhance Students' Engagement in Learning Structural Analysis
ERIC Educational Resources Information Center
Hardjito, Djwantoro
2010-01-01
This paper presents a reflection on the use of Scaffolding Approach to engage Civil Engineering students in learning Structural Analysis subjects. In this approach, after listening to the lecture on background theory, students are provided with a series of practice problems, each one comes with the steps, formulas, hints, and tables needed to…
André, Claire; Guyon, Catherine; Thomassin, Mireille; Barbier, Alexandre; Richert, Lysiane; Guillaume, Yves-Claude
2005-06-05
The binding constants (K) of a series of anticoagulant rodenticides with the main soil organic component, humic acid (HA), were determined using frontal analysis approach. The order of the binding constants was identical as the one obtained in a previous paper [J. Chromatogr. B 813 (2004) 295], i.e. bromadiolone>brodifacoum>difenacoum>chlorophacinone>diphacinone, confirming the power of this frontal analysis approach for the determination of binding constants. Moreover, and for the first time, the concentration of unbound rodenticide to HAs could be determined. Thanks this approach, we could clearly demonstrate that HA acid protected the human hepatoma cell line HepG2 against the cytotoxicity of all the rodenticides tested and that the toxicity of rodenticides was directly linked to the free rodenticide fraction in the medium (i.e. unbound rodenticide to HA).
Information Retrieval and Graph Analysis Approaches for Book Recommendation.
Benkoussas, Chahinez; Bellot, Patrice
2015-01-01
A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.
Analysis of the temporal properties in car accident time series
NASA Astrophysics Data System (ADS)
Telesca, Luciano; Lovallo, Michele
2008-05-01
In this paper we study the time-clustering behavior of sequences of car accidents, using data from a freely available database in the internet. The Allan Factor analysis, which is a well-suited method to investigate time-dynamical behaviors in point processes, has revealed that the car accident sequences are characterized by a general time-scaling behavior, with the presence of cyclic components. These results indicate that the time dynamics of the events are not Poissonian but long range correlated with periodicities ranging from 12 h to 1 year.
NASA Astrophysics Data System (ADS)
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources
NASA Astrophysics Data System (ADS)
Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.
2014-12-01
We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q<0 and greater than GHE when q>0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.
Mixed Multifractal Analysis of Crude Oil, Gold and Exchange Rate Series
NASA Astrophysics Data System (ADS)
Dai, Meifeng; Shao, Shuxiang; Gao, Jianyu; Sun, Yu; Su, Weiyi
2016-11-01
The multifractal analysis of one time series, e.g. crude oil, gold and exchange rate series, is often referred. In this paper, we apply the classical multifractal and mixed multifractal spectrum to study multifractal properties of crude oil, gold and exchange rate series and their inner relationships. The obtained results show that in general, the fractal dimension of gold and crude oil is larger than that of exchange rate (RMB against the US dollar), reflecting a fact that the price series in gold and crude oil are more heterogeneous. Their mixed multifractal spectra have a drift and the plot is not symmetric, so there is a low level of mixed multifractal between each pair of crude oil, gold and exchange rate series.
[Analysis of a case series of workers with mobbing syndrome].
Marinoni, B; Minelli, C M; Franzina, B; Martellosio, V; Scafa, F; Giorgi, I; Mazzacane, F; Stancanelli, M; Mennoia, N V; Candura, S M
2007-01-01
Mobbing represents nowadays a major challenge for Occupational Medicine. We examined, during the last seven years, 253 patients who asked medical assistance for psychopathological problems by them ascribed to mobbing in the working environment. All patients underwent occupational health visit, psychological counselling (including personality tests administration), and psychiatric evaluation. A clinical picture probably due to mobbing was diagnosed in 37 workers: 2 cases of Post-Traumatic Stress Disorder (PTSD), 33 of Adjustment Disorder (AD), and 2 of anxiety disorder. Regarding mobbing typology, we found 19 cases of vertical mobbing (by an employer/manager to employees), 14 cases of strategic mobbing, 3 cases of horizontal mobbing (among colleagues), and one non intentional mobbing. In conclusion, a pure mobbing syndrome was diagnosed in a lower proportion than that reported by other investigators. The described interdisciplinary approach appears useful for the diagnostic assessment of suspect mobbing cases, that in turn is crucial for prognosis and treatment, as well as in relation to medico-legal issues and work-related compensation claims.
NASA Astrophysics Data System (ADS)
Telesca, Luciano; Lovallo, Michele; Shaban, Amin; Darwich, Talal; Amacha, Nabil
2013-09-01
In this study, the time dynamics of water flow from Anjar Spring was investigated, which is one of the major issuing springs in the central part of Lebanon. Likewise, many water sources in Lebanon, this spring has no continuous records for the discharge, and this would prevent the application of standard time series analysis tools. Furthermore, the highly nonstationary character of the series implies that suited methodologies can be employed to get insight into its dynamical features. Therefore, the Singular Spectrum Analysis (SSA) and Fisher-Shannon (FS) method, which are useful methods to disclose dynamical features in noisy nonstationary time series with gaps, are jointly applied to analyze the Anjar Spring water flow series. The SSA revealed that the series can be considered as the superposition of meteo-climatic periodic components, low-frequency trend and noise-like high-frequency fluctuations. The FS method allowed to extract and to identify among all the SSA reconstructed components the long-term trend of the series. The long-term trend is characterized by higher Fisher Information Measure (FIM) and lower Shannon entropy, and thus, represents the main informative component of the whole series. Generally water discharge time series presents very complex time structure, therefore the joint application of the SSA and the FS method would be very useful in disclosing the main informative part of such kind of data series in the view of existing climatic variability and/or anthropogenic challenges.
Genetic Programming Based Approach for Modeling Time Series Data of Real Systems
NASA Astrophysics Data System (ADS)
Ahalpara, Dilip P.; Parikh, Jitendra C.
Analytic models of a computer generated time series (logistic map) and three real time series (ion saturation current in Aditya Tokamak plasma, NASDAQ composite index and Nifty index) are constructed using Genetic Programming (GP) framework. In each case, the optimal map that results from fitting part of the data set also provides a very good description of the rest of the data. Predictions made using the map iteratively are very good for computer generated time series but not for the data of real systems. For such cases, an extended GP model is proposed and illustrated. A comparison of these results with those obtained using Artificial Neural Network (ANN) is also carried out.
Crystallographic analysis of a series of inorganic compounds
NASA Astrophysics Data System (ADS)
Borisov, S. V.; Magarill, S. A.; Pervukhina, N. V.
2015-04-01
The method of crystallographic analysis relies on the mechanical-wave concept that treats the crystalline state as the result of ordering of atomic positions by families of parallel equidistant planes. Using this method, a large set of fluoride, oxide and sulfide structures was analyzed. The pseudo-translational ordering of various atomic groups (including the presence of cation and anion sublattices) in the structures of various classes of inorganic compounds was established. The crucial role of local ordering of heavy cations (coherent assembly) in the structures comprising large cluster fragments (Keggin polyanions, polyoxoniobates, etc.) is discussed. The role of symmetry and the regular distribution of heavy atoms in the formation of stable crystal structures, which is to be taken into account in the targeted design, is considered. The universality of configurations of atomic positions in the structures of various classes of inorganic compounds resulting from the ordering mechanism organized by mechanical (elastic) forces is demonstrated. The bibliography includes 158 references.
Approaches to remote sensing data analysis
Pettinger, Lawrence R.
1978-01-01
Objectives: To present an overview of the essential steps in the remote sensing data analysis process, and to compare and contrast manual (visual) and automated analysis methods Rationale: This overview is intended to provide a framework for choosing a manual of digital analysis approach to collecting resource information. It can also be used as a basis for understanding/evaluating invited papers and poster sessions during the Symposium
Gherardi, Bianca; Tommaso, Giulia; Ranzi, Andrea; Zauli Sajani, Stefano; De Togni, Aldo; Pizzi, Lorenzo; Lauriola, Paolo
2015-01-01
to compare the meta-analysis and the pooled analysis approach to study short-term effects of air pollution on human health in Emilia-Romagna Region (Central Italy) cities, characterised by strong homogeneity of environmental and sociodemographic features. application of fixed-effects meta-analysis and fixed-effects pooled analysis on time-series data of seven cities in Emilia-Romagna in the period 2006-2010. The relationship among adverse health events (deaths due to natural causes, cardiovascular disease, cerebrovascular disease and respiratory disease) and concentrations of PM10, PM2.5 and NO2 was investigated by means of GAM models, using the EpiAir protocol. the pooled analysis application entailed a gain in terms of precision of effect estimates in respect to meta-analysis approach. The interval widths of pooled analysis are lower than those of meta-analytic estimates, with percentage reductions between 7% and 43%. This power increase led to a major number of statistically significant pooled analysis estimates. It has been a generally good correspondence between the two methods in terms of direction and strength of the association among health outcomes and the various pollutants. An exception is the PM10 effect estimate on respiratory mortality, where the meta-analytic estimate was significantly higher and not in line with literature data. the study highlighted the increase in accuracy and stability of effect estimates obtained from a pooled analysis compared to a meta-analysis in a regional context such as the Emilia-Romagna Region, characterised by the absence of heterogeneity in exposure to pollutants and other confounders. In this context, the pooled approach is to be considered preferable to meta-analysis.
NASA Astrophysics Data System (ADS)
Schwatke, C.; Dettmering, D.; Bosch, W.; Seitz, F.
2015-05-01
Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, since some years, this technology is also used for observing inland water levels of lakes and rivers. In this paper, a new approach for the estimation of inland water level time series is described. It is used for the computation of time series available through the web service "Database for Hydrological Time Series over Inland Water" (DAHITI). The method is based on a Kalman filter approach incorporating multi-mission altimeter observations and their uncertainties. As input data, cross-calibrated altimeter data from Envisat, ERS-2, Jason-1, Jason-2, Topex/Poseidon, and SARAL/AltiKa are used. The paper presents water level time series for a variety of lakes and rivers in North and South America featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparison with in-situ gauge data and results from external inland altimeter databases. The new approach yields RMS differences with respect to in-situ data between 4 and 38 cm for lakes and 12 and 139 cm for rivers, respectively. For most study cases, more accurate height information than from available other altimeter data bases can be achieved.
NASA Astrophysics Data System (ADS)
Schwatke, C.; Dettmering, D.; Bosch, W.; Seitz, F.
2015-10-01
Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, for some years, this technology has also been used to retrieve water levels from reservoirs, wetlands and in general any inland water body, although the radar altimetry technique has been especially applied to rivers and lakes. In this paper, a new approach for the estimation of inland water level time series is described. It is used for the computation of time series of rivers and lakes available through the web service "Database for Hydrological Time Series over Inland Waters" (DAHITI). The new method is based on an extended outlier rejection and a Kalman filter approach incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1, Jason-2, TOPEX/Poseidon, and SARAL/AltiKa, including their uncertainties. The paper presents water level time series for a variety of lakes and rivers in North and South America featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases. The new approach yields rms differences with respect to in situ data between 4 and 36 cm for lakes and 8 and 114 cm for rivers. For most study cases, more accurate height information than from other available altimeter databases can be achieved.
A quantitative approach to scar analysis.
Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia
2011-02-01
Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology.
A Quantitative Approach to Scar Analysis
Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia
2011-01-01
Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794
Discrete Fourier analysis of ultrasound RF time series for detection of prostate cancer.
Moradi, M; Mousavi, P; Siemens, D R; Sauerbrei, E E; Isotalo, P; Boag, A; Abolmaesumi, P
2007-01-01
In this paper, we demonstrate that a set of six features extracted from the discrete Fourier transform of ultrasound Radio-Frequency (RF) time series can be used to detect prostate cancer with high sensitivity and specificity. Ultrasound RF time series refer to a series of echoes received from one spatial location of tissue while the imaging probe and the tissue are fixed in position. Our previous investigations have shown that at least one feature, fractal dimension, of these signals demonstrates strong correlation with the tissue microstructure. In the current paper, six new features that represent the frequency spectrum of the RF time series have been used, in conjunction with a neural network classification approach, to detect prostate cancer in regions of tissue as small as 0.03 cm2. Based on pathology results used as gold standard, we have acquired mean accuracy of 91%, mean sensitivity of 92% and mean specificity of 90% on seven human prostates.
Engine Control Improvement through Application of Chaotic Time Series Analysis
Green, J.B., Jr.; Daw, C.S.
2003-07-15
The objective of this program was to investigate cyclic variations in spark-ignition (SI) engines under lean fueling conditions and to develop options to reduce emissions of nitrogen oxides (NOx) and particulate matter (PM) in compression-ignition direct-injection (CIDI) engines at high exhaust gas recirculation (EGR) rates. The CIDI activity builds upon an earlier collaboration between ORNL and Ford examining combustion instabilities in SI engines. Under the original CRADA, the principal objective was to understand the fundamental causes of combustion instability in spark-ignition engines operating with lean fueling. The results of this earlier activity demonstrated that such combustion instabilities are dominated by the effects of residual gas remaining in each cylinder from one cycle to the next. A very simple, low-order model was developed that explained the observed combustion instability as a noisy nonlinear dynamical process. The model concept lead to development of a real-time control strategy that could be employed to significantly reduce cyclic variations in real engines using existing sensors and engine control systems. This collaboration led to the issuance of a joint patent for spark-ignition engine control. After a few years, the CRADA was modified to focus more on EGR and CIDI engines. The modified CRADA examined relationships between EGR, combustion, and emissions in CIDI engines. Information from CIDI engine experiments, data analysis, and modeling were employed to identify and characterize new combustion regimes where it is possible to simultaneously achieve significant reductions in NOx and PM emissions. These results were also used to develop an on-line combustion diagnostic (virtual sensor) to make cycle-resolved combustion quality assessments for active feedback control. Extensive experiments on engines at Ford and ORNL led to the development of the virtual sensor concept that may be able to detect simultaneous reductions in NOx and PM
Chaos in Electronic Circuits: Nonlinear Time Series Analysis
Wheat, Jr., Robert M.
2003-07-01
Chaos in electronic circuits is a phenomenon that has been largely ignored by engineers, manufacturers, and researchers until the early 1990’s and the work of Chua, Matsumoto, and others. As the world becomes more dependent on electronic devices, the detrimental effects of non-normal operation of these devices becomes more significant. Developing a better understanding of the mechanisms involved in the chaotic behavior of electronic circuits is a logical step toward the prediction and prevention of any potentially catastrophic occurrence of this phenomenon. Also, a better understanding of chaotic behavior, in a general sense, could potentially lead to better accuracy in the prediction of natural events such as weather, volcanic activity, and earthquakes. As a first step in this improvement of understanding, and as part of the research being reported here, methods of computer modeling, identifying and analyzing, and producing chaotic behavior in simple electronic circuits have been developed. The computer models were developed using both the Alternative Transient Program (ATP) and Spice, the analysis techniques have been implemented using the C and C++ programming languages, and the chaotically behaving circuits developed using “off the shelf” electronic components.
CCD Observing and Dynamical Time Series Analysis of Active Galactic Nuclei.
NASA Astrophysics Data System (ADS)
Nair, Achotham Damodaran
1995-01-01
The properties, working and operations procedure of the Charge Coupled Device (CCD) at the 30" telescope at Rosemary Hill Observatory (RHO) are discussed together with the details of data reduction. Several nonlinear techniques of time series analysis, based on the behavior of the nearest neighbors, have been used to analyze the time series of the quasar 3C 345. A technique using Artificial Neural Networks based on prediction of the time series is used to study the dynamical properties of 3C 345. Finally, a heuristic model for variability of Active Galactic Nuclei is discussed.
Highway Subsidence Analysis Based on the Advanced InSAR Time Series Analysis Method
NASA Astrophysics Data System (ADS)
Zhang, Qingyun; Zhang, Jingfa; Liu, Guolin; Li, Yongsheng
2016-08-01
The synthetic aperture radar (InSAR) measurements have the advantages of all-weather, wide range, high precision on the surface deformation monitoring. Highway as an important index of modern social and economic development, the quality and deformation changes in the process of using have a significant impact in the social development and people's life and property security. In practical applications the InSAR technology should do a variety of error correction analysis. By using a new analysis method – FRAM- SBAS time-series analysis method, to analyze the settlement of highway on Yanzhou area by the ALOS PALSAR datas. Use FRAM- SBAS timing analysis method to obtain the surface timing changes during 2008-09-21 to 2010-07-18 in the Jining area and obtained good results, the Jining area maximum timing settlement is 60mm, the maximum settlement rate reached 30mm/yr. The maximum settlement of the highway section is 53mm, the maximum settlement rate is 32mm/yr. And the settlement of highway worst sections were in severe ground subsidence, thus proving the mining and vehicle load effect on settlement of highway. And it is proved that the timing method on the ground and highway subsidence monitoring is feasible.
Low, Diana H P; Motakis, Efthymios
2013-10-01
Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.