Science.gov

Sample records for series analysis approach

  1. A multiscale approach to InSAR time series analysis

    NASA Astrophysics Data System (ADS)

    Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y. N.; Dicaprio, C.; Rickerby, A.

    2008-12-01

    We describe a new technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale analysis of InSAR Time Series), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. This approach also permits a consistent treatment of all data independent of the presence of localized holes in any given interferogram. In essence, MInTS allows one to considers all data at the same time (as opposed to one pixel at a time), thereby taking advantage of both spatial and temporal characteristics of the deformation field. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to includes a set of general functions (e.g., splines) in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate our results by comparing with ground-based observations.

  2. A Multiscale Approach to InSAR Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Muse, P.; Simons, M.; Lin, N.; Dicaprio, C. J.

    2010-12-01

    We present a technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale InSAR Time Series analysis), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. As opposed to single pixel InSAR time series techniques, MInTS takes advantage of both spatial and temporal characteristics of the deformation field. We use a weighting scheme which accounts for the presence of localized holes due to decorrelation or unwrapping errors in any given interferogram. We represent time-dependent deformation using a dictionary of general basis functions, capable of detecting both steady and transient processes. The estimation is regularized using a model resolution based smoothing so as to be able to capture rapid deformation where there are temporally dense radar acquisitions and to avoid oscillations during time periods devoid of acquisitions. MInTS also has the flexibility to explicitly parametrize known time-dependent processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). We use cross validation to choose the regularization penalty parameter in the inversion of for the time-dependent deformation field. We demonstrate MInTS using a set of 63 ERS-1/2 and 29 Envisat interferograms for Long Valley Caldera.

  3. A Multiscale Approach to InSAR Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y.; Dicaprio, C. J.

    2009-12-01

    We describe progress in the development of MInTS (Multiscale analysis of InSAR Time Series), an approach to constructed self-consistent time-dependent deformation observations from repeated satellite-based InSAR images of a given region. MInTS relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. In essence, MInTS allows one to considers all data at the same time as opposed to one pixel at a time, thereby taking advantage of both spatial and temporal characteristics of the deformation field. This approach also permits a consistent treatment of all data independent of the presence of localized holes due to unwrapping issues in any given interferogram. Specifically, the presence of holes is accounted for through a weighting scheme that accounts for the extent of actual data versus the area of holes associated with any given wavelet. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to include a set of general functions in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. We also experiment with the use of sparsity inducing regularization as a way to select from a large dictionary of time functions. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate

  4. A Comparison of Alternative Approaches to the Analysis of Interrupted Time-Series.

    ERIC Educational Resources Information Center

    Harrop, John W.; Velicer, Wayne F.

    1985-01-01

    Computer generated data representative of 16 Auto Regressive Integrated Moving Averages (ARIMA) models were used to compare the results of interrupted time-series analysis using: (1) the known model identification, (2) an assumed (l,0,0) model, and (3) an assumed (3,0,0) model as an approximation to the General Transformation approach. (Author/BW)

  5. Complex networks approach to geophysical time series analysis: Detecting paleoclimate transitions via recurrence networks

    NASA Astrophysics Data System (ADS)

    Donner, R. V.; Zou, Y.; Donges, J. F.; Marwan, N.; Kurths, J.

    2009-12-01

    We present a new approach for analysing structural properties of time series from complex systems. Starting from the concept of recurrences in phase space, the recurrence matrix of a time series is interpreted as the adjacency matrix of an associated complex network which links different points in time if the evolution of the considered states is very similar. A critical comparison of these recurrence networks with similar existing techniques is presented, revealing strong conceptual benefits of the new approach which can be considered as a unifying framework for transforming time series into complex networks that also includes other methods as special cases. Based on different model systems, we demonstrate that there are fundamental interrelationships between the topological properties of recurrence networks and the statistical properties of the phase space density of the underlying dynamical system. Hence, the network description yields new quantitative characteristics of the dynamical complexity of a time series, which substantially complement existing measures of recurrence quantification analysis. Finally, we illustrate the potential of our approach for detecting hidden dynamical transitions from geoscientific time series by applying it to different paleoclimate records. In particular, we are able to resolve previously unknown climatic regime shifts in East Africa during the last about 4 million years, which might have had a considerable influence on the evolution of hominids in the area.

  6. Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach.

    PubMed

    Monti, Martin M

    2011-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a general linear model (GLM) approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power, and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making.

  7. Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach

    PubMed Central

    Monti, Martin M.

    2011-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a general linear model (GLM) approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power, and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making. PMID:21442013

  8. Detection of chaos: New approach to atmospheric pollen time-series analysis

    NASA Astrophysics Data System (ADS)

    Bianchi, M. M.; Arizmendi, C. M.; Sanchez, J. R.

    1992-09-01

    Pollen and spores are biological particles that are ubiquitous to the atmosphere and are pathologically significant, causing plant diseases and inhalant allergies. One of the main objectives of aerobiological surveys is forecasting. Prediction models are required in order to apply aerobiological knowledge to medical or agricultural practice; a necessary condition of these models is not to be chaotic. The existence of chaos is detected through the analysis of a time series. The time series comprises hourly counts of atmospheric pollen grains obtained using a Burkard spore trap from 1987 to 1989 at Mar del Plata. Abraham's method to obtain the correlation dimension was applied. A low and fractal dimension shows chaotic dynamics. The predictability of models for atomspheric pollen forecasting is discussed.

  9. A hybrid wavelet analysis-cloud model data-extending approach for meteorologic and hydrologic time series

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing

    2015-05-01

    For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.

  10. Dynamic analysis of traffic time series at different temporal scales: A complex networks approach

    NASA Astrophysics Data System (ADS)

    Tang, Jinjun; Wang, Yinhai; Wang, Hua; Zhang, Shen; Liu, Fang

    2014-07-01

    The analysis of dynamics in traffic flow is an important step to achieve advanced traffic management and control in Intelligent Transportation System (ITS). Complexity and periodicity are definitely two fundamental properties in traffic dynamics. In this study, we first measure the complexity of traffic flow data by Lempel-Ziv algorithm at different temporal scales, and the data are collected from loop detectors on freeway. Second, to obtain more insight into the complexity and periodicity in traffic time series, we then construct complex networks from traffic time series by considering each day as a cycle and each cycle as a single node. The optimal threshold value of complex networks is estimated by the distribution of density and its derivative. In addition, the complex networks are subsequently analyzed in terms of some statistical properties, such as average path length, clustering coefficient, density, average degree and betweenness. Finally, take 2 min aggregation data as example, we use the correlation coefficient matrix, adjacent matrix and closeness to exploit the periodicity of weekdays and weekends in traffic flow data. The findings in this paper indicate that complex network is a practical tool for exploring dynamics in traffic time series.

  11. A Time Series Approach to Random Number Generation: Using Recurrence Quantification Analysis to Capture Executive Behavior

    PubMed Central

    Oomens, Wouter; Maes, Joseph H. R.; Hasselman, Fred; Egger, Jos I. M.

    2015-01-01

    The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG) task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA), a non-linear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation. PMID:26097449

  12. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    PubMed

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  13. Novel approaches in Extended Principal Components Analysis to compare spatio-temporal patterns among multiple image time series

    NASA Astrophysics Data System (ADS)

    Neeti, N.; Eastman, R.

    2012-12-01

    Extended Principal Components Analysis (EPCA) aims to examine the patterns of variability shared among multiple image time series. Conventionally, this is done by virtually extending the spatial dimension of the time series by spatially concatenating the different time series and then performing S-mode PCA. In S-mode analysis, samples in space are the statistical variables and samples in time are the statistical observations. This paper introduces the concept of temporal concatenation of multiple image time series to perform EPCA. EPCA can also be done with T-mode orientation in which samples in time are the statistical variables and samples in space are the statistical observations. This leads to a total of four orientations in which EPCA can be carried out. This research explores these four orientations and their implications in investigating spatio-temporal relationships among multiple time series. This research demonstrates that EPCA carried out with temporal concatenation of the multiple time series with T-mode (tT) is able to identify similar spatial patterns among multiple time series. The conventional S-mode EPCA with spatial concatenation (sS) identifies similar temporal patterns among multiple time series. The other two modes, namely T-mode with spatial concatenation (sT) and S-mode with temporal concatenation (tS), are able to identify patterns which share consistent temporal phase relationships and consistent spatial phase relationships with each other, respectively. In a case study using three sets of precipitation time series data from GPCP, CMAP and NCEP-DOE, the results show that examination of all four modes provides an effective basis comparison of the series.

  14. Cabinetmaker. Occupational Analysis Series.

    ERIC Educational Resources Information Center

    Chinien, Chris; Boutin, France

    This document contains the analysis of the occupation of cabinetmaker, or joiner, that is accepted by the Canadian Council of Directors as the national standard for the occupation. The front matter preceding the analysis includes exploration of the development of the analysis, structure of the analysis, validation method, scope of the cabinetmaker…

  15. Permutations and time series analysis.

    PubMed

    Cánovas, Jose S; Guillamón, Antonio

    2009-12-01

    The main aim of this paper is to show how the use of permutations can be useful in the study of time series analysis. In particular, we introduce a test for checking the independence of a time series which is based on the number of admissible permutations on it. The main improvement in our tests is that we are able to give a theoretical distribution for independent time series.

  16. FROG: Time-series analysis

    NASA Astrophysics Data System (ADS)

    Allan, Alasdair

    2014-06-01

    FROG performs time series analysis and display. It provides a simple user interface for astronomers wanting to do time-domain astrophysics but still offers the powerful features found in packages such as PERIOD (ascl:1406.005). FROG includes a number of tools for manipulation of time series. Among other things, the user can combine individual time series, detrend series (multiple methods) and perform basic arithmetic functions. The data can also be exported directly into the TOPCAT (ascl:1101.010) application for further manipulation if needed.

  17. Hands-On Approach to Structure Activity Relationships: The Synthesis, Testing, and Hansch Analysis of a Series of Acetylcholineesterase Inhibitors

    ERIC Educational Resources Information Center

    Locock, Katherine; Tran, Hue; Codd, Rachel; Allan, Robin

    2015-01-01

    This series of three practical sessions centers on drugs that inhibit the enzyme acetylcholineesterase. This enzyme is responsible for the inactivation of acetylcholine and has been the target of drugs to treat glaucoma and Alzheimer's disease and for a number of insecticides and warfare agents. These sessions relate to a series of carbamate…

  18. Visibility graphlet approach to chaotic time series.

    PubMed

    Mutua, Stephen; Gu, Changgui; Yang, Huijie

    2016-05-01

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  19. Task Analysis Inventories. Series II.

    ERIC Educational Resources Information Center

    Wesson, Carl E.

    This second in a series of task analysis inventories contains checklists of work performed in twenty-two occupations. Each inventory is a comprehensive list of work activities, responsibilities, educational courses, machines, tools, equipment, and work aids used and the products produced or services rendered in a designated occupational area. The…

  20. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  1. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  2. Multifractal Analysis of Sunspot Number Time Series

    NASA Astrophysics Data System (ADS)

    Kasde, Satish Kumar; Gwal, Ashok Kumar; Sondhiya, Deepak Kumar

    2016-07-01

    Multifractal analysis based approaches have been recently developed as an alternative framework to study the complex dynamical fluctuations in sunspot numbers data including solar cycles 20 to 23 and ascending phase of current solar cycle 24.To reveal the multifractal nature, the time series data of monthly sunspot number are analyzed by singularity spectrum and multi resolution wavelet analysis. Generally, the multifractility in sunspot number generate turbulence with the typical characteristics of the anomalous process governing the magnetosphere and interior of Sun. our analysis shows that singularities spectrum of sunspot data shows well Gaussian shape spectrum, which clearly establishes the fact that monthly sunspot number has multifractal character. The multifractal analysis is able to provide a local and adaptive description of the cyclic components of sunspot number time series, which are non-stationary and result of nonlinear processes. Keywords: Sunspot Numbers, Magnetic field, Multifractal analysis and wavelet Transform Techniques.

  3. Introduction to Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1986-01-01

    The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.

  4. Approach to analysis of multiscale space-distributed time series: separation of spatio-temporal modes with essentially different time scales

    NASA Astrophysics Data System (ADS)

    Feigin, Alexander; Mukhin, Dmitry; Gavrilov, Andrey; Volodin, Evgeny; Loskutov, Evgeny

    2014-05-01

    Natural systems are in general space-distributed, and their evolution represents a broad spectrum of temporal scales. The multiscale nature may be resulted from multiplicity of mechanisms governing the system behaviour, and a large number of feedbacks and nonlinearities. A way to reveal and understand the underlying mechanisms as well as to model corresponding sub-systems is decomposition of the full (complex) system into well separated spatio-temporal patterns ("modes") that evolve with essentially different time scales. In the report a new method of a similar decomposition is discussed. The method is based on generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding space-distributed time series in basis of spatio-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points. The method is applied to decomposition of the Earth's climate system: on the base of 156 years time series of SST anomalies distributed over the globe [2] two climatic modes possessing by noticeably different time scales (3-5 and 9-11 years) are separated. For more accurate exclusion of "too slow" (and thus not represented correctly) processes from real data the numerically produced STEOF basis is used. For doing this the time series generated by the INM RAS Coupled Climate Model [3] is utilized. Relations of separated modes to ENSO and PDO are investigated. Possible development of the suggested approach in order to the separation of the modes that are nonlinearly uncorrelated is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/ 3. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm

  5. Nonlinear time-series analysis revisited.

    PubMed

    Bradley, Elizabeth; Kantz, Holger

    2015-09-01

    In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data-typically univariate-via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems. PMID:26428563

  6. Nonlinear time-series analysis revisited

    NASA Astrophysics Data System (ADS)

    Bradley, Elizabeth; Kantz, Holger

    2015-09-01

    In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data—typically univariate—via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems.

  7. Relation between temperature and suicide mortality in Japan in the presence of other confounding factors using time-series analysis with a semiparametric approach

    PubMed Central

    Honda, Yasushi; Ono, Masaji

    2010-01-01

    Objectives The objective of this study was to assess the relation between temperature and suicide mortality in Japan using time series analysis with a semiparametric approach. Methods We analyzed the relation between daily fluctuations in suicide mortality and maximum temperatures for all regions in Japan over the period of time from 1972 to 1995 using a generalized additive model. The model controls for the time trend, season, selected meteorological parameters, day of the week, and holiday. Adjustment was based using penalized splines and the decision on the amount of smoothness was based on minimizing the unbiased risk estimation criterion. Results The results show that suicide mortality in Japan has a seasonal character and it varies from year to year, with the highest occurrence in April, as well as in the first part of the week, especially on Mondays and Tuesdays. As for the day of the week, there were only few suicide cases on Saturdays and holidays. We found that for all regions in Japan when temperature increased the suicide mortality increased on the same day (lag = 0). Analysis by method of suicide showed that when temperature increased mortality significantly increased only for suicide by a violent method. The pattern of the relation for other methods remained unclear. Conclusions This study suggests that an increase in temperature has a short-term effect on suicide mortality in Japan. PMID:21432215

  8. Ordinal analysis of time series

    NASA Astrophysics Data System (ADS)

    Keller, K.; Sinn, M.

    2005-10-01

    In order to develop fast and robust methods for extracting qualitative information from non-linear time series, Bandt and Pompe have proposed to consider time series from the pure ordinal viewpoint. On the basis of counting ordinal patterns, which describe the up-and-down in a time series, they have introduced the concept of permutation entropy for quantifying the complexity of a system behind a time series. The permutation entropy only provides one detail of the ordinal structure of a time series. Here we present a method for extracting the whole ordinal information.

  9. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  10. Radar Interferometry Time Series Analysis and Tools

    NASA Astrophysics Data System (ADS)

    Buckley, S. M.

    2006-12-01

    We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including localized subsidence bowls with constant velocity and seasonal deformation fluctuations. We will present results and insights from the application of these time series analysis techniques to several land subsidence study sites with varying deformation and environmental conditions, e.g., arid Phoenix and coastal Houston-Galveston metropolitan areas and rural Texas sink holes. We consistently find that the time invested in implementing, applying and comparing multiple InSAR time series approaches for a given study site is rewarded with a deeper understanding of the techniques and deformation phenomena. To this end, and with support from NSF, we are preparing a first-version of an InSAR post-processing toolkit to be released to the InSAR science community. These studies form a baseline of results to compare against the higher spatial and temporal sampling anticipated from TerraSAR-X as well as the trade-off between spatial coverage and resolution when relying on ScanSAR interferometry.

  11. Analysis of Polyphonic Musical Time Series

    NASA Astrophysics Data System (ADS)

    Sommer, Katrin; Weihs, Claus

    A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.

  12. Complex network approach to fractional time series

    SciTech Connect

    Manshour, Pouya

    2015-10-15

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.

  13. Forecasting spore concentrations: A time series approach

    NASA Astrophysics Data System (ADS)

    Stephen, Elaine; Raffery, Adrian E.; Dowding, Paul

    1990-06-01

    Fungal basidiospores and Cladosporium spores are the two most numerous spore types in the air of Dublin and its surroundings. They are known to have allergenic components, and the aim of the study described here is to develop a predictive model for these spores. A very simple model, which combines an estimated diurnal rhythm with a simple, one-parameter time series model, provided golld short-term forecasts. The one-step prediction error variance was reduced by 88% for Cladosporium spores and by 98% for basidiospores.

  14. Image distortion analysis using polynomial series expansion.

    PubMed

    Baggenstoss, Paul M

    2004-11-01

    In this paper, we derive a technique for analysis of local distortions which affect data in real-world applications. In the paper, we focus on image data, specifically handwritten characters. Given a reference image and a distorted copy of it, the method is able to efficiently determine the rotations, translations, scaling, and any other distortions that have been applied. Because the method is robust, it is also able to estimate distortions for two unrelated images, thus determining the distortions that would be required to cause the two images to resemble each other. The approach is based on a polynomial series expansion using matrix powers of linear transformation matrices. The technique has applications in pattern recognition in the presence of distortions. PMID:15521492

  15. Analysis of series resonant converter with series-parallel connection

    NASA Astrophysics Data System (ADS)

    Lin, Bor-Ren; Huang, Chien-Lan

    2011-02-01

    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  16. Nonlinear Analysis of Surface EMG Time Series

    NASA Astrophysics Data System (ADS)

    Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-04-01

    Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.

  17. The Resource Approach to the Analysis of Educational Project Cost. Number 3 in a Series of Monographs on Evaluation in Education.

    ERIC Educational Resources Information Center

    Office of Education (DHEW), Washington, DC.

    Comparing or estimating the costs of educational projects by merely using cost-per-student figures is imprecise and ignores area differences in prices. The resource approach to cost analysis begins by determining specific physical resources (such as facilities, staff, equipment, materials, and services) needed for a project. Then the cost of these…

  18. Entropic Analysis of Electromyography Time Series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  19. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  20. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  1. Nonlinear Time Series Analysis via Neural Networks

    NASA Astrophysics Data System (ADS)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  2. Wavelet analysis of radon time series

    NASA Astrophysics Data System (ADS)

    Barbosa, Susana; Pereira, Alcides; Neves, Luis

    2013-04-01

    Radon is a radioactive noble gas with a half-life of 3.8 days ubiquitous in both natural and indoor environments. Being produced in uranium-bearing materials by decay from radium, radon can be easily and accurately measured by nuclear methods, making it an ideal proxy for time-varying geophysical processes. Radon time series exhibit a complex temporal structure and large variability on multiple scales. Wavelets are therefore particularly suitable for the analysis on a scale-by-scale basis of time series of radon concentrations. In this study continuous and discrete wavelet analysis is applied to describe the variability structure of hourly radon time series acquired both indoors and on a granite site in central Portugal. A multi-resolution decomposition is performed for extraction of sub-series associated to specific scales. The high-frequency components are modeled in terms of stationary autoregressive / moving average (ARMA) processes. The amplitude and phase of the periodic components are estimated and tidal features of the signals are assessed. Residual radon concentrations (after removal of periodic components) are further examined and the wavelet spectrum is used for estimation of the corresponding Hurst exponent. The results for the several radon time series considered in the present study are very heterogeneous in terms of both high-frequency and long-term temporal structure indicating that radon concentrations are very site-specific and heavily influenced by local factors.

  3. Nonlinear Time Series Analysis of Sunspot Data

    NASA Astrophysics Data System (ADS)

    Suyal, Vinita; Prasad, Awadhesh; Singh, Harinder P.

    2009-12-01

    This article deals with the analysis of sunspot number time series using the Hurst exponent. We use the rescaled range ( R/ S) analysis to estimate the Hurst exponent for 259-year and 11 360-year sunspot data. The results show a varying degree of persistence over shorter and longer time scales corresponding to distinct values of the Hurst exponent. We explain the presence of these multiple Hurst exponents by their resemblance to the deterministic chaotic attractors having multiple centers of rotation.

  4. Applying time series analysis to performance logs

    NASA Astrophysics Data System (ADS)

    Kubacki, Marcin; Sosnowski, Janusz

    2015-09-01

    Contemporary computer systems provide mechanisms for monitoring various performance parameters (e.g. processor or memory usage, disc or network transfers), which are collected and stored in performance logs. An important issue is to derive characteristic features describing normal and abnormal behavior of the systems. For this purpose we use various schemes of analyzing time series. They have been adapted to the specificity of performance logs and verified using data collected from real systems. The presented approach is useful in evaluating system dependability.

  5. Haar Wavelet Analysis of Climatic Time Series

    NASA Astrophysics Data System (ADS)

    Zhang, Zhihua; Moore, John; Grinsted, Aslak

    2014-05-01

    In order to extract the intrinsic information of climatic time series from background red noise, we will first give an analytic formula on the distribution of Haar wavelet power spectra of red noise in a rigorous statistical framework. The relation between scale aand Fourier period T for the Morlet wavelet is a= 0.97T . However, for Haar wavelet, the corresponding formula is a= 0.37T . Since for any time series of time step δt and total length Nδt, the range of scales is from the smallest resolvable scale 2δt to the largest scale Nδt in wavelet-based time series analysis, by using the Haar wavelet analysis, one can extract more low frequency intrinsic information. Finally, we use our method to analyze Arctic Oscillation which is a key aspect of climate variability in the Northern Hemisphere, and discover a great change in fundamental properties of the AO,-commonly called a regime shift or tripping point. Our partial results have been published as follows: [1] Z. Zhang, J.C. Moore and A. Grinsted, Haar wavelet analysis of climatic time series, Int. J. Wavelets, Multiresol. & Inf. Process., in press, 2013 [2] Z. Zhang, J.C. Moore, Comment on "Significance tests for the wavelet power and the wavelet power spectrum", Ann. Geophys., 30:12, 2012

  6. Climate Time Series Analysis and Forecasting

    NASA Astrophysics Data System (ADS)

    Young, P. C.; Fildes, R.

    2009-04-01

    This paper will discuss various aspects of climate time series data analysis, modelling and forecasting being carried out at Lancaster. This will include state-dependent parameter, nonlinear, stochastic modelling of globally averaged atmospheric carbon dioxide; the computation of emission strategies based on modern control theory; and extrapolative time series benchmark forecasts of annual average temperature, both global and local. The key to the forecasting evaluation will be the iterative estimation of forecast error based on rolling origin comparisons, as recommended in the forecasting research literature. The presentation will conclude with with a comparison of the time series forecasts with forecasts produced from global circulation models and a discussion of the implications for climate modelling research.

  7. Clustering Financial Time Series by Network Community Analysis

    NASA Astrophysics Data System (ADS)

    Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio

    In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.

  8. Time-Series Analysis: A Cautionary Tale

    NASA Technical Reports Server (NTRS)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  9. Synchronization Analysis of Nonstationary Bivariate Time Series

    NASA Astrophysics Data System (ADS)

    Kurths, J.

    First the concept of synchronization in coupled complex systems is presented and it is shown that synchronization phenomena are abundant in science, nature, engineer- ing etc. We use this concept to treat the inverse problem and to reveal interactions between oscillating systems from observational data. First it is discussed how time varying phases and frequencies can be estimated from time series and second tech- niques for detection and quantification of hidden synchronization is presented. We demonstrate that this technique is effective for the analysis of systems' interrelation from noisy nonstationary bivariate data and provides other insights than traditional cross correlation and spectral analysis. For this, model examples and geophysical data are discussed.

  10. Cluster analysis of respiratory time series.

    PubMed

    Adams, J M; Attinger, E O; Attinger, F M

    1978-03-01

    We have investigated the respiratory control system with the hypothesis that, although many variables such as minute ventilation (VI), tidal volume (VT), breathing period (TT), inspiratory duration (TI), and expiratory duration (TE) may be observed, the controller functions more simply by manipulating only 2 or 3 of these. Thus, if tidal volume is the only independent variable, TI being determined by the "off-switch" threshold, these variables should have very similar time courses. Anesthetized dogs were subjected to CO2 breathing and carotid sinus perfusion to stimulate both chemoreceptors. The time series of the variables VI, VT, TT, TE, and TI as well as PACO2 were determined on a breath by breath basis. Derived characteristics of these time series were compared using Cluster Analysis and the latent dimensionality of respiratory control determined by Factor Analysis. The characteristics of the time series clustered into 4 groups: magnitude (of the response), speed, variability and relative change. One respiratory factor accounted for 86% of the variance for the variability characteristics, 2 factors for magnitude (91%) and relative change (85%) and 3 factors for speed (89%). The respiratory variables were analysed for each of the 4 groups of characteristics with the following results: VT and TI clustered together only for the magnitude and relative change characteristics where as TT and TE clustered closely for all four characteristics. One latent factor was associated with the [TT-TE] group and the other usually with PACO2.

  11. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  12. AN ALTERNATIVE APPROACH TO THE TREATMENT OF MENISCAL PATHOLOGIES: A CASE SERIES ANALYSIS OF THE MULLIGAN CONCEPT “SQUEEZE” TECHNIQUE

    PubMed Central

    Richmond, Amy; Sanchez, Belinda; Stevenson, Valerie; Baker, Russell T.; May, James; Nasypany, Alan; Reordan, Don

    2016-01-01

    ABSTRACT Background Partial meniscectomy does not consistently produce the desired positive outcomes intended for meniscal tears lesions; therefore, a need exists for research into alternatives for treating symptoms of meniscal tears. The purpose of this case series was to examine the effect of the Mulligan Concept (MC) “Squeeze” technique in physically active participants who presented with clinical symptoms of meniscal tears. Description of Cases The MC “Squeeze” technique was applied in five cases of clinically diagnosed meniscal tears in a physically active population. The Numeric Pain Rating Scale (NRS), the Patient Specific Functional Scale (PSFS), the Disability in the Physically Active (DPA) Scale, and the Knee injury and Osteoarthritis Outcomes Score (KOOS) were administered to assess participant pain level and function. Outcomes Statistically significant improvements were found on cumulative NRS (p ≤ 0.001), current NRS (p ≤ 0.002), PSFS (p ≤ 0.003), DPA (p ≤ 0.019), and KOOS (p ≤ 0.002) scores across all five participants. All participants exceeded the minimal clinically important difference (MCID) on the first treatment and reported an NRS score and current pain score of one point or less at discharge. The MC “Squeeze” technique produced statistically and clinically significant changes across all outcome measures in all five participants. Discussion The use of the MC “Squeeze” technique in this case series indicated positive outcomes in five participants who presented with meniscal tear symptoms. Of importance to the athletic population, each of the participants continued to engage in sport activity as tolerated unless otherwise required during the treatment period. The outcomes reported in this case series exceed those reported when using traditional conservative therapy and the return to play timelines for meniscal tears treated with partial meniscectomies. Levels of Evidence Level 4 PMID:27525181

  13. Long-Term Retrospective Analysis of Mackerel Spawning in the North Sea: A New Time Series and Modeling Approach to CPR Data

    PubMed Central

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221

  14. Sliced Inverse Regression for Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Li-Sue

    1995-11-01

    In this thesis, general nonlinear models for time series data are considered. A basic form is x _{t} = f(beta_sp{1} {T}X_{t-1},beta_sp {2}{T}X_{t-1},... , beta_sp{k}{T}X_ {t-1},varepsilon_{t}), where x_{t} is an observed time series data, X_{t } is the first d time lag vector, (x _{t},x_{t-1},... ,x _{t-d-1}), f is an unknown function, beta_{i}'s are unknown vectors, varepsilon_{t }'s are independent distributed. Special cases include AR and TAR models. We investigate the feasibility applying SIR/PHD (Li 1990, 1991) (the sliced inverse regression and principal Hessian methods) in estimating beta _{i}'s. PCA (Principal component analysis) is brought in to check one critical condition for SIR/PHD. Through simulation and a study on 3 well -known data sets of Canadian lynx, U.S. unemployment rate and sunspot numbers, we demonstrate how SIR/PHD can effectively retrieve the interesting low-dimension structures for time series data.

  15. Partial spectral analysis of hydrological time series

    NASA Astrophysics Data System (ADS)

    Jukić, D.; Denić-Jukić, V.

    2011-03-01

    SummaryHydrological time series comprise the influences of numerous processes involved in the transfer of water in hydrological cycle. It implies that an ambiguity with respect to the processes encoded in spectral and cross-spectral density functions exists. Previous studies have not paid attention adequately to this issue. Spectral and cross-spectral density functions represent the Fourier transforms of auto-covariance and cross-covariance functions. Using this basic property, the ambiguity is resolved by applying a novel approach based on the spectral representation of partial correlation. Mathematical background for partial spectral density, partial amplitude and partial phase functions is presented. The proposed functions yield the estimates of spectral density, amplitude and phase that are not affected by a controlling process. If an input-output relation is the subject of interest, antecedent and subsequent influences of the controlling process can be distinguished considering the input event as a referent point. The method is used for analyses of the relations between the rainfall, air temperature and relative humidity, as well as the influences of air temperature and relative humidity on the discharge from karst spring. Time series are collected in the catchment of the Jadro Spring located in the Dinaric karst area of Croatia.

  16. Irreversibility of financial time series: A graph-theoretical approach

    NASA Astrophysics Data System (ADS)

    Flanagan, Ryan; Lacasa, Lucas

    2016-04-01

    The relation between time series irreversibility and entropy production has been recently investigated in thermodynamic systems operating away from equilibrium. In this work we explore this concept in the context of financial time series. We make use of visibility algorithms to quantify, in graph-theoretical terms, time irreversibility of 35 financial indices evolving over the period 1998-2012. We show that this metric is complementary to standard measures based on volatility and exploit it to both classify periods of financial stress and to rank companies accordingly. We then validate this approach by finding that a projection in principal components space of financial years, based on time irreversibility features, clusters together periods of financial stress from stable periods. Relations between irreversibility, efficiency and predictability are briefly discussed.

  17. Compounding approach for univariate time series with nonstationary variances

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  18. Sutureless clear corneal DSAEK with a modified approach for preventing pupillary block and graft dislocation: case series with retrospective comparative analysis.

    PubMed

    Titiyal, Jeewan S; Tinwala, Sana I; Shekhar, Himanshu; Sinha, Rajesh

    2015-04-01

    The purpose of this study was to describe a modified technique of sutureless DSAEK with continuous pressurized internal air tamponade. This was a prospective interventional case series, single-center, institutional study. Twenty-seven patients with corneal decompensation without scarring were included. Aphakic patients and patients with cataractous lens requiring IOL implantation surgery were excluded. Following preparation of the donor tissue, a corneal tunnel was made nasally with two side ports. All incisions were kept long enough to be overlapped by the peripheral part of the donor tissue. Descemet membrane scoring was done using a reverse Sinskey hook, following which it was removed with the same instrument or by forceps. The donor lenticule was then inserted using Busin's glide. Continuous pressurized internal air tamponade was achieved by means of a 30-gauge needle, inserted through the posterior limbus, for 12-14 min. At the end of the surgery, air was partially replaced with BSS, leaving a moderate-sized mobile air bubble in the anterior chamber. At the 6 month's follow-up, CDVA improved from counting fingers at half meter-6/24 preoperatively to 6/9-6/18 postoperatively, and the mean endothelial cell count decreased: to 1,800 from 2,200 cell/mm(2) preoperatively (18.19 % endothelial cell loss). Donor lenticule thickness as documented on AS-OCT was 70-110 µ on Day 1 and 50-80 µ at 6 months postoperative. None of the cases had flat AC or peripheral anterior synechiae formation. None of the patients required a second intervention. There were no cases of primary graft failure, pupillary block glaucomax or donor lenticule dislocation postoperatively. Our modified technique is simple and effective with reduction in postoperative complications associated with DSAEK, thereby maximizing anatomic and functional outcomes associated. PMID:24728534

  19. Time Series Analysis of SOLSTICE Measurements

    NASA Astrophysics Data System (ADS)

    Wen, G.; Cahalan, R. F.

    2003-12-01

    Solar radiation is the major energy source for the Earth's biosphere and atmospheric and ocean circulations. Variations of solar irradiance have been a major concern of scientists both in solar physics and atmospheric sciences. A number of missions have been carried out to monitor changes in total solar irradiance (TSI) [see Fröhlich and Lean, 1998 for review] and spectral solar irradiance (SSI) [e.g., SOLSTICE on UARS and VIRGO on SOHO]. Observations over a long time period reveal the connection between variations in solar irradiance and surface magnetic fields of the Sun [Lean1997]. This connection provides a guide to scientists in modeling solar irradiances [e.g., Fontenla et al., 1999; Krivova et al., 2003]. Solar spectral observations have now been made over a relatively long time period, allowing statistical analysis. This paper focuses on predictability of solar spectral irradiance using observed SSI from SOLSTICE . Analysis of predictability is based on nonlinear dynamics using an artificial neural network in a reconstructed phase space [Abarbanel et al., 1993]. In the analysis, we first examine the average mutual information of the observed time series and a delayed time series. The time delay that gives local minimum of mutual information is chosen as the time-delay for phase space reconstruction [Fraser and Swinney, 1986]. The embedding dimension of the reconstructed phase space is determined using the false neighbors and false strands method [Kennel and Abarbanel, 2002]. Subsequently, we use a multi-layer feed-forward network with back propagation scheme [e.g., Haykin, 1994] to model the time series. The predictability of solar irradiance as a function of wavelength is considered. References Abarbanel, H. D. I., R. Brown, J. J. Sidorowich, and L. Sh. Tsimring, Rev. Mod. Phys. 65, 1331, 1993. Fraser, A. M. and H. L. Swinney, Phys. Rev. 33A, 1134, 1986. Fontenla, J., O. R. White, P. Fox, E. H. Avrett and R. L. Kurucz, The Astrophysical Journal, 518, 480

  20. Behavior of road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin

    2014-12-01

    Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.

  1. Time Series Analysis and Prediction of AE and Dst Data

    NASA Astrophysics Data System (ADS)

    Takalo, J.; Lohikiski, R.; Timonen, J.; Lehtokangas, M.; Kaski, K.

    1996-12-01

    A new method to analyse the structure function has been constructed and used in the analysis of the AE time series for the years 1978-85 and Dst time series for 1957-84. The structure function (SF) was defined by S(l) = <|x(ti + lDt) - x(ti)|>, where Dt is the sampling time, l is an integer, and <|.|> denotes the average of absolute values. If a time series is self-affine its SF should scale for small values of l as S(l) is proportional to lH, where 0 < H < 1 is called the scaling exponent. It is known that for power-law (coloured) noise, which has P ~ f-a, a ~ 2H + 1 for 1 < a < 3. In this work the scaling exponent H was analysed by considering the local slopes dlog(S(l))/dlog(l) between two adjacent points as a function of l. For self-affine time series the local slopes should stay constant, at least for small values of l. The AE time series was found to be affine such that the scaling exponent changes at a time scale of 113 (+/-9) minutes. On the other hand, in the SF function analysis, the Dst data were dominated by the 24-hour and 27-day periods. The 27-day period was further modulated by the annual variation. These differences between the two time series arise from the difference in their periodicities in relation to their respective characteristic time scales. In the AE data the dominating periods are longer than that related to the characteristic time scale, i.e. they appear in the flatter part of the power spectrum. This is why the affinity is the dominating feature of the AE time series. In contrast with this the dominating periods of the Dst data are shorter than the characteristic time scale, and appear in the steeper part of the spectrum. Consequently periodicity is the dominating feature of the Dst data. Because of their different dynamic characteristics, prediction of Dst and AE time series appear to presuppose rather different approaches. In principle it is easier to produce the gross features of the Dst time series correctly as it is periodicity

  2. Mixed Spectrum Analysis on fMRI Time-Series.

    PubMed

    Kumar, Arun; Lin, Feng; Rajapakse, Jagath C

    2016-06-01

    Temporal autocorrelation present in functional magnetic resonance image (fMRI) data poses challenges to its analysis. The existing approaches handling autocorrelation in fMRI time-series often presume a specific model of autocorrelation such as an auto-regressive model. The main limitation here is that the correlation structure of voxels is generally unknown and varies in different brain regions because of different levels of neurogenic noises and pulsatile effects. Enforcing a universal model on all brain regions leads to bias and loss of efficiency in the analysis. In this paper, we propose the mixed spectrum analysis of the voxel time-series to separate the discrete component corresponding to input stimuli and the continuous component carrying temporal autocorrelation. A mixed spectral analysis technique based on M-spectral estimator is proposed, which effectively removes autocorrelation effects from voxel time-series and identify significant peaks of the spectrum. As the proposed method does not assume any prior model for the autocorrelation effect in voxel time-series, varying correlation structure among the brain regions does not affect its performance. We have modified the standard M-spectral method for an application on a spatial set of time-series by incorporating the contextual information related to the continuous spectrum of neighborhood voxels, thus reducing considerably the computation cost. Likelihood of the activation is predicted by comparing the amplitude of discrete component at stimulus frequency of voxels across the brain by using normal distribution and modeling spatial correlations among the likelihood with a conditional random field. We also demonstrate the application of the proposed method in detecting other desired frequencies.

  3. Time-dependent spectral analysis of epidemiological time-series with wavelets.

    PubMed

    Cazelles, Bernard; Chavez, Mario; Magny, Guillaume Constantin de; Guégan, Jean-Francois; Hales, Simon

    2007-08-22

    In the current context of global infectious disease risks, a better understanding of the dynamics of major epidemics is urgently needed. Time-series analysis has appeared as an interesting approach to explore the dynamics of numerous diseases. Classical time-series methods can only be used for stationary time-series (in which the statistical properties do not vary with time). However, epidemiological time-series are typically noisy, complex and strongly non-stationary. Given this specific nature, wavelet analysis appears particularly attractive because it is well suited to the analysis of non-stationary signals. Here, we review the basic properties of the wavelet approach as an appropriate and elegant method for time-series analysis in epidemiological studies. The wavelet decomposition offers several advantages that are discussed in this paper based on epidemiological examples. In particular, the wavelet approach permits analysis of transient relationships between two signals and is especially suitable for gradual change in force by exogenous variables.

  4. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  5. Flutter Analysis for Turbomachinery Using Volterra Series

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Yao, Weigang

    2014-01-01

    The objective of this paper is to describe an accurate and efficient reduced order modeling method for aeroelastic (AE) analysis and for determining the flutter boundary. Without losing accuracy, we develop a reduced order model based on the Volterra series to achieve significant savings in computational cost. The aerodynamic force is provided by a high-fidelity solution from the Reynolds-averaged Navier-Stokes (RANS) equations; the structural mode shapes are determined from the finite element analysis. The fluid-structure coupling is then modeled by the state-space formulation with the structural displacement as input and the aerodynamic force as output, which in turn acts as an external force to the aeroelastic displacement equation for providing the structural deformation. NASA's rotor 67 blade is used to study its aeroelastic characteristics under the designated operating condition. First, the CFD results are validated against measured data available for the steady state condition. Then, the accuracy of the developed reduced order model is compared with the full-order solutions. Finally the aeroelastic solutions of the blade are computed and a flutter boundary is identified, suggesting that the rotor, with the material property chosen for the study, is structurally stable at the operating condition, free of encountering flutter.

  6. A novel similarity comparison approach for dynamic ECG series.

    PubMed

    Yin, Hong; Zhu, Xiaoqian; Ma, Shaodong; Yang, Shuqiang; Chen, Liqian

    2015-01-01

    The heart sound signal is a reflection of heart and vascular system motion. Long-term continuous electrocardiogram (ECG) contains important information which can be helpful to prevent heart failure. A single piece of a long-term ECG recording usually consists of more than one hundred thousand data points in length, making it difficult to derive hidden features that may be reflected through dynamic ECG monitoring, which is also very time-consuming to analyze. In this paper, a Dynamic Time Warping based on MapReduce (MRDTW) is proposed to make prognoses of possible lesions in patients. Through comparison of a real-time ECG of a patient with the reference sets of normal and problematic cardiac waveforms, the experimental results reveal that our approach not only retains high accuracy, but also greatly improves the efficiency of the similarity measure in dynamic ECG series.

  7. Deciding on the best (in this case) approach to time-series forecasting

    SciTech Connect

    Pack, D. J.

    1980-01-01

    This paper was motivated by a Decision Sciences article (v. 10, no. 2, 232-244(April 1979)) that presented comparisons of the adaptive estimation procedure (AEP), adaptive filtering, the Box-Jenkins (BJ) methodology, and multiple regression analysis as they apply to time-series forecasting with single-series models. While such comparisons are to be applauded in general, it is demonstrated that the empirical comparisons of the above paper are quite misleading with respect to choosing between the AEP and BJ approaches. This demonstration is followed by a somewhat philosophical discussion on comparison-of-methods techniques.

  8. The scaling of time series size towards detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Gao, Xiaolei; Ren, Liwei; Shang, Pengjian; Feng, Guochen

    2016-06-01

    In this paper, we introduce a modification of detrended fluctuation analysis (DFA), called multivariate DFA (MNDFA) method, based on the scaling of time series size N. In traditional DFA method, we obtained the influence of the sequence segmentation interval s, and it inspires us to propose a new model MNDFA to discuss the scaling of time series size towards DFA. The effectiveness of the procedure is verified by numerical experiments with both artificial and stock returns series. Results show that the proposed MNDFA method contains more significant information of series compared to traditional DFA method. The scaling of time series size has an influence on the auto-correlation (AC) in time series. For certain series, we obtain an exponential relationship, and also calculate the slope through the fitting function. Our analysis and finite-size effect test demonstrate that an appropriate choice of the time series size can avoid unnecessary influences, and also make the testing results more accurate.

  9. Time Series in Education: The Analysis of Daily Attendance in Two High Schools

    ERIC Educational Resources Information Center

    Koopmans, Matthijs

    2011-01-01

    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  10. Non-stationary hydrological frequency analysis based on the reconstruction of extreme hydrological series

    NASA Astrophysics Data System (ADS)

    Hu, Y. M.; Liang, Z. M.; Jiang, X. L.; Bu, H.

    2015-06-01

    In this paper, a novel approach for non-stationary hydrological frequency analysis is proposed. The approach is due to the following consideration that, at present the data series used to detect mutation characteristic is very short, which may only reflect the partial characteristic of the population. That is to say, the mutation characteristic of short series may not fully represent the mutation characteristic of population, such as the difference of mutation degree between short sample and population. In this proposed method, an assumption is done that the variation hydrological series in a big time window owns an expected vibration center (EVC), which is a linear combination of the two mean values of the two subsample series obtained through separating the original hydrological series by a novel optimal segmentation technique (change rate of slope method). Then using the EVC to reconstruct non-stationary series to meet the requirement of stationary, and further ensure the conventional frequency analysis methods is valid.

  11. Time-series analysis of Campylobacter incidence in Switzerland.

    PubMed

    Wei, W; Schüpbach, G; Held, L

    2015-07-01

    Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.

  12. Critical Thinking Skills. Analysis and Action Series.

    ERIC Educational Resources Information Center

    Heiman, Marcia; Slomianko, Joshua

    Intended for teachers across grade levels and disciplines, this monograph reviews research on the development of critical thinking skills and introduces a series of these skills that can be incorporated into classroom teaching. Beginning with a definition of critical thinking, the monograph contains two main sections. The first section reviews…

  13. A time-series approach to dynamical systems from classical and quantum worlds

    SciTech Connect

    Fossion, Ruben

    2014-01-08

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  14. The U-series comminution approach: where to from here

    NASA Astrophysics Data System (ADS)

    Handley, Heather; Turner, Simon; Afonso, Juan; Turner, Michael; Hesse, Paul

    2015-04-01

    Quantifying the rates of landscape evolution in response to climate change is inhibited by the difficulty of dating the formation of continental detrital sediments. The 'comminution age' dating model of DePaolo et al. (2006) hypothesises that the measured disequilibria between U-series nuclides (234U and 238U) in fine-grained continental (detrital) sediments can be used to calculate the time elapsed since mechanical weathering of a grain to the threshold size ( 50 µm). The comminution age includes the time that a particle has been mobilised in transport, held in temporary storage (e.g., soils and floodplains) and the time elapsed since final deposition to present day. Therefore, if the deposition age of sediment can be constrained independently, for example via optically stimulated luminescence (OSL) dating, the residence time of sediment (e.g., a palaeochannel deposit) can be determined. Despite the significant potential of this approach, there is still much work to be done before meaningful absolute comminution ages can be obtained. The calculated recoil loss factor and comminution age are highly dependent on the method of recoil loss factor determination used and the inherent assumptions. We present new and recently published uranium isotope data for aeolian sediment deposits, leached and unleached palaeochannel sediments and bedrock samples from Australia to exemplify areas of current uncertainty in the comminution age approach. In addition to the information gained from natural samples, Monte Carlo simulations have been conducted for a synthetic sediment sample to determine the individual and combined comminution age uncertainties associated to each input variable. Using a reasonable associated uncertainty for each input factor and including variations in the source rock and measured (234U/238U) ratios, the total combined uncertainty on comminution age in our simulation (for two methods of recoil loss factor estimation: weighted geometric and surface area

  15. Stratospheric ozone time series analysis using dynamical linear models

    NASA Astrophysics Data System (ADS)

    Laine, Marko; Kyrölä, Erkki

    2013-04-01

    We describe a hierarchical statistical state space model for ozone profile time series. The time series are from satellite measurements by the SAGE II and GOMOS instruments spanning years 1984-2012. The original data sets are combined and gridded monthly using 10 degree latitude bands, and covering 20-60 km with 1 km vertical spacing. Model components include level, trend, seasonal effect with solar activity, and quasi biennial oscillations as proxy variables. A typical feature of an atmospheric time series is that they are not stationary but exhibit both slowly varying and abrupt changes in the distributional properties. These are caused by external forcing such as changes in the solar activity or volcanic eruptions. Further, the data sampling is often nonuniform, there are data gaps, and the uncertainty of the observations can vary. When observations are combined from various sources there will be instrument and retrieval method related biases. The differences in sampling lead also to uncertainties. Standard classical ARIMA type of statistical time series methods are mostly useless for atmospheric data. A more general approach makes use of dynamical linear models and Kalman filter type of sequential algorithms. These state space models assume a linear relationship between the unknown state of the system and the observations and for the process evolution of the hidden states. They are still flexible enough to model both smooth trends and sudden changes. The above mentioned methodological challenges are discussed, together with analysis of change points in trends related to recovery of stratospheric ozone. This work is part of the ESA SPIN and ozone CCI projects.

  16. Time series analysis of Monte Carlo neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Nease, Brian Robert

    A time series based approach is applied to the Monte Carlo (MC) fission source distribution to calculate the non-fundamental mode eigenvalues of the system. The approach applies Principal Oscillation Patterns (POPs) to the fission source distribution, transforming the problem into a simple autoregressive order one (AR(1)) process. Proof is provided that the stationary MC process is linear to first order approximation, which is a requirement for the application of POPs. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern k-eigenvalue MC codes calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. The strength of this approach is contrasted against the Fission Matrix method (FMM) in terms of accuracy versus computer memory constraints. Multi-dimensional problems are considered since the approach has strong potential for use in reactor analysis, and the implementation of the method into production codes is discussed. Lastly, the appearance of complex eigenvalues is investigated and solutions are provided.

  17. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    NASA Astrophysics Data System (ADS)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  18. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  19. Interstage Flammability Analysis Approach

    NASA Technical Reports Server (NTRS)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  20. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  1. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series

    PubMed Central

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M.; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  2. Model independent approach for estimating hydrological model parameters and rainfall time-series using only discharges time-series and coarse data

    NASA Astrophysics Data System (ADS)

    Michon, Timothée; Saulnier, Georges-Marie; Castaings, William

    2013-04-01

    Despite hydrological models progressed in terms of relevancy and efficiency, a calibration step is still required to estimate parameters values that can not be obtained by field experiments or other physically based reasoning. As a consequence, hydrological models results robustness depends on data availability and data accuracy. Furthermore, current calibration procedure often require concomitant forcing and prognostic variables time series at identical time step (e.g. hourly rainfall and discharges times series for flood hydrological models) which also limits the applicability of hydrological models (what can be done for retrospective historical analysis, for poorly gauged catchment, etc). This communication deals with the question of the possibility of hydrological model calibration with less information content. In particular, it will be shown that rainfall and discharge time series are redundant to some extent and at least on the case study presented here (Ardèche catchment, 2000 km2, Southern France). As a first part, it will be shown that "doing hydrology backward" (Kirchner, 2009) can be generalized to several models based on different and even contradictory assumptions, leading to a model-independent hydrology backward approach. Also, it will be shown that if a model is reasonably set up on a catchment, rainfall time series can be accurately inverted using only discharges time-series. This prefigure the idea that discharges time-series contains both information on rainfall inputs and informations on the discharge-rainfall relationship, i.e. the hydrological behaviour of the considered catchment, and that these coupled informations may be separately identified. In other terms, is it possible to distinguish and to quantify, within the discharges time series, the model parameters values from one side and the rainfall time-series on the other side ? This will be illustrated as a second part. Indeed, presented results will show that, knowing only the hourly

  3. Biological time series analysis using a context free language: applicability to pulsatile hormone data.

    PubMed

    Dean, Dennis A; Adler, Gail K; Nguyen, David P; Klerman, Elizabeth B

    2014-01-01

    We present a novel approach for analyzing biological time-series data using a context-free language (CFL) representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP) analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.

  4. Wavelet transform approach for fitting financial time series data

    NASA Astrophysics Data System (ADS)

    Ahmed, Amel Abdoullah; Ismail, Mohd Tahir

    2015-10-01

    This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.

  5. Automatising the analysis of stochastic biochemical time-series

    PubMed Central

    2015-01-01

    Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821

  6. Nonlinear time-series analysis of Hyperion's lightcurves

    NASA Astrophysics Data System (ADS)

    Tarnopolski, M.

    2015-06-01

    Hyperion is a satellite of Saturn that was predicted to remain in a chaotic rotational state. This was confirmed to some extent by Voyager 2 and Cassini series of images and some ground-based photometric observations. The aim of this article is to explore conditions for potential observations to meet in order to estimate a maximal Lyapunov Exponent (mLE), which being positive is an indicator of chaos and allows to characterise it quantitatively. Lightcurves existing in literature as well as numerical simulations are examined using standard tools of theory of chaos. It is found that existing datasets are too short and undersampled to detect a positive mLE, although its presence is not rejected. Analysis of simulated lightcurves leads to an assertion that observations from one site should be performed over a year-long period to detect a positive mLE, if present, in a reliable way. Another approach would be to use 2-3 telescopes spread over the world to have observations distributed more uniformly. This may be achieved without disrupting other observational projects being conducted. The necessity of time-series to be stationary is highly stressed.

  7. Analysis of complex causal networks through time series

    NASA Astrophysics Data System (ADS)

    Hut, R.; van de Giesen, N.

    2008-12-01

    We introduce a new way of looking at (the relations between) groups of signals. In complex networks, such as in landscapes and ecosystems, multiple factors influence each other either through direct causal relations or indirectly through intermediate variables. To puzzle apart the causal relations in a complex network on the basis of measured time series, is not trivial. The method developed here allows us to do excalty that. Using relations that can be derived by (classical) multiple input multiple output system identification, we construct underlying networks of linear time-invariant systems that describe the direct relations between the different signals. The structure of this underlying network can provide valuable information about which signals are dominant, which relations between signals are dominant, and which signals affect each other through another signal in stead of directly. Feedback is easily identified using this approach. We show that the Eigenvalues of the underlying network determine the stability of the network as a whole. Applications are foreseen in for instance the fields of data-driven climate modeling as well as other research involving time series analysis in complex networks.

  8. [The ethical approach applied to the TV series ER].

    PubMed

    Svandra, Philippe

    2013-05-01

    The television series ER presents an opportunity to reflect on ethical dilemmas. This article discusses the example of an episode in which a patient suffering from an incurable disease, unable to express his views clearly, has a tracheotomy performed on him without the consent of the team or his health care proxy. PMID:23776983

  9. Time series expression analyses using RNA-seq: a statistical approach.

    PubMed

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021

  10. Wavelet analysis and scaling properties of time series

    NASA Astrophysics Data System (ADS)

    Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.

    2005-10-01

    We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.

  11. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    PubMed

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management.

  12. Spectral analysis of the Elatina series

    NASA Technical Reports Server (NTRS)

    Bracewell, R. N.

    1988-01-01

    The Elatina formation in South Australia, which provides a rich fossil record of presumptive solar activity in the late Precambrian, is of great potential significance for the physics of the sun because it contains luminae grouped in cycles of about 12, an appearance suggestive of the solar cycle. Here, the laminae are treated as varves laid down yearly and modulated in thickness in accordance with the late Precambrian sunspot activity for the year of deposition. The purpose is to present a simple structure, or intrinsic spectrum, that will be uncovered by appropriate data analysis.

  13. Spectral analysis of the Elatina varve series

    NASA Technical Reports Server (NTRS)

    Bracewell, R. N.

    1988-01-01

    The Elatina formation in South America, which provides a rich fossil record of presumptive solar activity in the late Precambrian, is of great potential significance for the physics of the sun because it contains luminae grouped in cycles of about 12, an appearance suggestive of the solar cycle. Here, the laminae are treated as varves laid down yearly and modulated in thickness in accordance with the late Precambrian sunspot activity for the year of deposition. The purpose is to present a simple structure, or intrinsic spectrum, that will be uncovered by appropriate data analysis.

  14. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  15. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    NASA Astrophysics Data System (ADS)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  16. The Effectiveness of Blind Source Separation Using Independent Component Analysis for GNSS Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Yan, Jun; Dong, Danan; Chen, Wen

    2016-04-01

    Due to the development of GNSS technology and the improvement of its positioning accuracy, observational data obtained by GNSS is widely used in Earth space geodesy and geodynamics research. Whereas the GNSS time series data of observation stations contains a plenty of information. This includes geographical space changes, deformation of the Earth, the migration of subsurface material, instantaneous deformation of the Earth, weak deformation and other blind signals. In order to decompose some instantaneous deformation underground, weak deformation and other blind signals hided in GNSS time series, we apply Independent Component Analysis (ICA) to daily station coordinate time series of the Southern California Integrated GPS Network. As ICA is based on the statistical characteristics of the observed signal. It uses non-Gaussian and independence character to process time series to obtain the source signal of the basic geophysical events. In term of the post-processing procedure of precise time-series data by GNSS, this paper examines GNSS time series using the principal component analysis (PCA) module of QOCA and ICA algorithm to separate the source signal. Then we focus on taking into account of these two signal separation technologies, PCA and ICA, for separating original signal that related geophysical disturbance changes from the observed signals. After analyzing these separation process approaches, we demonstrate that in the case of multiple factors, PCA exists ambiguity in the separation of source signals, that is the result related is not clear, and ICA will be better than PCA, which means that dealing with GNSS time series that the combination of signal source is unknown is suitable to use ICA.

  17. J series thruster isolator failure analysis

    NASA Technical Reports Server (NTRS)

    Campbell, J. W.; Bechtel, R. T.; Brophy, J. R.

    1982-01-01

    Three Hg propellant isolators (two cathode and one main) failed during testing in the Mission Profile Life Test. These failures involved contamination of the surface of the alumina insulating body which resulted in heating of the vaporizer by leakage current from the high voltage supply, with subsequent loss of propellant flow rate control. Failure analysis of the isolators showed the surface resistance was temperature dependent and that the alumina could be restored to its original insulating state by grit blasting the surface. The contaminant was identified as carbon and the most likely sources identified as ambient facility hydrocarbons, directed back-sputtered facility materials, and outgassing from organic insulating materials within the thruster envelope. Methods to eliminate contamination from each of these sources are described.

  18. Advanced tools for astronomical time series and image analysis

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.

    The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

  19. Aroma characterization based on aromatic series analysis in table grapes.

    PubMed

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-08-04

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, 'Kyoho' grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes.

  20. Aroma characterization based on aromatic series analysis in table grapes

    PubMed Central

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, ‘Kyoho’ grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  1. Aroma characterization based on aromatic series analysis in table grapes.

    PubMed

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, 'Kyoho' grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  2. Time series power flow analysis for distribution connected PV generation.

    SciTech Connect

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  3. Emergent Approaches to Mental Health Problems. The Century Psychology Series.

    ERIC Educational Resources Information Center

    Cowen, Emory L., Ed.; And Others

    Innovative approaches to mental health problems are described. Conceptualizations about the following areas are outlined: psychiatry, the universe, and the community; theoretical malaise and community mental health; the relation of conceptual models to manpower needs; and mental health manpower and institutional change. Community programs and new…

  4. Activity Approach to Just Beyond the Classroom. Environmental Education Series.

    ERIC Educational Resources Information Center

    Skliar, Norman; La Mantia, Laura

    To provide teachers with some of the many activities that can be carried on "just beyond the classroom," the booklet presents plans for more than 40 outdoor education activities, all emphasizing multidisciplinary, inquiry approach to learning. The school grounds offer optimum conditions for initiating studies in the out-of-doors. While every…

  5. Impacts of age-dependent tree sensitivity and dating approaches on dendrogeomorphic time series of landslides

    NASA Astrophysics Data System (ADS)

    Šilhán, Karel; Stoffel, Markus

    2015-05-01

    Different approaches and thresholds have been utilized in the past to date landslides with growth ring series of disturbed trees. Past work was mostly based on conifer species because of their well-defined ring boundaries and the easy identification of compression wood after stem tilting. More recently, work has been expanded to include broad-leaved trees, which are thought to produce less and less evident reactions after landsliding. This contribution reviews recent progress made in dendrogeomorphic landslide analysis and introduces a new approach in which landslides are dated via ring eccentricity formed after tilting. We compare results of this new and the more conventional approaches. In addition, the paper also addresses tree sensitivity to landslide disturbance as a function of tree age and trunk diameter using 119 common beech (Fagus sylvatica L.) and 39 Crimean pine (Pinus nigra ssp. pallasiana) trees growing on two landslide bodies. The landslide events reconstructed with the classical approach (reaction wood) also appear as events in the eccentricity analysis, but the inclusion of eccentricity clearly allowed for more (162%) landslides to be detected in the tree-ring series. With respect to tree sensitivity, conifers and broad-leaved trees show the strongest reactions to landslides at ages comprised between 40 and 60 years, with a second phase of increased sensitivity in P. nigra at ages of ca. 120-130 years. These phases of highest sensitivities correspond with trunk diameters at breast height of 6-8 and 18-22 cm, respectively (P. nigra). This study thus calls for the inclusion of eccentricity analyses in future landslide reconstructions as well as for the selection of trees belonging to different age and diameter classes to allow for a well-balanced and more complete reconstruction of past events.

  6. Viewing effects of 3-D images synthesized from a series of 2-D tomograms by VAP and HAP approaches

    NASA Astrophysics Data System (ADS)

    Zhai, H. C.; Wang, M. W.; Liu, F. M.; Hsu, Ken Y.

    We report, for the first time, the experimental result and its analysis of synthesizing a series of simulating 2-D tomograms into a 3-D monochromatic image. Our result shows clearly the advantage in monochromaticity of a vertical area-partition (VAP) approach over a horizontal area-partition (HAP) approach during the final white-light reconstruction. This monochromaticity will ensure a 3-D image synthesis without any distortion in gray level or positional recovery.

  7. Clinical immunology review series: an approach to desensitization

    PubMed Central

    Krishna, M T; Huissoon, A P

    2011-01-01

    Allergen immunotherapy describes the treatment of allergic disease through administration of gradually increasing doses of allergen. This form of immune tolerance induction is now safer, more reliably efficacious and better understood than when it was first formally described in 1911. In this paper the authors aim to summarize the current state of the art in immunotherapy in the treatment of inhalant, venom and drug allergies, with specific reference to its practice in the United Kingdom. A practical approach has been taken, with reference to current evidence and guidelines, including illustrative protocols and vaccine schedules. A number of novel approaches and techniques are likely to change considerably the way in which we select and treat allergy patients in the coming decade, and these advances are previewed. PMID:21175592

  8. A Monte Carlo Approach to Biomedical Time Series Search

    PubMed Central

    Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A.T.

    2016-01-01

    Time series subsequence matching (or signal searching) has importance in a variety of areas in health care informatics. These areas include case-based diagnosis and treatment as well as the discovery of trends and correlations between data. Much of the traditional research in signal searching has focused on high dimensional R-NN matching. However, the results of R-NN are often small and yield minimal information gain; especially with higher dimensional data. This paper proposes a randomized Monte Carlo sampling method to broaden search criteria such that the query results are an accurate sampling of the complete result set. The proposed method is shown both theoretically and empirically to improve information gain. The number of query results are increased by several orders of magnitude over approximate exact matching schemes and fall within a Gaussian distribution. The proposed method also shows excellent performance as the majority of overhead added by sampling can be mitigated through parallelization. Experiments are run on both simulated and real-world biomedical datasets.

  9. A Corpus Analysis of Vocabulary Coverage and Vocabulary Learning Opportunities within a Children's Story Series

    ERIC Educational Resources Information Center

    Sun, Yu-Chih

    2016-01-01

    Extensive reading for second language learners have been widely documented over the past few decades. However, few studies, if any, have used a corpus analysis approach to analyze the vocabulary coverage within a single-author story series, its repetition of vocabulary, and the incidental and intentional vocabulary learning opportunities therein.…

  10. Optimal trading strategies—a time series approach

    NASA Astrophysics Data System (ADS)

    Bebbington, Peter A.; Kühn, Reimer

    2016-05-01

    Motivated by recent advances in the spectral theory of auto-covariance matrices, we are led to revisit a reformulation of Markowitz’ mean-variance portfolio optimization approach in the time domain. In its simplest incarnation it applies to a single traded asset and allows an optimal trading strategy to be found which—for a given return—is minimally exposed to market price fluctuations. The model is initially investigated for a range of synthetic price processes, taken to be either second order stationary, or to exhibit second order stationary increments. Attention is paid to consequences of estimating auto-covariance matrices from small finite samples, and auto-covariance matrix cleaning strategies to mitigate against these are investigated. Finally we apply our framework to real world data.

  11. Automatic differentiation for Fourier series and the radii polynomial approach

    NASA Astrophysics Data System (ADS)

    Lessard, Jean-Philippe; Mireles James, J. D.; Ransford, Julian

    2016-11-01

    In this work we develop a computer-assisted technique for proving existence of periodic solutions of nonlinear differential equations with non-polynomial nonlinearities. We exploit ideas from the theory of automatic differentiation in order to formulate an augmented polynomial system. We compute a numerical Fourier expansion of the periodic orbit for the augmented system, and prove the existence of a true solution nearby using an a-posteriori validation scheme (the radii polynomial approach). The problems considered here are given in terms of locally analytic vector fields (i.e. the field is analytic in a neighborhood of the periodic orbit) hence the computer-assisted proofs are formulated in a Banach space of sequences satisfying a geometric decay condition. In order to illustrate the use and utility of these ideas we implement a number of computer-assisted existence proofs for periodic orbits of the Planar Circular Restricted Three-Body Problem (PCRTBP).

  12. Scale-space analysis of time series in circulatory research.

    PubMed

    Mortensen, Kim Erlend; Godtliebsen, Fred; Revhaug, Arthur

    2006-12-01

    Statistical analysis of time series is still inadequate within circulation research. With the advent of increasing computational power and real-time recordings from hemodynamic studies, one is increasingly dealing with vast amounts of data in time series. This paper aims to illustrate how statistical analysis using the significant nonstationarities (SiNoS) method may complement traditional repeated-measures ANOVA and linear mixed models. We applied these methods on a dataset of local hepatic and systemic circulatory changes induced by aortoportal shunting and graded liver resection. We found SiNoS analysis more comprehensive when compared with traditional statistical analysis in the following four ways: 1) the method allows better signal-to-noise detection; 2) including all data points from real time recordings in a statistical analysis permits better detection of significant features in the data; 3) analysis with multiple scales of resolution facilitates a more differentiated observation of the material; and 4) the method affords excellent visual presentation by combining group differences, time trends, and multiscale statistical analysis allowing the observer to quickly view and evaluate the material. It is our opinion that SiNoS analysis of time series is a very powerful statistical tool that may be used to complement conventional statistical methods.

  13. Introduction to the Special Series on Research Synthesis: A Cross-Disciplinary Approach.

    PubMed

    Robinson, Lisa A; Hammitt, James K

    2015-06-01

    To estimate the effects of a policy change, analysts must often rely on available data as time and resource constraints limit their ability to commission new primary research. Research synthesis methods-including systematic review, meta-analysis, and expert elicitation-play an important role in ensuring that this evidence is appropriately weighed and considered. We present the conclusions of a multidisciplinary Harvard Center for Risk Analysis project that evaluated and applied these methods, and introduce the resulting series of articles. The first step in any analysis is to clearly define the problem to be addressed; the second is a systematic review of the literature. Whether additional analysis is needed depends on the quality and relevance of the available data to the policy question, and the likely effect of uncertainty on the policy decision. Meta-analysis promotes understanding the variation between studies and may be used to combine the estimates to develop values for application in policy analysis. Formal, structured expert elicitation promotes careful consideration of the evidence when data are limited or inconsistent, and aids in extrapolating to the policy context. Regardless of the methods used, clear communication of the approach, assumptions, and uncertainty is essential. PMID:26081936

  14. Improvements in Accurate GPS Positioning Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Koyama, Yuichiro; Tanaka, Toshiyuki

    Although the Global Positioning System (GPS) is used widely in car navigation systems, cell phones, surveying, and other areas, several issues still exist. We focus on the continuous data received in public use of GPS, and propose a new positioning algorithm that uses time series analysis. By fitting an autoregressive model to the time series model of the pseudorange, we propose an appropriate state-space model. We apply the Kalman filter to the state-space model and use the pseudorange estimated by the filter in our positioning calculations. The results of the authors' positioning experiment show that the accuracy of the proposed method is much better than that of the standard method. In addition, as we can obtain valid values estimated by time series analysis using the state-space model, the proposed state-space model can be applied to several other fields.

  15. Simulation of active and passive millimeter-wave (35 GHz) sensors by time series analysis

    NASA Astrophysics Data System (ADS)

    Strenzwilk, D. F.; Maruyama, R. T.

    1982-11-01

    Analog voltage signals from a millimeter-wave (MMW) radiometer (passive sensor) and radar (active sensor) were collected over varying grassy terrains at Aberdeen Proving Ground (APG), Maryland in July 1980. These measurements were made as part of continuing studies of MMW sensors for smart munitions. The signals were digitized at a rate of 2,000 observations per second and then analyzed by the Box and Jenkins time series approach. This analysis reports on the characterization of these data sets. The passive time series signals resulted in a simple autoregressive-moving average process, similar to a previous set of data taken at Rome Air Development Center in Rome, N.Y. by Ballistic Research Laboratory. On the other hand, the radar data (active sensor) required a data transformation to enhance the analysis. In both cases the signals were well characterized using the Box-Jenkins time series approach.

  16. ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES

    PubMed Central

    PENG, C.-K.; COSTA, MADALENA; GOLDBERGER, ARY L.

    2009-01-01

    We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations. PMID:20041035

  17. Analysis of Complex Intervention Effects in Time-Series Experiments.

    ERIC Educational Resources Information Center

    Bower, Cathleen

    An iterative least squares procedure for analyzing the effect of various kinds of intervention in time-series data is described. There are numerous applications of this design in economics, education, and psychology, although until recently, no appropriate analysis techniques had been developed to deal with the model adequately. This paper…

  18. Handbook of Job Analysis for Reasonable Accommodation. Personnel Management Series.

    ERIC Educational Resources Information Center

    Yuspeh, Sheldon

    This is the second in a series of booklets on reasonable accommodation. It focuses on a job analysis process that can be used to plan and select appropriate actions necessary to accommodate handicapped persons in specific jobs and work environments. The guide is aimed especially at federal agencies, which are required to make reasonable…

  19. Nonlinear Analysis of Surface EMG Time Series of Back Muscles

    NASA Astrophysics Data System (ADS)

    Dolton, Donald C.; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-10-01

    A nonlinear analysis of surface electromyography time series of subjects with and without low back pain is presented. The mean-square displacement and entropy shows anomalous diffusive behavior on intermediate time range 10 ms < t < 1 s. This behavior implies the presence of correlations in the signal. We discuss the shape of the power spectrum of the signal.

  20. Structured Time Series Analysis for Human Action Segmentation and Recognition.

    PubMed

    Dian Gong; Medioni, Gerard; Xuemei Zhao

    2014-07-01

    We address the problem of structure learning of human motion in order to recognize actions from a continuous monocular motion sequence of an arbitrary person from an arbitrary viewpoint. Human motion sequences are represented by multivariate time series in the joint-trajectories space. Under this structured time series framework, we first propose Kernelized Temporal Cut (KTC), an extension of previous works on change-point detection by incorporating Hilbert space embedding of distributions, to handle the nonparametric and high dimensionality issues of human motions. Experimental results demonstrate the effectiveness of our approach, which yields realtime segmentation, and produces high action segmentation accuracy. Second, a spatio-temporal manifold framework is proposed to model the latent structure of time series data. Then an efficient spatio-temporal alignment algorithm Dynamic Manifold Warping (DMW) is proposed for multivariate time series to calculate motion similarity between action sequences (segments). Furthermore, by combining the temporal segmentation algorithm and the alignment algorithm, online human action recognition can be performed by associating a few labeled examples from motion capture data. The results on human motion capture data and 3D depth sensor data demonstrate the effectiveness of the proposed approach in automatically segmenting and recognizing motion sequences, and its ability to handle noisy and partially occluded data, in the transfer learning module. PMID:26353312

  1. Wavelet analysis for non-stationary, nonlinear time series

    NASA Astrophysics Data System (ADS)

    Schulte, Justin A.

    2016-08-01

    Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the quasi-biennial oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16 and 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.

  2. Interglacial climate dynamics and advanced time series analysis

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  3. Calculation of series capacitance for transient analysis of windings

    SciTech Connect

    Chowdhuri, P.

    1985-01-01

    The analysis of transient voltage oscillations inside a winding (transformer, reactor or magnet) is performed by representing the winding as a lumped ladder network. The capacitances to the return lead (or ground) C/sub gi/ can be computed by assuming either equivalent parellel-plate or cylindrical-symmetry configuration. The equivalent series capacitances of a winding C/sub si/, however, are complex capacitance networks, consisting of intra- and intercoil turn-to-turn capacitances of the coil sections. Many researchers have proposed methods of computing the equivalent series capacitance of a winding. As each method is quite different from the others, the analysis of the transient behavior of a winding becomes uncertain. A method of computing the series capacitance of a winding is proposed which is mathematically rigorous but simple to execute. As a check, the equivalent series capacitance of a four-turn double-disk winding was computed by the proposed method as well as by simple circuit analysis. The computations corraborated each other.

  4. Modelling trends in climatic time series using the state space approach

    NASA Astrophysics Data System (ADS)

    Laine, Marko; Kyrölä, Erkki

    2014-05-01

    A typical feature of an atmospheric time series is that they are not stationary but exhibit both slowly varying and abrupt changes in the distributional properties. These are caused by external forcing such as changes in the solar activity or volcanic eruptions. Further, the data sampling is often nonuniform, there are data gaps, and the uncertainty of the observations can vary. When observations are combined from various sources there will be instrument and retrieval method related biases. The differences in sampling lead to uncertainties, also. Dynamic regression with state space representation of the underlying processes provides flexible tools for these challenges in the analysis. By explicitly allowing for variability in the regression coefficients we let the system properties change in time. This change in time can be modelled and estimated, also. Furthermore, the use of unobservable state variables allows modelling of the processes that are driving the observed variability, such as seasonality or external forcing, and we can explicitly allow for some modelling error. The state space approach provides a well-defined hierarchical statistical model for assessing trends defined as long term background changes in the time series. The modelling assumptions can be evaluated and the method provides realistic uncertainty estimates for the model based statements on the quantities of interest. We show that a linear dynamic model (DLM) provides very flexible tool for trend and change point analysis in time series. Given the structural parameters of the model, the Kalman filter and Kalman smoother formulas can be used to estimate the model states. Further, we provide an efficient way to account for the structural parameter uncertainty by using adaptive Markov chain Monte Carlo (MCMC) algorithm. Then, the trend related statistics can be estimated by simulating realizations of the estimated processes with fully quantified uncertainties. This presentation will provide a

  5. Learning the Conditional Independence Structure of Stationary Time Series: A Multitask Learning Approach

    NASA Astrophysics Data System (ADS)

    Jung, Alexander

    2015-11-01

    We propose a method for inferring the conditional independence graph (CIG) of a high-dimensional Gaussian vector time series (discrete-time process) from a finite-length observation. By contrast to existing approaches, we do not rely on a parametric process model (such as, e.g., an autoregressive model) for the observed random process. Instead, we only require certain smoothness properties (in the Fourier domain) of the process. The proposed inference scheme works even for sample sizes much smaller than the number of scalar process components if the true underlying CIG is sufficiently sparse. A theoretical performance analysis provides conditions which guarantee that the probability of the proposed inference method to deliver a wrong CIG is below a prescribed value. These conditions imply lower bounds on the sample size such that the new method is consistent asymptotically. Some numerical experiments validate our theoretical performance analysis and demonstrate superior performance of our scheme compared to an existing (parametric) approach in case of model mismatch.

  6. Data-adaptive unfolding of nuclear excitation spectra: a time-series approach

    NASA Astrophysics Data System (ADS)

    Torres Vargas, G.; Fossion, R.; Velázquez, V.; López Vieyra, J. C.

    2014-03-01

    A common problem in the statistical characterization of the excitation spectrum of quantum systems is the adequate separation of global system-dependent properties from the local fluctuations that are universal. In this process, called unfolding, the functional form to describe the global behaviour is often imposed externally on the data and can introduce arbitrarities in the statistical results. In this contribution, we show that a quantum excitation spectrum can readily be interpreted as a time series, before any previous unfolding. An advantage of the time-series approach is that specialized methods such as Singular Spectrum Analysis (SSA) can be used to perform the unfolding procedure in a data-adaptive way. We will show how SSA separates the components that describe the global properties from the components that describe the local fluctuations. The partial variances, associated with the fluctuations, follow a definite power law that distinguishes between soft and rigid excitation spectra. The data-adaptive fluctuation and trend components can be used to reconstruct customary fluctuation measures without ambiguities or artifacts introduced by an arbitrary unfolding, and also define the global level density of the excitation spectrum. The method is applied to nuclear shell-model calculations for 48Ca, using a realistic force and Two-Body Random Ensemble (TBRE) interactions. We show that the statistical results are very robust against a variation in the parameters of the SSA method.

  7. Time-series analysis in operant research1

    PubMed Central

    Jones, Richard R.; Vaught, Russell S.; Weinrott, Mark

    1977-01-01

    A time-series method is presented, nontechnically, for analysis of data generated in individual-subject operant studies, and is recommended as a supplement to visual analysis of behavior change in reversal or multiple-baseline experiments. The method can be used to identify three kinds of statistically significant behavior change: (a) changes in score levels from one experimental phase to another, (b) reliable upward or downward trends in scores, and (c) changes in trends between phases. The detection of, and reliance on, serial dependency (autocorrelation among temporally adjacent scores) in individual-subject behavioral scores is emphasized. Examples of published data from the operant literature are used to illustrate the time-series method. PMID:16795544

  8. The application of complex network time series analysis in turbulent heated jets

    SciTech Connect

    Charakopoulos, A. K.; Karakasidis, T. E. Liakopoulos, A.; Papanicolaou, P. N.

    2014-06-15

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.

  9. Satellite time series analysis using Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  10. Scaling detection in time series: diffusion entropy analysis.

    PubMed

    Scafetta, Nicola; Grigolini, Paolo

    2002-09-01

    The methods currently used to determine the scaling exponent of a complex dynamic process described by a time series are based on the numerical evaluation of variance. This means that all of them can be safely applied only to the case where ordinary statistical properties hold true even if strange kinetics are involved. We illustrate a method of statistical analysis based on the Shannon entropy of the diffusion process generated by the time series, called diffusion entropy analysis (DEA). We adopt artificial Gauss and Lévy time series, as prototypes of ordinary and anomalous statistics, respectively, and we analyze them with the DEA and four ordinary methods of analysis, some of which are very popular. We show that the DEA determines the correct scaling exponent even when the statistical properties, as well as the dynamic properties, are anomalous. The other four methods produce correct results in the Gauss case but fail to detect the correct scaling in the case of Lévy statistics. PMID:12366207

  11. Time series analysis as a tool for karst water management

    NASA Astrophysics Data System (ADS)

    Fournier, Matthieu; Massei, Nicolas; Duran, Léa

    2015-04-01

    Karst hydrosystems are well known for their vulnerability to turbidity due to their complex and unique characteristics which make them very different from other aquifers. Moreover, many parameters can affect their functioning. It makes the characterization of their vulnerability difficult and needs the use of statistical analyses Time series analyses on turbidity, electrical conductivity and water discharge datasets, such as correlation and spectral analyses, have proven to be useful in improving our understanding of karst systems. However, the loss of information on time localization is a major drawback of those Fourier spectral methods; this problem has been overcome by the development of wavelet analysis (continuous or discrete) for hydrosystems offering the possibility to better characterize the complex modalities of variation inherent to non stationary processes. Nevertheless, from wavelet transform, signal is decomposed on several continuous wavelet signals which cannot be true with local-time processes frequently observed in karst aquifer. More recently, a new approach associating empirical mode decomposition and the Hilbert transform was presented for hydrosystems. It allows an orthogonal decomposition of the signal analyzed and provides a more accurate estimation of changing variability scales across time for highly transient signals. This study aims to identify the natural and anthropogenic parameters which control turbidity released at a well for drinking water supply. The well is located in the chalk karst aquifer near the Seine river at 40 km of the Seine estuary in western Paris Basin. At this location, tidal variations greatly affect the level of the water in the Seine. Continuous wavelet analysis on turbidity dataset have been used to decompose turbidity release at the well into three components i) the rain event periods, ii) the pumping periods and iii) the tidal range of Seine river. Time-domain reconstruction by inverse wavelet transform allows

  12. KALREF—A Kalman filter and time series approach to the International Terrestrial Reference Frame realization

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoping; Abbondanza, Claudio; Altamimi, Zuheir; Chin, T. Mike; Collilieux, Xavier; Gross, Richard S.; Heflin, Michael B.; Jiang, Yan; Parker, Jay W.

    2015-05-01

    The current International Terrestrial Reference Frame is based on a piecewise linear site motion model and realized by reference epoch coordinates and velocities for a global set of stations. Although linear motions due to tectonic plates and glacial isostatic adjustment dominate geodetic signals, at today's millimeter precisions, nonlinear motions due to earthquakes, volcanic activities, ice mass losses, sea level rise, hydrological changes, and other processes become significant. Monitoring these (sometimes rapid) changes desires consistent and precise realization of the terrestrial reference frame (TRF) quasi-instantaneously. Here, we use a Kalman filter and smoother approach to combine time series from four space geodetic techniques to realize an experimental TRF through weekly time series of geocentric coordinates. In addition to secular, periodic, and stochastic components for station coordinates, the Kalman filter state variables also include daily Earth orientation parameters and transformation parameters from input data frames to the combined TRF. Local tie measurements among colocated stations are used at their known or nominal epochs of observation, with comotion constraints applied to almost all colocated stations. The filter/smoother approach unifies different geodetic time series in a single geocentric frame. Fragmented and multitechnique tracking records at colocation sites are bridged together to form longer and coherent motion time series. While the time series approach to TRF reflects the reality of a changing Earth more closely than the linear approximation model, the filter/smoother is computationally powerful and flexible to facilitate incorporation of other data types and more advanced characterization of stochastic behavior of geodetic time series.

  13. Time series analysis using semiparametric regression on oil palm production

    NASA Astrophysics Data System (ADS)

    Yundari, Pasaribu, U. S.; Mukhaiyar, U.

    2016-04-01

    This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).

  14. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    PubMed

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  15. Metagenomics meets time series analysis: unraveling microbial community dynamics.

    PubMed

    Faust, Karoline; Lahti, Leo; Gonze, Didier; de Vos, Willem M; Raes, Jeroen

    2015-06-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic patterns, help to build predictive models or, on the contrary, quantify irregularities that make community behavior unpredictable. Microbial communities can change abruptly in response to small perturbations, linked to changing conditions or the presence of multiple stable states. With sufficient samples or time points, such alternative states can be detected. In addition, temporal variation of microbial interactions can be captured with time-varying networks. Here, we apply these techniques on multiple longitudinal datasets to illustrate their potential for microbiome research.

  16. Multifractal analysis of validated wind speed time series

    NASA Astrophysics Data System (ADS)

    García-Marín, A. P.; Estévez, J.; Jiménez-Hornero, F. J.; Ayuso-Muñoz, J. L.

    2013-03-01

    Multifractal properties of 30 min wind data series recorded at six locations in Cadiz (Southern Spain) have been studied in this work with the aim of obtaining detailed information for a range of time scales. Wind speed records have been validated, applying various quality control tests as a pre-requisite before their use, improving the reliability of the results due to the identification of incorrect values which have been discarded in the analysis. The scaling of the wind speed moments has been analysed and empirical moments scaling exponent functions K(q) have been obtained. Although the same critical moment (qcrit) has been obtained for all the places, some differences appear in other multifractal parameters like γmax and the value of K(0). These differences have been related to the presence of extreme events and zero data values in the data series analysed, respectively.

  17. Diagnosis of nonlinear systems using time series analysis

    SciTech Connect

    Hunter, N.F. Jr.

    1991-01-01

    Diagnosis and analysis techniques for linear systems have been developed and refined to a high degree of precision. In contrast, techniques for the analysis of data from nonlinear systems are in the early stages of development. This paper describes a time series technique for the analysis of data from nonlinear systems. The input and response time series resulting from excitation of the nonlinear system are embedded in a state space. The form of the embedding is optimized using local canonical variate analysis and singular value decomposition techniques. From the state space model, future system responses are estimated. The expected degree of predictability of the system is investigated using the state transition matrix. The degree of nonlinearity present is quantified using the geometry of the transfer function poles in the z plane. Examples of application to a linear single-degree-of-freedom system, a single-degree-of-freedom Duffing Oscillator, and linear and nonlinear three degree of freedom oscillators are presented. 11 refs., 9 figs.

  18. Enveloping Spectral Surfaces: Covariate Dependent Spectral Analysis of Categorical Time Series.

    PubMed

    Krafty, Robert T; Xiong, Shuangyan; Stoffer, David S; Buysse, Daniel J; Hall, Martica

    2012-09-01

    Motivated by problems in Sleep Medicine and Circadian Biology, we present a method for the analysis of cross-sectional categorical time series collected from multiple subjects where the effect of static continuous-valued covariates is of interest. Toward this goal, we extend the spectral envelope methodology for the frequency domain analysis of a single categorical process to cross-sectional categorical processes that are possibly covariate dependent. The analysis introduces an enveloping spectral surface for describing the association between the frequency domain properties of qualitative time series and covariates. The resulting surface offers an intuitively interpretable measure of association between covariates and a qualitative time series by finding the maximum possible conditional power at a given frequency from scalings of the qualitative time series conditional on the covariates. The optimal scalings that maximize the power provide scientific insight by identifying the aspects of the qualitative series which have the most pronounced periodic features at a given frequency conditional on the value of the covariates. To facilitate the assessment of the dependence of the enveloping spectral surface on the covariates, we include a theory for analyzing the partial derivatives of the surface. Our approach is entirely nonparametric, and we present estimation and asymptotics in the setting of local polynomial smoothing.

  19. Investigation of Bank Filtration in Gravel and Sand Aquifers using Time-Series Analysis

    NASA Astrophysics Data System (ADS)

    Vogt, T.; Hoehn, E.; Schneider, P.; Cirpka, O. A.

    2009-04-01

    Drinking-water wells in the vicinity of rivers may be influenced by infiltration of river water. In the context of drinking-water protection the decisive questions concern the fraction of river infiltrate in the pumped water and the residence time in the aquifer. For this purpose, tracer experiments may be performed. At larger rivers, however, such tests require the injection of large amounts of the tracer. As alternative to artificial-tracer tests, we present methods in which time series of electric conductivity and temperature are used for quantitative statements regarding mixing ratios and residence times. We recommend a multi-step approach consisting of: (1) a qualitative analysis of the time series, (2) the spectral determination of the seasonal temperature and conductivity signals, (3) a cross-correlation analysis, and (4) the non-parametric deconvolution of the time series. We apply these methods to two sites in the aquifer of the Thur valley in the Swiss Plateau. At sites without good connection between river and groundwater or where the river gains groundwater, the elaborate methods of time-series analysis are not applicable, but the time series indicate such conditions. At sites with continuous river-water infiltration, we can reconstruct the breakthrough curve of a tracer test without releasing an artificial tracer into the river.

  20. Characterization of chaos in air pollutants: A Volterra-Wiener-Korenberg series and numerical titration approach

    NASA Astrophysics Data System (ADS)

    Kumar, Ujjwal; Prakash, Amit; Jain, V. K.

    The present study attempts to provide an insight into the chaotic nature of air pollutants by applying the recent developments in the field of nonlinear dynamics. The Volterra-Wiener-Korenberg (VWK) series approach by Barahona and Poon [1996. Detection of nonlinear dynamics in short, noisy time series. Nature 381, 215-217] has been used to investigate the nonlinearity of O 3, NO, NO 2 and CO time series at two urban stations, namely—Hohenpeissenberg and Jungfraujoch. Nonlinearity has been detected in NO 2 and CO time series at both the stations. The numerical titration technique [Poon, C., Barahona, M., 2001. Titration of chaos with added noise. Proceedings of the National Academy of Sciences 98, 7107-7112] reveals that the dynamics of NO 2 and CO are indeed governed by deterministic chaos. Cao's method [Cao, L., 1997. Practical method for determining the minimum embedding dimension of a scalar time series. Physica D 110, 43-50] to determine the minimum embedding dimension further reveals that probably the dynamics of both NO 2 and CO time series are manifestations of high-dimensional chaos. It is interesting to note that similar chaotic characteristic of NO 2 and CO time series have been observed at both the sites indicating a possible universality in their chaotic nature in the ambient urban environment.

  1. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  2. Time series analysis for psychological research: examining and forecasting change

    PubMed Central

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  3. Time series analysis for psychological research: examining and forecasting change.

    PubMed

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  4. Chaotic time series analysis in economics: Balance and perspectives

    SciTech Connect

    Faggini, Marisa

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  5. Automatic change detection in time series of Synthetic Aperture Radar data using a scale-driven approach

    NASA Astrophysics Data System (ADS)

    Ajadi, O. A.; Meyer, F. J.

    2013-12-01

    Automatic change detection and change classification from Synthetic Aperture Radar (SAR) images is a difficult task mostly due to the high level of speckle noise inherent to SAR data and the highly non-Gaussian nature of the SAR amplitude information. Several approaches were developed in recent years to deal with SAR specific change detection problems from image pairs and time series of images. Despite these considerable efforts, no satisfying solution to this problem has been found so far. In this paper we present a promising new algorithm for change detection from SAR that is based on a multi-scale analysis of a times series of SAR images. Our approach is composed of three steps, including (1) data enhancement and filtering, (2) multi-scale change detection, and (3) time-series analysis of change detection maps. In the data enhancement and filtering step, we first form time-series of ratio images by dividing all SAR images by a reference acquisition to suppress stationary image information and enhance change signatures. Several methods for reference image selection will be discussed in the paper. The generated ratio images are further log-transformed to create near-Gaussian data and to convert the originally multiplicative noise into additive noise. A subsequent fast non-local mean filter is applied to reduce image noise whilst preserving most of the image details. The filtered log-ratio images are then inserted into a multi-scale change detection algorithm that is composed of: (1) a multi-scale decomposition of the input images using a two-dimensional discrete stationary wavelet transform (2D-SWT); (2) a multi-resolution classification into 'change' and 'no-change' areas; and (3) a scale-driven fusion of the classification results. In a final time-series analysis step the multi-temporal change detection maps are analyzed to identify seasonal, gradual, and abrupt changes. The performance of the developed approach will be demonstrated by application to the

  6. On-line analysis of reactor noise using time-series analysis

    SciTech Connect

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives.

  7. Patient Prognosis from Vital Sign Time Series: Combining Convolutional Neural Networks with a Dynamical Systems Approach

    PubMed Central

    Lehman, Li-wei; Ghassemi, Mohammad; Snoek, Jasper; Nemati, Shamim

    2016-01-01

    In this work, we propose a stacked switching vector-autoregressive (SVAR)-CNN architecture to model the changing dynamics in physiological time series for patient prognosis. The SVAR-layer extracts dynamical features (or modes) from the time-series, which are then fed into the CNN-layer to extract higher-level features representative of transition patterns among the dynamical modes. We evaluate our approach using 8-hours of minute-by-minute mean arterial blood pressure (BP) from over 450 patients in the MIMIC-II database. We modeled the time-series using a third-order SVAR process with 20 modes, resulting in first-level dynamical features of size 20×480 per patient. A fully connected CNN is then used to learn hierarchical features from these inputs, and to predict hospital mortality. The combined CNN/SVAR approach using BP time-series achieved a median and interquartile-range AUC of 0.74 [0.69, 0.75], significantly outperforming CNN-alone (0.54 [0.46, 0.59]), and SVAR-alone with logistic regression (0.69 [0.65, 0.72]). Our results indicate that including an SVAR layer improves the ability of CNNs to classify nonlinear and nonstationary time-series.

  8. Tidal Analysis of Very Long Gravity Time Series

    NASA Astrophysics Data System (ADS)

    Calvo, M.; Hinderer, J.; Rosat, S.; Legros, H.; Boy, J.; Riccardi, U.; Ducarme, B.; Zuern, W. E.

    2012-12-01

    We report on the tidal analyses carried out on very long gravity time series collected at three European permanent gravity observatories. According to the Nyquist's criterion, very long gravity series enable us to obtain a high resolution spectral analysis in the tidal bands allowing to separate small amplitude waves in the major tidal groups and also to attempt to detect very long period (18.6 and 9 yr) tides that have never been observed in gravity data. For this study we use 2 long data sets recorded by spring gravimeters in BFO (Germany) (1980-2012) and in Walferdange (Luxemburg) (1980-1995) as well as two time series (1987-1996 and 1996-2012) from two superconducting gravimeters located at the Strasbourg station (France). It is well known that the temporal changes of the instrumental sensitivity may introduce a related error in the tidal analysis. Hence the sensitivity of each instrument is investigated using the temporal variations of the delta factor for the main tidal waves (O1, K1, M2, and S2) as well as the M2/O1 delta factor ratio. Our findings demonstrate that the lack of long term stability of the spring instruments prevents from more detailed spectral analysis; on the contrary promising results have been obtained from gravity data collected by the two superconducting gravimeters operating at different consecutive epochs at Strasbourg. We checked the stability of instrumental sensitivity using numerous calibration experiments carried out during the last 15 years by co-located absolute gravity measurements. It turns out that the SG stability is much better than the one that can be achieved by SG/AG calibration repetitions. The observed temporal evolution of the tidal delta factors in Strasbourg is also compared with results from other European SG stations. Finally, we compare the observed parameters, with those theoretically estimated from the solid Earth tide models. The results demonstrate that long series of precise SG observations are a powerful

  9. Classification of cardiovascular time series based on different coupling structures using recurrence networks analysis.

    PubMed

    Ramírez Ávila, Gonzalo Marcelo; Gapelyuk, Andrej; Marwan, Norbert; Walther, Thomas; Stepan, Holger; Kurths, Jürgen; Wessel, Niels

    2013-08-28

    We analyse cardiovascular time series with the aim of performing early prediction of preeclampsia (PE), a pregnancy-specific disorder causing maternal and foetal morbidity and mortality. The analysis is made using a novel approach, namely the ε-recurrence networks applied to a phase space constructed by means of the time series of the variabilities of the heart rate and the blood pressure (systolic and diastolic). All the possible coupling structures among these variables are considered for the analysis. Network measures such as average path length, mean coreness, global clustering coefficient and scale-local transitivity dimension are computed and constitute the parameters for the subsequent quadratic discriminant analysis. This allows us to predict PE with a sensitivity of 91.7 per cent and a specificity of 68.1 per cent, thus validating the use of this method for classifying healthy and preeclamptic patients. PMID:23858486

  10. Weighted permutation entropy based on different symbolic approaches for financial time series

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2016-02-01

    In this paper, we introduce weighted permutation entropy (WPE) and three different symbolic approaches to investigate the complexities of stock time series containing amplitude-coded information and explore the influence of using different symbolic approaches on obtained WPE results. We employ WPE based on symbolic approaches to the US and Chinese stock markets and make a comparison between the results of US and Chinese stock markets. Three symbolic approaches are able to help the complexity containing in the stock time series by WPE method drop whatever the embedding dimension is. The similarity between these stock markets can be detected by the WPE based on Binary Δ-coding-method, while the difference between them can be revealed by the WPE based on σ-method, Max-min-method. The combinations of the symbolic approaches: σ-method and Max-min-method, and WPE method are capable of reflecting the multiscale structure of complexity by different time delay and analyze the differences between complexities of stock time series in more detail and more accurately. Furthermore, the correlations between stock markets in the same region and the similarities hidden in the S&P500 and DJI, ShangZheng and ShenCheng are uncovered by the comparison of the WPE based on Binary Δ-coding-method of six stock markets.

  11. Hybrid grammar-based approach to nonlinear dynamical system identification from biological time series

    NASA Astrophysics Data System (ADS)

    McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.

    2006-02-01

    We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.

  12. Monthly hail time series analysis related to agricultural insurance

    NASA Astrophysics Data System (ADS)

    Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.

    2010-05-01

    Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.

  13. Time series analysis of gold production in Malaysia

    NASA Astrophysics Data System (ADS)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  14. Time series clustering analysis of health-promoting behavior

    NASA Astrophysics Data System (ADS)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  15. Period analysis of hydrologic series through moving-window correlation analysis method

    NASA Astrophysics Data System (ADS)

    Xie, Yangyang; Huang, Qiang; Chang, Jianxia; Liu, Saiyan; Wang, Yimin

    2016-07-01

    Period analysis is of great significance for understanding various hydrologic processes and predicting the future hydrological regime of a watershed or region. Hence, many period analysis methods including fast Fourier transform (FFT), maximum entropy spectral analysis (MESA) and wavelet analysis (WA) have been developed to study this issue. However, due to the complex components of hydrologic series and the limitations of these conventional methods, the problem is still difficult to be solved. In this paper, the moving-window correlation analysis method (MWCA) has been proposed for analyzing the periodic component of hydrologic series, which includes construction of periodic processes, significant test of periods and time frequency analysis. Three commonly used methods (FFT, MESA and WA) and MWCA are employed to investigate the periods of synthetic series and observed hydrologic series, respectively. The results show that FFT, MESA and WA are not always as good as expected when detecting periods of a time series. By contrast, MWCA has better application effects, which could identify the true periods of time series, extract the reliable periodic components, find the active time ranges of periodic components and resist the disturbance of noises. In conclusion, MWCA is suitable to identify the true periods of hydrologic time series.

  16. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research.

  17. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. PMID:23260716

  18. Microvascular decompression for glossopharyngeal neuralgia through a microasterional approach: A case series

    PubMed Central

    Revuelta-Gutiérrez, Rogelio; Morales-Martínez, Andres Humberto; Mejías-Soto, Carolina; Martínez-Anda, Jaime Jesús; Ortega-Porcayo, Luis Alberto

    2016-01-01

    Background: Glossopharyngeal neuralgia (GPN) is an uncommon craniofacial pain syndrome. It is characterized by a sudden onset lancinating pain usually localized in the sensory distribution of the IX cranial nerve associated with excessive vagal outflow, which leads to bradycardia, hypotension, syncope, or cardiac arrest. This study aims to review our surgical experience performing microvascular decompression (MVD) in patients with GPN. Methods: Over the last 20 years, 14 consecutive cases were diagnosed with GPN. MVD using a microasterional approach was performed in all patients. Demographic data, clinical presentation, surgical findings, clinical outcome, complications, and long-term follow-up were reviewed. Results: The median age of onset was 58.7 years. The mean time from onset of symptoms to treatment was 8.8 years. Glossopharyngeal and vagus nerve compression was from the posterior inferior cerebellar artery in eleven cases (78.5%), vertebral artery in two cases (14.2%), and choroid plexus in one case (7.1%). Postoperative mean follow-up was 26 months (3–180 months). Pain analysis demonstrated long-term pain improvement of 114 ± 27.1 months and pain remission in 13 patients (92.9%) (P = 0.0001) two complications were documented, one patient had a cerebrospinal fluid leak, and another had bacterial meningitis. There was no surgical mortality. Conclusions: GPN is a rare entity, and secondary causes should be discarded. MVD through a retractorless microasterional approach is a safe and effective technique. Our series demonstrated an excellent clinical outcome with pain remission in 92.9%. PMID:27213105

  19. Homogenization of atmospheric pressure time series recorded at VLBI stations using a segmentation LASSO approach

    NASA Astrophysics Data System (ADS)

    Balidakis, Kyriakos; Heinkelmann, Robert; Lu, Cuixian; Soja, Benedikt; Karbon, Maria; Nilsson, Tobias; Glaser, Susanne; Andres Mora-Diaz, Julian; Anderson, James; Liu, Li; Raposo-Pulido, Virginia; Xu, Minghui; Schuh, Harald

    2015-04-01

    Time series of meteorological parameters recorded at VLBI (Very Long Baseline Interferometry) observatories allow us to realistically model and consequently to eliminate the atmosphere-induced effects in the VLBI products to a large extent. Nevertheless, this advantage of VLBI is not fully exploited since such information is contaminated with inconsistencies, such as uncertainties regarding the calibration and location of the meteorological sensors, outliers, missing data points, and breaks. It has been shown that such inconsistencies in meteorological data used for VLBI data analysis impose problems in the geodetic products (e.g vertical site position) and result in mistakes in geophysical interpretation. The aim of the procedure followed here is to optimally model the tropospheric delay and bending effects that are still the main sources of error in VLBI data analysis. In this study, the meteorological data recorded with sensors mounted in the vicinity of VLBI stations have been homogenized spanning the period from 1979 until today. In order to meet this objective, inhomogeneities were detected and adjusted using test results and metadata. Some of the approaches employed include Alexandersson's Standard Normal Homogeneity Test and an iterative procedure, of which the segmentation part is based on a dynamic programming algorithm and the functional part on a LASSO (Least Absolute Shrinkage and Selection Operator) estimator procedure. For the provision of reference time series that are necessary to apply the aforementioned methods, ECMWF's (European Centre for Medium-Range Weather Forecasts) ERA-Interim reanalysis surface data were employed. Special care was taken regarding the datum definition of this model. Due to the significant height difference between the VLBI antenna's reference point and the elevation included in geopotential fields of the specific numerical weather models, a hypsometric adjustment is applied using the absolute pressure level from the WMO

  20. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  1. A non-subjective approach to the GP algorithm for analysing noisy time series

    NASA Astrophysics Data System (ADS)

    Harikrishnan, K. P.; Misra, R.; Ambika, G.; Kembhavi, A. K.

    2006-03-01

    We present an adaptation of the standard Grassberger Proccacia (GP) algorithm for estimating the correlation dimension of a time series in a non-subjective manner. The validity and accuracy of this approach are tested using different types of time series, such as those from standard chaotic systems, pure white and colored noise and chaotic systems with added noise. The effectiveness of the scheme in analysing noisy time series, particularly those involving colored noise, is investigated. One interesting result we have obtained is that, for the same percentage of noise addition, data with colored noise is more distinguishable from the corresponding surrogates than data with white noise. As examples of real life applications, analyses of data from an astrophysical X-ray object and a human brain EEG are presented.

  2. Inorganic chemical analysis of environmental materials—A lecture series

    USGS Publications Warehouse

    Crock, J.G.; Lamothe, P.J.

    2011-01-01

    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.

  3. Feature extraction for change analysis in SAR time series

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2015-10-01

    In remote sensing, the change detection topic represents a broad field of research. If time series data is available, change detection can be used for monitoring applications. These applications require regular image acquisitions at identical time of day along a defined period. Focusing on remote sensing sensors, radar is especially well-capable for applications requiring regularity, since it is independent from most weather and atmospheric influences. Furthermore, regarding the image acquisitions, the time of day plays no role due to the independence from daylight. Since 2007, the German SAR (Synthetic Aperture Radar) satellite TerraSAR-X (TSX) permits the acquisition of high resolution radar images capable for the analysis of dense built-up areas. In a former study, we presented the change analysis of the Stuttgart (Germany) airport. The aim of this study is the categorization of detected changes in the time series. This categorization is motivated by the fact that it is a poor statement only to describe where and when a specific area has changed. At least as important is the statement about what has caused the change. The focus is set on the analysis of so-called high activity areas (HAA) representing areas changing at least four times along the investigated period. As first step for categorizing these HAAs, the matching HAA changes (blobs) have to be identified. Afterwards, operating in this object-based blob level, several features are extracted which comprise shape-based, radiometric, statistic, morphological values and one context feature basing on a segmentation of the HAAs. This segmentation builds on the morphological differential attribute profiles (DAPs). Seven context classes are established: Urban, infrastructure, rural stable, rural unstable, natural, water and unclassified. A specific HA blob is assigned to one of these classes analyzing the CovAmCoh time series signature of the surrounding segments. In combination, also surrounding GIS information

  4. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  5. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  6. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  7. Visibility graph network analysis of gold price time series

    NASA Astrophysics Data System (ADS)

    Long, Yu

    2013-08-01

    Mapping time series into a visibility graph network, the characteristics of the gold price time series and return temporal series, and the mechanism underlying the gold price fluctuation have been explored from the perspective of complex network theory. The network degree distribution characters, which change from power law to exponent law when the series was shuffled from original sequence, and the average path length characters, which change from L∼lnN into lnL∼lnN as the sequence was shuffled, demonstrate that price series and return series are both long-rang dependent fractal series. The relations of Hurst exponent to the power-law exponent of degree distribution demonstrate that the logarithmic price series is a fractal Brownian series and the logarithmic return series is a fractal Gaussian series. Power-law exponents of degree distribution in a time window changing with window moving demonstrates that a logarithmic gold price series is a multifractal series. The Power-law average clustering coefficient demonstrates that the gold price visibility graph is a hierarchy network. The hierarchy character, in light of the correspondence of graph to price fluctuation, means that gold price fluctuation is a hierarchy structure, which appears to be in agreement with Elliot’s experiential Wave Theory on stock price fluctuation, and the local-rule growth theory of a hierarchy network means that the hierarchy structure of gold price fluctuation originates from persistent, short term factors, such as short term speculation.

  8. Analysis of temperature time-series: Embedding dynamics into the MDS method

    NASA Astrophysics Data System (ADS)

    Lopes, António M.; Tenreiro Machado, J. A.

    2014-04-01

    Global warming and the associated climate changes are being the subject of intensive research due to their major impact on social, economic and health aspects of the human life. Surface temperature time-series characterise Earth as a slow dynamics spatiotemporal system, evidencing long memory behaviour, typical of fractional order systems. Such phenomena are difficult to model and analyse, demanding for alternative approaches. This paper studies the complex correlations between global temperature time-series using the Multidimensional scaling (MDS) approach. MDS provides a graphical representation of the pattern of climatic similarities between regions around the globe. The similarities are quantified through two mathematical indices that correlate the monthly average temperatures observed in meteorological stations, over a given period of time. Furthermore, time dynamics is analysed by performing the MDS analysis over slices sampling the time series. MDS generates maps describing the stations' locus in the perspective that, if they are perceived to be similar to each other, then they are placed on the map forming clusters. We show that MDS provides an intuitive and useful visual representation of the complex relationships that are present among temperature time-series, which are not perceived on traditional geographic maps. Moreover, MDS avoids sensitivity to the irregular distribution density of the meteorological stations.

  9. Multifractal detrended fluctuation analysis of Pannonian earthquake magnitude series

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Toth, Laszlo

    2016-04-01

    The multifractality of the series of magnitudes of the earthquakes occurred in Pannonia region from 2002 to 2012 has been investigated. The shallow (depth less than 40 km) and deep (depth larger than 70 km) seismic catalogues were analysed by using the multifractal detrended fluctuation analysis. The shallow and deep catalogues are characterized by different multifractal properties: (i) the magnitudes of the shallow events are weakly persistent, while those of the deep ones are almost uncorrelated; (ii) the deep catalogue is more multifractal than the shallow one; (iii) the magnitudes of the deep catalogue are characterized by a right-skewed multifractal spectrum, while that of the shallow magnitude is rather symmetric; (iv) a direct relationship between the b-value of the Gutenberg-Richter law and the multifractality of the magnitudes is suggested.

  10. Visual analytics for model selection in time series analysis.

    PubMed

    Bögl, Markus; Aigner, Wolfgang; Filzmoser, Peter; Lammarsch, Tim; Miksch, Silvia; Rind, Alexander

    2013-12-01

    Model selection in time series analysis is a challenging task for domain experts in many application areas such as epidemiology, economy, or environmental sciences. The methodology used for this task demands a close combination of human judgement and automated computation. However, statistical software tools do not adequately support this combination through interactive visual interfaces. We propose a Visual Analytics process to guide domain experts in this task. For this purpose, we developed the TiMoVA prototype that implements this process based on user stories and iterative expert feedback on user experience. The prototype was evaluated by usage scenarios with an example dataset from epidemiology and interviews with two external domain experts in statistics. The insights from the experts' feedback and the usage scenarios show that TiMoVA is able to support domain experts in model selection tasks through interactive visual interfaces with short feedback cycles.

  11. Time series analysis for minority game simulations of financial markets

    NASA Astrophysics Data System (ADS)

    Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy

    2003-04-01

    The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.

  12. Time series analysis of diverse extreme phenomena: universal features

    NASA Astrophysics Data System (ADS)

    Eftaxias, K.; Balasis, G.

    2012-04-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.

  13. Fuzzy Inference System Approach for Locating Series, Shunt, and Simultaneous Series-Shunt Faults in Double Circuit Transmission Lines.

    PubMed

    Swetapadma, Aleena; Yadav, Anamika

    2015-01-01

    Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance.

  14. Fuzzy Inference System Approach for Locating Series, Shunt, and Simultaneous Series-Shunt Faults in Double Circuit Transmission Lines

    PubMed Central

    Swetapadma, Aleena; Yadav, Anamika

    2015-01-01

    Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance. PMID:26413088

  15. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  16. Forced response approach of a parametric vibration with a trigonometric series

    NASA Astrophysics Data System (ADS)

    Huang, Dishan

    2015-02-01

    A forced vibration problem with parametric stiffness is modeled by feedback structure in this manuscript, and the forced response is expressed as a special trigonometric series. The forced response of this problem is determined by algebraic equation. By applying harmonic balance and limitation operation, all coefficients of the harmonic components in the forced response solution are fully approached. The investigation result shows that the new approach has an advantage in the computational time and accuracy, and it is very significant for the theoretical research and engineering application in dealing with the problem of forced parametric vibration.

  17. Hypnobehavioral approaches for school-age children with dysphagia and food aversion: a case series.

    PubMed

    Culbert, T P; Kajander, R L; Kohen, D P; Reaney, J B

    1996-10-01

    The purpose of this article is to describe hypnobehavioral treatment of five school-age children with maladaptive eating behaviors, including functional dysphagia, food aversion, globus hystericus, and conditioned fear of eating (phagophobia). The unique treatment approach described emphasizes the successful use of self-management techniques, particularly hypnosis, by all five children. Common etiological factors, treatment strategies, and proposed mechanisms of change are discussed. To the authors' knowledge, this is the first such case series in the mainstream pediatric literature describing the use of a hypnobehavioral approach for children with these maladaptive eating problems.

  18. Interrupted time-series analysis: studying trends in neurosurgery.

    PubMed

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research. PMID:26621420

  19. Interrupted time-series analysis: studying trends in neurosurgery.

    PubMed

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  20. An Alternative Approach to Atopic Dermatitis: Part I—Case-Series Presentation

    PubMed Central

    2004-01-01

    Atopic dermatitis (AD) is a complex disease of obscure pathogenesis. A substantial portion of AD patients treated with conventional therapy become intractable after several cycles of recurrence. Over the last 20 years we have developed an alternative approach to treat many of these patients by diet and Kampo herbal medicine. However, as our approach is highly individualized and the Kampo formulae sometimes complicated, it is not easy to provide evidence to establish usefulness of this approach. In this Review, to demonstrate the effectiveness of the method of individualized Kampo therapy, results are presented for a series of patients who had failed with conventional therapy but were treated afterwards in our institution. Based on these data, we contend that there exist a definite subgroup of AD patients in whom conventional therapy fails, but the ‘Diet and Kampo’ approach succeeds, to heal. Therefore, this approach should be considered seriously as a second-line treatment for AD patients. In the Discussion, we review the evidential status of the current conventional strategies for AD treatment in general, and then specifically discuss the possibility of integrating Kampo regimens into it, taking our case-series presented here as evidential basis. We emphasize that Kampo therapy for AD is more ‘art’ than technology, for which expertise is an essential pre-requisite. PMID:15257326

  1. Multidimensional stock network analysis: An Escoufier's RV coefficient approach

    NASA Astrophysics Data System (ADS)

    Lee, Gan Siew; Djauhari, Maman A.

    2013-09-01

    The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.

  2. Finite element techniques in computational time series analysis of turbulent flows

    NASA Astrophysics Data System (ADS)

    Horenko, I.

    2009-04-01

    In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical

  3. REDFIT-X: Cross-spectral analysis of unevenly spaced paleoclimate time series

    NASA Astrophysics Data System (ADS)

    Björg Ólafsdóttir, Kristín; Schulz, Michael; Mudelsee, Manfred

    2016-06-01

    Cross-spectral analysis is commonly used in climate research to identify joint variability between two variables and to assess the phase (lead/lag) between them. Here we present a Fortran 90 program (REDFIT-X) that is specially developed to perform cross-spectral analysis of unevenly spaced paleoclimate time series. The data properties of climate time series that are necessary to take into account are for example data spacing (unequal time scales and/or uneven spacing between time points) and the persistence in the data. Lomb-Scargle Fourier transform is used for the cross-spectral analyses between two time series with unequal and/or uneven time scale and the persistence in the data is taken into account when estimating the uncertainty associated with cross-spectral estimates. We use a Monte Carlo approach to estimate the uncertainty associated with coherency and phase. False-alarm level is estimated from empirical distribution of coherency estimates and confidence intervals for the phase angle are formed from the empirical distribution of the phase estimates. The method is validated by comparing the Monte Carlo uncertainty estimates with the traditionally used measures. Examples are given where the method is applied to paleoceanographic time series.

  4. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    PubMed

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  5. A Systematic Review of Methodology: Time Series Regression Analysis for Environmental Factors and Infectious Diseases

    PubMed Central

    Imai, Chisato; Hashizume, Masahiro

    2015-01-01

    Background: Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Methods: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Findings: Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. Conclusion: The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases. PMID:25859149

  6. An approach for estimating time-variable rates from geodetic time series

    NASA Astrophysics Data System (ADS)

    Didova, Olga; Gunter, Brian; Riva, Riccardo; Klees, Roland; Roese-Koerner, Lutz

    2016-11-01

    There has been considerable research in the literature focused on computing and forecasting sea-level changes in terms of constant trends or rates. The Antarctic ice sheet is one of the main contributors to sea-level change with highly uncertain rates of glacial thinning and accumulation. Geodetic observing systems such as the Gravity Recovery and Climate Experiment (GRACE) and the Global Positioning System (GPS) are routinely used to estimate these trends. In an effort to improve the accuracy and reliability of these trends, this study investigates a technique that allows the estimated rates, along with co-estimated seasonal components, to vary in time. For this, state space models are defined and then solved by a Kalman filter (KF). The reliable estimation of noise parameters is one of the main problems encountered when using a KF approach, which is solved by numerically optimizing likelihood. Since the optimization problem is non-convex, it is challenging to find an optimal solution. To address this issue, we limited the parameter search space using classical least-squares adjustment (LSA). In this context, we also tested the usage of inequality constraints by directly verifying whether they are supported by the data. The suggested technique for time-series analysis is expanded to classify and handle time-correlated observational noise within the state space framework. The performance of the method is demonstrated using GRACE and GPS data at the CAS1 station located in East Antarctica and compared to commonly used LSA. The results suggest that the outlined technique allows for more reliable trend estimates, as well as for more physically valuable interpretations, while validating independent observing systems.

  7. An approach for estimating time-variable rates from geodetic time series

    NASA Astrophysics Data System (ADS)

    Didova, Olga; Gunter, Brian; Riva, Riccardo; Klees, Roland; Roese-Koerner, Lutz

    2016-06-01

    There has been considerable research in the literature focused on computing and forecasting sea-level changes in terms of constant trends or rates. The Antarctic ice sheet is one of the main contributors to sea-level change with highly uncertain rates of glacial thinning and accumulation. Geodetic observing systems such as the Gravity Recovery and Climate Experiment (GRACE) and the Global Positioning System (GPS) are routinely used to estimate these trends. In an effort to improve the accuracy and reliability of these trends, this study investigates a technique that allows the estimated rates, along with co-estimated seasonal components, to vary in time. For this, state space models are defined and then solved by a Kalman filter (KF). The reliable estimation of noise parameters is one of the main problems encountered when using a KF approach, which is solved by numerically optimizing likelihood. Since the optimization problem is non-convex, it is challenging to find an optimal solution. To address this issue, we limited the parameter search space using classical least-squares adjustment (LSA). In this context, we also tested the usage of inequality constraints by directly verifying whether they are supported by the data. The suggested technique for time-series analysis is expanded to classify and handle time-correlated observational noise within the state space framework. The performance of the method is demonstrated using GRACE and GPS data at the CAS1 station located in East Antarctica and compared to commonly used LSA. The results suggest that the outlined technique allows for more reliable trend estimates, as well as for more physically valuable interpretations, while validating independent observing systems.

  8. Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.

    PubMed

    Ak, Ronay; Fink, Olga; Zio, Enrico

    2016-08-01

    The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria. PMID:25910257

  9. Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.

    PubMed

    Ak, Ronay; Fink, Olga; Zio, Enrico

    2016-08-01

    The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.

  10. The EarthLabs Climate Series: Approaching Climate Literacy From Multiple Contexts

    NASA Astrophysics Data System (ADS)

    Haddad, N.; Ledley, T. S.; Ellins, K.; McNeal, K.; Bardar, E. W.; Youngman, E.; Lockwood, J.; Dunlap, C.

    2015-12-01

    The EarthLabs Climate Series is a set of four distinct but related high school curriculum modules that help build student and teacher understanding of our planet's complex climate system. The web-based, freely available curriculum modules include a rich set of resources for teachers, and are tied together by a common set of climate related themes that include: 1) the Earth system with the complexities of its positive and negative feedback loops; 2) the range of temporal and spatial scales at which climate, weather, and other Earth system processes occur; and 3) the recurring question, "How do we know what we know about Earth's past and present climate?" which addresses proxy data and scientific instrumentation. The four modules (Climate and the Cryosphere; Climate and the Biosphere; Climate and the Carbon Cycle; and Climate Detectives) approach climate literacy from different contexts, and have provided teachers of biology, chemistry, marine science, environmental science, and earth science with opportunities to address climate science by selecting a module that best supplements the content of their particular course. This presentation will highlight the four curriculum modules in the Climate Series, the multiple pathways they offer teachers for introducing climate science into their existing courses, and the two newest elements of the series: the Climate Series Intro, which holds an extensive set of climate related resources for teachers; and the Climate Detectives module, which is based on the 2013 expedition of the Joides Resolution to collect cores from the seafloor below the Gulf of Alaska.

  11. Three approaches to reliability analysis

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1989-01-01

    It is noted that current reliability analysis tools differ not only in their solution techniques, but also in their approach to model abstraction. The analyst must be satisfied with the constraints that are intrinsic to any combination of solution technique and model abstraction. To get a better idea of the nature of these constraints, three reliability analysis tools (HARP, ASSIST/SURE, and CAME) were used to model portions of the Integrated Airframe/Propulsion Control System architecture. When presented with the example problem, all three tools failed to produce correct results. In all cases, either the tool or the model had to be modified. It is suggested that most of the difficulty is rooted in the large model size and long computational times which are characteristic of Markov model solutions.

  12. New insights into time series analysis. I. Correlated observations

    NASA Astrophysics Data System (ADS)

    Ferreira Lopes, C. E.; Cross, N. J. G.

    2016-02-01

    indices computed in this new approach allow us to reduce misclassification and these will be implemented in an automatic classifier which will be addressed in a forthcoming paper in this series.

  13. Evaluating disease management program effectiveness: an introduction to time-series analysis.

    PubMed

    Linden, Ariel; Adams, John L; Roberts, Nancy

    2003-01-01

    Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.

  14. Spatiotemporal analysis of GPS time series in vertical direction using independent component analysis

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Dai, Wujiao; Peng, Wei; Meng, Xiaolin

    2015-11-01

    GPS has been widely used in the field of geodesy and geodynamics thanks to its technology development and the improvement of positioning accuracy. A time series observed by GPS in vertical direction usually contains tectonic signals, non-tectonic signals, residual atmospheric delay, measurement noise, etc. Analyzing these information is the basis of crustal deformation research. Furthermore, analyzing the GPS time series and extracting the non-tectonic information are helpful to study the effect of various geophysical events. Principal component analysis (PCA) is an effective tool for spatiotemporal filtering and GPS time series analysis. But as it is unable to extract statistically independent components, PCA is unfavorable for achieving the implicit information in time series. Independent component analysis (ICA) is a statistical method of blind source separation (BSS) and can separate original signals from mixed observations. In this paper, ICA is used as a spatiotemporal filtering method to analyze the spatial and temporal features of vertical GPS coordinate time series in the UK and Sichuan-Yunnan region in China. Meanwhile, the contributions from atmospheric and soil moisture mass loading are evaluated. The analysis of the relevance between the independent components and mass loading with their spatial distribution shows that the signals extracted by ICA have a strong correlation with the non-tectonic deformation, indicating that ICA has a better performance in spatiotemporal analysis.

  15. Discovery of progenitor cell signatures by time-series synexpression analysis during Drosophila embryonic cell immortalization

    PubMed Central

    Dequéant, Mary-Lee; Fagegaltier, Delphine; Hu, Yanhui; Spirohn, Kerstin; Simcox, Amanda; Hannon, Gregory J.; Perrimon, Norbert

    2015-01-01

    The use of time series profiling to identify groups of functionally related genes (synexpression groups) is a powerful approach for the discovery of gene function. Here we apply this strategy during RasV12 immortalization of Drosophila embryonic cells, a phenomenon not well characterized. Using high-resolution transcriptional time-series datasets, we generated a gene network based on temporal expression profile similarities. This analysis revealed that common immortalized cells are related to adult muscle precursors (AMPs), a stem cell-like population contributing to adult muscles and sharing properties with vertebrate satellite cells. Remarkably, the immortalized cells retained the capacity for myogenic differentiation when treated with the steroid hormone ecdysone. Further, we validated in vivo the transcription factor CG9650, the ortholog of mammalian Bcl11a/b, as a regulator of AMP proliferation predicted by our analysis. Our study demonstrates the power of time series synexpression analysis to characterize Drosophila embryonic progenitor lines and identify stem/progenitor cell regulators. PMID:26438832

  16. Three Approaches to Environmental Resources Analysis.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Design.

    This booklet, the first of a projected series related to the development of methodologies and techniques for environments planning and design, examines three approaches that are currently being used to identify, analyze, and evaluate the natural and man-made resources that comprise the physical environment. One approach by G. Angus Hills uses a…

  17. Eigen analysis of series compensation schemes reducing the potential of subsynchronous resonance

    SciTech Connect

    Iravani, M.R.; Edris, A.A.

    1995-05-01

    A previous paper describes a new concept for mitigation of the phenomenon of subsynchronous resonance (SSR) based on asymmetrical series capacitor compensation at SSR frequencies. The studies reported in that paper are based on a digital time-domain simulation technique. This paper provides a quantitative evaluation of the concept using a novel eigen analysis approach. The eigen analysis approach represents the mathematical models of power system components in the three-phase basis, and can evaluate the impacts of asymmetry and imbalance on the system dynamics in the subsynchronous frequency range. The study results restate technical feasibility of the proposed SSR countermeasure. This paper opens the avenue for examination of active power filter topologies to introduce artificial asymmetry at SSR frequencies to counteract torsional oscillations.

  18. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    NASA Astrophysics Data System (ADS)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  19. Traffic time series analysis by using multiscale time irreversibility and entropy

    NASA Astrophysics Data System (ADS)

    Wang, Xuejiao; Shang, Pengjian; Fang, Jintang

    2014-09-01

    Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method—multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy.

  20. Traffic time series analysis by using multiscale time irreversibility and entropy.

    PubMed

    Wang, Xuejiao; Shang, Pengjian; Fang, Jintang

    2014-09-01

    Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method-multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy. PMID:25273180

  1. Water Resources Management Plan for Ganga River using SWAT Modelling and Time series Analysis

    NASA Astrophysics Data System (ADS)

    Satish, L. N. V.

    2015-12-01

    Water resources management of the Ganga River is one of the primary objectives of National Ganga River Basin Environmental Management Plan. The present study aims to carry out water balance study and development of appropriate methodologies to compute environmental flow in the middle Ganga river basin between Patna-Farraka, India. The methodology adopted here are set-up a hydrological model to estimate monthly discharge at the tributaries under natural condition, hydrological alternation analysis of both observed and simulated discharge series, flow health analysis to obtain status of the stream health in the last 4 decades and estimating the e-flow using flow health indicators. ArcSWAT, was used to simulate 8 tributaries namely Kosi, Gandak and others. This modelling is quite encouraging and helps to provide the monthly water balance analysis for all tributaries for this study. The water balance analysis indicates significant change in surface and ground water interaction pattern within the study time period Indicators of hydrological alternation has been used for both observed and simulated data series to quantify hydrological alternation occurred in the tributaries and the main river in the last 4 decades,. For temporal variation of stream health, flow health tool has been used for observed and simulated discharge data. A detailed stream health analysis has been performed by considering 3 approaches based on i) observed flow time series, ii) observed and simulated flow time series and iii) simulated flow time series at small upland basin, major tributary and main Ganga river basin levels. At upland basin level, these approaches show that stream health and its temporal variations are good with non-significant temporal variation. At major tributary level, the stream health and its temporal variations are found to be deteriorating from 1970s. At the main Ganga reach level river health and its temporal variations does not show any declining trend. Finally, E- flows

  2. Chaotic time series analysis of vision evoked EEG

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Wang, Hong

    2010-01-01

    To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.

  3. Chaotic time series analysis of vision evoked EEG

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Wang, Hong

    2009-12-01

    To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.

  4. On the Fourier and Wavelet Analysis of Coronal Time Series

    NASA Astrophysics Data System (ADS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-07-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  5. Permutation Entropy Analysis of Geomagnetic Indices Time Series

    NASA Astrophysics Data System (ADS)

    De Michelis, Paola; Consolini, Giuseppe

    2013-04-01

    The Earth's magnetospheric dynamics displays a very complex nature in response to solar wind changes as widely documented in the scientific literature. This complex dynamics manifests in various physical processes occurring in different regions of the Earth's magnetosphere as clearly revealed by previous analyses on geomagnetic indices (AE-indices, Dst, Sym-H, ....., etc.). One of the most interesting features of the geomagnetic indices as proxies of the Earth's magnetospheric dynamics is the multifractional nature of the time series of such indices. This aspect has been interpreted as the occurrence of intermittence and dynamical phase transition in the Earth's magnetosphere. Here, we investigate the Markovian nature of different geomagnetic indices (AE-indices, Sym-H, Asy-H) and their fluctuations by means of Permutation Entropy Analysis. The results clearly show the non-Markovian and different nature of the distinct sets of geomagnetic indices, pointing towards diverse underlying physical processes. A discussion in connection with the nature of the physical processes responsible of each set of indices and their multifractional character is attempted.

  6. Detrended fluctuation analysis of laser Doppler flowmetry time series.

    PubMed

    Esen, Ferhan; Aydin, Gülsün Sönmez; Esen, Hamza

    2009-12-01

    Detrended fluctuation analysis (DFA) of laser Doppler flow (LDF) time series appears to yield improved prognostic power in microvascular dysfunction, through calculation of the scaling exponent, alpha. In the present study the long lasting strenuous activity-induced change in microvascular function was evaluated by DFA in basketball players compared with sedentary control. Forearm skin blood flow was measured at rest and during local heating. Three scaling exponents, the slopes of the three regression lines, were identified corresponding to cardiac, cardio-respiratory and local factors. Local scaling exponent was always approximately one, alpha=1.01+/-0.15, in the control group and did not change with local heating. However, we found a broken line with two scaling exponents (alpha(1)=1.06+/-0.01 and alpha(2)=0.75+/-0.01) in basketball players. The broken line became a single line having one scaling exponent (alpha(T)=0.94+/-0.01) with local heating. The scaling exponents, alpha(2) and alpha(T), smaller than 1 indicate reduced long-range correlation in blood flow due to a loss of integration in local mechanisms and suggest endothelial dysfunction as the most likely candidate. Evaluation of microvascular function from a baseline LDF signal at rest is the superiority of DFA to other methods, spectral or not, that use the amplitude changes of evoked relative signal. PMID:19660479

  7. Detrended fluctuation analysis of laser Doppler flowmetry time series.

    PubMed

    Esen, Ferhan; Aydin, Gülsün Sönmez; Esen, Hamza

    2009-12-01

    Detrended fluctuation analysis (DFA) of laser Doppler flow (LDF) time series appears to yield improved prognostic power in microvascular dysfunction, through calculation of the scaling exponent, alpha. In the present study the long lasting strenuous activity-induced change in microvascular function was evaluated by DFA in basketball players compared with sedentary control. Forearm skin blood flow was measured at rest and during local heating. Three scaling exponents, the slopes of the three regression lines, were identified corresponding to cardiac, cardio-respiratory and local factors. Local scaling exponent was always approximately one, alpha=1.01+/-0.15, in the control group and did not change with local heating. However, we found a broken line with two scaling exponents (alpha(1)=1.06+/-0.01 and alpha(2)=0.75+/-0.01) in basketball players. The broken line became a single line having one scaling exponent (alpha(T)=0.94+/-0.01) with local heating. The scaling exponents, alpha(2) and alpha(T), smaller than 1 indicate reduced long-range correlation in blood flow due to a loss of integration in local mechanisms and suggest endothelial dysfunction as the most likely candidate. Evaluation of microvascular function from a baseline LDF signal at rest is the superiority of DFA to other methods, spectral or not, that use the amplitude changes of evoked relative signal.

  8. A combined NSMC and pole series expansion approach for UXO discrimination

    NASA Astrophysics Data System (ADS)

    Shubitidze, F.; Barrowes, B. E.; O'Neill, K.; Shamatava, I.; Fernández, J. P.

    2007-04-01

    This paper combines the normalized surface magnetic charge (NSMC) model and a pole series expansion method to determine the scattered field singularities directly from EMI measured data, i.e. to find a buried object location and orientation without solving a time consuming inverse-scattering problem. The NSMC is very simple to program and robust for predicting the EMI responses of various objects. The technique is applicable to any combination of magnetic or electromagnetic induction data for any arbitrary homogeneous or heterogeneous 3-D object or set of objects. In this proposed approach, first EMI responses are collected at a measurement surface. Then the NSMC approach, which distributes magnetic charge on a surface conformal, but does not coincide to the measurement surface, is used to extend the actual measured EMI magnetic field above the data collection surface for generating spatially distributed data. Then the pole series expansion approach is employed to localize the scattered fields singularities i.e. to determine the object's location and orientation. Once the object's location and orientations are found, then the total NSMC, which is characteristic of the object, is calculated and used for discriminating between UXO and non-UXO items. The algorithm is tested against actual EM-63 time domain EMI data collected at the ERDC test-stand site for actual UXO. Several numerical results are presented and discussed for demonstrating the applicability of the proposed method for determining buried objects location as well as for discriminating between objects on interested from non-hazardous items.

  9. COST-ES0601: Advances in homogenisation methods of climate series: an integrated approach (HOME)

    NASA Astrophysics Data System (ADS)

    Mestre, O.; Auer, I.; Venema, V.; Stepanek, P.; Szentimrey, T.; Grimvall, A.; Aguilar, E.

    2009-04-01

    The COST Action ES0601: Advances in homogenisation methods of climate series: an integrated approach is nearing the end of its second year of life. The action is intended to provide the best possible tools for the homogenization of time series to the climate research community. The involved scientists have done remarkable progress since COST Action ES0601 was launched (see www.homogenisation.org). HOME has started with a literature review and a survey to the research community to identify those climatic elements and homogenisation techniques to be considered during the action. This allowed the preparation of the benchmark monthly dataset to be used during the remaining time of the action. This monthly benchmark contains real temperature and precipitation data (with real inhomogeneities), as well as synthetic and surrogate networks, including artificially produced missing values, outliers, local trends and break inhomogeneities which are inserted at the usual rate, size and distribution found in actual networks. The location of the outliers and change points is undisclosed to the HOME scientists, who are, at present, applying different homogenisation approaches and uploading the results, to analyse the performances of their techniques. Everyone who works on the homogenization of climate data is cordially invited to join this exercise. HOME is also working on the production of a daily benchmark dataset, to reproduce the experiment described above, but in a lower temporal resolution, and on the preparation of freely available homogenization tools, including the best performing approaches.

  10. The Prediction of Teacher Turnover Employing Time Series Analysis.

    ERIC Educational Resources Information Center

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  11. Examining deterrence of adult sex crimes: A semi-parametric intervention time series approach

    PubMed Central

    Park, Jin-Hong; Bandyopadhyay, Dipankar; Letourneau, Elizabeth

    2013-01-01

    Motivated by recent developments on dimension reduction (DR) techniques for time series data, the association of a general deterrent effect towards South Carolina (SC)’s registration and notification (SORN) policy for preventing sex crimes was examined. Using adult sex crime arrestee data from 1990 to 2005, the the idea of Central Mean Subspace (CMS) is extended to intervention time series analysis (CMS-ITS) to model the sequential intervention effects of 1995 (the year SC’s SORN policy was initially implemented) and 1999 (the year the policy was revised to include online notification) on the time series spectrum. The CMS-ITS model estimation was achieved via kernel smoothing techniques, and compared to interrupted auto-regressive integrated time series (ARIMA) models. Simulation studies and application to the real data underscores our model’s ability towards achieving parsimony, and to detect intervention effects not earlier determined via traditional ARIMA models. From a public health perspective, findings from this study draw attention to the potential general deterrent effects of SC’s SORN policy. These findings are considered in light of the overall body of research on sex crime arrestee registration and notification policies, which remain controversial. PMID:24795489

  12. Time series analysis of collective motions in proteins

    NASA Astrophysics Data System (ADS)

    Alakent, Burak; Doruker, Pemra; ćamurdan, Mehmet C.

    2004-01-01

    The dynamics of α-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the Cα atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm-1 range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers.

  13. Nonlinear time series analysis of normal and pathological human walking

    NASA Astrophysics Data System (ADS)

    Dingwell, Jonathan B.; Cusumano, Joseph P.

    2000-12-01

    Characterizing locomotor dynamics is essential for understanding the neuromuscular control of locomotion. In particular, quantifying dynamic stability during walking is important for assessing people who have a greater risk of falling. However, traditional biomechanical methods of defining stability have not quantified the resistance of the neuromuscular system to perturbations, suggesting that more precise definitions are required. For the present study, average maximum finite-time Lyapunov exponents were estimated to quantify the local dynamic stability of human walking kinematics. Local scaling exponents, defined as the local slopes of the correlation sum curves, were also calculated to quantify the local scaling structure of each embedded time series. Comparisons were made between overground and motorized treadmill walking in young healthy subjects and between diabetic neuropathic (NP) patients and healthy controls (CO) during overground walking. A modification of the method of surrogate data was developed to examine the stochastic nature of the fluctuations overlying the nominally periodic patterns in these data sets. Results demonstrated that having subjects walk on a motorized treadmill artificially stabilized their natural locomotor kinematics by small but statistically significant amounts. Furthermore, a paradox previously present in the biomechanical literature that resulted from mistakenly equating variability with dynamic stability was resolved. By slowing their self-selected walking speeds, NP patients adopted more locally stable gait patterns, even though they simultaneously exhibited greater kinematic variability than CO subjects. Additionally, the loss of peripheral sensation in NP patients was associated with statistically significant differences in the local scaling structure of their walking kinematics at those length scales where it was anticipated that sensory feedback would play the greatest role. Lastly, stride-to-stride fluctuations in the

  14. Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.

    PubMed

    Durdu, Omer Faruk

    2010-10-01

    In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic

  15. Volterra Series Approach for Nonlinear Aeroelastic Response of 2-D Lifting Surfaces

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Marzocca, Piergiovanni; Librescu, Liviu

    2001-01-01

    The problem of the determination of the subcritical aeroelastic response and flutter instability of nonlinear two-dimensional lifting surfaces in an incompressible flow-field via Volterra series approach is addressed. The related aeroelastic governing equations are based upon the inclusion of structural nonlinearities, of the linear unsteady aerodynamics and consideration of an arbitrary time-dependent external pressure pulse. Unsteady aeroelastic nonlinear kernels are determined, and based on these, frequency and time histories of the subcritical aeroelastic response are obtained, and in this context the influence of geometric nonlinearities is emphasized. Conclusions and results displaying the implications of the considered effects are supplied.

  16. Approximate Symmetry Reduction Approach: Infinite Series Reductions to the KdV-Burgers Equation

    NASA Astrophysics Data System (ADS)

    Jiao, Xiaoyu; Yao, Ruoxia; Zhang, Shunli; Lou, Sen Y.

    2009-11-01

    For weak dispersion and weak dissipation cases, the (1+1)-dimensional KdV-Burgers equation is investigated in terms of approximate symmetry reduction approach. The formal coherence of similarity reduction solutions and similarity reduction equations of different orders enables series reduction solutions. For the weak dissipation case, zero-order similarity solutions satisfy the Painlevé II, Painlevé I, and Jacobi elliptic function equations. For the weak dispersion case, zero-order similarity solutions are in the form of Kummer, Airy, and hyperbolic tangent functions. Higher-order similarity solutions can be obtained by solving linear variable coefficients ordinary differential equations.

  17. Use of a Principal Components Analysis for the Generation of Daily Time Series.

    NASA Astrophysics Data System (ADS)

    Dreveton, Christine; Guillou, Yann

    2004-07-01

    A new approach for generating daily time series is considered in response to the weather-derivatives market. This approach consists of performing a principal components analysis to create independent variables, the values of which are then generated separately with a random process. Weather derivatives are financial or insurance products that give companies the opportunity to cover themselves against adverse climate conditions. The aim of a generator is to provide a wider range of feasible situations to be used in an assessment of risk. Generation of a temperature time series is required by insurers or bankers for pricing weather options. The provision of conditional probabilities and a good representation of the interannual variance are the main challenges of a generator when used for weather derivatives. The generator was developed according to this new approach using a principal components analysis and was applied to the daily average temperature time series of the Paris-Montsouris station in France. The observed dataset was homogenized and the trend was removed to represent correctly the present climate. The results obtained with the generator show that it represents correctly the interannual variance of the observed climate; this is the main result of the work, because one of the main discrepancies of other generators is their inability to represent accurately the observed interannual climate variance—this discrepancy is not acceptable for an application to weather derivatives. The generator was also tested to calculate conditional probabilities: for example, the knowledge of the aggregated value of heating degree-days in the middle of the heating season allows one to estimate the probability if reaching a threshold at the end of the heating season. This represents the main application of a climate generator for use with weather derivatives.


  18. Independent Component Analysis (ICA) as a tool for exploring geodetic time series

    NASA Astrophysics Data System (ADS)

    Forootan, E.; Kusche, J.

    2012-04-01

    Long-term geodetic and geophysical observations offer the possibility of studying the behaviour of geophysical or climatic phenomena embedded in the observed time series. These observations, however, usually exhibit non-linear and complex physical interactions with many inherent time scales. Therefore, simple time series approaches inefficient for exploring the source of variabilities from those observed mixture of signals. Independent Component Analysis (ICA) is a higher-order statistical technique that allows to separate a mixture of random non-Gaussian signals into their statistically independence sources. Its benefit is that it only relies on the information contained in the observations, no a-priori models are prescribed to extract source signals. However, justifications of ICA are usually rooted in the theory of random signals. This study discusses the possibility of using ICA to separate a mixture of stochastic random signals and deterministic sinusoidal signals in the presence of a trend. Theoretical as well as numerical investigations are presented. As a specific application, the performance of ICA on a synthetic example based on the hydrological signals detected by the Gravity Recovery and Climate Experiment (GRACE) satellite gravimetry mission is presented. We also present the results of ICA when it was applied to separate the real GRACE-derived water storage signals over the landmass of Australia from the surrounding oceans. Our results show that the ICA is a reliable analysis tool which can be used for exploring geodetic signals. Keywords: ICA; geodetic time series; GRACE-derived water storage

  19. Time-series analysis reveals genetic responses to intensive management of razorback sucker (Xyrauchen texanus).

    PubMed

    Dowling, Thomas E; Turner, Thomas F; Carson, Evan W; Saltzgiver, Melody J; Adams, Deborah; Kesner, Brian; Marsh, Paul C

    2014-03-01

    Time-series analysis is used widely in ecology to study complex phenomena and may have considerable potential to clarify relationships of genetic and demographic processes in natural and exploited populations. We explored the utility of this approach to evaluate population responses to management in razorback sucker, a long-lived and fecund, but declining freshwater fish species. A core population in Lake Mohave (Arizona-Nevada, USA) has experienced no natural recruitment for decades and is maintained by harvesting naturally produced larvae from the lake, rearing them in protective custody, and repatriating them at sizes less vulnerable to predation. Analyses of mtDNA and 15 microsatellites characterized for sequential larval cohorts collected over a 15-year time series revealed no changes in geographic structuring but indicated significant increase in mtDNA diversity for the entire population over time. Likewise, ratios of annual effective breeders to annual census size (N b /N a) increased significantly despite sevenfold reduction of N a. These results indicated that conservation actions diminished near-term extinction risk due to genetic factors and should now focus on increasing numbers of fish in Lake Mohave to ameliorate longer-term risks. More generally, time-series analysis permitted robust testing of trends in genetic diversity, despite low precision of some metrics. PMID:24665337

  20. Time-series analysis reveals genetic responses to intensive management of razorback sucker (Xyrauchen texanus)

    PubMed Central

    Dowling, Thomas E; Turner, Thomas F; Carson, Evan W; Saltzgiver, Melody J; Adams, Deborah; Kesner, Brian; Marsh, Paul C

    2014-01-01

    Time-series analysis is used widely in ecology to study complex phenomena and may have considerable potential to clarify relationships of genetic and demographic processes in natural and exploited populations. We explored the utility of this approach to evaluate population responses to management in razorback sucker, a long-lived and fecund, but declining freshwater fish species. A core population in Lake Mohave (Arizona-Nevada, USA) has experienced no natural recruitment for decades and is maintained by harvesting naturally produced larvae from the lake, rearing them in protective custody, and repatriating them at sizes less vulnerable to predation. Analyses of mtDNA and 15 microsatellites characterized for sequential larval cohorts collected over a 15-year time series revealed no changes in geographic structuring but indicated significant increase in mtDNA diversity for the entire population over time. Likewise, ratios of annual effective breeders to annual census size (Nb/Na) increased significantly despite sevenfold reduction of Na. These results indicated that conservation actions diminished near-term extinction risk due to genetic factors and should now focus on increasing numbers of fish in Lake Mohave to ameliorate longer-term risks. More generally, time-series analysis permitted robust testing of trends in genetic diversity, despite low precision of some metrics. PMID:24665337

  1. A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series

    NASA Technical Reports Server (NTRS)

    Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.

    1998-01-01

    An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.

  2. Nonlinear Time Series Analysis of White Dwarf Light Curves

    NASA Astrophysics Data System (ADS)

    Jevtic, N.; Zelechoski, S.; Feldman, H.; Peterson, C.; Schweitzer, J.

    2001-12-01

    We use nonlinear time series analysis methods to examine the light intensity curves of white dwarf PG1351+489 obtained by the Whole Earth Telescope (WET). Though these methods were originally introduced to study chaotic systems, when a clear signature of determinism is found for the process generating an observable and it couples the active degrees of freedom of the system, then the notion of phase space provides a framework for exploring the system dynamics of nonlinear systems in general. With a pronounced single frequency, its harmonics and other frequencies of lower amplitude on a broadband background, the PG1351 light curve lends itself to the use of time delay coordinates. Our phase space reconstruction yields a triangular, toroidal three-dimensional shape. This differs from earlier results of a circular toroidal representation. We find a morphological similarity to a magnetic dynamo model developed for fast rotators that yields a union of both results: the circular phase space structure for the ascending portion of the cycle, and the triangular structure for the declining portion. The rise and fall of the dynamo cycle yield both different phase space representations and different correlation dimensions. Since PG1351 is known to have no significant fields, these results may stimulate the observation of light curves of known magnetic white dwarfs for comparison. Using other data obtained by the WET, we compare the phase space reconstruction of DB white dwarf PG1351 with that of GD 358 which has a more complex power spectrum. We also compare these results with those for PG1159. There is some general similarity between the results of the phase space reconstruction for the DB white dwarfs. As expected, the difference between the results for the DB white dwarfs and PG1159 is great.

  3. Uniform approach to linear and nonlinear interrelation patterns in multivariate time series.

    PubMed

    Rummel, Christian; Abela, Eugenio; Müller, Markus; Hauf, Martinus; Scheidegger, Olivier; Wiest, Roland; Schindler, Kaspar

    2011-06-01

    Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.

  4. Uniform approach to linear and nonlinear interrelation patterns in multivariate time series

    NASA Astrophysics Data System (ADS)

    Rummel, Christian; Abela, Eugenio; Müller, Markus; Hauf, Martinus; Scheidegger, Olivier; Wiest, Roland; Schindler, Kaspar

    2011-06-01

    Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.

  5. Applications of time-series analysis to mood fluctuations in bipolar disorder to promote treatment innovation: a case series.

    PubMed

    Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M

    2016-01-01

    Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP

  6. Applications of time-series analysis to mood fluctuations in bipolar disorder to promote treatment innovation: a case series.

    PubMed

    Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M

    2016-01-26

    Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP).

  7. Applications of time-series analysis to mood fluctuations in bipolar disorder to promote treatment innovation: a case series

    PubMed Central

    Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M

    2016-01-01

    Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP

  8. VAET: A Visual Analytics Approach for E-Transactions Time-Series.

    PubMed

    Xie, Cong; Chen, Wei; Huang, Xinxin; Hu, Yueqi; Barlowe, Scott; Yang, Jing

    2014-12-01

    Previous studies on E-transaction time-series have mainly focused on finding temporal trends of transaction behavior. Interesting transactions that are time-stamped and situation-relevant may easily be obscured in a large amount of information. This paper proposes a visual analytics system, Visual Analysis of E-transaction Time-Series (VAET), that allows the analysts to interactively explore large transaction datasets for insights about time-varying transactions. With a set of analyst-determined training samples, VAET automatically estimates the saliency of each transaction in a large time-series using a probabilistic decision tree learner. It provides an effective time-of-saliency (TOS) map where the analysts can explore a large number of transactions at different time granularities. Interesting transactions are further encoded with KnotLines, a compact visual representation that captures both the temporal variations and the contextual connection of transactions. The analysts can thus explore, select, and investigate knotlines of interest. A case study and user study with a real E-transactions dataset (26 million records) demonstrate the effectiveness of VAET. PMID:26356888

  9. Multitask Gaussian processes for multivariate physiological time-series analysis.

    PubMed

    Dürichen, Robert; Pimentel, Marco A F; Clifton, Lei; Schweikard, Achim; Clifton, David A

    2015-01-01

    Gaussian process (GP) models are a flexible means of performing nonparametric Bayesian regression. However, GP models in healthcare are often only used to model a single univariate output time series, denoted as single-task GPs (STGP). Due to an increasing prevalence of sensors in healthcare settings, there is an urgent need for robust multivariate time-series tools. Here, we propose a method using multitask GPs (MTGPs) which can model multiple correlated multivariate physiological time series simultaneously. The flexible MTGP framework can learn the correlation between multiple signals even though they might be sampled at different frequencies and have training sets available for different intervals. Furthermore, prior knowledge of any relationship between the time series such as delays and temporal behavior can be easily integrated. A novel normalization is proposed to allow interpretation of the various hyperparameters used in the MTGP. We investigate MTGPs for physiological monitoring with synthetic data sets and two real-world problems from the field of patient monitoring and radiotherapy. The results are compared with standard Gaussian processes and other existing methods in the respective biomedical application areas. In both cases, we show that our framework learned the correlation between physiological time series efficiently, outperforming the existing state of the art.

  10. On fractal analysis of cardiac interbeat time series

    NASA Astrophysics Data System (ADS)

    Guzmán-Vargas, L.; Calleja-Quevedo, E.; Angulo-Brown, F.

    2003-09-01

    In recent years the complexity of a cardiac beat-to-beat time series has been taken as an auxiliary tool to identify the health status of human hearts. Several methods has been employed to characterize the time series complexity. In this work we calculate the fractal dimension of interbeat time series arising from three groups: 10 young healthy persons, 8 elderly healthy persons and 10 patients with congestive heart failures. Our numerical results reflect evident differences in the dynamic behavior corresponding to each group. We discuss these results within the context of the neuroautonomic control of heart rate dynamics. We also propose a numerical simulation which reproduce aging effects of heart rate behavior.

  11. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  12. An introduction to chaotic and random time series analysis

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1989-01-01

    The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.

  13. Period04: Statistical analysis of large astronomical time series

    NASA Astrophysics Data System (ADS)

    Lenz, Patrick; Breger, Michel

    2014-07-01

    Period04 statistically analyzes large astronomical time series containing gaps. It calculates formal uncertainties, can extract the individual frequencies from the multiperiodic content of time series, and provides a flexible interface to perform multiple-frequency fits with a combination of least-squares fitting and the discrete Fourier transform algorithm. Period04, written in Java/C++, supports the SAMP communication protocol to provide interoperability with other applications of the Virtual Observatory. It is a reworked and extended version of Period98 (Sperl 1998) and PERIOD/PERDET (Breger 1990).

  14. Scaling behaviour of heartbeat intervals obtained by wavelet-based time-series analysis

    NASA Astrophysics Data System (ADS)

    Ivanov, Plamen Ch.; Rosenblum, Michael G.; Peng, C.-K.; Mietus, Joseph; Havlin, Shlomo; Stanley, H. Eugene; Goldberger, Ary L.

    1996-09-01

    BIOLOGICAL time-series analysis is used to identify hidden dynamical patterns which could yield important insights into underlying physiological mechanisms. Such analysis is complicated by the fact that biological signals are typically both highly irregular and non-stationary, that is, their statistical character changes slowly or intermittently as a result of variations in background influences1-3. Previous statistical analyses of heartbeat dynamics4-6 have identified long-range correlations and power-law scaling in the normal heartbeat, but not the phase interactions between the different frequency components of the signal. Here we introduce a new approach, based on the wavelet transform and an analytic signal approach, which can characterize non-stationary behaviour and elucidate such phase interactions. We find that, when suitably rescaled, the distributions of the variations in the beat-to-beat intervals for all healthy subjects are described by a single function stable over a wide range of timescales. However, a similar scaling function does not exist for a group with cardiopulmonary instability caused by sleep apnoea. We attribute the functional form of the scaling observed in the healthy subjects to underlying nonlinear dynamics, which seem to be essential to normal heart function. The approach introduced here should be useful in the analysis of other nonstationary biological signals.

  15. Investigation on Law and Economics Based on Complex Network and Time Series Analysis

    PubMed Central

    Yang, Jian; Qu, Zhao; Chang, Hui

    2015-01-01

    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460

  16. Investigation on Law and Economics Based on Complex Network and Time Series Analysis.

    PubMed

    Yang, Jian; Qu, Zhao; Chang, Hui

    2015-01-01

    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing.

  17. Engage: The Science Speaker Series - A novel approach to improving science outreach and communication

    NASA Astrophysics Data System (ADS)

    Mitchell, R.; Hilton, E.; Rosenfield, P.

    2011-12-01

    Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series is hosted on the University of Washington campus and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the importance of an interdisciplinary approach, and discuss strategies for

  18. Engage: The Science Speaker Series - A novel approach to improving science outreach and communication

    NASA Astrophysics Data System (ADS)

    Mitchell, R.; Hilton, E.; Rosenfield, P.

    2012-12-01

    Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010 and 2011, and will be retaught in 2012. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series has been hosted at the University of Washington campus and Seattle Town Hall, and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the

  19. A Time-Series Analysis of Hispanic Unemployment.

    ERIC Educational Resources Information Center

    Defreitas, Gregory

    1986-01-01

    This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)

  20. Complexity analysis of the turbulent environmental fluid flow time series

    NASA Astrophysics Data System (ADS)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  1. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    NASA Astrophysics Data System (ADS)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  2. Improvements in chronic diseases with a comprehensive natural medicine approach: a review and case series.

    PubMed

    Nader, T; Rothenberg, S; Averbach, R; Charles, B; Fields, J Z; Schneider, R H

    2000-01-01

    Approximately 40% of the US population report using complementary and alternative medicine, including Maharishi Vedic Medicine (MVM), a traditional, comprehensive system of natural medicine, for relief from chronic and other disorders. Although many reports suggest health benefits from individual MVM techniques, reports on integrated holistic approaches are rare. This case series, designed to investigate the effectiveness of an integrated, multimodality MVM program in an ideal clinical setting, describes the outcomes in four patients: one with sarcoidosis; one with Parkinson's disease; a third with renal hypertension; and a fourth with diabetes/essential hypertension/anxiety disorder. Standard symptom reports and objective markers of disease were evaluated before, during, and after the treatment period. Results suggested substantial improvements as indicated by reductions in major signs, symptoms, and use of conventional medications in the four patients during the 3-week in-residence treatment phase and continuing through the home follow-up program.

  3. Seasonal and annual precipitation time series trend analysis in North Carolina, United States

    NASA Astrophysics Data System (ADS)

    Sayemuzzaman, Mohammad; Jha, Manoj K.

    2014-02-01

    The present study performs the spatial and temporal trend analysis of the annual and seasonal time-series of a set of uniformly distributed 249 stations precipitation data across the state of North Carolina, United States over the period of 1950-2009. The Mann-Kendall (MK) test, the Theil-Sen approach (TSA) and the Sequential Mann-Kendall (SQMK) test were applied to quantify the significance of trend, magnitude of trend, and the trend shift, respectively. Regional (mountain, piedmont and coastal) precipitation trends were also analyzed using the above-mentioned tests. Prior to the application of statistical tests, the pre-whitening technique was used to eliminate the effect of autocorrelation of precipitation data series. The application of the above-mentioned procedures has shown very notable statewide increasing trend for winter and decreasing trend for fall precipitation. Statewide mixed (increasing/decreasing) trend has been detected in annual, spring, and summer precipitation time series. Significant trends (confidence level ≥ 95%) were detected only in 8, 7, 4 and 10 nos. of stations (out of 249 stations) in winter, spring, summer, and fall, respectively. Magnitude of the highest increasing (decreasing) precipitation trend was found about 4 mm/season (- 4.50 mm/season) in fall (summer) season. Annual precipitation trend magnitude varied between - 5.50 mm/year and 9 mm/year. Regional trend analysis found increasing precipitation in mountain and coastal regions in general except during the winter. Piedmont region was found to have increasing trends in summer and fall, but decreasing trend in winter, spring and on an annual basis. The SQMK test on "trend shift analysis" identified a significant shift during 1960 - 70 in most parts of the state. Finally, the comparison between winter (summer) precipitations with the North Atlantic Oscillation (Southern Oscillation) indices concluded that the variability and trend of precipitation can be explained by the

  4. Extensive mapping of coastal change in Alaska by Landsat time-series analysis, 1972-2013

    NASA Astrophysics Data System (ADS)

    Reynolds, J.; Macander, M. J.; Swingley, C. S.; Spencer, S. R.

    2014-12-01

    The landscape-scale effects of coastal storms on Alaska's Bering Sea and Gulf of Alaska coasts includes coastal erosion, migration of spits and barrier islands, breaching of coastal lakes and lagoons, and inundation and salt-kill of vegetation. Large changes in coastal storm frequency and intensity are expected due to climate change and reduced sea-ice extent. Storms have a wide range of impacts on carbon fluxes and on fish and wildlife resources, infrastructure siting and operation, and emergency response planning. In areas experiencing moderate to large effects, changes can be mapped by analyzing trends in time series of Landsat imagery from Landsat 1 through Landsat 8. The authors are performing a time-series trend analysis for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska. Ice- and cloud-free Landsat imagery from Landsat 1-8, covering 1972-2013, were analyzed using a combination of regression, changepoint detection, and classification tree approaches to detect, classify, and map changes in near-infrared reflectance. Areas with significant changes in coastal features, as well as timing of dominant changes and, in some cases, rates of change were identified . The approach captured many coastal changes over the 42-year study period, including coastal erosion exceeding the 60-m pixel resolution of the Multispectral Scanner (MSS) data and migrations of coastal spits and estuarine channels.

  5. The LDA beamformer: Optimal estimation of ERP source time series using linear discriminant analysis.

    PubMed

    Treder, Matthias S; Porbadnigk, Anne K; Shahbazi Avarvand, Forooz; Müller, Klaus-Robert; Blankertz, Benjamin

    2016-04-01

    We introduce a novel beamforming approach for estimating event-related potential (ERP) source time series based on regularized linear discriminant analysis (LDA). The optimization problems in LDA and linearly-constrained minimum-variance (LCMV) beamformers are formally equivalent. The approaches differ in that, in LCMV beamformers, the spatial patterns are derived from a source model, whereas in an LDA beamformer the spatial patterns are derived directly from the data (i.e., the ERP peak). Using a formal proof and MEG simulations, we show that the LDA beamformer is robust to correlated sources and offers a higher signal-to-noise ratio than the LCMV beamformer and PCA. As an application, we use EEG data from an oddball experiment to show how the LDA beamformer can be harnessed to detect single-trial ERP latencies and estimate connectivity between ERP sources. Concluding, the LDA beamformer optimally reconstructs ERP sources by maximizing the ERP signal-to-noise ratio. Hence, it is a highly suited tool for analyzing ERP source time series, particularly in EEG/MEG studies wherein a source model is not available.

  6. On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Vigo, Isabel M.; Trottini, Mario; Belda, Santiago

    2016-04-01

    In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.

  7. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  8. Multifractal analysis of time series generated by discrete Ito equations

    SciTech Connect

    Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele

    2015-06-15

    In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.

  9. Dynamical analysis and visualization of tornadoes time series.

    PubMed

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  10. Dynamical Analysis and Visualization of Tornadoes Time Series

    PubMed Central

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281

  11. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  12. Dynamical analysis and visualization of tornadoes time series.

    PubMed

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281

  13. Seasonality of tuberculosis in delhi, India: a time series analysis.

    PubMed

    Kumar, Varun; Singh, Abhay; Adhikary, Mrinmoy; Daral, Shailaja; Khokhar, Anita; Singh, Saudan

    2014-01-01

    Background. It is highly cost effective to detect a seasonal trend in tuberculosis in order to optimize disease control and intervention. Although seasonal variation of tuberculosis has been reported from different parts of the world, no definite and consistent pattern has been observed. Therefore, the study was designed to find the seasonal variation of tuberculosis in Delhi, India. Methods. Retrospective record based study was undertaken in a Directly Observed Treatment Short course (DOTS) centre located in the south district of Delhi. Six-year data from January 2007 to December 2012 was analyzed. Expert modeler of SPSS ver. 21 software was used to fit the best suitable model for the time series data. Results. Autocorrelation function (ACF) and partial autocorrelation function (PACF) at lag 12 show significant peak suggesting seasonal component of the TB series. Seasonal adjusted factor (SAF) showed peak seasonal variation from March to May. Univariate model by expert modeler in the SPSS showed that Winter's multiplicative model could best predict the time series data with 69.8% variability. The forecast shows declining trend with seasonality. Conclusion. A seasonal pattern and declining trend with variable amplitudes of fluctuation were observed in the incidence of tuberculosis.

  14. Characterization of Ground Deformation above AN Urban Tunnel by Means of Insar Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Ferretti, A.; Iannacone, J.; Falorni, G.; Berti, M.; Corsini, A.

    2013-12-01

    Ground deformation produced by tunnel excavation in urban areas can cause damage to buildings and infrastructure. In these contexts, monitoring systems are required to determine the surface area affected by displacement and the rates of movement. Advanced multi-image satellite-based InSAR approaches are uniquely suited for this purpose as they provide an overview of the entire affected area and can measure movement rates with millimeter precision. Persistent scatterer approaches such as SqueeSAR™ use reflections off buildings, lampposts, roads, etc to produce a high-density point cloud in which each point has a time series of deformation spanning the period covered by the imagery. We investigated an area of about 10 km2 in North Vancouver, (Canada) where the shaft excavation of the Seymour-Capilano water filtration plant was started in 2004. As part of the project, twin tunnels in bedrock were excavated to transfer water from the Capilano Reservoir to the treatment plant. A radar dataset comprising 58 images (spanning March 2001 - June 2008) acquired by the Radarsat-1 satellite and covering the period of excavation was processed with the SqueeSAR™ algorithm (Ferretti et al., 2011) to assess the ground deformation caused by the tunnel excavation. To better characterize the deformation in the time and space domains and correlate ground movement with excavation, an in-depth time series analysis was carried out. Berti et al. (2013) developed an automatic procedure for the analysis of InSAR time series based on a sequence of statistical tests. The tool classifies time series into six distinctive types (uncorrelated; linear; quadratic; bilinear; discontinuous without constant velocity; discontinuous with change in velocity) which can be linked to different physical phenomena. It also provides a series of descriptive parameters which can be used to characterize the temporal changes of ground motion. We processed the movement time series with PSTime to determine the

  15. Advantages of the Multiple Case Series Approach to the Study of Cognitive Deficits in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Towgood, Karren J.; Meuwese, Julia D. I.; Gilbert, Sam J.; Turner, Martha S.; Burgess, Paul W.

    2009-01-01

    In the neuropsychological case series approach, tasks are administered that tap different cognitive domains, and differences within rather than across individuals are the basis for theorising; each individual is effectively their own control. This approach is a mainstay of cognitive neuropsychology, and is particularly suited to the study of…

  16. Cross-recurrence quantification analysis of categorical and continuous time series: an R package

    PubMed Central

    Coco, Moreno I.; Dale, Rick

    2014-01-01

    This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between “correlation” and “co-visitation” when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results’ consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels. PMID:25018736

  17. Time-series tropical forest change detection: a visual and quantitative approach

    NASA Astrophysics Data System (ADS)

    Sader, Steven A.; Sever, Thomas; Smoot, James C.

    1996-11-01

    Forest change detection over a decadal time frame was conducted for the Maya Biosphere Reserve in northern Guatemala. A simple and logical method of visualizing and quantifying forest change is presented. Analysis of time- series Landsat-Thematic Mapper imagery provided estimates of forest change at three time periods; prior to 1990, 1990 to 1993 and 1993 to 1995. Four dates of Landsat imagery were pre-processed, co-registered to a UTM projection and the normalized difference vegetation index was computed for each date. An unsupervised classification was performed and cluster classes were grouped into time-series change/no change categories. A color coded image was generated which resembled the RBG-NDVI color composite of the 1990, 1993, and 1995 imagery. Land cover information and Geographic Information System (GIS) editing techniques were applied to resolve some confusions between forest change and change in non-forest types. Results indicated that forest clearing rates in the reserve were less than 0.5 percent per year in the early to mid 1990s but the buffer zone clearing rates, at over two percent, were much higher.

  18. Knowledge fusion: An approach to time series model selection followed by pattern recognition

    SciTech Connect

    Bleasdale, S.A.; Burr, T.L.; Scovel, J.C.; Strittmatter, R.B.

    1996-03-01

    This report describes work done during FY 95 that was sponsored by the Department of Energy, Office of Nonproliferation and National Security, Knowledge Fusion Project. The project team selected satellite sensor data to use as the one main example for the application of its analysis algorithms. The specific sensor-fusion problem has many generic features, which make it a worthwhile problem to attempt to solve in a general way. The generic problem is to recognize events of interest from multiple time series that define a possibly noisy background. By implementing a suite of time series modeling and forecasting methods and using well-chosen alarm criteria, we reduce the number of false alarms. We then further reduce the number of false alarms by analyzing all suspicious sections of data, as judged by the alarm criteria, with pattern recognition methods. An accompanying report (Ref 1) describes the implementation and application of this 2-step process for separating events from unusual background and applies a suite of forecasting methods followed by a suite of pattern recognition methods. This report goes into more detail about one of the forecasting methods and one of the pattern recognition methods and is applied to the same kind of satellite-sensor data that is described in Ref. 1.

  19. Class D Management Implementation Approach of the First Orbital Mission of the Earth Venture Series

    NASA Technical Reports Server (NTRS)

    Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris

    2013-01-01

    A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.

  20. Class D management implementation approach of the first orbital mission of the Earth Venture series

    NASA Astrophysics Data System (ADS)

    Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris

    2013-09-01

    A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.

  1. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    NASA Technical Reports Server (NTRS)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  2. Bilinear System Characteristics from Nonlinear Time Series Analysis

    SciTech Connect

    Hunter, N.F. Jr.

    1999-02-08

    Detection of changes in the resonant frequencies and mode shapes of a system is a fundamental problem in dynamics. This paper describes a time series method of detecting and quantifying changes in these parameters for a ten degree-of-freedom bilinear system excited by narrow band random noise. The method partitions the state space and computes mode frequencies and mode shapes for each region. Different regions of the space may exhibit different mode shapes, allowing diagnosis of stiffness changes at structural discontinuities. The method is useful for detecting changes in the properties of joints in mechanical systems or for detection of damage as the properties of a structure change during use.

  3. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  4. Anatomy of the ICDS series: A bibliometric analysis

    NASA Astrophysics Data System (ADS)

    Cardona, Manuel; Marxa, Werner

    2007-12-01

    In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called “source journals” covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories.

  5. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  6. Spectral analysis of time series of categorical variables in earth sciences

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.; Dorador, Javier

    2016-10-01

    Time series of categorical variables often appear in Earth Science disciplines and there is considerable interest in studying their cyclic behavior. This is true, for example, when the type of facies, petrofabric features, ichnofabrics, fossil assemblages or mineral compositions are measured continuously over a core or throughout a stratigraphic succession. Here we deal with the problem of applying spectral analysis to such sequences. A full indicator approach is proposed to complement the spectral envelope often used in other disciplines. Additionally, a stand-alone computer program is provided for calculating the spectral envelope, in this case implementing the permutation test to assess the statistical significance of the spectral peaks. We studied simulated sequences as well as real data in order to illustrate the methodology.

  7. Flow Analysis: A Novel Approach For Classification.

    PubMed

    Vakh, Christina; Falkova, Marina; Timofeeva, Irina; Moskvin, Alexey; Moskvin, Leonid; Bulatov, Andrey

    2016-09-01

    We suggest a novel approach for classification of flow analysis methods according to the conditions under which the mass transfer processes and chemical reactions take place in the flow mode: dispersion-convection flow methods and forced-convection flow methods. The first group includes continuous flow analysis, flow injection analysis, all injection analysis, sequential injection analysis, sequential injection chromatography, cross injection analysis, multi-commutated flow analysis, multi-syringe flow injection analysis, multi-pumping flow systems, loop flow analysis, and simultaneous injection effective mixing flow analysis. The second group includes segmented flow analysis, zone fluidics, flow batch analysis, sequential injection analysis with a mixing chamber, stepwise injection analysis, and multi-commutated stepwise injection analysis. The offered classification allows systematizing a large number of flow analysis methods. Recent developments and applications of dispersion-convection flow methods and forced-convection flow methods are presented.

  8. Flow Analysis: A Novel Approach For Classification.

    PubMed

    Vakh, Christina; Falkova, Marina; Timofeeva, Irina; Moskvin, Alexey; Moskvin, Leonid; Bulatov, Andrey

    2016-09-01

    We suggest a novel approach for classification of flow analysis methods according to the conditions under which the mass transfer processes and chemical reactions take place in the flow mode: dispersion-convection flow methods and forced-convection flow methods. The first group includes continuous flow analysis, flow injection analysis, all injection analysis, sequential injection analysis, sequential injection chromatography, cross injection analysis, multi-commutated flow analysis, multi-syringe flow injection analysis, multi-pumping flow systems, loop flow analysis, and simultaneous injection effective mixing flow analysis. The second group includes segmented flow analysis, zone fluidics, flow batch analysis, sequential injection analysis with a mixing chamber, stepwise injection analysis, and multi-commutated stepwise injection analysis. The offered classification allows systematizing a large number of flow analysis methods. Recent developments and applications of dispersion-convection flow methods and forced-convection flow methods are presented. PMID:26364745

  9. Analysis of Zenith Tropospheric Delay above Europe based on long time series derived from the EPN data

    NASA Astrophysics Data System (ADS)

    Baldysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; Kroszczynski, Krzysztof; Araszkiewicz, Andrzej

    2015-04-01

    In recent years, the GNSS system began to play an increasingly important role in the research related to the climate monitoring. Based on the GPS system, which has the longest operational capability in comparison with other systems, and a common computational strategy applied to all observations, long and homogeneous ZTD (Zenith Tropospheric Delay) time series were derived. This paper presents results of analysis of 16-year ZTD time series obtained from the EPN (EUREF Permanent Network) reprocessing performed by the Military University of Technology. To maintain the uniformity of data, analyzed period of time (1998-2013) is exactly the same for all stations - observations carried out before 1998 were removed from time series and observations processed using different strategy were recalculated according to the MUT LAC approach. For all 16-year time series (59 stations) Lomb-Scargle periodograms were created to obtain information about the oscillations in ZTD time series. Due to strong annual oscillations which disturb the character of oscillations with smaller amplitude and thus hinder their investigation, Lomb-Scargle periodograms for time series with the deleted annual oscillations were created in order to verify presence of semi-annual, ter-annual and quarto-annual oscillations. Linear trend and seasonal components were estimated using LSE (Least Square Estimation) and Mann-Kendall trend test were used to confirm the presence of linear trend designated by LSE method. In order to verify the effect of the length of time series on the estimated size of the linear trend, comparison between two different length of ZTD time series was performed. To carry out a comparative analysis, 30 stations which have been operating since 1996 were selected. For these stations two periods of time were analyzed: shortened 16-year (1998-2013) and full 18-year (1996-2013). For some stations an additional two years of observations have significant impact on changing the size of linear

  10. Methods for the Analysis of interferometric Time Series Non-linearity

    NASA Astrophysics Data System (ADS)

    Pasquali, Paolo; Cantone, Alessio; Riccardi, Paolo

    2014-05-01

    Interferometric stacking techniques emerged as methods to obtain very precise measurements of small terrain displacements. In particular, the so-called Persistent Scatterers and Small BASeline methods can be considered as the two most representative stacking approaches. In both cases, the exploitation of 20 or more satellite Synthetic Aperture Radar (SAR) acquisitions obtained from the same satellite sensor with similar geometries on the interest area allows to measure average displacement rates with an accuracy in the order of few mm / year, and to derive the full location history of "good" pixels with an accuracy of 1cm or better for every available date. Although the temporal component of these measurements provides very rich information to investigate the evolution of complex phenomena, this wealth of data can result of difficult interpretation as soon as the area of investigation reaches certain sizes and several millions of valid pixels can be identified. The typical approach is then to focus the analysis on the average displacement rate: one evident advantage is that it can be easily displayed, and regions showing different average behaviours can be easily identified with a simple visual analysis. Limitations of this approach become evident as soon as more complex, non-linear behaviours are to be expected (as natural) in a certain region, and different methods shall be sought to provide a synthetic way to visualise the time series in a synoptic way and to identify areas with similar, non-linear characteristics. The paper focus on the identification of which could be descriptive parameter(s) that, complementarily to the average displacement rate, could be synthesized from the displacement time series and exploited in this analysis. While asking this it shall be noticed that this approach is of particular applicability to time series obtained with the SBAS method that, due to its algorithm, is less depending on linearity assumptions than the PS method. A first

  11. Taxation in Public Education. Analysis and Bibliography Series, No. 12.

    ERIC Educational Resources Information Center

    Ross, Larry L.

    Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…

  12. Student Stress: A Classroom Management System. Analysis and Action Series.

    ERIC Educational Resources Information Center

    Swick, Kevin J.

    This book is concerned with the problem of student stress and the possibility that children and adolescents will internalize ineffective coping strategies used by adult models available to them. The introductory chapter explains a need for an educational plan to promote ways of controlling stress; recommends a systematic approach to managing…

  13. Multiscale InSAR Time Series (MInTS) analysis of surface deformation

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Muse, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; DiCaprio, C. J.

    2011-12-01

    We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, such that coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least-squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.

  14. Multiscale InSAR Time Series (MInTS) analysis of surface deformation

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Musé, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; Dicaprio, C. J.

    2012-02-01

    We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, since the coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.

  15. A multi-modal treatment approach for the shoulder: A 4 patient case series

    PubMed Central

    Pribicevic, Mario; Pollard, Henry

    2005-01-01

    Background This paper describes the clinical management of four cases of shoulder impingement syndrome using a conservative multimodal treatment approach. Clinical Features Four patients presented to a chiropractic clinic with chronic shoulder pain, tenderness in the shoulder region and a limited range of motion with pain and catching. After physical and orthopaedic examination a clinical diagnosis of shoulder impingement syndrome was reached. The four patients were admitted to a multi-modal treatment protocol including soft tissue therapy (ischaemic pressure and cross-friction massage), 7 minutes of phonophoresis (driving of medication into tissue with ultrasound) with 1% cortisone cream, diversified spinal and peripheral joint manipulation and rotator cuff and shoulder girdle muscle exercises. The outcome measures for the study were subjective/objective visual analogue pain scales (VAS), range of motion (goniometer) and return to normal daily, work and sporting activities. All four subjects at the end of the treatment protocol were symptom free with all outcome measures being normal. At 1 month follow up all patients continued to be symptom free with full range of motion and complete return to normal daily activities. Conclusion This case series demonstrates the potential benefit of a multimodal chiropractic protocol in resolving symptoms associated with a suspected clinical diagnosis of shoulder impingement syndrome. PMID:16168053

  16. Modular Approach to Instrumental Analysis.

    ERIC Educational Resources Information Center

    Deming, Richard L.; And Others

    1982-01-01

    To remedy certain deficiencies, an instrument analysis course was reorganized into six one-unit modules: optical spectroscopy, magnetic resonance, separations, electrochemistry, radiochemistry, and computers and interfacing. Selected aspects of the course are discussed. (SK)

  17. Reduction in the suicide rate during Advent--a time series analysis.

    PubMed

    Ajdacic-Gross, Vladeta; Lauber, Christoph; Bopp, Matthias; Eich, Dominique; Gostynski, Michael; Gutzwiller, Felix; Burns, Tom; Rössler, Wulf

    2008-01-15

    Research has shown that there are different seasonal effects in suicide. The aim of this study is to demonstrate that the decrease in suicide rate at the end of the year is extended over the last weeks of the year and represents a specific type of seasonal effect. Suicide data were extracted from individual records of the Swiss mortality statistics, 1969-2003. The data were aggregated to daily frequencies of suicide across the year. Specifically, the period October-February was examined using time-series analysis, i.e., the Box-Jenkins approach with intervention models. The time series models require a step function to account for the gradual drop in suicide frequencies in December. The decrease in suicide frequencies includes the whole Advent and is accentuated at Christmas. After the New Year, there is a sharp recovery in men's suicide rate but not in women's. The reduction in the suicide rate during the last weeks of the year exceeds the well-recognised effect of reduced rates on major public holidays. It involves valuable challenges for suicide prevention such as timing of campaigns and enhancement of social networks.

  18. Different approaches of spectral analysis

    NASA Technical Reports Server (NTRS)

    Lacoume, J. L.

    1977-01-01

    Several approaches to the problem of the calculation of spectral power density of a random function from an estimate of the autocorrelation function were studied. A comparative study was presented of these different methods. The principles on which they are based and the hypothesis implied were pointed out. Some indications on the optimization of the length of the estimated correlation function was given. An example of application of the different methods discussed in this paper was included.

  19. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG).

  20. Data Reorganization for Optimal Time Series Data Access, Analysis, and Visualization

    NASA Astrophysics Data System (ADS)

    Rui, H.; Teng, W. L.; Strub, R.; Vollmer, B.

    2012-12-01

    The way data are archived is often not optimal for their access by many user communities (e.g., hydrological), particularly if the data volumes and/or number of data files are large. The number of data records of a non-static data set generally increases with time. Therefore, most data sets are commonly archived by time steps, one step per file, often containing multiple variables. However, many research and application efforts need time series data for a given geographical location or area, i.e., a data organization that is orthogonal to the way the data are archived. The retrieval of a time series of the entire temporal coverage of a data set for a single variable at a single data point, in an optimal way, is an important and longstanding challenge, especially for large science data sets (i.e., with volumes greater than 100 GB). Two examples of such large data sets are the North American Land Data Assimilation System (NLDAS) and Global Land Data Assimilation System (GLDAS), archived at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC; Hydrology Data Holdings Portal, http://disc.sci.gsfc.nasa.gov/hydrology/data-holdings). To date, the NLDAS data set, hourly 0.125x0.125° from Jan. 1, 1979 to present, has a total volume greater than 3 TB (compressed). The GLDAS data set, 3-hourly and monthly 0.25x0.25° and 1.0x1.0° Jan. 1948 to present, has a total volume greater than 1 TB (compressed). Both data sets are accessible, in the archived time step format, via several convenient methods, including Mirador search and download (http://mirador.gsfc.nasa.gov/), GrADS Data Server (GDS; http://hydro1.sci.gsfc.nasa.gov/dods/), direct FTP (ftp://hydro1.sci.gsfc.nasa.gov/data/s4pa/), and Giovanni Online Visualization and Analysis (http://disc.sci.gsfc.nasa.gov/giovanni). However, users who need long time series currently have no efficient way to retrieve them. Continuing a longstanding tradition of facilitating data access, analysis, and

  1. Variance Analysis of Unevenly Spaced Time Series Data

    NASA Technical Reports Server (NTRS)

    Hackman, Christine; Parker, Thomas E.

    1996-01-01

    We have investigated the effect of uneven data spacing on the computation of delta (sub chi)(gamma). Evenly spaced simulated data sets were generated for noise processes ranging from white phase modulation (PM) to random walk frequency modulation (FM). Delta(sub chi)(gamma) was then calculated for each noise type. Data were subsequently removed from each simulated data set using typical two-way satellite time and frequency transfer (TWSTFT) data patterns to create two unevenly spaced sets with average intervals of 2.8 and 3.6 days. Delta(sub chi)(gamma) was then calculated for each sparse data set using two different approaches. First the missing data points were replaced by linear interpolation and delta (sub chi)(gamma) calculated from this now full data set. The second approach ignored the fact that the data were unevenly spaced and calculated delta(sub chi)(gamma) as if the data were equally spaced with average spacing of 2.8 or 3.6 days. Both approaches have advantages and disadvantages, and techniques are presented for correcting errors caused by uneven data spacing in typical TWSTFT data sets.

  2. Analysis of Binary Series to Evaluate Astronomical Forcing of a Middle Permian Chert Sequence in South China

    NASA Astrophysics Data System (ADS)

    Hinnov, L. A.; Yao, X.; Zhou, Y.

    2014-12-01

    We describe a Middle Permian radiolarian chert sequence in South China (Chaohu area), with sequence of chert and mudstone layers formulated into binary series.Two interpolation approaches were tested: linear interpolation resulting in a "triangle" series, and staircase interpolation resulting in a "boxcar" series. Spectral analysis of the triangle series reveals decimeter chert-mudstone cycles which represent theoretical Middle Permian 32 kyr obliquity cycling. Tuning these cycles to a 32-kyr periodicity reveals that other cm-scale cycles are in the precession index band and have a strong ~400 kyr amplitude modulation. Additional tuning tests further support a hypothesis of astronomical forcing of the chert sequence. Analysis of the boxcar series reveals additional "eccentricity" terms transmitted by the boxcar representation of the modulating precession-scale cycles. An astronomical time scale reconstructed from these results assumes a Roadian/Wordian boundary age of 268.8 Ma for the onset of the first chert layer at the base of the sequence and ends at 264.1 Ma, for a total duration of 4.7 Myrs. We propose that monsoon-controlled upwelling contributed to the development of the chert-mudstone cycles. A seasonal monsoon controlled by astronomical forcing influenced the intensity of upwelling, modulating radiolarian productivity and silica deposition.

  3. Pitfalls in Fractal Time Series Analysis: fMRI BOLD as an Exemplary Case.

    PubMed

    Eke, Andras; Herman, Peter; Sanganahalli, Basavaraju G; Hyder, Fahmeed; Mukli, Peter; Nagy, Zoltan

    2012-01-01

    This article will be positioned on our previous work demonstrating the importance of adhering to a carefully selected set of criteria when choosing the suitable method from those available ensuring its adequate performance when applied to real temporal signals, such as fMRI BOLD, to evaluate one important facet of their behavior, fractality. Earlier, we have reviewed on a range of monofractal tools and evaluated their performance. Given the advance in the fractal field, in this article we will discuss the most widely used implementations of multifractal analyses, too. Our recommended flowchart for the fractal characterization of spontaneous, low frequency fluctuations in fMRI BOLD will be used as the framework for this article to make certain that it will provide a hands-on experience for the reader in handling the perplexed issues of fractal analysis. The reason why this particular signal modality and its fractal analysis has been chosen was due to its high impact on today's neuroscience given it had powerfully emerged as a new way of interpreting the complex functioning of the brain (see "intrinsic activity"). The reader will first be presented with the basic concepts of mono and multifractal time series analyses, followed by some of the most relevant implementations, characterization by numerical approaches. The notion of the dichotomy of fractional Gaussian noise and fractional Brownian motion signal classes and their impact on fractal time series analyses will be thoroughly discussed as the central theme of our application strategy. Sources of pitfalls and way how to avoid them will be identified followed by a demonstration on fractal studies of fMRI BOLD taken from the literature and that of our own in an attempt to consolidate the best practice in fractal analysis of empirical fMRI BOLD signals mapped throughout the brain as an exemplary case of potentially wide interest. PMID:23227008

  4. Pitfalls in Fractal Time Series Analysis: fMRI BOLD as an Exemplary Case

    PubMed Central

    Eke, Andras; Herman, Peter; Sanganahalli, Basavaraju G.; Hyder, Fahmeed; Mukli, Peter; Nagy, Zoltan

    2012-01-01

    This article will be positioned on our previous work demonstrating the importance of adhering to a carefully selected set of criteria when choosing the suitable method from those available ensuring its adequate performance when applied to real temporal signals, such as fMRI BOLD, to evaluate one important facet of their behavior, fractality. Earlier, we have reviewed on a range of monofractal tools and evaluated their performance. Given the advance in the fractal field, in this article we will discuss the most widely used implementations of multifractal analyses, too. Our recommended flowchart for the fractal characterization of spontaneous, low frequency fluctuations in fMRI BOLD will be used as the framework for this article to make certain that it will provide a hands-on experience for the reader in handling the perplexed issues of fractal analysis. The reason why this particular signal modality and its fractal analysis has been chosen was due to its high impact on today’s neuroscience given it had powerfully emerged as a new way of interpreting the complex functioning of the brain (see “intrinsic activity”). The reader will first be presented with the basic concepts of mono and multifractal time series analyses, followed by some of the most relevant implementations, characterization by numerical approaches. The notion of the dichotomy of fractional Gaussian noise and fractional Brownian motion signal classes and their impact on fractal time series analyses will be thoroughly discussed as the central theme of our application strategy. Sources of pitfalls and way how to avoid them will be identified followed by a demonstration on fractal studies of fMRI BOLD taken from the literature and that of our own in an attempt to consolidate the best practice in fractal analysis of empirical fMRI BOLD signals mapped throughout the brain as an exemplary case of potentially wide interest. PMID:23227008

  5. Vapor burn analysis for the Coyote series LNG spill experiments

    SciTech Connect

    Rodean, H.C.; Hogan, W.J.; Urtiew, P.A.; Goldwire, H.C. Jr.; McRae, T.G.; Morgan, D.L. Jr.

    1984-04-01

    A major purpose of the Coyote series of field experiments at China Lake, California, in 1981 was to study the burning of vapor clouds from spills of liquefied natural gas (LNG) on water. Extensive arrays of instrumentation were deployed to obtain micrometeorological, gas concentration, and fire-related data. The instrumentation included in situ sensors of various types, high-speed motion picture cameras, and infrared (IR) imagers. Five of the total of ten Coyote spill experiments investigated vapor burns. The first vapor-burn experiment, Coyote 2, was done with a small spill of LNG to assess instrument capability and survivability in vapor cloud fires. The emphasis in this report is on the other four vapor-burn experiments: Coyotes 3, 5, 6, and 7. The data are analyzed to determine fire spread, flame propagation, and heat flux - quantities that are related to the determination of the damage zone for vapor burns. The results of the analyses are given here. 20 references, 57 figures, 7 tables.

  6. Presentations to Emergency Departments for COPD: A Time Series Analysis.

    PubMed

    Rosychuk, Rhonda J; Youngson, Erik; Rowe, Brian H

    2016-01-01

    Background. Chronic obstructive pulmonary disease (COPD) is a common respiratory condition characterized by progressive dyspnea and acute exacerbations which may result in emergency department (ED) presentations. This study examines monthly rates of presentations to EDs in one Canadian province. Methods. Presentations for COPD made by individuals aged ≥55 years during April 1999 to March 2011 were extracted from provincial databases. Data included age, sex, and health zone of residence (North, Central, South, and urban). Crude rates were calculated. Seasonal autoregressive integrated moving average (SARIMA) time series models were developed. Results. ED presentations for COPD totalled 188,824 and the monthly rate of presentation remained relatively stable (from 197.7 to 232.6 per 100,000). Males and seniors (≥65 years) comprised 52.2% and 73.7% of presentations, respectively. The ARIMA(1,0, 0) × (1,0, 1)12 model was appropriate for the overall rate of presentations and for each sex and seniors. Zone specific models showed relatively stable or decreasing rates; the North zone had an increasing trend. Conclusions. ED presentation rates for COPD have been relatively stable in Alberta during the past decade. However, their increases in northern regions deserve further exploration. The SARIMA models quantified the temporal patterns and can help planning future health care service needs. PMID:27445514

  7. Presentations to Emergency Departments for COPD: A Time Series Analysis

    PubMed Central

    Youngson, Erik; Rowe, Brian H.

    2016-01-01

    Background. Chronic obstructive pulmonary disease (COPD) is a common respiratory condition characterized by progressive dyspnea and acute exacerbations which may result in emergency department (ED) presentations. This study examines monthly rates of presentations to EDs in one Canadian province. Methods. Presentations for COPD made by individuals aged ≥55 years during April 1999 to March 2011 were extracted from provincial databases. Data included age, sex, and health zone of residence (North, Central, South, and urban). Crude rates were calculated. Seasonal autoregressive integrated moving average (SARIMA) time series models were developed. Results. ED presentations for COPD totalled 188,824 and the monthly rate of presentation remained relatively stable (from 197.7 to 232.6 per 100,000). Males and seniors (≥65 years) comprised 52.2% and 73.7% of presentations, respectively. The ARIMA(1,0, 0) × (1,0, 1)12 model was appropriate for the overall rate of presentations and for each sex and seniors. Zone specific models showed relatively stable or decreasing rates; the North zone had an increasing trend. Conclusions. ED presentation rates for COPD have been relatively stable in Alberta during the past decade. However, their increases in northern regions deserve further exploration. The SARIMA models quantified the temporal patterns and can help planning future health care service needs. PMID:27445514

  8. Recurrence network analysis of nonlinear geoscientific time series: Theoretical foundations and applications to paleoclimate

    NASA Astrophysics Data System (ADS)

    Donner, R. V.; Donges, J. F.; Kurths, J.

    2011-12-01

    In the past years, different approaches to studying time series of complex systems from a graph-theoretical point of view have been suggested by several authors. Among the proposed methods, recurrence networks have particularly proven their great potential for characterizing the structural complexity of the system under study and detecting subtle changes in the underlying dynamics. Based on the concept of recurrence plots, recurrence networks encode mutual proximity relationships in the recorded dynamical system's phase space and thus describe the structural backbone of the underlying dynamics. As a consequence, many of the local and global measures traditionally characterizing complex networks have simple geometric interpretations when being considered for recurrence networks. For example, local and global transitivity properties of recurrence networks allow defining sophisticated measures for the effective dimensionality of the system under study. Since a more regular behavior (e.g., phases of laminar or periodic behavior) is less complex from a dynamical system's perspective than fully chaotic or stochastic dynamics, the estimated values of the corresponding graph-theoretic measures (local clustering coefficient and global network transitivity, respectively) serve as easily calculable indicators for dynamic regularity. Other network quantifiers can be interpreted in similar ways. The fact that only "spatial" information is taken into account in the network construction makes recurrence networks especially robust with respect to typical problems one is confronted with in the analysis of nonlinear geoscientific time series. Specifically, since time information is not explicitly considered, recurrence networks are well applicable to time series with nonuniform sampling and/or uncertain timing of observations, which are typical features of paleoclimate records. As a particular example, the results of recurrence network analysis are reported for different geological

  9. A Predictive Analysis Approach to Adaptive Testing.

    ERIC Educational Resources Information Center

    Kirisci, Levent; Hsu, Tse-Chi

    The predictive analysis approach to adaptive testing originated in the idea of statistical predictive analysis suggested by J. Aitchison and I.R. Dunsmore (1975). The adaptive testing model proposed is based on parameter-free predictive distribution. Aitchison and Dunsmore define statistical prediction analysis as the use of data obtained from an…

  10. Intra-cholecystic approach for laparoscopic management of Mirizzi's syndrome: A case series

    PubMed Central

    Nag, Hirdaya H.; Gangadhara, Vageesh Bettageri; Dangi, Amit

    2016-01-01

    INTRODUCTION: Laparoscopic management of patients with Mirizzi's syndrome (MS) is not routinely recommended due to the high risk of iatrogenic complications. PATIENTS AND METHODS: Intra-cholecystic (IC) or inside-gall bladder (GB) approach was used for laparoscopic management of 16 patients with MS at a tertiary care referral centre in North India from May 2010 to August 2014; a retrospective analysis of prospectively collected data was performed. RESULTS: Mean age was 40.1 ± 14.7 years, the male-to-female ratio was 1:3, and 9 (56.25%) patients had type 1 MS (MS1) and 7 (43.75%) had type 2 MS (MS2) (McSherry's classification). The laparoscopic intra-cholecystic approach (LICA) was successful in 11 (68.75%) patients, whereas 5 patients (31.25%) required conversion to open method. Median blood loss was 100 mL (range: 50-400 mL), and median duration of surgery was 3.25 h (range: 2-7.5 h). No major complications were encountered except 1 patient (6.5%) who required re-operation for retained bile duct stones. The final histopathology report was benign in all the patients. No remote complications were noted during a mean follow-up of 20.18 months. CONCLUSION: LICA is a feasible and safe approach for selected patients with Mirizzi's syndrome; however, a low threshold for conversion is necessary to avoid iatrogenic complications. PMID:27251843

  11. Student Involvement. Analysis and Bibliography Series, No. 14.

    ERIC Educational Resources Information Center

    Armstrong, Ronald

    Intended primarily for educational administrators, this review presents an analysis of the literature concerning student participation in educational decisionmaking. The educational and legal ramifications of student involvement in several decisionmaking spheres, such as school board and committee membership, student government, extracurricular…

  12. Philosophy: Discipline Analysis. Women in the Curriculum Series.

    ERIC Educational Resources Information Center

    Nye, Andrea

    This essay examines the ways in which philosophy, as a discipline, has been influenced by feminist scholarship in the field. It explains that in the 1970s feminist philosophers introduced questions regarding personal life and sexuality as matters for philosophical analysis, and that scholars began to challenge the notions of the Western canon.…

  13. Educational Attainment: Analysis by Immigrant Generation. IZA Discussion Paper Series.

    ERIC Educational Resources Information Center

    Chiswick, Barry R.; DebBurman, Noyna

    This paper presents a theoretical and empirical analysis of the largely ignored issue of the determinants of the educational attainment of adults by immigrant generation. Using Current Population Survey (CPS) data, differences in educational attainment are analyzed by immigrant generation (first, second, and higher order generations), and among…

  14. Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  15. New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…

  16. Characterizing rainfall of hot arid region by using time-series modeling and sustainability approaches: a case study from Gujarat, India

    NASA Astrophysics Data System (ADS)

    Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi

    2016-05-01

    This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable

  17. DAHITI - An Innovative Approach for Estimating Water Level Time Series over Inland Water using Multi-Mission Satellite Altimetry

    NASA Astrophysics Data System (ADS)

    Schwatke, Christian; Dettmering, Denise

    2016-04-01

    Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, for some years, this technology has also been used to retrieve water levels from lakes, reservoirs, rivers, wetlands and in general any inland water body. In this contribution, a new approach for the estimation of inland water level time series is presented. The method is the basis for the computation of time series of rivers and lakes available through the web service 'Database for Hydrological Time Series over Inland Water' (DAHITI). It is based on an extended outlier rejection and a Kalman filter approach incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1, Jason-2, Topex/Poseidon, and SARAL/AltiKa, including their uncertainties. The new approach yields RMS differences with respect to in situ data between 4 cm and 36 cm for lakes and 8 cm and 114 cm for rivers, respectively. Within this presentation, the new approach will be introduced and examples for water level time series for a variety of lakes and rivers will be shown featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases.

  18. Modeling Dyadic Processes Using Hidden Markov Models: A Time Series Approach to Mother-Infant Interactions during Infant Immunization

    ERIC Educational Resources Information Center

    Stifter, Cynthia A.; Rovine, Michael

    2015-01-01

    The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…

  19. Models for Planning. Analysis of Literature and Selected Bibliography. Analysis and Bibliography Series, No. 5.

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Educational Management, Eugene, OR.

    This review analyzes current research trends in the application of planning models to broad educational systems. Planning models reviewed include systems approach models, simulation models, operational gaming, linear programing, Markov chain analysis, dynamic programing, and queuing techniques. A 77-item bibliography of recent literature is…

  20. Analysis of the temporal properties in car accident time series

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Lovallo, Michele

    2008-05-01

    In this paper we study the time-clustering behavior of sequences of car accidents, using data from a freely available database in the internet. The Allan Factor analysis, which is a well-suited method to investigate time-dynamical behaviors in point processes, has revealed that the car accident sequences are characterized by a general time-scaling behavior, with the presence of cyclic components. These results indicate that the time dynamics of the events are not Poissonian but long range correlated with periodicities ranging from 12 h to 1 year.

  1. Analysis of cedar pollen time series: no evidence of low-dimensional chaotic behavior.

    PubMed

    Delaunay, J-J; Konishi, R; Seymour, C

    2006-01-01

    Much of the current interest in pollen time series analysis is motivated by the possibility that pollen series arise from low-dimensional chaotic systems. If this is the case, short-range prediction using nonlinear modeling is justified and would produce high-quality forecasts that could be useful in providing pollen alerts to allergy sufferers. To date, contradictory reports about the characterization of the dynamics of pollen series can be found in the literature. Pollen series have been alternatively described as featuring and not featuring deterministic chaotic behavior. We showed that the choice of test for detection of deterministic chaos in pollen series is difficult because pollen series exhibit [see text] power spectra. This is a characteristic that is also produced by colored noise series, which mimic deterministic chaos in most tests. We proposed to apply the Ikeguchi-Aihara test to properly detect the presence of deterministic chaos in pollen series. We examined the dynamics of cedar (Cryptomeria japonica) hourly pollen series by means of the Ikeguchi-Aihara test and concluded that these pollen series cannot be described as low-dimensional deterministic chaos. Therefore, the application of low-dimensional chaotic deterministic models to the prediction of short-range pollen concentration will not result in high-accuracy pollen forecasts even though these models may provide useful forecasts for certain applications. We believe that our conclusion can be generalized to pollen series from other wind-pollinated plant species, as wind speed, the forcing parameter of the pollen emission and transport, is best described as a nondeterministic series that originates in the high dimensionality of the atmosphere.

  2. Mapping mountain pine beetle mortality through growth trend analysis of time-series landsat data

    USGS Publications Warehouse

    Liang, Lu; Chen, Yanlei; Hawbaker, Todd J.; Zhu, Zhi-Liang; Gong, Peng

    2014-01-01

    Disturbances are key processes in the carbon cycle of forests and other ecosystems. In recent decades, mountain pine beetle (MPB; Dendroctonus ponderosae) outbreaks have become more frequent and extensive in western North America. Remote sensing has the ability to fill the data gaps of long-term infestation monitoring, but the elimination of observational noise and attributing changes quantitatively are two main challenges in its effective application. Here, we present a forest growth trend analysis method that integrates Landsat temporal trajectories and decision tree techniques to derive annual forest disturbance maps over an 11-year period. The temporal trajectory component successfully captures the disturbance events as represented by spectral segments, whereas decision tree modeling efficiently recognizes and attributes events based upon the characteristics of the segments. Validated against a point set sampled across a gradient of MPB mortality, 86.74% to 94.00% overall accuracy was achieved with small variability in accuracy among years. In contrast, the overall accuracies of single-date classifications ranged from 37.20% to 75.20% and only become comparable with our approach when the training sample size was increased at least four-fold. This demonstrates that the advantages of this time series work flow exist in its small training sample size requirement. The easily understandable, interpretable and modifiable characteristics of our approach suggest that it could be applicable to other ecoregions.

  3. PX series AMTEC cell design, testing and analysis

    SciTech Connect

    Borkowski, C.A.; Sievers, R.K.; Hendricks, T.J.

    1997-12-31

    PX (Pluto Express) cell testing and analysis has shown that AMTEC (Alkali Metal Thermal to Electric Conversion) cells can reach the power levels required by proposed RPS (Radioisotope Power Supply) system designs. A major PX cell design challenge was to optimize the power and efficiency of the cell while allowing a broad operational power range. These design optimization issues are greatly dependent on the placement of the evaporation zone. Before the PX-2 and PX-4 cells were built, the results from the PX-1, ATC-2 (artery test cell) and design analysis indicated the need for a thermal bridge between the heat input surface of the cell and the structure supporting the evaporation zone. Test and analytic results are presented illustrating the magnitude of the power transfer to the evaporation zone and the effect of this power transfer on the performance of the cell. Comparisons are also made between the cell test data and analytic results of cell performance to validate the analytic models.

  4. Association mechanism between a series of rodenticide and humic acid: a frontal analysis to support the biological data.

    PubMed

    André, Claire; Guyon, Catherine; Thomassin, Mireille; Barbier, Alexandre; Richert, Lysiane; Guillaume, Yves-Claude

    2005-06-01

    The binding constants (K) of a series of anticoagulant rodenticides with the main soil organic component, humic acid (HA), were determined using frontal analysis approach. The order of the binding constants was identical as the one obtained in a previous paper [J. Chromatogr. B 813 (2004) 295], i.e. bromadiolone>brodifacoum>difenacoum>chlorophacinone>diphacinone, confirming the power of this frontal analysis approach for the determination of binding constants. Moreover, and for the first time, the concentration of unbound rodenticide to HAs could be determined. Thanks this approach, we could clearly demonstrate that HA acid protected the human hepatoma cell line HepG2 against the cytotoxicity of all the rodenticides tested and that the toxicity of rodenticides was directly linked to the free rodenticide fraction in the medium (i.e. unbound rodenticide to HA). PMID:15866487

  5. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  6. Time-series analysis of networks: Exploring the structure with random walks

    NASA Astrophysics Data System (ADS)

    Weng, Tongfeng; Zhao, Yi; Small, Michael; Huang, Defeng David

    2014-08-01

    We generate time series from scale-free networks based on a finite-memory random walk traversing the network. These time series reveal topological and functional properties of networks via their temporal correlations. Remarkably, networks with different node-degree mixing patterns exhibit distinct self-similar characteristics. In particular, assortative networks are transformed into time series with long-range correlation, while disassortative networks are transformed into time series exhibiting anticorrelation. These relationships are consistent across a diverse variety of real networks. Moreover, we show that multiscale analysis of these time series can describe and classify various physical networks ranging from social and technological to biological networks according to their functional origin. These results suggest that there is a unified dynamical mechanism that governs the structural organization of many seemingly different networks.

  7. Engine Control Improvement through Application of Chaotic Time Series Analysis

    SciTech Connect

    Green, J.B., Jr.; Daw, C.S.

    2003-07-15

    The objective of this program was to investigate cyclic variations in spark-ignition (SI) engines under lean fueling conditions and to develop options to reduce emissions of nitrogen oxides (NOx) and particulate matter (PM) in compression-ignition direct-injection (CIDI) engines at high exhaust gas recirculation (EGR) rates. The CIDI activity builds upon an earlier collaboration between ORNL and Ford examining combustion instabilities in SI engines. Under the original CRADA, the principal objective was to understand the fundamental causes of combustion instability in spark-ignition engines operating with lean fueling. The results of this earlier activity demonstrated that such combustion instabilities are dominated by the effects of residual gas remaining in each cylinder from one cycle to the next. A very simple, low-order model was developed that explained the observed combustion instability as a noisy nonlinear dynamical process. The model concept lead to development of a real-time control strategy that could be employed to significantly reduce cyclic variations in real engines using existing sensors and engine control systems. This collaboration led to the issuance of a joint patent for spark-ignition engine control. After a few years, the CRADA was modified to focus more on EGR and CIDI engines. The modified CRADA examined relationships between EGR, combustion, and emissions in CIDI engines. Information from CIDI engine experiments, data analysis, and modeling were employed to identify and characterize new combustion regimes where it is possible to simultaneously achieve significant reductions in NOx and PM emissions. These results were also used to develop an on-line combustion diagnostic (virtual sensor) to make cycle-resolved combustion quality assessments for active feedback control. Extensive experiments on engines at Ford and ORNL led to the development of the virtual sensor concept that may be able to detect simultaneous reductions in NOx and PM

  8. Singular spectrum analysis and Fisher-Shannon analysis of spring flow time series: An application to Anjar Spring, Lebanon

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Lovallo, Michele; Shaban, Amin; Darwich, Talal; Amacha, Nabil

    2013-09-01

    In this study, the time dynamics of water flow from Anjar Spring was investigated, which is one of the major issuing springs in the central part of Lebanon. Likewise, many water sources in Lebanon, this spring has no continuous records for the discharge, and this would prevent the application of standard time series analysis tools. Furthermore, the highly nonstationary character of the series implies that suited methodologies can be employed to get insight into its dynamical features. Therefore, the Singular Spectrum Analysis (SSA) and Fisher-Shannon (FS) method, which are useful methods to disclose dynamical features in noisy nonstationary time series with gaps, are jointly applied to analyze the Anjar Spring water flow series. The SSA revealed that the series can be considered as the superposition of meteo-climatic periodic components, low-frequency trend and noise-like high-frequency fluctuations. The FS method allowed to extract and to identify among all the SSA reconstructed components the long-term trend of the series. The long-term trend is characterized by higher Fisher Information Measure (FIM) and lower Shannon entropy, and thus, represents the main informative component of the whole series. Generally water discharge time series presents very complex time structure, therefore the joint application of the SSA and the FS method would be very useful in disclosing the main informative part of such kind of data series in the view of existing climatic variability and/or anthropogenic challenges.

  9. A statistical method for the analysis of nonlinear temperature time series from compost.

    PubMed

    Yu, Shouhai; Clark, O Grant; Leonard, Jerry J

    2008-04-01

    Temperature is widely accepted as a critical indicator of aerobic microbial activity during composting but, to date, little effort has been made to devise an appropriate statistical approach for the analysis of temperature time series. Nonlinear, time-correlated effects have not previously been considered in the statistical analysis of temperature data from composting, despite their importance and the ubiquity of such features. A novel mathematical model is proposed here, based on a modified Gompertz function, which includes nonlinear, time-correlated effects. Methods are shown to estimate initial values for the model parameter. Algorithms in SAS are used to fit the model to different sets of temperature data from passively aerated compost. Methods are then shown for testing the goodness-of-fit of the model to data. Next, a method is described to determine, in a statistically rigorous manner, the significance of differences among the time-correlated characteristics of the datasets as described using the proposed model. An extra-sum-of-squares method was selected for this purpose. Finally, the model and methods are used to analyze a sample dataset and are shown to be useful tools for the statistical comparison of temperature data in composting. PMID:17997302

  10. The Use of Scaffolding Approach to Enhance Students' Engagement in Learning Structural Analysis

    ERIC Educational Resources Information Center

    Hardjito, Djwantoro

    2010-01-01

    This paper presents a reflection on the use of Scaffolding Approach to engage Civil Engineering students in learning Structural Analysis subjects. In this approach, after listening to the lecture on background theory, students are provided with a series of practice problems, each one comes with the steps, formulas, hints, and tables needed to…

  11. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    PubMed

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  12. Bioelectric signal classification using a recurrent probabilistic neural network with time-series discriminant component analysis.

    PubMed

    Hayashi, Hideaki; Shima, Keisuke; Shibanoki, Taro; Kurita, Yuichi; Tsuji, Toshio

    2013-01-01

    This paper outlines a probabilistic neural network developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower-dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model that incorporates a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into a neural network so that parameters can be obtained appropriately as network coefficients according to backpropagation-through-time-based training algorithm. The network is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. In the experiments conducted during the study, the validity of the proposed network was demonstrated for EEG signals.

  13. CCD Observing and Dynamical Time Series Analysis of Active Galactic Nuclei.

    NASA Astrophysics Data System (ADS)

    Nair, Achotham Damodaran

    1995-01-01

    The properties, working and operations procedure of the Charge Coupled Device (CCD) at the 30" telescope at Rosemary Hill Observatory (RHO) are discussed together with the details of data reduction. Several nonlinear techniques of time series analysis, based on the behavior of the nearest neighbors, have been used to analyze the time series of the quasar 3C 345. A technique using Artificial Neural Networks based on prediction of the time series is used to study the dynamical properties of 3C 345. Finally, a heuristic model for variability of Active Galactic Nuclei is discussed.

  14. Approaches to remote sensing data analysis

    USGS Publications Warehouse

    Pettinger, Lawrence R.

    1978-01-01

    Objectives: To present an overview of the essential steps in the remote sensing data analysis process, and to compare and contrast manual (visual) and automated analysis methods Rationale: This overview is intended to provide a framework for choosing a manual of digital analysis approach to collecting resource information. It can also be used as a basis for understanding/evaluating invited papers and poster sessions during the Symposium

  15. Providing web-based tools for time series access and analysis

    NASA Astrophysics Data System (ADS)

    Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane

    2014-05-01

    Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30

  16. An Interactive Analysis of Hyperboles in a British TV Series: Implications For EFL Classes

    ERIC Educational Resources Information Center

    Sert, Olcay

    2008-01-01

    This paper, part of an ongoing study on the analysis of hyperboles in a British TV series, reports findings drawing upon a 90,000 word corpus. The findings are compared to the ones from CANCODE (McCarthy and Carter 2004), a five-million word corpus of spontaneous speech, in order to identify similarities between the two. The analysis showed that…

  17. A Time-Series Model for Academic Library Data Using Intervention Analysis.

    ERIC Educational Resources Information Center

    Naylor, Maiken; Walsh, Kathleen

    1994-01-01

    Discussion of methods for gathering journal use information in academic libraries (for retention decisions) highlights an 8.4-year time-series of weekly library journal pickup data. Use of the autocorrelation function, spectral analysis, and intervention analysis is described.(LRW)

  18. Brief Communication: Earthquake sequencing: analysis of time series constructed from the Markov chain model

    NASA Astrophysics Data System (ADS)

    Cavers, M. S.; Vasudevan, K.

    2015-10-01

    Directed graph representation of a Markov chain model to study global earthquake sequencing leads to a time series of state-to-state transition probabilities that includes the spatio-temporally linked recurrent events in the record-breaking sense. A state refers to a configuration comprised of zones with either the occurrence or non-occurrence of an earthquake in each zone in a pre-determined time interval. Since the time series is derived from non-linear and non-stationary earthquake sequencing, we use known analysis methods to glean new information. We apply decomposition procedures such as ensemble empirical mode decomposition (EEMD) to study the state-to-state fluctuations in each of the intrinsic mode functions. We subject the intrinsic mode functions, derived from the time series using the EEMD, to a detailed analysis to draw information content of the time series. Also, we investigate the influence of random noise on the data-driven state-to-state transition probabilities. We consider a second aspect of earthquake sequencing that is closely tied to its time-correlative behaviour. Here, we extend the Fano factor and Allan factor analysis to the time series of state-to-state transition frequencies of a Markov chain. Our results support not only the usefulness of the intrinsic mode functions in understanding the time series but also the presence of power-law behaviour exemplified by the Fano factor and the Allan factor.

  19. Documentation of a spreadsheet for time-series analysis and drawdown estimation

    USGS Publications Warehouse

    Halford, Keith J.

    2006-01-01

    Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user

  20. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  1. Nonstationary frequency analysis for the trivariate flood series of the Weihe River

    NASA Astrophysics Data System (ADS)

    Jiang, Cong; Xiong, Lihua

    2016-04-01

    Some intensive human activities such as water-soil conservation can significantly alter the natural hydrological processes of rivers. In this study, the effect of the water-soil conservation on the trivariate flood series from the Weihe River located in the Northwest China is investigated. The annual maxima daily discharge, annual maxima 3-day flood volume and annual maxima 5-day flood volume are chosen as the study data and used to compose the trivariate flood series. The nonstationarities in both the individual univariate flood series and the corresponding antecedent precipitation series generating the flood events are examined by the Mann-Kendall trend test. It is found that all individual univariate flood series present significant decreasing trend, while the antecedent precipitation series can be treated as stationary. It indicates that the increase of the water-soil conservation land area has altered the rainfall-runoff relationship of the Weihe basin, and induced the nonstationarities in the three individual univariate flood series. The time-varying moments model based on the Pearson type III distribution is applied to capture the nonstationarities in the flood frequency distribution with the water-soil conservation land area introduced as the explanatory variable of the flood distribution parameters. Based on the analysis for each individual univariate flood series, the dependence structure among the three univariate flood series are investigated by the time-varying copula model also with the water-soil conservation land area as the explanatory variable of copula parameters. The results indicate that the dependence among the trivariate flood series is enhanced by the increase of water-soil conservation land area.

  2. A Physiological Time Series Dynamics-Based Approach to Patient Monitoring and Outcome Prediction

    PubMed Central

    Lehman, Li-Wei H.; Adams, Ryan P.; Mayaud, Louis; Moody, George B.; Malhotra, Atul; Mark, Roger G.; Nemati, Shamim

    2015-01-01

    Cardiovascular variables such as heart rate (HR) and blood pressure (BP) are regulated by an underlying control system, and therefore, the time series of these vital signs exhibit rich dynamical patterns of interaction in response to external perturbations (e.g., drug administration), as well as pathological states (e.g., onset of sepsis and hypotension). A question of interest is whether “similar” dynamical patterns can be identified across a heterogeneous patient cohort, and be used for prognosis of patients’ health and progress. In this paper, we used a switching vector autoregressive framework to systematically learn and identify a collection of vital sign time series dynamics, which are possibly recurrent within the same patient and may be shared across the entire cohort. We show that these dynamical behaviors can be used to characterize the physiological “state” of a patient. We validate our technique using simulated time series of the cardiovascular system, and human recordings of HR and BP time series from an orthostatic stress study with known postural states. Using the HR and BP dynamics of an intensive care unit (ICU) cohort of over 450 patients from the MIMIC II database, we demonstrate that the discovered cardiovascular dynamics are significantly associated with hospital mortality (dynamic modes 3 and 9, p = 0.001, p = 0.006 from logistic regression after adjusting for the APACHE scores). Combining the dynamics of BP time series and SAPS-I or APACHE-III provided a more accurate assessment of patient survival/mortality in the hospital than using SAPS-I and APACHE-III alone (p = 0.005 and p = 0.045). Our results suggest that the discovered dynamics of vital sign time series may contain additional prognostic value beyond that of the baseline acuity measures, and can potentially be used as an independent predictor of outcomes in the ICU. PMID:25014976

  3. Influence of sampling intake position on suspended solid measurements in sewers: two probability/time-series-based approaches.

    PubMed

    Sandoval, Santiago; Bertrand-Krajewski, Jean-Luc

    2016-06-01

    Total suspended solid (TSS) measurements in urban drainage systems are required for several reasons. Aiming to assess uncertainties in the mean TSS concentration due to the influence of sampling intake vertical position and vertical concentration gradients in a sewer pipe, two methods are proposed: a simplified method based on a theoretical vertical concentration profile (SM) and a time series grouping method (TSM). SM is based on flow rate and water depth time series. TSM requires additional TSS time series as input data. All time series are from the Chassieu urban catchment in Lyon, France (time series from 2007 with 2-min time step, 89 rainfall events). The probability of measuring a TSS value lower than the mean TSS along the vertical cross section (TSS underestimation) is about 0.88 with SM and about 0.64 with TSM. TSM shows more realistic TSS underestimation values (about 39 %) than SM (about 269 %). Interquartile ranges (IQR) over the probability values indicate that SM is more uncertain (IQR = 0.08) than TSM (IQR = 0.02). Differences between the two methods are mainly due to simplifications in SM (absence of TSS measurements). SM assumes a significant asymmetry of the TSS concentration profile along the vertical axis in the cross section. This is compatible with the distribution of TSS measurements found in the TSM approach. The methods provide insights towards an indicator of the measurement performance and representativeness for a TSS sampling protocol. PMID:27178049

  4. Approaches to Literature through Theme. The Oryx Reading Motivation Series No. 1.

    ERIC Educational Resources Information Center

    Montgomery, Paula Kay

    Intended to help teachers and librarians inspire students in grades 5-9 to read and keep reading, this book provides literature theme approaches and teaching strategies for reading and studying literature. Chapter 1 discusses approaches, methods, techniques, and strategies in using literature approaches to motivate reading. Chapter 2 defines a…

  5. Comparison of nonparametric trend analysis according to the types of time series data

    NASA Astrophysics Data System (ADS)

    Heo, J.; Shin, H.; Kim, T.; Jang, H.; Kim, H.

    2013-12-01

    In the analysis of hydrological data, the determination of the existence of overall trend due to climate change has been a major concern and the important part of design and management of water resources for the future. The existence of trend could be identified by plotting hydrologic time series. However, statistical methods are more accurate and objective tools to perform trend analysis. Statistical methods divided into parametric and nonparametric methods. In the case of parametric method, the population should be assumed to be normally distributed. However, most of hydrological data tend to be represented by non-normal distribution, then the nonparametric method considered more suitable than parametric method. In this study, simulations were performed with different types of time series data and four nonparametric methods (Mann-Kendall test, Spearman's rho test, SEN test, and Hotelling-Pabst test) generally used in trend analysis were applied to assess the power of each trend analysis. The time series data were classified into three types which are Trend+Random, Trend+Cycle+Random, and Trend+Non-random. In order to add a change to the data, 11 kinds of different slopes were overlapped at each simulation. As the results, nonparametric methods have almost similar power for Trend+random type and Trend+Non-random series. On the other hand, Mann-Kendall and SEN tests have slightly higher power than Spearman's rho and Hotelling-Pabst tests for Trend+Cycle+Random series.

  6. Multiresolution diffusion entropy analysis of time series: an application to births to teenagers in Texas

    NASA Astrophysics Data System (ADS)

    Scafetta, Nicola; West, Bruce J.

    2004-04-01

    The multiresolution diffusion entropy analysis is used to evaluate the stochastic information left in a time series after systematic removal of certain non-stationarities. This method allows us to establish whether the identified patterns are sufficient to capture all relevant information contained in a time series. If they do not, the method suggests the need for further interpretation to explain the residual memory in the signal. We apply the multiresolution diffusion entropy analysis to the daily count of births to teens in Texas from 1964 through 2000 because it is a typical example of a non-stationary time series, having an anomalous trend, an annual variation, as well as short time fluctuations. The analysis is repeated for the three main racial/ethnic groups in Texas (White, Hispanic and African American), as well as, to married and unmarried teens during the years from 1994 to 2000 and we study the differences that emerge among the groups.

  7. [Optic neuritis in childhood. A pediatric series, literature review and treatment approach].

    PubMed

    Lopez-Martin, D; Martinez-Anton, J

    2016-08-01

    Introduccion. En la edad pediatrica, la forma mas frecuente de neuritis optica se presenta generalmente despues de un cuadro infeccioso, con edema de papila, que suele ser bilateral y tiene buen pronostico. La conversion a esclerosis multiple es infrecuente. Objetivo. Presentar las caracteristicas clinicas y de laboratorio de una serie pediatrica de neuritis optica. Pacientes y metodos. Se analiza una serie de 17 casos de neuritis optica en niños y jovenes de 4 a 14 años, referidos entre los años 2000 y 2015. Resultados. La edad mediana de la serie fue de 11 años. Predominaron los pacientes de sexo femenino y el antecedente infeccioso fue poco frecuente; en cinco pacientes, la afectacion fue bilateral, y cuatro casos se presentaron como neuritis optica retrobulbar. La resonancia magnetica mostro hiperintensidad en T2 en los nervios opticos afectados en cinco pacientes. El estudio del liquido cefalorraquideo y bandas oligoclonales fue normal en todos los casos. Los pacientes, tratados con metilprednisolona intravenosa, tuvieron buena recuperacion. Solo en tres casos se comprobo una evolucion posterior a esclerosis multiple. Conclusiones. En esta serie, los casos que evolucionaron a esclerosis multiple no mostraron diferencias clinicas, aunque si presentaron mayor cantidad de lesiones hiperintensas en la resonancia magnetica. Este hecho, descrito en trabajos previos, apoya nuestro esquema diagnostico y terapeutico en un intento por acercarnos al manejo optimo de esta patologia.

  8. Probabilistic prediction of real-world time series: A local regression approach

    NASA Astrophysics Data System (ADS)

    Laio, Francesco; Ridolfi, Luca; Tamea, Stefania

    2007-02-01

    We propose a probabilistic prediction method, based on local polynomial regressions, which complements the point forecasts with robust estimates of the corresponding forecast uncertainty. The reliability, practicability and generality of the method is demonstrated by applying it to astronomical, physiological, economic, and geophysical time series.

  9. Discussion Guide for Film Clip Series--"The Team Approach in Education: Twenty Questions on Film."

    ERIC Educational Resources Information Center

    Bowman, Garda W.; And Others

    This discussion guide is part of a multi-media package of audiovisual and written materials designed to assist trainers of teams in a school setting, particularly for use with teams of teachers and auxiliaries (paraprofessionals). The purpose of the film clip series--to stimulate discussion that is geared to problem solving--is discussed, and the…

  10. Education & Income Generation for Women: Non-Formal Approaches. Women's Development Series--3.

    ERIC Educational Resources Information Center

    Tellis-Nayak, Jessie B.

    The third of five publications of the Indian Social Institute (ISI)-Women's Development Series, this book outlines a non-formal education program for girls of India with emphasis on training for income generation. The program stresses the practical skills of home-making, motherhood, social skills, and character formation. The book describes some…

  11. Analysis of Effects of Meteorological Factors on Dengue Incidence in Sri Lanka Using Time Series Data

    PubMed Central

    Goto, Kensuke; Kumarendran, Balachandran; Mettananda, Sachith; Gunasekara, Deepa; Fujii, Yoshito; Kaneko, Satoshi

    2013-01-01

    In tropical and subtropical regions of eastern and South-eastern Asia, dengue fever (DF) and dengue hemorrhagic fever (DHF) outbreaks occur frequently. Previous studies indicate an association between meteorological variables and dengue incidence using time series analyses. The impacts of meteorological changes can affect dengue outbreak. However, difficulties in collecting detailed time series data in developing countries have led to common use of monthly data in most previous studies. In addition, time series analyses are often limited to one area because of the difficulty in collecting meteorological and dengue incidence data in multiple areas. To gain better understanding, we examined the effects of meteorological factors on dengue incidence in three geographically distinct areas (Ratnapura, Colombo, and Anuradhapura) of Sri Lanka by time series analysis of weekly data. The weekly average maximum temperature and total rainfall and the total number of dengue cases from 2005 to 2011 (7 years) were used as time series data in this study. Subsequently, time series analyses were performed on the basis of ordinary least squares regression analysis followed by the vector autoregressive model (VAR). In conclusion, weekly average maximum temperatures and the weekly total rainfall did not significantly affect dengue incidence in three geographically different areas of Sri Lanka. However, the weekly total rainfall slightly influenced dengue incidence in the cities of Colombo and Anuradhapura. PMID:23671694

  12. Analysis of effects of meteorological factors on dengue incidence in Sri Lanka using time series data.

    PubMed

    Goto, Kensuke; Kumarendran, Balachandran; Mettananda, Sachith; Gunasekara, Deepa; Fujii, Yoshito; Kaneko, Satoshi

    2013-01-01

    In tropical and subtropical regions of eastern and South-eastern Asia, dengue fever (DF) and dengue hemorrhagic fever (DHF) outbreaks occur frequently. Previous studies indicate an association between meteorological variables and dengue incidence using time series analyses. The impacts of meteorological changes can affect dengue outbreak. However, difficulties in collecting detailed time series data in developing countries have led to common use of monthly data in most previous studies. In addition, time series analyses are often limited to one area because of the difficulty in collecting meteorological and dengue incidence data in multiple areas. To gain better understanding, we examined the effects of meteorological factors on dengue incidence in three geographically distinct areas (Ratnapura, Colombo, and Anuradhapura) of Sri Lanka by time series analysis of weekly data. The weekly average maximum temperature and total rainfall and the total number of dengue cases from 2005 to 2011 (7 years) were used as time series data in this study. Subsequently, time series analyses were performed on the basis of ordinary least squares regression analysis followed by the vector autoregressive model (VAR). In conclusion, weekly average maximum temperatures and the weekly total rainfall did not significantly affect dengue incidence in three geographically different areas of Sri Lanka. However, the weekly total rainfall slightly influenced dengue incidence in the cities of Colombo and Anuradhapura.

  13. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria).

    PubMed

    Mayaud, C; Wagner, T; Benischke, R; Birk, S

    2014-04-16

    The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the

  14. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria)

    PubMed Central

    Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.

    2014-01-01

    Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and

  15. Shielding analysis of the TRUPACT-series casks for transportation of Hanford HLW

    SciTech Connect

    Banjac, V.; Sanchez, P.E.; Hills, C.R.; Heger, A.S. )

    1993-01-01

    In this paper, the authors propose the possibility of utilizing the TRUPACT-series casks for the transportation of high-level waste (HLW) from the Hanford reservation. The configurations of the TRUPACT series are a rectangular parallelepiped and a right circular cylinder, which are the TRUPACT-1 and -11, respectively. The TRUPACT series was designed as a type B contact-handled transuranic (CH-TRU) waste transportation system for use in Waste Isolation Pilot Plant-related operations and was subjected to type B container accident tests, which it successfully passed. Thus from a safety standpoint, the TRUPACT series is provided with double containment, impact limitation, and fire-retardant capabilities. However, the shielding analysis has shown the major modifications are required to allow for the transport of even a reasonable fraction of Hanford HLW.

  16. Analysis of heterogeneous dengue transmission in Guangdong in 2014 with multivariate time series model

    PubMed Central

    Cheng, Qing; Lu, Xin; Wu, Joseph T.; Liu, Zhong; Huang, Jincai

    2016-01-01

    Guangdong experienced the largest dengue epidemic in recent history. In 2014, the number of dengue cases was the highest in the previous 10 years and comprised more than 90% of all cases. In order to analyze heterogeneous transmission of dengue, a multivariate time series model decomposing dengue risk additively into endemic, autoregressive and spatiotemporal components was used to model dengue transmission. Moreover, random effects were introduced in the model to deal with heterogeneous dengue transmission and incidence levels and power law approach was embedded into the model to account for spatial interaction. There was little spatial variation in the autoregressive component. In contrast, for the endemic component, there was a pronounced heterogeneity between the Pearl River Delta area and the remaining districts. For the spatiotemporal component, there was considerable heterogeneity across districts with highest values in some western and eastern department. The results showed that the patterns driving dengue transmission were found by using clustering analysis. And endemic component contribution seems to be important in the Pearl River Delta area, where the incidence is high (95 per 100,000), while areas with relatively low incidence (4 per 100,000) are highly dependent on spatiotemporal spread and local autoregression. PMID:27666657

  17. Accuracy analysis of measurements on a stable power-law distributed series of events

    NASA Astrophysics Data System (ADS)

    Matthews, J. O.; Hopcraft, K. I.; Jakeman, E.; Siviour, G. B.

    2006-11-01

    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation.

  18. Assessing coal-mine safety regulation: A pooled time-series analysis

    SciTech Connect

    Chun Youngpyoung.

    1991-01-01

    This study attempts to assess the independent, relative, and conjoint effects of four types of variables on coal-mine safety: administrative (mine inspections, mine investigations, and mine safety grants); political (state party competition, gubernatorial party affiliation, and deregulation); economic (state per-capita income and unemployment rates); task-related (mine size, technology, and type of mining), and state dummy variables. Trend, Pearson correlation, and pooled time-series analyses are performed on fatal and nonfatal injury rates reported in 25 coal-producing states during the 1975-1985 time period. These are then interpreted in light of three competing theories of regulation: capture, nonmarket failure, and threshold. Analysis reveals: (1) distinctions in the total explanatory power of the model across different types of injuries, as well as across presidential administrations; (2) a consistently more powerful impact on safety of informational implementation tools (safety education grants) over command-and-control approaches (inspections and investigations) or political variables; and (3) limited, albeit conjectural, support for a threshold theory of regulation in the coal mine safety arena.

  19. The Terror Attacks of 9/11 and Suicides in Germany: A Time Series Analysis.

    PubMed

    Medenwald, Daniel

    2016-04-01

    Data on the effect of the September 11, 2001 (9/11) terror attacks on suicide rates remain inconclusive. Reportedly, even people located far from the attack site have considerable potential for personalizing the events that occurred on 9/11. Durkheim's theory states that suicides decrease during wartime; thus, a decline in suicides might have been expected after 9/11. We conducted a time series analysis of 164,136 officially recorded suicides in Germany between 1995 and 2009 using the algorithm introduced by Box and Jenkins. Compared with the average death rate, we observed no relevant change in the suicide rate of either sex after 9/11. Our estimates of an excess of suicides approached the null effect value on and within a 7-day period after 9/11, which also held when subsamples of deaths in urban or rural settings were examined. No evidence of Durkheim's theory attributable to the 9/11attacks was found in this sample.

  20. The Terror Attacks of 9/11 and Suicides in Germany: A Time Series Analysis.

    PubMed

    Medenwald, Daniel

    2016-04-01

    Data on the effect of the September 11, 2001 (9/11) terror attacks on suicide rates remain inconclusive. Reportedly, even people located far from the attack site have considerable potential for personalizing the events that occurred on 9/11. Durkheim's theory states that suicides decrease during wartime; thus, a decline in suicides might have been expected after 9/11. We conducted a time series analysis of 164,136 officially recorded suicides in Germany between 1995 and 2009 using the algorithm introduced by Box and Jenkins. Compared with the average death rate, we observed no relevant change in the suicide rate of either sex after 9/11. Our estimates of an excess of suicides approached the null effect value on and within a 7-day period after 9/11, which also held when subsamples of deaths in urban or rural settings were examined. No evidence of Durkheim's theory attributable to the 9/11attacks was found in this sample. PMID:27082561

  1. Daily water and sediment discharges from selected rivers of the eastern United States; a time-series modeling approach

    USGS Publications Warehouse

    Fitzgerald, Michael G.; Karlinger, Michael R.

    1983-01-01

    Time-series models were constructed for analysis of daily runoff and sediment discharge data from selected rivers of the Eastern United States. Logarithmic transformation and first-order differencing of the data sets were necessary to produce second-order, stationary time series and remove seasonal trends. Cyclic models accounted for less than 42 percent of the variance in the water series and 31 percent in the sediment series. Analysis of the apparent oscillations of given frequencies occurring in the data indicates that frequently occurring storms can account for as much as 50 percent of the variation in sediment discharge. Components of the frequency analysis indicate that a linear representation is reasonable for the water-sediment system. Models that incorporate lagged water discharge as input prove superior to univariate techniques in modeling and prediction of sediment discharges. The random component of the models includes errors in measurement and model hypothesis and indicates no serial correlation. An index of sediment production within or between drain-gage basins can be calculated from model parameters.

  2. A simplified concentration series to produce a pair of 2D asynchronous spectra based on the DAOSD approach

    NASA Astrophysics Data System (ADS)

    Kang, Xiaoyan; He, Anqi; Guo, Ran; Zhai, Yanjun; Xu, Yizhuang; Noda, Isao; Wu, Jinguang

    2016-11-01

    We propose a substantially simplified approach to construct a pair of 2D asynchronous spectra based on the DAOSD approach proposed in our previous papers. By using a new concentration series, only three 1D spectra are used to generate a pair of 2D correlation spectra together with two reference spectra. By using this method, the previous problem of labor intensive traditional DAOSD approach has been successfully addressed. We apply the new approach to characterize intermolecular interaction between acetonitrile and butanone dissolved in carbon tetrachloride. The existence of intermolecular interaction between the two solutes can be confirmed by the presence of a cross peak in the resultant 2D IR spectra. In addition, the absence of cross peak around (2254, 2292) in Ψbutanone provides another experimental evidence to reveal the intrinsic relationship between the Ctbnd N stretching band and an overtone band (δCH3+νC-C).

  3. An Automated Approach to Map the History of Forest Disturbance from Insect Mortality and Harvest with Landsat Time-Series Data

    NASA Technical Reports Server (NTRS)

    Rudasill-Neigh, Christopher S.; Bolton, Douglas K.; Diabate, Mouhamad; Williams, Jennifer J.; Carvalhais, Nuno

    2014-01-01

    Forests contain a majority of the aboveground carbon (C) found in ecosystems, and understanding biomass lost from disturbance is essential to improve our C-cycle knowledge. Our study region in the Wisconsin and Minnesota Laurentian Forest had a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 2007, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometer (AVHRR). To understand the potential role of disturbances in the terrestrial C-cycle, we developed an algorithm to map forest disturbances from either harvest or insect outbreak for Landsat time-series stacks. We merged two image analysis approaches into one algorithm to monitor forest change that included: (1) multiple disturbance index thresholds to capture clear-cut harvest; and (2) a spectral trajectory-based image analysis with multiple confidence interval thresholds to map insect outbreak. We produced 20 maps and evaluated classification accuracy with air-photos and insect air-survey data to understand the performance of our algorithm. We achieved overall accuracies ranging from 65% to 75%, with an average accuracy of 72%. The producer's and user's accuracy ranged from a maximum of 32% to 70% for insect disturbance, 60% to 76% for insect mortality and 82% to 88% for harvested forest, which was the dominant disturbance agent. Forest disturbances accounted for 22% of total forested area (7349 km2). Our algorithm provides a basic approach to map disturbance history where large impacts to forest stands have occurred and highlights the limited spectral sensitivity of Landsat time-series to outbreaks of defoliating insects. We found that only harvest and insect mortality events can be mapped with adequate accuracy with a non-annual Landsat time-series. This limited our land cover understanding of NDVI decline drivers. We demonstrate that to capture more subtle disturbances with spectral trajectories, future observations

  4. A new approach for agroecosystems monitoring using high-revisit multitemporal satellite data series

    NASA Astrophysics Data System (ADS)

    Diez, M.; Moclán, C.; Romo, A.; Pirondini, F.

    2014-10-01

    With increasing population pressure throughout the world and the need for increased agricultural production there is a definite need for improved management of the world's agricultural resources. Comprehensive, reliable and timely information on agricultural resources is necessary for the implementation of effective management decisions. In that sense, the demand for high-quality and high-frequency geo-information for monitoring of agriculture and its associated ecosystems has been growing in the recent decades. Satellite image data enable direct observation of large areas at frequent intervals and therefore allow unprecedented mapping and monitoring of crops evolution. Furthermore, real time analysis can assist in making timely management decisions that affect the outcome of the crops. The DEIMOS-1 satellite, owned and operated by ELECNOR DEIMOS IMAGING (Spain), provides 22m, 3-band imagery with a very wide (620-km) swath, and has been specifically designed to produce high-frequency revisit on very large areas. This capability has been proved through the contracts awarded to Airbus Defence and Space every year since 2011, where DEIMOS-1 has provided the USDA with the bulk of the imagery used to monitor the crop season in the Lower 48, in cooperation with its twin satellite DMCii's UK-DMC2. Furthermore, high density agricultural areas have been targeted with increased frequency and analyzed in near real time to monitor tightly the evolution. In this paper we present the results obtained from a campaign carried out in 2013 with DEIMOS-1 and UK-DMC2 satellites. These campaigns provided a high-frequency revisit of target areas, with one image every two days on average: almost a ten-fold frequency improvement with respect to Landsat-8. The results clearly show the effectiveness of a high-frequency monitoring approach with high resolution images with respect to classic strategies where results are more exposed to weather conditions.

  5. Modified cross sample entropy and surrogate data analysis method for financial time series

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2015-09-01

    For researching multiscale behaviors from the angle of entropy, we propose a modified cross sample entropy (MCSE) and combine surrogate data analysis with it in order to compute entropy differences between original dynamics and surrogate series (MCSDiff). MCSDiff is applied to simulated signals to show accuracy and then employed to US and Chinese stock markets. We illustrate the presence of multiscale behavior in the MCSDiff results and reveal that there are synchrony containing in the original financial time series and they have some intrinsic relations, which are destroyed by surrogate data analysis. Furthermore, the multifractal behaviors of cross-correlations between these financial time series are investigated by multifractal detrended cross-correlation analysis (MF-DCCA) method, since multifractal analysis is a multiscale analysis. We explore the multifractal properties of cross-correlation between these US and Chinese markets and show the distinctiveness of NQCI and HSI among the markets in their own region. It can be concluded that the weaker cross-correlation between US markets gives the evidence for the better inner mechanism in the US stock markets than that of Chinese stock markets. To study the multiscale features and properties of financial time series can provide valuable information for understanding the inner mechanism of financial markets.

  6. The F-12 series aircraft approach to design for control system reliability

    NASA Technical Reports Server (NTRS)

    Schenk, F. L.; Mcmaster, J. R.

    1976-01-01

    The F-12 series aircraft control system design philosophy is reviewed as it pertains to functional reliability. The basic control system, i.e., cables, mixer, feel system, trim devices, and hydraulic systems are described and discussed. In addition, the implementation of the redundant stability augmentation system in the F-12 aircraft is described. Finally, the functional reliability record that has been achieved is presented.

  7. Task Analysis: A Top-Down Approach.

    ERIC Educational Resources Information Center

    Harmon, Paul

    1983-01-01

    This approach to task analysis includes descriptions of (1) inputs, outputs, and jobs; (2) flow of materials and decisions between jobs; (3) inputs, major tasks, and outputs of each job; (4) sequence of steps for major tasks; (5) heuristics/algorithms for each sequence step; and (6) information needed to use heuristics algorithms. (EAO)

  8. Heterogeneous Factor Analysis Models: A Bayesian Approach.

    ERIC Educational Resources Information Center

    Ansari, Asim; Jedidi, Kamel; Dube, Laurette

    2002-01-01

    Developed Markov Chain Monte Carlo procedures to perform Bayesian inference, model checking, and model comparison in heterogeneous factor analysis. Tested the approach with synthetic data and data from a consumption emotion study involving 54 consumers. Results show that traditional psychometric methods cannot fully capture the heterogeneity in…

  9. A Mellin transform approach to wavelet analysis

    NASA Astrophysics Data System (ADS)

    Alotta, Gioacchino; Di Paola, Mario; Failla, Giuseppe

    2015-11-01

    The paper proposes a fractional calculus approach to continuous wavelet analysis. Upon introducing a Mellin transform expression of the mother wavelet, it is shown that the wavelet transform of an arbitrary function f(t) can be given a fractional representation involving a suitable number of Riesz integrals of f(t), and corresponding fractional moments of the mother wavelet. This result serves as a basis for an original approach to wavelet analysis of linear systems under arbitrary excitations. In particular, using the proposed fractional representation for the wavelet transform of the excitation, it is found that the wavelet transform of the response can readily be computed by a Mellin transform expression, with fractional moments obtained from a set of algebraic equations whose coefficient matrix applies for any scale a of the wavelet transform. Robustness and computationally efficiency of the proposed approach are shown in the paper.

  10. Pharmacogenomic responses of rat liver to methylprednisolone: an approach to mining a rich microarray time series.

    PubMed

    Almon, Richard R; Dubois, Debra C; Jin, Jin Y; Jusko, William J

    2005-08-18

    A data set was generated to examine global changes in gene expression in rat liver over time in response to a single bolus dose of methylprednisolone. Four control animals and 43 drug-treated animals were humanely killed at 16 different time points following drug administration. Total RNA preparations from the livers of these animals were hybridized to 47 individual Affymetrix RU34A gene chips, generating data for 8799 different probe sets for each chip. Data mining techniques that are applicable to gene array time series data sets in order to identify drug-regulated changes in gene expression were applied to this data set. A series of 4 sequentially applied filters were developed that were designed to eliminate probe sets that were not expressed in the tissue, were not regulated by the drug treatment, or did not meet defined quality control standards. These filters eliminated 7287 probe sets of the 8799 total (82%) from further consideration. Application of judiciously chosen filters is an effective tool for data mining of time series data sets. The remaining data can then be further analyzed by clustering and mathematical modeling techniques.

  11. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  12. Determinants of healthcare expenditures in Iran: evidence from a time series analysis

    PubMed Central

    Rezaei, Satar; Fallah, Razieh; Kazemi Karyani, Ali; Daroudi, Rajabali; Zandiyan, Hamed; Hajizadeh, Mohammad

    2016-01-01

    Background: A dramatic increase in healthcare expenditures is a major health policy concern worldwide. Understanding factors that underlie the growth in healthcare expenditures is essential to assist decision-makers in finding best policies to manage healthcare costs. We aimed to examine the determinants of healthcare spending in Iran over the periods of 1978-2011. Methods: A time series analysis was used to examine the effect of selected socio-economic, demographic and health service input on per capita healthcare expenditures (HCE) in Iran from 1978 to 2011. Data were retrieved from the Central Bank of Iran, Iranian Statistical Center and World Bank. Autoregressive distributed lag approach and error correction method were employed to examine long- and short-run effects of covariates. Results: Our findings indicated that the GDP per capita, degree of urbanization and illiteracy rate increase healthcare expenditures, while physician per 10,000 populations and proportion of population aged≥ 65 years decrease healthcare expenditures. In addition, we found that healthcare spending is a "necessity good" with long- and short-run income (GDP per capita), elasticities of 0.46 (p<0.01) and 0.67 (p = 0.01), respectively. Conclusion: Our analysis identified GDP per capita, illiteracy rate, degree of urbanization and number of physicians as some of the driving forces behind the persistent increase in HCE in Iran. These findings provide important insights into the growth in HCE in Iran. In addition, since we found that health spending is a "necessity good" in Iran, healthcare services should thus be the object of public funding and government intervention PMID:27390683

  13. An Integrated Approach to Life Cycle Analysis

    NASA Technical Reports Server (NTRS)

    Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.

    2006-01-01

    Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.

  14. An approach to jointly invert hypocenters and 1D velocity structure and its application to the Lushan earthquake series

    NASA Astrophysics Data System (ADS)

    Qian, Hui; Mechie, James; Li, Haibing; Xue, Guangqi; Su, Heping; Cui, Xiang

    2016-01-01

    Earthquake location is essential when defining fault systems and other geological structures. Many methods have been developed to locate hypocenters within a 1D velocity model. In this study, a new approach, named MatLoc, has been developed which can simultaneously invert for the locations and origin times of the hypocenters and the velocity structure, from the arrival times of local earthquakes. Moreover, it can invert for layer boundary depths, such as Moho depths, which can be well constrained by the Pm and Pn phases. For this purpose, the package was developed to take into account reflected phases, e.g., the Pm phase. The speed of the inversion is acceptable due to the use of optimized matrix calculations. The package has been used to re-locate the Lushan earthquake series which occurred in Sichuan, China, from April 20 to April 22, 2013. The results obtained with the package show that the Lushan earthquake series defines the dip of the Guankou fault, on which most of the series occurred, to be 39° toward the NW. Further, the surface projection of the Lushan earthquake series is consistent with the regional tectonic strike which is about N45° E.

  15. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  16. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package.

    PubMed

    Donges, Jonathan F; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V; Marwan, Norbert; Dijkstra, Henk A; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. PMID:26627561

  17. Time series analysis of knowledge of results effects during motor skill acquisition.

    PubMed

    Blackwell, J R; Simmons, R W; Spray, J A

    1991-03-01

    Time series analysis was used to investigate the hypothesis that during acquisition of a motor skill, knowledge of results (KR) information is used to generate a stable internal referent about which response errors are randomly distributed. Sixteen subjects completed 50 acquisition trials of each of three movements whose spatial-temporal characteristics differed. Acquisition trials were either blocked, with each movement being presented in series, or randomized, with the presentation of movements occurring in random order. Analysis of movement time data indicated the contextual interference effect reported in previous studies was replicated in the present experiment. Time series analysis of the acquisition trial data revealed the majority of individual subject response patterns during blocked trials were best described by a model with a temporarily stationary, internal reference of the criterion and systematic, trial-to-trial variation of response errors. During random trial conditions, response patterns were usually best described by a "White-noise" model. This model predicts a permanently stationary, internal reference associated with randomly distributed response errors that are unaffected by KR information. These results are not consistent with previous work using time series analysis to describe motor behavior (Spray & Newell, 1986). PMID:2028084

  18. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  19. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    NASA Astrophysics Data System (ADS)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  20. Economic Conditions and the Divorce Rate: A Time-Series Analysis of the Postwar United States.

    ERIC Educational Resources Information Center

    South, Scott J.

    1985-01-01

    Challenges the belief that the divorce rate rises during prosperity and falls during economic recessions. Time-series regression analysis of postwar United States reveals small but positive effects of unemployment on divorce rate. Stronger influences on divorce rates are changes in age structure and labor-force participation rate of women.…

  1. Rationale, Development, and Validation of a Series of Self-Instructional Modules in Interaction Analysis.

    ERIC Educational Resources Information Center

    Suiter, Phil Edward; Queen, Bernard

    This study was designed to develop a series of instructional modules to teach inservice teachers the Flanders System of Interaction Analysis. Instructional modules were constructed based on research information, and then modified from feedback from experts and random trials. Two field-test groups were used to provide data for validation testing,…

  2. Young volcanoes in the Chilean Southern Volcanic Zone: A statistical approach to eruption prediction based on time series

    NASA Astrophysics Data System (ADS)

    Dzierma, Y.; Wehrmann, H.

    2010-03-01

    Forecasting volcanic activity has long been an aim of applied volcanology with regard to mitigating consequences of volcanic eruptions. Effective disaster management requires both information on expected physical eruption behaviour such as types and magnitudes of eruptions as typical for the individual volcano, usually reconstructed from deposits of past eruptions, and the likelihood that a new eruption will occur within a given time. Here we apply a statistical procedure to provide a probability estimate for future eruptions based on eruption time series, and discuss the limitations of this approach. The statistical investigation encompasses a series of young volcanoes of the Chilean Southern Volcanic Zone. Most of the volcanoes considered have been active in historical times, in addition to several volcanoes with a longer eruption record from Late-Pleistocene to Holocene. Furthermore, eruption rates of neighbouring volcanoes are compared with the aim to reveal possible regional relations, potentially resulting from local to medium-scale tectonic dynamics. One special focus is directed to the two currently most active volcanoes of South America, Llaima and Villarrica, whose eruption records comprise about 50 historical eruptions over the past centuries. These two front volcanoes are considered together with Lanín Volcano, situated in the back-arc of Villarrica, for which the analysis is based on eight eruptions in the past 10 ka. For Llaima and Villarrica, affirmed tests for independence of the repose times between successive eruptions permit to assume Poisson processes; which is hampered for Lanín because of the more limited availability of documented eruptions. The assumption of stationarity reaches varying degrees of confidence depending on the time interval considered, ameliorating towards the more recent and hence probably more complete eruption record. With these pre-requisites of the time series, several distribution functions are fit and the goodness of

  3. Evaluation of physiologic complexity in time series using generalized sample entropy and surrogate data analysis

    NASA Astrophysics Data System (ADS)

    Eduardo Virgilio Silva, Luiz; Otavio Murta, Luiz

    2012-12-01

    Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiffmax) for q ≠1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiffmax values were capable of distinguish HRV groups (p-values 5.10×10-3, 1.11×10-7, and 5.50×10-7 for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis.

  4. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis.

    PubMed

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio

    2015-12-01

    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  5. Wavelet analysis for non-stationary, non-linear time series

    NASA Astrophysics Data System (ADS)

    Schulte, J. A.

    2015-12-01

    Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the Quasi-biennial Oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase-coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16, 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.

  6. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  7. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  8. Systemic and intensifying drought induces collapse and replacement of native fishes: a time-series approach

    NASA Astrophysics Data System (ADS)

    Ruhi, A.; Olden, J. D.; Sabo, J. L.

    2015-12-01

    In the American Southwest, hydrologic drought has become a new normal as a result of increasing human appropriation of freshwater resources and increased aridity associated with global warming. Although drought has often been touted to threaten freshwater biodiversity, connecting drought to extinction risk of highly-imperiled faunas remains a challenge. Here we combine time-series methods from signal processing and econometrics to analyze a spatially comprehensive and long-term dataset to link discharge variation and community abundance of fish across the American Southwest. This novel time series framework identifies ongoing trends in daily discharge anomalies across the Southwest, quantifies the effect of the historical hydrologic drivers on fish community abundance, and allows us to simulate species trajectories and range-wide risk of decline (quasiextinction) under scenarios of future climate. Spectral anomalies are declining over the last 30 years in at least a quarter of the stream gaging stations across the American Southwest and these anomalies are robust predictors of historical abundance of native and non-native fishes. Quasiextinction probabilities are high (>50 %) for nearly ¾ of the native species across several large river basins in the same region; and the negative trend in annual anomalies increases quasiextinction risk for native but reduces this risk for non-native fishes. These findings suggest that ongoing drought is causing range-wide collapse and replacement of native fish faunas, and that this homogenization of western fish faunas will continue given the prevailing negative trend in discharge anomalies. Additionally, this combination of methods can be applied elsewhere as long as environmental and biological long-term time-series data are available. Collectively, these methods allow identifying the link between hydroclimatic forcing and ecological responses and thus may help anticipating the potential impacts of ongoing and future hydrologic

  9. Minimax mutual information approach for independent component analysis.

    PubMed

    Erdogmus, Deniz; Hild, Kenneth E; Rao, Yadunandana N; Príncipe, Joséc C

    2004-06-01

    Minimum output mutual information is regarded as a natural criterion for independent component analysis (ICA) and is used as the performance measure in many ICA algorithms. Two common approaches in information-theoretic ICA algorithms are minimum mutual information and maximum output entropy approaches. In the former approach, we substitute some form of probability density function (pdf) estimate into the mutual information expression, and in the latter we incorporate the source pdf assumption in the algorithm through the use of nonlinearities matched to the corresponding cumulative density functions (cdf). Alternative solutions to ICA use higher-order cumulant-based optimization criteria, which are related to either one of these approaches through truncated series approximations for densities. In this article, we propose a new ICA algorithm motivated by the maximum entropy principle (for estimating signal distributions). The optimality criterion is the minimum output mutual information, where the estimated pdfs are from the exponential family and are approximate solutions to a constrained entropy maximization problem. This approach yields an upper bound for the actual mutual information of the output signals - hence, the name minimax mutual information ICA algorithm. In addition, we demonstrate that for a specific selection of the constraint functions in the maximum entropy density estimation procedure, the algorithm relates strongly to ICA methods using higher-order cumulants. PMID:15130248

  10. Monitoring scale scores over time via quality control charts, model-based approaches, and time series techniques.

    PubMed

    Lee, Yi-Hsuan; von Davier, Alina A

    2013-07-01

    Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment. PMID:25106404

  11. Percutaneous trans-ulnar artery approach for coronary angiography and angioplasty; A case series study

    PubMed Central

    Roghani-Dehkordi, Farshad; Hadizadeh, Mahmood; Hadizadeh, Fatemeh

    2015-01-01

    BACKGROUND Coronary angiography is the gold standard method for diagnosis of coronary heart disease and usually performed by femoral approach that has several complications. To reduce these complications, upper extremity approach is increasingly used and is becoming preferred access site by many interventionists. Although radial approach is relatively well studied, safety, feasibility and risk of applying ulnar approach in not clearly known yet. METHODS We followed 97 patients (man = 56%, mean ± standard deviation of age = 57 ± 18) who had undergone coronary angiography or angioplasty via ulnar approach for 6-10 months and recorded their outcomes. RESULTS In 97 patients out of 105 ones (92.38%), procedure through ulnar access were successfully done. Unsuccessful puncture (3 patients), wiring (2 patients), passing of sheet (2 patients), and anatomically unsuitable ulnar artery (1 patient) were the reasons of failure. In 94 patients (89.52%), the angiography and angioplasty was done without any complications. Five patients (5.1%) hematoma and 11 patients (11%) experienced low-grade pain that resolved with painkiller. No infection, amputation or need for surgery was reported. CONCLUSION This study demonstrated that ulnar access in our patients was a safe and practical approach for coronary angiography or angioplasty, without any major complication. Bearing in mind its high success rate, it can be utilized when a radial artery is not useful for the catheterization and in cases such as prior harvesting of the radial artery (in prior coronary artery bypass grafting). PMID:26715936

  12. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  13. The study of coastal groundwater depth and salinity variation using time-series analysis

    SciTech Connect

    Tularam, G.A. . E-mail: a.tularam@griffith.edu.au; Keeler, H.P. . E-mail: p.keeler@ms.unimelb.edu.au

    2006-10-15

    A time-series approach is applied to study and model tidal intrusion into coastal aquifers. The authors examine the effect of tidal behaviour on groundwater level and salinity intrusion for the coastal Brisbane region using auto-correlation and spectral analyses. The results show a close relationship between tidal behaviour, groundwater depth and salinity levels for the Brisbane coast. The known effect can be quantified and incorporated into new models in order to more accurately map salinity intrusion into coastal groundwater table.

  14. Modified superposition: A simple time series approach to closed-loop manual controller identification

    NASA Technical Reports Server (NTRS)

    Biezad, D. J.; Schmidt, D. K.; Leban, F.; Mashiko, S.

    1986-01-01

    Single-channel pilot manual control output in closed-tracking tasks is modeled in terms of linear discrete transfer functions which are parsimonious and guaranteed stable. The transfer functions are found by applying a modified super-position time series generation technique. A Levinson-Durbin algorithm is used to determine the filter which prewhitens the input and a projective (least squares) fit of pulse response estimates is used to guarantee identified model stability. Results from two case studies are compared to previous findings, where the source of data are relatively short data records, approximately 25 seconds long. Time delay effects and pilot seasonalities are discussed and analyzed. It is concluded that single-channel time series controller modeling is feasible on short records, and that it is important for the analyst to determine a criterion for best time domain fit which allows association of model parameter values, such as pure time delay, with actual physical and physiological constraints. The purpose of the modeling is thus paramount.

  15. [Approximation of Time Series of Paramecia caudatum Dynamics by Verhulst and Gompertz Models: Non-traditional Approach].

    PubMed

    Nedorezov, L V

    2015-01-01

    For approximation of some well-known time series of Paramecia caudatun population dynamics (G. F. Gause, The Struggle for Existence, 1934) Verhulst and Gompertz models were used. The parameters were estimated for each of the models in two different ways: with the least squares method (global fitting) and non-traditional approach (a method of extreme points). The results obtained were compared and also with those represented by G. F. Gause. Deviations of theoretical (model) trajectories from experimental time series were tested using various non-parametric statistical tests. It was shown that the least square method-estimations lead to the results which not always meet the requirements imposed for a "fine" model. But in some cases a small modification of the least square method-estimations is possible allowing for satisfactory representations of experimental data set for approximation.

  16. [Approximation of Time Series of Paramecia caudatum Dynamics by Verhulst and Gompertz Models: Non-traditional Approach].

    PubMed

    Nedorezov, L V

    2015-01-01

    For approximation of some well-known time series of Paramecia caudatun population dynamics (G. F. Gause, The Struggle for Existence, 1934) Verhulst and Gompertz models were used. The parameters were estimated for each of the models in two different ways: with the least squares method (global fitting) and non-traditional approach (a method of extreme points). The results obtained were compared and also with those represented by G. F. Gause. Deviations of theoretical (model) trajectories from experimental time series were tested using various non-parametric statistical tests. It was shown that the least square method-estimations lead to the results which not always meet the requirements imposed for a "fine" model. But in some cases a small modification of the least square method-estimations is possible allowing for satisfactory representations of experimental data set for approximation. PMID:26349222

  17. Multiscaling comparative analysis of time series and a discussion on "earthquake conversations" in California.

    PubMed

    Scafetta, Nicola; West, Bruce J

    2004-04-01

    Time series are characterized by complex memory and/or distribution patterns. In this Letter we show that stochastic models characterized by different statistics may equally well reproduce some pattern of a time series. In particular, we discuss the difference between Lévy-walk and fractal Gaussian intermittent signals and show that the adoption of complementary scaling analysis techniques may be useful to distinguish the two cases. Finally, we apply this methodology to the earthquake occurrences in California and suggest the possibility that earthquake occurrences are described by a colored ("long-range correlated") generalized Poisson model. PMID:15089646

  18. Time series analysis and Monte Carlo methods for eigenvalue separation in neutron multiplication problems

    SciTech Connect

    Nease, Brian R. Ueki, Taro

    2009-12-10

    A time series approach has been applied to the nuclear fission source distribution generated by Monte Carlo (MC) particle transport in order to calculate the non-fundamental mode eigenvalues of the system. The novel aspect is the combination of the general technical principle of projection pursuit for multivariate data with the neutron multiplication eigenvalue problem in the nuclear engineering discipline. Proof is thoroughly provided that the stationary MC process is linear to first order approximation and that it transforms into one-dimensional autoregressive processes of order one (AR(1)) via the automated choice of projection vectors. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern MC codes for nuclear criticality calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. This time series approach was tested for a variety of problems including multi-dimensional ones. Numerical results show that the time series approach has strong potential for three dimensional whole reactor core. The eigenvalue ratio can be updated in an on-the-fly manner without storing the nuclear fission source distributions at all previous iteration cycles for the mean subtraction. Lastly, the effects of degenerate eigenvalues are investigated and solutions are provided.

  19. Analysis of Lotka's Law: The Simon-Yule Approach.

    ERIC Educational Resources Information Center

    Chen, Ye-Sho

    1989-01-01

    Argues that a major difficulty in using Lotka's law in information science arises from the misuse of goodness of fit tests in parameter estimation. Three approaches for studying Lotka's law are presented: an index approach, a time series approach, and a generating mechanism incorporating these two influential variables to derive an equilibrium…

  20. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs

  1. Microscopical image analysis: problems and approaches.

    PubMed

    Bradbury, S

    1979-03-01

    This article reviews some of the problems which have been encountered in the application of automatic image analysis to problems in biology. Some of the questions involved in the actual formulation of such a problem for this approach are considered as well as the difficulties in the analysis due to lack of specific constrast in the image and to its complexity. Various practical methods which have been successful in overcoming these problems are outlined, and the question of the desirability of an opto-manual or semi-automatic system as opposed to a fully automatic version is considered.

  2. Graphic analysis and multifractal on percolation-based return interval series

    NASA Astrophysics Data System (ADS)

    Pei, A. Q.; Wang, J.

    2015-05-01

    A financial time series model is developed and investigated by the oriented percolation system (one of the statistical physics systems). The nonlinear and statistical behaviors of the return interval time series are studied for the proposed model and the real stock market by applying visibility graph (VG) and multifractal detrended fluctuation analysis (MF-DFA). We investigate the fluctuation behaviors of return intervals of the model for different parameter settings, and also comparatively study these fluctuation patterns with those of the real financial data for different threshold values. The empirical research of this work exhibits the multifractal features for the corresponding financial time series. Further, the VGs deviated from both of the simulated data and the real data show the behaviors of small-world, hierarchy, high clustering and power-law tail for the degree distributions.

  3. Engage: The Science Speaker Series - A novel approach to improving science outreach and communication

    NASA Astrophysics Data System (ADS)

    von der Linden, Jens; Hilton, Eric; Mitchell, Rachel; Rosenfield, Phil

    2011-10-01

    Communicating the results and significance of basic research to the general public is of critical importance. At present, very few programs exist to allow young scientists the opportunity to practice their public outreach skills. Although the need for science outreach is recognized, graduate programs often fail to provide any training in making science accessible. Engage represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed an interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk about their research. The course incorporates story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This free, public-friendly speaker series is hosted at the University of Washington and has substantial public attendance and participation.

  4. Mining biomedical time series by combining structural analysis and temporal abstractions.

    PubMed

    Bellazzi, R; Magni, P; Larizza, C; De Nicolao, G; Riva, A; Stefanelli, M

    1998-01-01

    This paper describes the combination of Structural Time Series analysis and Temporal Abstractions for the interpretation of data coming from home monitoring of diabetic patients. Blood Glucose data are analyzed by a novel Bayesian technique for time series analysis. The results obtained are post-processed using Temporal Abstractions in order to extract knowledge that can be exploited "at the point of use" from physicians. The proposed data analysis procedure can be viewed as a Knowledge Discovery in Data Base process that is applied to time-varying data. The work here described is part of a Web-based telemedicine system for the management of Insulin Dependent Diabetes Mellitus patients, called T-IDDM.

  5. Scalable Hyper-parameter Estimation for Gaussian Process Based Time Series Analysis

    SciTech Connect

    Chandola, Varun; Vatsavai, Raju

    2010-01-01

    Gaussian process (GP) is increasingly becoming popular as a kernel machine learning tool for non-parametric data analysis. Recently, GP has been applied to model non-linear dependencies in time series data. GP based analysis can be used to solve problems of time series prediction, forecasting, missing data imputation, change point detection, anomaly detection, etc. But the use of GP to handle massive scientific time series data sets has been limited, owing to its expensive computational complexity. The primary bottleneck is the handling of the covariance matrix whose size is quadratic in the length of the time series. In this paper we propose a scalable method that exploit the special structure of the covariance matrix for hyper-parameter estimation in GP based learning. The proposed method allows estimation of hyper parameters associated with GP in quadratic time, which is an order of magnitude improvement over standard methods with cubic complexity. Moreover, the proposed method does not require explicit computation of the covariance matrix and hence has memory requirement linear to the length of the time series as opposed to the quadratic memory requirement of standard methods. To further improve the computational complexity of the proposed method, we provide a parallel version to concurrently estimate the log likelihood for a set of time series which is the key step in the hyper-parameter estimation. Performance results on a multi-core system show that our proposed method provides significant speedups as high as 1000, even when running in serial mode, while maintaining a small memory footprint. The parallel version exploits the natural parallelization potential of the serial algorithm and is shown to perform significantly better than the serial faster algorithm, with speedups as high as 10.

  6. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    PubMed

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  7. Wavelet application to the time series analysis of DORIS station coordinates

    NASA Astrophysics Data System (ADS)

    Bessissi, Zahia; Terbeche, Mekki; Ghezali, Boualem

    2009-06-01

    The topic developed in this article relates to the residual time series analysis of DORIS station coordinates using the wavelet transform. Several analysis techniques, already developed in other disciplines, were employed in the statistical study of the geodetic time series of stations. The wavelet transform allows one, on the one hand, to provide temporal and frequential parameter residual signals, and on the other hand, to determine and quantify systematic signals such as periodicity and tendency. Tendency is the change in short or long term signals; it is an average curve which represents the general pace of the signal evolution. On the other hand, periodicity is a process which is repeated, identical to itself, after a time interval called the period. In this context, the topic of this article consists, on the one hand, in determining the systematic signals by wavelet analysis of time series of DORIS station coordinates, and on the other hand, in applying the denoising signal to the wavelet packet, which makes it possible to obtain a well-filtered signal, smoother than the original signal. The DORIS data used in the treatment are a set of weekly residual time series from 1993 to 2004 from eight stations: DIOA, COLA, FAIB, KRAB, SAKA, SODB, THUB and SYPB. It is the ign03wd01 solution expressed in stcd format, which is derived by the IGN/JPL analysis center. Although these data are not very recent, the goal of this study is to detect the contribution of the wavelet analysis method on the DORIS data, compared to the other analysis methods already studied.

  8. THE MENTALLY RETARDED CHILD, A PSYCHOLOGICAL APPROACH. MCGRAW-HILL SERIES IN PSYCHOLOGY.

    ERIC Educational Resources Information Center

    ROBINSON, HALBERT B.; ROBINSON, NANCY M.

    PRESENTING A PSYCHOLOGICAL APPROACH TO MENTAL RETARDATION, THIS TEXT BEGINS WITH A DISCUSSION OF THEORIES OF INTELLIGENCE, PROBLEMS OF DEFINITION, AND THE CURRENT STATUS OF THE FIELD OF MENTAL RETARDATION. A SECTION ON ETIOLOGY AND SYNDROMES PRESENTS INFORMATION ON GENETIC FACTORS AND GENETIC SYNDROMES AND THE PHYSICAL AND PSYCHOLOGICAL…

  9. Performed Culture: An Approach to East Asian Language Pedagogy. Pathways to Advanced Skills Series, Volume 11

    ERIC Educational Resources Information Center

    Christensen, Matthew; Warnick, Paul

    2006-01-01

    This book is a general introduction to the performed culture approach, which trains students how to express themselves in a way that native speakers of the target culture feel appropriate in given situations. Target readership includes Chinese, Japanese, and Korean language teachers and graduate students. Chapters of this book include: (1)…

  10. Communication for the Workplace: An Integrated Language Approach. Second Edition. Job Skills. Net Effect Series.

    ERIC Educational Resources Information Center

    Ettinger, Blanche; Perfetto, Edda

    Using a developmental, hands-on approach, this text/workbook helps students master the basic English skills that are essential to write effective business correspondence, to recognize language errors, and to develop decision-making and problem-solving skills. Its step-by-step focus and industry-specific format encourages students to review,…

  11. Phonics, Spelling, and Word Study: A Sensible Approach. The Bill Harp Professional Teachers Library Series.

    ERIC Educational Resources Information Center

    Glazer, Susan Mandel

    This concise book shares several sensible, logical, and meaningful approaches that guide young children to use the written coding system to read, spell, and make meaning of the English language coding system. The book demonstrates that phonics, spelling, and word study are essential parts of literacy learning. After an introduction, chapters are:…

  12. Place as Text: Approaches to Active Learning. 2nd Edition. National Collegiate Honors Council Monograph Series

    ERIC Educational Resources Information Center

    Braid, Bernice, Ed.; Long, Ada, Ed.

    2010-01-01

    The decade since publication of "Place as Text: Approaches to Active Learning" has seen an explosion of interest and productivity in the field of experiential education. This monograph presents a story of an experiment and a blueprint of sorts for anyone interested in enriching an existing program or willing to experiment with pedagogy…

  13. The Adolescent Community Reinforcement Approach for Adolescent Cannabis Users, Cannabis Youth Treatment (CYT) Series, Volume 4.

    ERIC Educational Resources Information Center

    Godley, Susan Harrington; Meyers, Robert J.; Smith, Jane Ellen; Karvinen, Tracy; Titus, Janet C.; Godley, Mark D.; Dent, George; Passetti, Lora; Kelberg, Pamela

    This publication was written for therapists and their supervisors who may want to implement the adolescent community reinforcement approach intervention, which was one of the five interventions tested by the Center for Substance Abuse Treatment's (CSAT's) Cannabis Youth Treatment (CYT) Project. The CYT Project provided funding to support a study…

  14. Increasing Vocational Program Relevance: A Data-based Approach. Research and Development Series No. 264.

    ERIC Educational Resources Information Center

    Starr, Harold

    This guide is intended for local- and state-level vocational education administrators, planners, and evaluators who are responsible for making program planning and evaluation findings and decisions geared toward increasing program relevance. The method differs from more traditional approaches in its heavy reliance upon the selection and…

  15. Drama for Learning: Dorothy Heathcote's Mantle of the Expert Approach to Education. Dimensions of Drama Series.

    ERIC Educational Resources Information Center

    Heathcote, Dorothy; Bolton, Gavin

    This book describes how theater can create an impetus for productive learning across the curriculum. Dorothy Heathcote's "mantle of the expert" approach is discussed in which teachers and students explore, in role, the knowledge they already have about a problem or task while making new discoveries along the way. The book also presents a variety…

  16. Parapharyngeal space tumors: surgical approaches in a series of 13 cases.

    PubMed

    Papadogeorgakis, N; Petsinis, V; Goutzanis, L; Kostakis, G; Alexandridis, C

    2010-03-01

    Tumors originating in the parapharyngeal space are rare; they comprise approximately 0.5% of head and neck tumors. Most (70-80%) are benign and the most frequent origins are salivary and neurogenic. The aim of this study is to present the surgical procedures used for the treatment of 13 patients with parapharyngeal space tumors; 11 of them were suffering from benign tumors (the most frequent being pleomorphic adenoma; 8 cases) and 2 from malignant lesions. The following surgical approaches were used: intraoral (2 cases), transcervical (4 cases) and transmandibular (7 cases) with different types of mandible osteotomies. The type of surgical approach was dictated by the type of the lesion (malignant or benign), the exact location, the size, the vascularity and the relation of the tumor to the neck neurovascular bundle. In all cases the selected surgical approach allowed the complete resection of the tumor, obtaining clear margins in cases of malignancy, without adding to the patient's preoperative morbidity. It was concluded that the surgical approach to the parapharyngeal space tumors must be adjusted to the tumor characteristics and be as wide is necessary to achieve its complete removal with safety.

  17. Analysis approaches and interventions with occupational performance

    PubMed Central

    Ahn, Sinae

    2016-01-01

    [Purpose] The purpose of this study was to analyze approaches and interventions with occupational performance in patients with stroke. [Subjects and Methods] In this study, articles published in the past 10 years were searched. The key terms used were “occupational performance AND stroke” and “occupational performance AND CVA”. A total 252 articles were identified, and 79 articles were selected. All interventions were classified according to their approaches according to 6 theories. All interventions were analyzed for frequency. [Results] Regarding the approaches, there were 25 articles for studies that provided high frequency interventions aimed at improving biomechanical approaches (31.6%). This included electrical stimulation therapy, robot therapy, and sensory stimulation training, as well as others. Analysis of the frequency of interventions revealed that the most commonly used interventions, which were used in 18 articles (22.8%), made use of the concept of constraint-induced therapy. [Conclusion] The results of this study suggest an approach for use in clinics for selecting an appropriate intervention for occupational performance. PMID:27799719

  18. Efficient transfer entropy analysis of non-stationary neural time series.

    PubMed

    Wollstadt, Patricia; Martínez-Zarzuela, Mario; Vicente, Raul; Díaz-Pernas, Francisco J; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  19. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    PubMed Central

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  20. TIME SERIES ANALYSIS OF REMOTELY-SENSED TIR EMISSION: linking anomalies to physical processes

    NASA Astrophysics Data System (ADS)

    Pavlidou, E.; van der Meijde, M.; Hecker, C.; van der Werff, H.; Ettema, J.

    2013-12-01

    In the last 15 years, remote sensing has been evaluated for detecting thermal anomalies as precursor to earthquakes. Important issues that need yet to be tackled include definition of: (a) thermal anomaly, taking into account weather conditions, observation settings and ';natural' variability caused by background sources (b) the length of observations required for this purpose; and (c) the location of detected anomalies, which should be physically related to the tectonic activity. To determine whether thermal anomalies are statistical noise, mere meteorological conditions, or actual earthquake-related phenomena, we apply a novel approach. We use brightness temperature (top-of-atmosphere) data from thermal infrared imagery acquired at a hypertemporal (sub-hourly) interval, from geostationary weather satellites over multiple years. The length of the time series allows for analysis of meteorological effects (diurnal, seasonal or annual trends) and background variability, through the application of a combined spatial and temporal filter to distinguish extreme occurrences from trends. The definition of potential anomalies is based on statistical techniques, taking into account published (geo)physical characteristics of earthquake related thermal anomalies. We use synthetic data to test the performance of the proposed detection method and track potential factors affecting the results. Subsequently, we apply the method on original data from Iran and Turkey, in quiescent and earthquake-struck periods alike. We present our findings with main focus to assess resulting anomalies in relation to physical processes thereby considering: (a) meteorological effects, (b) the geographical, geological and environmental settings, and (c) physically realistic distances and potential physical relations with the activity of causative faults.

  1. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  2. Time series analysis and feature extraction techniques for structural health monitoring applications

    NASA Astrophysics Data System (ADS)

    Overbey, Lucas A.

    Recently, advances in sensing and sensing methodologies have led to the deployment of multiple sensor arrays on structures for structural health monitoring (SHM) applications. Appropriate feature extraction, detection, and classification methods based on measurements obtained from these sensor networks are vital to the SHM paradigm. This dissertation focuses on a multi-input/multi-output approach to novel data processing procedures to produce detailed information about the integrity of a structure in near real-time. The studies employ nonlinear time series analysis techniques to extract three different types of features for damage diagnostics: namely, nonlinear prediction error, transfer entropy, and the generalized interdependence. These features form reliable measures of generalized correlations between multiple measurements to capture aspects of the dynamics related to the presence of damage. Several analyses are conducted on each of these features. Specifically, variations of nonlinear prediction error are introduced, analyzed, and validated, including the use of a stochastic excitation to augment generality, introduction of local state-space models for sensitivity enhancement, and the employment of comparisons between multiple measurements for localization capability. A modification and enhancement to transfer entropy is created and validated for improved sensitivity. In addition, a thorough analysis of the effects of variability to transfer entropy estimation is made. The generalized interdependence is introduced into the literature and validated as an effective measure of damage presence, extent, and location. These features are validated on a multi-degree-of-freedom dynamic oscillator and several different frame experiments. The evaluated features are then fed into four different classification schemes to obtain a concurrent set of outputs that categorize the integrity of the structure, e.g. the presence, extent, location, and type of damage, taking

  3. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  4. Time series analysis of long-term data sets of atmospheric mercury concentrations.

    PubMed

    Temme, Christian; Ebinghaus, Ralf; Einax, Jürgen W; Steffen, Alexandra; Schroeder, William H

    2004-10-01

    Different aspects and techniques of time series analysis were used to investigate long-term data sets of atmospheric mercury in the Northern Hemisphere. Two perennial time series from different latitudes with different seasonal behaviour were chosen: first, Mace Head on the west coast of Ireland (53 degrees 20'N, 9 degrees 54'W), representing Northern Hemispherical background conditions in Europe with no indications for so-called atmospheric mercury depletion events (AMDEs); and second, Alert, Canada (82 degrees 28'N, 62 degrees 30'W), showing strong AMDEs during Arctic springtime. Possible trends were extracted and forecasts were performed by using seasonal decomposition procedures, autoregressive integrated moving average (ARIMA) methods and exponential smoothing (ES) techniques. The application of time series analysis to environmental data is shown in respect of atmospheric long-term data sets, and selected advantages are discussed. Both time series have not shown any statistically significant temporal trend in the gaseous elemental mercury (GEM) concentrations since 1995, representing low Northern Hemispherical background concentrations of 1.72+/-0.09 ng m(-3) (Mace Head) and 1.55+/-0.18 ng m(-3) (Alert), respectively. The annual forecasts for the GEM concentrations in 2001 at Alert by two different techniques were in good agreement with the measured concentrations for this year.

  5. CANONICAL CORRELATION ANALYSIS BETWEEN TIME SERIES AND STATIC OUTCOMES, WITH APPLICATION TO THE SPECTRAL ANALYSIS OF HEART RATE VARIABILITY.

    PubMed

    Krafty, Robert T; Hall, Martica

    2013-03-01

    Although many studies collect biomedical time series signals from multiple subjects, there is a dearth of models and methods for assessing the association between frequency domain properties of time series and other study outcomes. This article introduces the random Cramér representation as a joint model for collections of time series and static outcomes where power spectra are random functions that are correlated with the outcomes. A canonical correlation analysis between cepstral coefficients and static outcomes is developed to provide a flexible yet interpretable measure of association. Estimates of the canonical correlations and weight functions are obtained from a canonical correlation analysis between the static outcomes and maximum Whittle likelihood estimates of truncated cepstral coefficients. The proposed methodology is used to analyze the association between the spectrum of heart rate variability and measures of sleep duration and fragmentation in a study of older adults who serve as the primary caregiver for their ill spouse.

  6. CANONICAL CORRELATION ANALYSIS BETWEEN TIME SERIES AND STATIC OUTCOMES, WITH APPLICATION TO THE SPECTRAL ANALYSIS OF HEART RATE VARIABILITY.

    PubMed

    Krafty, Robert T; Hall, Martica

    2013-03-01

    Although many studies collect biomedical time series signals from multiple subjects, there is a dearth of models and methods for assessing the association between frequency domain properties of time series and other study outcomes. This article introduces the random Cramér representation as a joint model for collections of time series and static outcomes where power spectra are random functions that are correlated with the outcomes. A canonical correlation analysis between cepstral coefficients and static outcomes is developed to provide a flexible yet interpretable measure of association. Estimates of the canonical correlations and weight functions are obtained from a canonical correlation analysis between the static outcomes and maximum Whittle likelihood estimates of truncated cepstral coefficients. The proposed methodology is used to analyze the association between the spectrum of heart rate variability and measures of sleep duration and fragmentation in a study of older adults who serve as the primary caregiver for their ill spouse. PMID:24851143

  7. Quantifying surface water-groundwater interactions using time series analysis of streambed thermal records: Method development

    USGS Publications Warehouse

    Hatch, C.E.; Fisher, A.T.; Revenaugh, J.S.; Constantz, J.; Ruehl, C.

    2006-01-01

    We present a method for determining streambed seepage rates using time series thermal data. The new method is based on quantifying changes in phase and amplitude of temperature variations between pairs of subsurface sensors. For a reasonable range of streambed thermal properties and sensor spacings the time series method should allow reliable estimation of seepage rates for a range of at least ??10 m d-1 (??1.2 ?? 10-2 m s-1), with amplitude variations being most sensitive at low flow rates and phase variations retaining sensitivity out to much higher rates. Compared to forward modeling, the new method requires less observational data and less setup and data handling and is faster, particularly when interpreting many long data sets. The time series method is insensitive to streambed scour and sedimentation, which allows for application under a wide range of flow conditions and allows time series estimation of variable streambed hydraulic conductivity. This new approach should facilitate wider use of thermal methods and improve understanding of the complex spatial and temporal dynamics of surface water-groundwater interactions. Copyright 2006 by the American Geophysical Union.

  8. Fractal time series analysis of postural stability in elderly and control subjects

    PubMed Central

    Amoud, Hassan; Abadi, Mohamed; Hewson, David J; Michel-Pellegrino, Valérie; Doussot, Michel; Duchêne, Jacques

    2007-01-01

    Background The study of balance using stabilogram analysis is of particular interest in the study of falls. Although simple statistical parameters derived from the stabilogram have been shown to predict risk of falls, such measures offer little insight into the underlying control mechanisms responsible for degradation in balance. In contrast, fractal and non-linear time-series analysis of stabilograms, such as estimations of the Hurst exponent (H), may provide information related to the underlying motor control strategies governing postural stability. In order to be adapted for a home-based follow-up of balance, such methods need to be robust, regardless of the experimental protocol, while producing time-series that are as short as possible. The present study compares two methods of calculating H: Detrended Fluctuation Analysis (DFA) and Stabilogram Diffusion Analysis (SDA) for elderly and control subjects, as well as evaluating the effect of recording duration. Methods Centre of pressure signals were obtained from 90 young adult subjects and 10 elderly subjects. Data were sampled at 100 Hz for 30 s, including stepping onto and off the force plate. Estimations of H were made using sliding windows of 10, 5, and 2.5 s durations, with windows slid forward in 1-s increments. Multivariate analysis of variance was used to test for the effect of time, age and estimation method on the Hurst exponent, while the intra-class correlation coefficient (ICC) was used as a measure of reliability. Results Both SDA and DFA methods were able to identify differences in postural stability between control and elderly subjects for time series as short as 5 s, with ICC values as high as 0.75 for DFA. Conclusion Both methods would be well-suited to non-invasive longitudinal assessment of balance. In addition, reliable estimations of H were obtained from time series as short as 5 s. PMID:17470303

  9. Surgical Approaches to First Branchial Cleft Anomaly Excision: A Case Series

    PubMed Central

    Quintanilla-Dieck, Lourdes; Virgin, Frank; Wootten, Chistopher; Goudy, Steven; Penn, Edward

    2016-01-01

    Objectives. First branchial cleft anomalies (BCAs) constitute a rare entity with variable clinical presentations and anatomic findings. Given the high rate of recurrence with incomplete excision, identification of the entire tract during surgical treatment is of paramount importance. The objectives of this paper were to present five anatomic variations of first BCAs and describe the presentation, evaluation, and surgical approach to each one. Methods. A retrospective case review and literature review were performed. We describe patient characteristics, presentation, evaluation, and surgical approach of five patients with first BCAs. Results. Age at definitive surgical treatment ranged from 8 months to 7 years. Various clinical presentations were encountered, some of which were atypical for first BCAs. All had preoperative imaging demonstrating the tract. Four surgical approaches required a superficial parotidectomy with identification of the facial nerve, one of which revealed an aberrant facial nerve. In one case the tract was found to travel into the angle of the mandible, terminating as a mandibular cyst. This required en bloc excision that included the lateral cortex of the mandible. Conclusions. First BCAs have variable presentations. Complete surgical excision can be challenging. Therefore, careful preoperative planning and the recognition of atypical variants during surgery are essential. PMID:27034873

  10. Low Cost Beam-Steering Approach for a Series-Fed Array

    NASA Technical Reports Server (NTRS)

    Host, Nicholas K.; Chen, Chi-Chih; Volakis, John L.; Miranda, Felix A.

    2013-01-01

    Phased array antennas showcase many advantages over mechanically steered systems. However, they are also more complex and costly. This paper presents a concept which overcomes these detrimental attributes by eliminating all of the phased array backend (including phase shifters). Instead, a propagation constant reconfigurable transmission line in a series fed array arrangement is used to allow phase shifting with one small (less than or equal to 100mil) linear mechanical motion. A novel slotted coplanar stripline design improves on previous transmission lines by demonstrating a greater control of propagation constant, thus allowing practical prototypes to be built. Also, beam steering pattern control is explored. We show that with correct choice of line impedance, pattern control is possible for all scan angles. A 20 element array scanning from -25 deg less than or equal to theta less than or equal to 21 deg. with mostly uniform gain at 13GHz is presented. Measured patterns show a reduced scan range of 12 deg. less than or equal to theta less than or equal to 25 deg. due to a correctable manufacturing error as verified by simulation. Beam squint is measured to be plus or minus 2.5 deg for a 600MHz bandwidth and cross-pol is measured to be at least -15dB.

  11. Aseismic deformation across the Hilina fault system, Hawaii, revealed by wavelet analysis of InSAR and GPS time series

    NASA Astrophysics Data System (ADS)

    Shirzaei, M.; Bürgmann, R.; Foster, J.; Walter, T. R.; Brooks, B. A.

    2013-08-01

    The Hilina Fault System (HFS) is located on the south flank of Kilauea volcano and is thought to represent the surface expression of an unstable edifice sector that is active during seismic events such as the 1975 Kalapana earthquake. Despite its potential for hazardous landsliding and associated tsunamis, no fault activity has yet been detected by means of modern geodetic methods, since the 1975 earthquake. We present evidence from individual SAR interferograms, as well as cluster analysis and wavelet analysis of GPS and InSAR time series, which suggest an inferred differential motion at HFS. To investigate the effect of atmospheric delay on the observed differential motion, we implement a statistical approach using wavelet transforms. We jointly analyze InSAR and continuous GPS deformation data from 2003 to 2010, to estimate the likelihood that the subtle time-dependent deformation signal about the HFS scarps is not associated with the atmospheric delay. This integrated analysis reveals localized deformation components in the InSAR deformation time series that are superimposed on the coherent motion of Kilauea's south flank. The statistical test suggests that at 95% confidence level, the identified differential deformation at HFS is not due to atmospheric artifacts. Since no significant shallow seismicity is observed over the study period, we suggest that this deformation occurred aseismically.

  12. Analysis of time-series correlation between weighted lifestyle data and health data.

    PubMed

    Takeuchi, Hiroshi; Mayuzumi, Yuuki; Kodama, Naoki

    2011-01-01

    The time-series data analysis described here is based on the simple idea that the accumulation of the effects of lifestyle events, such as ingestion and exercise, could affect personal health with some delay. The delay may reflect complex bio-reactions such as those of metabolism in a human body. In the analysis, the accumulation of the effects of lifestyle events is represented by a summation of daily lifestyle data whose time-series correlation to variations of health data is examined (healthcare-data-mining). The concept of weighting is introduced for the summation of daily lifestyle data. As a result, it is suggested that the nature of personal health could be represented by a weighting pattern characterized by a small number of parameters.

  13. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  14. Parametric time-series analysis of daily air pollutants of city of Shumen, Bulgaria

    NASA Astrophysics Data System (ADS)

    Ivanov, A.; Voynikova, D.; Gocheva-Ilieva, S.; Boyadzhiev, D.

    2012-10-01

    The urban air pollution is one of the main factors determining the ambient air quality, which affects on the human health and the environment. In this paper parametric time series models are obtained for studying the distribution over time of primary pollutants as sulphur and nitrogen oxides, particulate matter and a secondary pollutant ground level ozon in the town of Shumen, Bulgaria. The methods of factor analysis and ARIMA are used to carry out the time series analysis based on hourly average data in 2011 and first quarter of 2012. The constructed models are applied for a short-term air pollution forecasting. The results are estimated on the basis of national and European regulation indices. The sources of pollutants in the region and their harmful effects on human health are also discussed.

  15. A nonlinear modeling approach using weighted piecewise series and its applications to predict unsteady flows

    NASA Astrophysics Data System (ADS)

    Yao, Weigang; Liou, Meng-Sing

    2016-08-01

    To preserve nonlinearity of a full-order system over a range of parameters of interest, we propose an accurate and robust nonlinear modeling approach by assembling a set of piecewise linear local solutions expanded about some sampling states. The work by Rewienski and White [1] on micromachined devices inspired our use of piecewise linear local solutions to study nonlinear unsteady aerodynamics. These local approximations are assembled via nonlinear weights of radial basis functions. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving with different pitching motions, specifically AGARD's CT2 and CT5 problems [27], in which the flows exhibit different nonlinear behaviors. Furthermore, application of the developed aerodynamic model to a two-dimensional aero-elastic system proves the approach is capable of predicting limit cycle oscillations (LCOs) by using AGARD's CT6 [28] as a benchmark test. All results, based on inviscid solutions, confirm that our nonlinear model is stable and accurate, against the full model solutions and measurements, and for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robust for inputs that considerably depart from the base trajectory in form and magnitude. This modeling provides a very efficient way for predicting unsteady flowfields with varying parameters because it needs only a tiny fraction of the cost of a full-order modeling for each new condition-the more cases studied, the more savings rendered. Hence, the present approach is especially useful for parametric studies, such as in the case of design optimization and exploration of flow phenomena.

  16. Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin

    NASA Astrophysics Data System (ADS)

    zhangli, Sun; xiufang, Zhu; yaozhong, Pan

    2016-04-01

    Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.

  17. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical

  18. Wavelet analysis for the 38-year time series of the Earth's Oblateness from SLR

    NASA Astrophysics Data System (ADS)

    Cheng, M.; Tapley, B. D.

    2013-12-01

    The long-term J2 time series contains a broad spectrum of signals produced by global mass transport between the atmosphere, ocean and solid earth. Except for the secular and the tidal variations, the variations in J2 are climate related with a stochastic (non-harmonic) behavior. In addition, the variations in J2 due to 18.6-year tides in the ocean and solid earth appear different in the time domain, and have different amplitude and phase. To improve our understanding of the nature of these variations, it is necessary to distinguish the signature of the different frequency components in the time domain. To deal with those signals with varying amplitude and phase, the wavelet analysis is a suitable technique for time series analysis, which decomposes the signals into individual high-low frequency components in the time domain. In this study, the discrete Meyer wavelet (dmey) was applied to analyze the 38-year time series of J2 variation (spanning the interval from May 1975) in order to characterize the interannual and decadal variations. Particular attention is given to the nature of the variations in J2 caused by the errors in the model of the 18.6-year ocean and frequency dependent solid earth tides from wavelet analysis.

  19. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i

  20. Rapid space trajectory generation using a Fourier series shape-based approach

    NASA Astrophysics Data System (ADS)

    Taheri, Ehsan

    With the insatiable curiosity of human beings to explore the universe and our solar system, it is essential to benefit from larger propulsion capabilities to execute efficient transfers and carry more scientific equipments. In the field of space trajectory optimization the fundamental advances in using low-thrust propulsion and exploiting the multi-body dynamics has played pivotal role in designing efficient space mission trajectories. The former provides larger cumulative momentum change in comparison with the conventional chemical propulsion whereas the latter results in almost ballistic trajectories with negligible amount of propellant. However, the problem of space trajectory design translates into an optimal control problem which is, in general, time-consuming and very difficult to solve. Therefore, the goal of the thesis is to address the above problem by developing a methodology to simplify and facilitate the process of finding initial low-thrust trajectories in both two-body and multi-body environments. This initial solution will not only provide mission designers with a better understanding of the problem and solution but also serves as a good initial guess for high-fidelity optimal control solvers and increases their convergence rate. Almost all of the high-fidelity solvers enjoy the existence of an initial guess that already satisfies the equations of motion and some of the most important constraints. Despite the nonlinear nature of the problem, it is sought to find a robust technique for a wide range of typical low-thrust transfers with reduced computational intensity. Another important aspect of our developed methodology is the representation of low-thrust trajectories by Fourier series with which the number of design variables reduces significantly. Emphasis is given on simplifying the equations of motion to the possible extent and avoid approximating the controls. These facts contribute to speeding up the solution finding procedure. Several example

  1. Surgery-first orthognathic approach case series: Salient features and guidelines.

    PubMed

    Gandedkar, Narayan H; Chng, Chai Kiat; Tan, Winston

    2016-01-01

    Conventional orthognathic surgery treatment involves a prolonged period of orthodontic treatment (pre- and post-surgery), making the total treatment period of 3-4 years too exhaustive. Surgery-first orthognathic approach (SFOA) sees orthognathic surgery being carried out first, followed by orthodontic treatment to align the teeth and occlusion. Following orthognathic surgery, a period of rapid metabolic activity within tissues ensues is known as the regional acceleratory phenomenon (RAP). By performing surgery first, RAP can be harnessed to facilitate efficient orthodontic treatment. This phenomenon is believed to be a key factor in the notable reduction in treatment duration using SFOA. This article presents two cases treated with SFOA with emphasis on "case selection, treatment strategy, merits, and limitations" of SFOA. Further, salient features comparison of "conventional orthognathic surgery" and "SFOA" with an overview of author's SFOA treatment protocol is enumerated.

  2. Surgery-first orthognathic approach case series: Salient features and guidelines

    PubMed Central

    Gandedkar, Narayan H; Chng, Chai Kiat; Tan, Winston

    2016-01-01

    Conventional orthognathic surgery treatment involves a prolonged period of orthodontic treatment (pre- and post-surgery), making the total treatment period of 3–4 years too exhaustive. Surgery-first orthognathic approach (SFOA) sees orthognathic surgery being carried out first, followed by orthodontic treatment to align the teeth and occlusion. Following orthognathic surgery, a period of rapid metabolic activity within tissues ensues is known as the regional acceleratory phenomenon (RAP). By performing surgery first, RAP can be harnessed to facilitate efficient orthodontic treatment. This phenomenon is believed to be a key factor in the notable reduction in treatment duration using SFOA. This article presents two cases treated with SFOA with emphasis on “case selection, treatment strategy, merits, and limitations” of SFOA. Further, salient features comparison of “conventional orthognathic surgery” and “SFOA” with an overview of author's SFOA treatment protocol is enumerated. PMID:26998476

  3. An improved time series approach for estimating groundwater recharge from groundwater level fluctuations

    NASA Astrophysics Data System (ADS)

    Cuthbert, M. O.

    2010-09-01

    An analytical solution to a linearized Boussinesq equation is extended to develop an expression for groundwater drainage using estimations of aquifer parameters. This is then used to develop an improved water table fluctuation (WTF) technique for estimating groundwater recharge. The resulting method extends the standard WTF technique by making it applicable, as long as aquifer properties for the area are relatively well known, in areas with smoothly varying water tables and is not reliant on precipitation data. The method is validated against numerical simulations and a case study from a catchment where recharge is "known" a priori using other means. The approach may also be inverted to provide initial estimates of aquifer parameters in areas where recharge can be reliably estimated by other methods.

  4. Surgery-first orthognathic approach case series: Salient features and guidelines.

    PubMed

    Gandedkar, Narayan H; Chng, Chai Kiat; Tan, Winston

    2016-01-01

    Conventional orthognathic surgery treatment involves a prolonged period of orthodontic treatment (pre- and post-surgery), making the total treatment period of 3-4 years too exhaustive. Surgery-first orthognathic approach (SFOA) sees orthognathic surgery being carried out first, followed by orthodontic treatment to align the teeth and occlusion. Following orthognathic surgery, a period of rapid metabolic activity within tissues ensues is known as the regional acceleratory phenomenon (RAP). By performing surgery first, RAP can be harnessed to facilitate efficient orthodontic treatment. This phenomenon is believed to be a key factor in the notable reduction in treatment duration using SFOA. This article presents two cases treated with SFOA with emphasis on "case selection, treatment strategy, merits, and limitations" of SFOA. Further, salient features comparison of "conventional orthognathic surgery" and "SFOA" with an overview of author's SFOA treatment protocol is enumerated. PMID:26998476

  5. Time series analysis of satellite derived surface temperature for Lake Garda

    NASA Astrophysics Data System (ADS)

    Pareeth, Sajid; Metz, Markus; Rocchini, Duccio; Salmaso, Nico; Neteler, Markus

    2014-05-01

    Remotely sensed satellite imageryis the most suitable tool for researchers around the globe in complementing in-situ observations. Nonetheless, it would be crucial to check for quality, validate and standardize methodologies to estimate the target variables from sensor data. Satellite imagery with thermal infrared bands provides opportunity to remotely measure the temperature in a very high spatio-temporal scale. Monitoring surface temperature of big lakes to understand the thermal fluctuations over time is considered crucial in the current status of global climate change scenario. The main disadvantage of remotely sensed data is the gaps due to presence of clouds and aerosols. In this study we use statistically reconstructed daily land surface temperature products from MODIS (MOD11A1 and MYD11A1) at a better spatial resolution of 250 m. The ability of remotely sensed datasets to capture the thermal variations over time is validated against historical monthly ground observation data collected for Lake Garda. The correlation between time series of satellite data LST (x,y,t) and the field measurements f (x,y,t) are found to be in acceptable range with a correlation coefficient of 0.94. We compared multiple time series analysis methods applied on the temperature maps recorded in the last ten years (2002 - 2012) and monthly field measurements in two sampling points in Lake Garda. The time series methods STL - Seasonal Time series decomposition based on Loess method, DTW - Dynamic Time Waping method, and BFAST - Breaks for Additive Season and Trend, are implemented and compared in their ability to derive changes in trends and seasonalities. These methods are mostly implemented on time series of vegetation indices from satellite data, but seldom used on thermal data because of the temporal incoherence of the data. The preliminary results show that time series methods applied on satellite data are able to reconstruct the seasons on an annual scale while giving us a

  6. Harmonic analysis of environmental time series with missing data or irregular sample spacing.

    PubMed

    Dilmaghani, Shabnam; Henry, Isaac C; Soonthornnonda, Puripus; Christensen, Erik R; Henry, Ronald C

    2007-10-15

    The Lomb periodogram and discrete Fourier transform are described and applied to harmonic analysis of two typical data sets, one air quality time series and one water quality time series. The air quality data is a 13 year series of 24 hour average particulate elemental carbon data from the IMPROVE station in Washington, D.C. The water quality data are from the stormwater monitoring network in Milwaukee, WI and cover almost 2 years of precipitation events. These data have irregular sampling periods and missing data that preclude the straightforward application of the fast Fourier transform (FFT). In both cases, an anthropogenic periodicity is identified; a 7-day weekday/ weekend effect in the Washington elemental carbon series and a 1 month cycle in several constituents of stormwater. Practical aspects of application of the Lomb periodogram are discussed, particularly quantifying the effects of random noise. The proper application of the FFT to data that are irregularly spaced with missing values is demonstrated on the air quality data. Recommendations are given when to use the Lomb periodogram and when to use the FFT. PMID:17993144

  7. Effective low-order models for atmospheric dynamics and time series analysis

    NASA Astrophysics Data System (ADS)

    Gluhovsky, Alexander; Grady, Kevin

    2016-02-01

    The paper focuses on two interrelated problems: developing physically sound low-order models (LOMs) for atmospheric dynamics and employing them as novel time-series models to overcome deficiencies in current atmospheric time series analysis. The first problem is warranted since arbitrary truncations in the Galerkin method (commonly used to derive LOMs) may result in LOMs that violate fundamental conservation properties of the original equations, causing unphysical behaviors such as unbounded solutions. In contrast, the LOMs we offer (G-models) are energy conserving, and some retain the Hamiltonian structure of the original equations. This work examines LOMs from recent publications to show that all of them that are physically sound can be converted to G-models, while those that cannot lack energy conservation. Further, motivated by recent progress in statistical properties of dynamical systems, we explore G-models for a new role of atmospheric time series models as their data generating mechanisms are well in line with atmospheric dynamics. Currently used time series models, however, do not specifically utilize the physics of the governing equations and involve strong statistical assumptions rarely met in real data.

  8. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis.

    PubMed

    Lutaif, N A; Palazzo, R; Gontijo, J A R

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  9. Reference manual for generation and analysis of Habitat Time Series: version II

    USGS Publications Warehouse

    Milhous, Robert T.; Bartholow, John M.; Updike, Marlys A.; Moos, Alan R.

    1990-01-01

    The selection of an instream flow requirement for water resource management often requires the review of how the physical habitat changes through time. This review is referred to as 'Time Series Analysis." The Tune Series Library (fSLIB) is a group of programs to enter, transform, analyze, and display time series data for use in stream habitat assessment. A time series may be defined as a sequence of data recorded or calculated over time. Examples might be historical monthly flow, predicted monthly weighted usable area, daily electrical power generation, annual irrigation diversion, and so forth. The time series can be analyzed, both descriptively and analytically, to understand the importance of the variation in the events over time. This is especially useful in the development of instream flow needs based on habitat availability. The TSLIB group of programs assumes that you have an adequate study plan to guide you in your analysis. You need to already have knowledge about such things as time period and time step, species and life stages to consider, and appropriate comparisons or statistics to be produced and displayed or tabulated. Knowing your destination, you must first evaluate whether TSLIB can get you there. Remember, data are not answers. This publication is a reference manual to TSLIB and is intended to be a guide to the process of using the various programs in TSLIB. This manual is essentially limited to the hands-on use of the various programs. a TSLIB use interface program (called RTSM) has been developed to provide an integrated working environment where the use has a brief on-line description of each TSLIB program with the capability to run the TSLIB program while in the user interface. For information on the RTSM program, refer to Appendix F. Before applying the computer models described herein, it is recommended that the user enroll in the short course "Problem Solving with the Instream Flow Incremental Methodology (IFIM)." This course is offered

  10. Microarray data analysis and mining approaches.

    PubMed

    Cordero, Francesca; Botta, Marco; Calogero, Raffaele A

    2007-12-01

    Microarray based transcription profiling is now a consolidated methodology and has widespread use in areas such as pharmacogenomics, diagnostics and drug target identification. Large-scale microarray studies are also becoming crucial to a new way of conceiving experimental biology. A main issue in microarray transcription profiling is data analysis and mining. When microarrays became a methodology of general use, considerable effort was made to produce algorithms and methods for the identification of differentially expressed genes. More recently, the focus has switched to algorithms and database development for microarray data mining. Furthermore, the evolution of microarray technology is allowing researchers to grasp the regulative nature of transcription, integrating basic expression analysis with mRNA characteristics, i.e. exon-based arrays, and with DNA characteristics, i.e. comparative genomic hybridization, single nucleotide polymorphism, tiling and promoter structure. In this article, we will review approaches used to detect differentially expressed genes and to link differential expression to specific biological functions.

  11. Subsonic flutter analysis addition to NASTRAN. [for use with CDC 6000 series digital computers

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Harder, R. L.

    1973-01-01

    A subsonic flutter analysis capability has been developed for NASTRAN, and a developmental version of the program has been installed on the CDC 6000 series digital computers at the Langley Research Center. The flutter analysis is of the modal type, uses doublet lattice unsteady aerodynamic forces, and solves the flutter equations by using the k-method. Surface and one-dimensional spline functions are used to transform from the aerodynamic degrees of freedom to the structural degrees of freedom. Some preliminary applications of the method to a beamlike wing, a platelike wing, and a platelike wing with a folded tip are compared with existing experimental and analytical results.

  12. Sediment residence times constrained by uranium-series isotopes: A critical appraisal of the comminution approach

    NASA Astrophysics Data System (ADS)

    Handley, Heather K.; Turner, Simon; Afonso, Juan C.; Dosseto, Anthony; Cohen, Tim

    2013-02-01

    Quantifying the rates of landscape evolution in response to climate change is inhibited by the difficulty of dating the formation of continental detrital sediments. We present uranium isotope data for Cooper Creek palaeochannel sediments from the Lake Eyre Basin in semi-arid South Australia in order to attempt to determine the formation ages and hence residence times of the sediments. To calculate the amount of recoil loss of 234U, a key input parameter used in the comminution approach, we use two suggested methods (weighted geometric and surface area measurement with an incorporated fractal correction) and typical assumed input parameter values found in the literature. The calculated recoil loss factors and comminution ages are highly dependent on the method of recoil loss factor determination used and the chosen assumptions. To appraise the ramifications of the assumptions inherent in the comminution age approach and determine individual and combined comminution age uncertainties associated to each variable, Monte Carlo simulations were conducted for a synthetic sediment sample. Using a reasonable associated uncertainty for each input factor and including variations in the source rock and measured (234U/238U) ratios, the total combined uncertainty on comminution age in our simulation (for both methods of recoil loss factor estimation) can amount to ±220-280 ka. The modelling shows that small changes in assumed input values translate into large effects on absolute comminution age. To improve the accuracy of the technique and provide meaningful absolute comminution ages, much tighter constraints are required on the assumptions for input factors such as the fraction of α-recoil lost 234Th and the initial (234U/238U) ratio of the source material. In order to be able to directly compare calculated comminution ages produced by different research groups, the standardisation of pre-treatment procedures, recoil loss factor estimation and assumed input parameter values

  13. Practical overview of ARIMA models for time-series forecasting

    SciTech Connect

    Pack, D.J.

    1980-01-01

    Single series analysis methodology is illustrated. The commentary summarizes the Box-Jenkins philosophy and the ARIMA model structure, with particular emphasis on practical aspects of application, forecast interpretation, strengths weaknesses, and comparison to other time series forecasting approaches. (GHT)

  14. Comprehensive Model of Annual Plankton Succession Based on the Whole-Plankton Time Series Approach

    PubMed Central

    Romagnan, Jean-Baptiste; Legendre, Louis; Guidi, Lionel; Jamet, Jean-Louis; Jamet, Dominique; Mousseau, Laure; Pedrotti, Maria-Luiza; Picheral, Marc; Gorsky, Gabriel; Sardet, Christian; Stemmann, Lars

    2015-01-01

    Ecological succession provides a widely accepted description of seasonal changes in phytoplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes) and larger (i.e. macroplankton) organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankton in the whole-plankton assemblage that spanned 5 orders of magnitude in size from microbes to macroplankton predators (not including fish or fish larvae, for which no consistent data were available). Samples were collected in the northwestern Mediterranean Sea (Bay of Villefranche) weekly during 10 months. Simultaneously collected samples were analyzed by flow cytometry, inverse microscopy, FlowCam, and ZooScan. The whole-plankton assemblage underwent sharp reorganizations that corresponded to bottom-up events of vertical mixing in the water-column, and its development was top-down controlled by large gelatinous filter feeders and predators. Based on the results provided by our novel whole-plankton assemblage approach, we propose a new comprehensive conceptual model of the annual plankton succession (i.e. whole plankton model) characterized by both stepwise stacking of four broad trophic communities from early spring through summer, which is a new concept, and progressive replacement of ecological plankton categories within the different trophic communities, as recognised traditionally. PMID:25780912

  15. On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series

    NASA Astrophysics Data System (ADS)

    Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman

    2016-04-01

    The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for

  16. Applications and development of new algorithms for displacement analysis using InSAR time series

    NASA Astrophysics Data System (ADS)

    Osmanoglu, Batuhan

    Time series analysis of Synthetic Aperture Radar Interferometry (InSAR) data has become an important scientific tool for monitoring and measuring the displacement of Earth's surface due to a wide range of phenomena, including earthquakes, volcanoes, landslides, changes in ground water levels, and wetlands. Time series analysis is a product of interferometric phase measurements, which become ambiguous when the observed motion is larger than half of the radar wavelength. Thus, phase observations must first be unwrapped in order to obtain physically meaningful results. Persistent Scatterer Interferometry (PSI), Stanford Method for Persistent Scatterers (StaMPS), Short Baselines Interferometry (SBAS) and Small Temporal Baseline Subset (STBAS) algorithms solve for this ambiguity using a series of spatio-temporal unwrapping algorithms and filters. In this dissertation, I improve upon current phase unwrapping algorithms, and apply the PSI method to study subsidence in Mexico City. PSI was used to obtain unwrapped deformation rates in Mexico City (Chapter 3),where ground water withdrawal in excess of natural recharge causes subsurface, clay-rich sediments to compact. This study is based on 23 satellite SAR scenes acquired between January 2004 and July 2006. Time series analysis of the data reveals a maximum line-of-sight subsidence rate of 300mm/yr at a high enough resolution that individual subsidence rates for large buildings can be determined. Differential motion and related structural damage along an elevated metro rail was evident from the results. Comparison of PSI subsidence rates with data from permanent GPS stations indicate root mean square (RMS) agreement of 6.9 mm/yr, about the level expected based on joint data uncertainty. The Mexico City results suggest negligible recharge, implying continuing degradation and loss of the aquifer in the third largest metropolitan area in the world. Chapters 4 and 5 illustrate the link between time series analysis and three

  17. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    PubMed Central

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899

  18. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    PubMed Central

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  19. Geospatial Analysis of Near-Surface Soil Moisture Time Series Data Over Indian Region

    NASA Astrophysics Data System (ADS)

    Berwal, P.; Murthy, C. S.; Raju, P. V.; Sesha Sai, M. V. R.

    2016-06-01

    The present study has developed the time series database surface soil moisture over India, for June, July and August months for the period of 20 years from 1991 to 2010, using data products generated under Climate Change Initiative Programme of European Space Agency. These three months represent the crop sowing period in the prime cropping season in the country and the soil moisture data during this period is highly useful to detect the drought conditions and assess the drought impact. The time series soil moisture data which is in 0.25 degree spatial resolution was analyzed to generate different indicators. Rainfall data of same spatial resolution for the same period, generated by India Meteorological Department was also procured and analyzed. Geospatial analysis of soil moisture and rainfall derived indicators was carried out to study (1) inter annual variability of soil moisture and rainfall, (2) soil moisture deviations from normal during prominent drought years, (3) soil moisture and rainfall correlations and (4) drought exposure based on soil moisture and rainfall variability. The study has successfully demonstrated the potential of these soil moisture time series data sets for generating regional drought surveillance information products, drought hazard mapping, drought exposure analysis and detection of drought sensitive areas in the crop planting period.

  20. Monitoring environmental change in the Andes based on low resolution time series analysis

    NASA Astrophysics Data System (ADS)

    Tote, C.; Swinnen, E.; Beringhs, K.; Govers, G.

    2012-04-01

    Environmental change is an important issue in the Andes region and it is unknown to what extent the ongoing processes are a consequence of human impact and/or climate change. The objectives of this research are to study vegetation dynamics in the Andes region based on time series analysis of SPOT-Vegetation, NOAA-AVHRR and MODIS derived NDVI at low spatial but high temporal resolution, and to recognize to which extent this variability can be attributed to either climatic variability or human induced impacts through assimilation of satellite derived NDVI and rainfall data. Monthly rainfall estimates were available from the European Centre for Medium-Range Weather Forecasts (ECMWF) through MeteoConsult and the Monitoring Agricultural ResourceS (MARS) unit. Deviations from the 'average' situation were calculated for the NDVI time series using the Standardized Difference Vegetation Index (SDVI) and for the precipitation time series using the Standardized Precipitation Index (SPI). Correlation analysis between NDVI and SPI is performed in order to identify the temporal scale at which the environment is most sensitive to precipitation anomalies (best lag). Trends in SDVI and SPI are investigated using least square regression, taking into account the accumulated rainfall anomalies over the best lag. Hot spots of human induced environmental change are detected by subtraction of the precipitation induced signal on vegetation dynamics. The model can be used to predict possible effects of climate change in areas most sensible to trends in precipitation.

  1. BSMART: A Matlab/C toolbox for analysis of multichannel neural time series

    PubMed Central

    Cui, Jie; Xu, Lei; Bressler, Steven L.; Ding, Mingzhou; Liang, Hualou

    2008-01-01

    We have developed a Matlab/C toolbox, Brain-SMART (System for Multivariate AutoRegressive Time series, or BSMART), for spectral analysis of continuous neural time series data recorded simultaneously from multiple sensors. Available functions include time series data importing/exporting, preprocessing (normalization and trend removal), AutoRegressive (AR) modeling (multivariate/bivariate model estimation and validation), spectral quantity estimation (auto power, coherence and Granger causality spectra), network analysis (including coherence and causality networks) and visualization (including data, power, coherence and causality views). The tools for investigating causal network structures are unique functions provided by this toolbox. All functionality has been integrated into a simple and user-friendly graphical user interface (GUI) environment designed for easy accessibility. Although we have tested the toolbox only on Windows and Linux operating systems, BSMART itself is system independent. This toolbox is freely available (http://www.sahs.uth.tmc.edu/hliang/software.htm) under the GNU public license for open source development. PMID:18599267

  2. Water quality management using statistical analysis and time-series prediction model

    NASA Astrophysics Data System (ADS)

    Parmar, Kulwinder Singh; Bhardwaj, Rashmi

    2014-12-01

    This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.

  3. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  4. The Re-Analysis of Ozone Profile Data from a 41-Year Series of SBUV Instruments

    NASA Technical Reports Server (NTRS)

    Kramarova, Natalya; Frith, Stacey; Bhartia, Pawan K.; McPeters, Richard; Labow, Gordon; Taylor, Steven; Fisher, Bradford

    2012-01-01

    In this study we present the validation of ozone profiles from a number of Solar Back Scattered Ultra Violet (SBUV) and SBUV/2 instruments that were recently reprocessed using an updated (Version 8.6) algorithm. The SBUV dataset provides the longest available record of global ozone profiles, spanning a 41-year period from 1970 to 2011 (except a 5-year gap in the 1970s) and includes ozone profile records obtained from the Nimbus-4 BUV and Nimbus-7 SBUV instruments, and a series of SBUV(/2) instruments launched on NOAA operational satellites (NOAA 09, 11, 14, 16, 17, 18, 19). Although modifications in instrument design were made in the evolution from the BUV instrument to the modern SBUV(/2) model, the basic principles of the measurement technique and retrieval algorithm remain the same. The long term SBUV data record allows us to create a consistent, calibrated dataset of ozone profiles that can be used for climate studies and trend analyses. In particular, we focus on estimating the various sources of error in the SBUV profile ozone retrievals using independent observations and analysis of the algorithm itself. For the first time we include in the metadata a quantitative estimate of the smoothing error, defined as the error due to profile variability that the SBUV observing system cannot inherently measure. The magnitude of the smoothing error varies with altitude, latitude, season and solar zenith angle. Between 10 and 1 hPa the smoothing errors for the SBUV monthly zonal mean retrievals are of the order of 1 %, but start to increase above and below this layer. The largest smoothing errors, as large as 15-20%, were detected in in the troposphere. The SBUV averaging kernels, provided with the ozone profiles in version 8.6, help to eliminate the smoothing effect when comparing the SBUV profiles with high vertical resolution measurements, and make it convenient to use the SBUV ozone profiles for data assimilation and model validation purposes. The smoothing error can

  5. Extensive mapping of coastal change in Alaska by Landsat time-series analysis, 1972-2013 (Invited)

    NASA Astrophysics Data System (ADS)

    Macander, M. J.; Swingley, C. S.; Reynolds, J.

    2013-12-01

    The landscape-scale effects of coastal storms on Alaska's Bering Sea and Gulf of Alaska coasts includes coastal erosion, migration of spits and barrier islands, breaching of coastal lakes and lagoons, and inundation and salt-kill of vegetation. Large changes in coastal storm frequency and intensity are expected due to climate change and reduced sea-ice extent. Storms have a wide range of impacts on carbon fluxes and on fish and wildlife resources, infrastructure siting and operation, and emergency response planning. In areas experiencing moderate to large effects, changes can be mapped by analyzing trends in time series of Landsat imagery from Landsat 1 through Landsat 8. ABR, Inc.--Environmental Research & Services and the Western Alaska Landscape Conservation Cooperative are performing a time-series trend analysis for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska. The archive of Landsat imagery covers the time period 1972-present. For a pilot study area in Kotzebue Sound, we conducted a regression analysis of changes in near-infrared reflectance to identify areas with significant changes in coastal features, 1972-2011. Suitable ice- and cloud-free Landsat imagery was obtained for 28 of the 40 years during the period. The approach captured several coastal changes over the 40-year study period, including coastal erosion exceeding the 60-m pixel resolution of the Multispectral Scanner (MSS) data and migrations of coastal spits and estuarine channels. In addition several lake drainage events were identified, mostly inland from the coastal zone. Analysis of shorter, decadal time periods produced noisier results that were generally consistent with the long-term trend analysis. Unusual conditions at the start or end of the time-series can strongly influence decadal results. Based on these results the study is being scaled up to map coastal change for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska coast. The

  6. An enhanced vegetation index time series for the Amazon based on combined gap-filling approaches and quality datasets

    NASA Astrophysics Data System (ADS)

    Bernardes, Sergio

    2010-10-01

    Vegetation indices from MODIS data are subject to residual atmospheric noise, affecting processes requiring data continuity and analyses. This work reconstructed a time series of MODIS EVI mosaics for the Amazon using a novel combination of curve-fitting and spatiotemporal gap-filling. TIMESAT was used for initial curve fitting and gap filling, using a Double Logistic method and MODIS Usefulness values as weights. Pixels with large temporal gaps were handled by a spatiotemporal gap filling approach. The method scans Julian Days before and after the image being gap filled, searching for a good quality pixel (Pg) at the location of the pixel to be replaced. If Pg is found, a window is defined around it and a search for good quality pixels (Px) with spectral characteristics similar to Pg is performed. Window size increases during processing and pixel similarity uses Euclidean distance based on MOD13A2 reflectances. A good quality EVI value for the image being gap filled and at the location analogous to the minimum distance Px replaces the low quality pixel. Results from the spatiotemporal gap filling were then used in TIMESAT for smoothing. An evaluation strategy of the spatiotemporal approach involved flagging 5,000 randomly selected good-quality pixels as low-quality, running the algorithm and regressing the results with the original EVI values (R2= 0.62). The combined strategy was able to find replacement pixels and reduce spikes for images with high cloud cover and was used to rebuild a time series of EVI over the Amazon region for the period 2000-2010.

  7. Time-Series Data Analysis of Long-Term Home Blood Pressure Measurements in Relation to Lifestyle.

    PubMed

    Takeuchi, Hiroshi; Kodama, Naoki; Takahashi, Shingo

    2015-01-01

    We conducted a long-term time-series analysis of an individual's home blood pressure measurements, stored on a personal healthcare system in cloud, relative to the individual's life-style. In addition to daily scattering, apparent seasonal variations were observed in both systolic and diastolic blood pressure measurements. We examined the effect of seasonal variations on the outcome of a healthcare data mining process that extracts rules between blood pressure measurements and life-style components such as exercise and diet, and found that the daily blood pressure data approached a normal distribution when adjusted for the seasonal variations. This implies that an adjustment is desirable in order to produce appropriate rules in the healthcare data mining process.

  8. Sinking Chao Phraya delta plain, Thailand, derived from SAR interferometry time series analysis

    NASA Astrophysics Data System (ADS)

    Tanaka, A.; Mio, A.; Saito, Y.

    2013-12-01

    The Bangkok Metropolitan region and its surrounding provinces are located in a low-lying delta plain of the Chao Phraya River. Extensive groundwater use from the late 1950s has caused the decline of groundwater levels in the aquifers and Holocene clay compaction beneath the Bangkok Region, resulting in significant subsidence of the ground. This ground deformation has been monitored using leveling surveys since 1978, and differential InSAR (Interferometric Synthetic Aperture Radar) analysis. It shows that the Bangkok Metropolitan region is subsiding at a rate of about 20 mm/year during the recent years due to law-limited groundwater pumping, although the highest subsidence rate as high as 120 mm/year was recorded in 1981. The subsidence rate in the Bangkok area has significantly decreased since the late 1980s; however, the affected area has spread out to the surrounding areas. The maximum subsidence rate up to 30 mm/year occurred in the outlying southeast and southwest coastal zones in 2002. In this study, we apply a SAR interferometry time series analysis to monitor ground deformations in the lower Chao Phraya delta plain (Lower Central Plain), Thailand, using ALOS (Advanced Land Observing Satellite) PALSAR (Phased Array type L-band SAR) data acquired between July 2007 and September 2010. We derive a single reference time series interferogram from the stacking of unwrapped phases under the assumptions that those phases are smoothly and continuously connected, and apply a smoothness-constrained inversion algorithm that optimizes the displacement from the phase unwrapping of multitemporal differential SAR interferograms. The SAR interferometry time series analysis succeeds to monitor the incremental line-of-sight (LOS)-change between SAR scene acquisitions. LOS displacements are converted to vertical displacements, based on the assumption that the ground displacement in this area occurs only in the vertical directions. This reveals an overall pattern of subsidence

  9. Multifractal analysis of geophysical time series in the urban lake of Créteil (France).

    NASA Astrophysics Data System (ADS)

    Mezemate, Yacine; Tchiguirinskaia, Ioulia; Bonhomme, Celine; Schertzer, Daniel; Lemaire, Bruno Jacques; Vinçon leite, Brigitte; Lovejoy, Shaun

    2013-04-01

    Urban water bodies take part in the environmental quality of the cities. They regulate heat, contribute to the beauty of landscape and give some space for leisure activities (aquatic sports, swimming). As they are often artificial they are only a few meters deep. It confers them some specific properties. Indeed, they are particularly sensitive to global environmental changes, including climate change, eutrophication and contamination by micro-pollutants due to the urbanization of the watershed. Monitoring their quality has become a major challenge for urban areas. The need for a tool for predicting short-term proliferation of potentially toxic phytoplankton therefore arises. In lakes, the behavior of biological and physical (temperature) fields is mainly driven by the turbulence regime in the water. Turbulence is highly non linear, nonstationary and intermittent. This is why statistical tools are needed to characterize the evolution of the fields. The knowledge of the probability distribution of all the statistical moments of a given field is necessary to fully characterize it. This possibility is offered by the multifractal analysis based on the assumption of scale invariance. To investigate the effect of space-time variability of temperature, chlorophyll and dissolved oxygen on the cyanobacteria proliferation in the urban lake of Creteil (France), a spectral analysis is first performed on each time series (or on subsamples) to have an overall estimate of their scaling behaviors. Then a multifractal analysis (Trace Moment, Double Trace Moment) estimates the statistical moments of different orders. This analysis is adapted to the specific properties of the studied time series, i. e. the presence of large scale gradients. The nonlinear behavior of the scaling functions K(q) confirms that the investigated aquatic time series are indeed multifractal and highly intermittent .The knowledge of the universal multifractal parameters is the key to calculate the different

  10. Evaluation of Pleistocene groundwater flow through fractured tuffs using a U-series disequilibrium approach, Pahute Mesa, Nevada, USA

    USGS Publications Warehouse

    Paces, James B.; Nichols, Paul J.; Neymark, Leonid A.; Rajaram, Harihar

    2013-01-01

    Groundwater flow through fractured felsic tuffs and lavas at the Nevada National Security Site represents the most likely mechanism for transport of radionuclides away from underground nuclear tests at Pahute Mesa. To help evaluate fracture flow and matrix–water exchange, we have determined U-series isotopic compositions on more than 40 drill core samples from 5 boreholes that represent discrete fracture surfaces, breccia zones, and interiors of unfractured core. The U-series approach relies on the disruption of radioactive secular equilibrium between isotopes in the uranium-series decay chain due to preferential mobilization of 234U relative to 238U, and U relative to Th. Samples from discrete fractures were obtained by milling fracture surfaces containing thin secondary mineral coatings of clays, silica, Fe–Mn oxyhydroxides, and zeolite. Intact core interiors and breccia fragments were sampled in bulk. In addition, profiles of rock matrix extending 15 to 44 mm away from several fractures that show evidence of recent flow were analyzed to investigate the extent of fracture/matrix water exchange. Samples of rock matrix have 234U/238U and 230Th/238U activity ratios (AR) closest to radioactive secular equilibrium indicating only small amounts of groundwater penetrated unfractured matrix. Greater U mobility was observed in welded-tuff matrix with elevated porosity and in zeolitized bedded tuff. Samples of brecciated core were also in secular equilibrium implying a lack of long-range hydraulic connectivity in these cases. Samples of discrete fracture surfaces typically, but not always, were in radioactive disequilibrium. Many fractures had isotopic compositions plotting near the 230Th-234U 1:1 line indicating a steady-state balance between U input and removal along with radioactive decay. Numerical simulations of U-series isotope evolution indicate that 0.5 to 1 million years are required to reach steady-state compositions. Once attained, disequilibrium 234U/238U

  11. Measurement, time series analysis and source apportionment of inorganic and organic speciated PM(2.5) air pollution in Denver

    NASA Astrophysics Data System (ADS)

    Dutton, Steven James

    Particulate air pollution has demonstrated significant health effects ranging from worsening of asthma to increased rates of respiratory and cardiopulmonary mortality. These results have prompted the US-EPA to include particulate matter (PM) as one of the six criteria air pollutants regulated under the Clean Air Act. The diverse chemical make-up and physical characteristics of PM make it a challenging pollutant to characterize and regulate. Particulate matter less than 2.5 microns in diameter (PM2.5) has the ability to travel deep into the lungs and therefore has been linked with some of the more significant health effects. The toxicity of any given particle is likely dependent on its chemical composition. The goal of this project has been to chemically characterize a long time series of PM 2.5 measurements collected at a receptor site in Denver to a level of detail that has not been done before on this size data set. This has involved characterization of inorganic ions using ion chromatography, total elemental and organic carbon using thermal optical transmission, and organic molecular marker species using gas chromatography-mass spectrometry. Methods have been developed to allow for daily measurement and speciation for these compounds over a six year period. Measurement methods, novel approaches to uncertainty estimation, time series analysis, spectral and pattern analyses and source apportionment using two multivariate factor analysis models are presented. Analysis results reveal several natural and anthropogenic sources contributing to PM2.5 in Denver. The most distinguishable sources are motor vehicles and biomass combustion. This information will be used in a health effect analysis as part of a larger study called the Denver Aerosol Sources and Health (DASH) study. Such results will inform regulatory decisions and may help create a better understanding of the underlying mechanisms for the observed adverse health effects associated with PM2.5.

  12. Variational approach for nonpolar solvation analysis.

    PubMed

    Chen, Zhan; Zhao, Shan; Chun, Jaehun; Thomas, Dennis G; Baker, Nathan A; Bates, Peter W; Wei, G W

    2012-08-28

    Solvation analysis is one of the most important tasks in chemical and biological modeling. Implicit solvent models are some of the most popular approaches. However, commonly used implicit solvent models rely on unphysical definitions of solvent-solute boundaries. Based on differential geometry, the present work defines the solvent-solute boundary via the variation of the nonpolar solvation free energy. The solvation free energy functional of the system is constructed based on a continuum description of the solvent and the discrete description of the solute, which are dynamically coupled by the solvent-solute boundaries via van der Waals interactions. The first variation of the energy functional gives rise to the governing Laplace-Beltrami equation. The present model predictions of the nonpolar solvation energies are in an excellent agreement with experimental data, which supports the validity of the proposed nonpolar solvation model. PMID:22938212

  13. Random matrix approach to categorical data analysis

    NASA Astrophysics Data System (ADS)

    Patil, Aashay; Santhanam, M. S.

    2015-09-01

    Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings, and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow random matrix predictions with the dominant eigenvalue being an exception. We demonstrate this approach by applying it to the data for Indian general elections and sea level pressures in the North Atlantic ocean.

  14. Statistical approach to partial equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  15. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  16. Analysis of radiation-induced microchemical evolution in 300 series stainless steel

    SciTech Connect

    Brager, H.R.; Garner, F.A.

    1980-03-01

    The irradiation of 300 series stainless steel by fast neutrons leads to an evolution of alloy microstructures that involves not only the formation of voids and dislocations, but also an extensive repartitioning of elements between various phases. This latter evolution has been shown to be the primary determinant of the alloy behavior in response to the large number of variables which influence void swelling and irradiation creep. The combined use of scanning transmission electron microscopy and energy-dispersive x-ray analysis has been the key element in the study of this phenomenon. Problems associated with the analysis of radioactive specimens are resolved by minor equipment modifications. Problems associated with spatial resolution limitations and the complexity and heterogeneity of the microchemical evolution have been overcome by using several data acquisition techniques. These include the measurement of compositional profiles near sinks, the use of foil-edge analysis, and the statistical sampling of many matrix and precipitate volumes.

  17. Kinematic and kinetic analysis of two gymnastics acrobatic series to performing the backward stretched somersault.

    PubMed

    Mkaouer, Bessem; Jemni, Monèm; Amara, Samiha; Chaabène, Helmi; Tabka, Zouhair

    2013-01-01

    Back swing connections during gymnastics acrobatic series considerably influence technical performance and difficulties, particularly in the back somersault. The aim of this study was to compare the take-off's kinetic and kinematic variables between two acrobatic series leading to perform the backward stretched somersault (also called salto): round-off, flic-flac to stretched salto versus round-off, tempo-salto to stretched salto. Five high level male gymnasts (age 23.17 ± 1.61 yrs; body height 1.65 ± 0.05 m; body mass 56.80 ± 7.66 kg) took part in this investigation. A force plate synchronized with a two dimensional movement analysis system was used to collect kinetic and kinematic data. Statistical analysis via the non-parametric Wilcoxon Rank-sum test showed significant differences between the take-offs' variables. The backswing connections were different in the take-off angle, linear momentum, vertical velocity and horizontal and vertical displacements. In conclusion, considering that the higher elevation of the centre of mass in the flight phase would allow best performance and lower the risk of falls, particularly when combined to a great angular momentum, this study demonstrated that the optimal connection series was round-off, flic-flac to stretched salto which enabled the best height in the somersault. Analysis of the results suggests that both connections facilitate the performance of single and double (or triple) backward somersaults with or without rotations around the longitudinal axis. Gymnasts could perform these later while gaining height if they chose the round-off, flic-flac technique or gaining some backward displacement if they choose the round-off, salto tempo. PMID:24146701

  18. Kinematic and Kinetic Analysis of Two Gymnastics Acrobatic Series to Performing the Backward Stretched Somersault

    PubMed Central

    Mkaouer, Bessem; Jemni, Monèm; Amara, Samiha; Chaabène, Helmi; Tabka, Zouhair

    Back swing connections during gymnastics acrobatic series considerably influence technical performance and difficulties, particularly in the back somersault. The aim of this study was to compare the take-off’s kinetic and kinematic variables between two acrobatic series leading to perform the backward stretched somersault (also called salto): round-off, flic-flac to stretched salto versus round-off, tempo-salto to stretched salto. Five high level male gymnasts (age 23.17 ± 1.61 yrs; body height 1.65 ± 0.05 m; body mass 56.80 ± 7.66 kg) took part in this investigation. A force plate synchronized with a two dimensional movement analysis system was used to collect kinetic and kinematic data. Statistical analysis via the non-parametric Wilcoxon Rank-sum test showed significant differences between the take-offs’ variables. The backswing connections were different in the take-off angle, linear momentum, vertical velocity and horizontal and vertical displacements. In conclusion, considering that the higher elevation of the centre of mass in the flight phase would allow best performance and lower the risk of falls, particularly when combined to a great angular momentum, this study demonstrated that the optimal connection series was round-off, flic-flac to stretched salto which enabled the best height in the somersault. Analysis of the results suggests that both connections facilitate the performance of single and double (or triple) backward somersaults with or without rotations around the longitudinal axis. Gymnasts could perform these later while gaining height if they chose the round-off, flic-flac technique or gaining some backward displacement if they choose the round-off, salto tempo. PMID:24146701

  19. Forecasting malaria cases using climatic factors in delhi, India: a time series analysis.

    PubMed

    Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan

    2014-01-01

    Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India. PMID:25147750

  20. Analysis of the mass balance time series of glaciers in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Carturan, Luca; Baroni, Carlo; Brunetti, Michele; Carton, Alberto; Dalla Fontana, Giancarlo; Salvatore, Maria Cristina; Zanoner, Thomas; Zuecco, Giulia

    2016-03-01

    This work presents an analysis of the mass balance series of nine Italian glaciers, which were selected based on the length, continuity and reliability of observations. All glaciers experienced mass loss in the observation period, which is variable for the different glaciers and ranges between 10 and 47 years. The longest series display increasing mass loss rates, which were mainly due to increased ablation during longer and warmer ablation seasons. The mean annual mass balance (Ba) in the decade from 2004 to 2013 ranged from -1788 to -763 mm w.e. yr-1. Low-altitude glaciers with low range of elevation are more out of balance than the higher, larger and steeper glaciers, which maintain residual accumulation areas in their upper reaches. The response of glaciers is mainly controlled by the combination of October-May precipitations and June-September temperatures, but rapid geometric adjustments and atmospheric changes lead to modifications in their response to climatic variations. In particular, a decreasing correlation of Ba with the June-September temperatures and an increasing correlation with October-May precipitations are observed for some glaciers. In addition, the October-May temperatures tend to become significantly correlated with Ba, possibly indicating a decrease in the fraction of solid precipitation, and/or increased ablation, during the accumulation season. Because most of the monitored glaciers have no more accumulation area, their observations series are at risk due to their impending extinction, thus requiring a replacement soon.

  1. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis

    NASA Astrophysics Data System (ADS)

    Rzepecka, Zofia; Kalita, Jakub

    2016-04-01

    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.

  2. Time-series analysis for determining vertical air permeability in unsaturated zones

    SciTech Connect

    Lu, N.

    1999-01-01

    The air pressure in the unsaturated subsurface changes dynamically as the barometric pressure varies with time. Depending on the material properties and boundary conditions, the intensity of the correlation between the atmospheric and subsurface pressures may be evidenced in two persistent patterns: (1) the amplitude attenuation; and (2) the phase lag for the principal modes, such as the diurnal, semidiurnal, and 8-h tides. The amplitude attenuation and the phase lag generally depend on properties that can be classified into two categories: (1) The barometric pressure parameters, such as the apparent pressure amplitudes and frequencies controlled by the atmospheric tides and others; and (2) the material properties of porous media, such as the air viscosity, air-filled porosity, and permeability. Based on the principle of superposition and a Fourier time-series analysis, an analytical solution for predicting the subsurface air pressure variation caused by the atmospheric pressure fluctuation is presented. The air permeability (or pneumatic diffusivity) can be quantitatively determined by using the calculated amplitude attenuations (or phase lags) and the appropriate analytical relations among the parameters of the atmosphere and the porous medium. An analysis using the field data shows that the Fourier time-series analysis may provide a potentially reliable and simple method for predicting the subsurface barometric pressure variation and for determining the air permeability of unsaturated zones.

  3. Time-series analysis of temperature profiles from VIRTIS Venus Express data

    NASA Astrophysics Data System (ADS)

    Grassi, D.; Migliorini, A.; Politi, R.; Montabone, L.; Piccioni, G.; Drossart, P.

    2012-04-01

    Nighttime infrared observations of the VIRTIS instrument on board Venus Express have already demonstrated their potential in the study of air temperature fields of the Venusian mesosphere. The entire available dataset acquired by the VIRTIS-M IR channel was processed at moderate spatial resolution (i.e. averaging pixels in 8x8 boxes) to derive an unprecedented dataset of air temperature profiles in the pressure range 100-0.1 mbar, covering mostly the latitudes south of 45S. We presented in Grassi et al. (2010, doi:10.1029/2009JE003553) an analysis of the mean properties of temperature profiles, once binned in the latitude/local time/pressure space. Here we discuss the preliminary findings of time-series analysis of data from individual bins. Despite the sparsity of most series, Lomb-Scargle periodogram can be effectively applied in the regions south of 70S, where better coverage is made possible by specific properties of Venus Express orbit. Here the algorithm is able to extract a clear signature related to a period of about 115-120 Earth days, i.e. one Venus solar day, particularly strong at the level around 10 mbar. Further analysis of average temperature fields in the latitude - longitude space demonstrated, for different local times during night, that air temperatures east of Lada Terra (most specifically in a region centered around 130°E and about 60° wide) are about 10K warmer than in other longitudes at 75S.

  4. ZWD time series analysis derived from NRT data processing. A regional study of PW in Greece.

    NASA Astrophysics Data System (ADS)

    Pikridas, Christos; Balidakis, Kyriakos; Katsougiannopoulos, Symeon

    2015-04-01

    ZWD (Zenith Wet/non-hydrostatic Delay) estimates are routinely derived Near Real Time from the new established Analysis Center in the Department of Geodesy and Surveying of Aristotle University of Thessaloniki (DGS/AUT-AC), in the framework of E-GVAP (EUMETNET GNSS water vapour project) since October 2014. This process takes place on an hourly basis and yields, among else, station coordinates and tropospheric parameter estimates for a network of 90+ permanent GNSS (Global Navigation Satellite System) stations. These are distributed at the wider part of Hellenic region. In this study, temporal and spatial variability of ZWD estimates were examined, as well as their relation with coordinate series extracted from both float and fixed solution of the initial phase ambiguities. For this investigation, Bernese GNSS Software v5.2 was used for the acquisition of the 6 month dataset from the aforementioned network. For time series analysis we employed techniques such as the Generalized Lomb-Scargle periodogram and Burg's maximum entropy method due to inefficiencies of the Discrete Fourier Transform application in the test dataset. Through the analysis, interesting results for further geophysical interpretation were drawn. In addition, the spatial and temporal distributions of Precipitable Water vapour (PW) obtained from both ZWD estimates and ERA-Interim reanalysis grids were investigated.

  5. Analysis of the gamma spectra of the uranium, actinium, and thorium decay series

    SciTech Connect

    Momeni, M.H.

    1981-09-01

    This report describes the identification of radionuclides in the uranium, actinium, and thorium series by analysis of gamma spectra in the energy range of 40 to 1400 keV. Energies and absolute efficiencies for each gamma line were measured by means of a high-resolution germanium detector and compared with those in the literature. A gamma spectroscopy method, which utilizes an on-line computer for deconvolution of spectra, search and identification of each line, and estimation of activity for each radionuclide, was used to analyze soil and uranium tailings, and ore.

  6. Applications of ARCH and GARCH time series analysis methods in study of Earth rotation

    NASA Astrophysics Data System (ADS)

    Hefty, J.; Kormonikova, M.; Bognár, T.

    Non-linear methods of Autoregressive Conditional Heteroskedasticity (ARCH) and Generalized Autoregressive Conditional Heteroskedasticity (GARCH) modelling are applied for analysis of short-term (periods <100 days) fluctuations of ERP. It is shown that 1-day sampled time series of x, y, and UT1R from 1993.0 to 1999.3 can be modelled as linear autoregressive process and non-linear time dependent variance. The latter is well modelled as GARCH(1,1) process for x and y and ARCH(2) process for UT1R.

  7. Measurement error in time-series analysis: a simulation study comparing modelled and monitored data

    PubMed Central

    2013-01-01

    Background Assessing health effects from background exposure to air pollution is often hampered by the sparseness of pollution monitoring networks. However, regional atmospheric chemistry-transport models (CTMs) can provide pollution data with national coverage at fine geographical and temporal resolution. We used statistical simulation to compare the impact on epidemiological time-series analysis of additive measurement error in sparse monitor data as opposed to geographically and temporally complete model data. Methods Statistical simulations were based on a theoretical area of 4 regions each consisting of twenty-five 5 km × 5 km grid-squares. In the context of a 3-year Poisson regression time-series analysis of the association between mortality and a single pollutant, we compared the error impact of using daily grid-specific model data as opposed to daily regional average monitor data. We investigated how this comparison was affected if we changed the number of grids per region containing a monitor. To inform simulations, estimates (e.g. of pollutant means) were obtained from observed monitor data for 2003–2006 for national network sites across the UK and corresponding model data that were generated by the EMEP-WRF CTM. Average within-site correlations between observed monitor and model data were 0.73 and 0.76 for rural and urban daily maximum 8-hour ozone respectively, and 0.67 and 0.61 for rural and urban loge(daily 1-hour maximum NO2). Results When regional averages were based on 5 or 10 monitors per region, health effect estimates exhibited little bias. However, with only 1 monitor per region, the regression coefficient in our time-series analysis was attenuated by an estimated 6% for urban background ozone, 13% for rural ozone, 29% for urban background loge(NO2) and 38% for rural loge(NO2). For grid-specific model data the corresponding figures were 19%, 22%, 54% and 44% respectively, i.e. similar for rural loge(NO2) but more marked for urban loge(NO2

  8. Student Rights and Student Discipline. School Leadership Digest Series, Number 13. ERIC/CEM Research Analysis Series, Number 15.

    ERIC Educational Resources Information Center

    Schofield, Dee

    This analysis of the research outlines the history of the conflict over student rights--a conflict that has its basis in American political and social philosophy. The author views the tension between those who favor the expansion of civil rights for students and those who advocate a return to discipline based on the in loco parentis doctrine as…

  9. Short-term pollution forecasts based on linear and nonlinear methods of time series analysis

    NASA Astrophysics Data System (ADS)

    Russo, A.; Trigo, R. M.

    2012-04-01

    Urban air pollution is a complex mixture of toxic components, which may induce acute and chronic responses from sensitive groups, such as children and people with previous heart and respiratory insufficiencies. However, air pollution, presents a highly chaotic and non-linear behavior. In this work we analyzed several pollutants time series recorded in the urban area of Lisbon (Portugal) for the 2002-2006 period. Linear and nonlinear methods were applied in order to assess NO2, PM10 and O3 main trends and fluctuations and finally, to produce daily forecasts of the referred pollutants. Here we evaluate the potential of linear and non-linear neural networks (NN) to produce short-term forecasts, and also the contribution of meteorological variables (daily mean temperature, radiation, wind speed and direction, boundary layer height, humidity) to pollutants dispersion. Additionally, we assess the role of large-scale circulation patterns, usually referred as Weather types (WT) (from the ERA40/ECMWF and ECMWF SLP database) towards the occurrence of critical pollution events identified previously. The presence and importance of trends and fluctuation is addressed by means of two modelling approaches: (1) raw data modelling; (2) residuals modelling (after the removal of the trends from the original data). The relative importance of two periodic components, the weekly and the monthly cycles, is addressed. For the three pollutants, the approach based on the removal of the weekly cycle presents the best results, comparatively to the removal of the monthly cycle or to the use of the raw data. The best predictors are chosen independently for each monitoring station and pollutant through an objective procedure (backward stepwise regression). The analysis reveals that the most significant variables in predicting NO2 concentration are several NO2 measures, wind direction and speed and global radiation, while for O3 correspond to several O3 measures, O3 precursors and WT

  10. Using a neural network approach and time series data from an international monitoring station in the Yellow Sea for modeling marine ecosystems.

    PubMed

    Zhang, Yingying; Wang, Juncheng; Vorontsov, A M; Hou, Guangli; Nikanorova, M N; Wang, Hongliang

    2014-01-01

    The international marine ecological safety monitoring demonstration station in the Yellow Sea was developed as a collaborative project between China and Russia. It is a nonprofit technical workstation designed as a facility for marine scientific research for public welfare. By undertaking long-term monitoring of the marine environment and automatic data collection, this station will provide valuable information for marine ecological protection and disaster prevention and reduction. The results of some initial research by scientists at the research station into predictive modeling of marine ecological environments and early warning are described in this paper. Marine ecological processes are influenced by many factors including hydrological and meteorological conditions, biological factors, and human activities. Consequently, it is very difficult to incorporate all these influences and their interactions in a deterministic or analysis model. A prediction model integrating a time series prediction approach with neural network nonlinear modeling is proposed for marine ecological parameters. The model explores the natural fluctuations in marine ecological parameters by learning from the latest observed data automatically, and then predicting future values of the parameter. The model is updated in a "rolling" fashion with new observed data from the monitoring station. Prediction experiments results showed that the neural network prediction model based on time series data is effective for marine ecological prediction and can be used for the development of early warning systems.

  11. Approaches to Cycle Analysis and Performance Metrics

    NASA Technical Reports Server (NTRS)

    Parson, Daniel E.

    2003-01-01

    The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.

  12. A modular approach to linear uncertainty analysis.

    PubMed

    Weathers, J B; Luck, R; Weathers, J W

    2010-01-01

    This paper introduces a methodology to simplify the uncertainty analysis of large-scale problems where many outputs and/or inputs are of interest. The modular uncertainty technique presented here can be utilized to analyze the results spanning a wide range of engineering problems with constant sensitivities within parameter uncertainty bounds. The proposed modular approach provides the same results as the traditional propagation of errors methodology with fewer conceptual steps allowing for a relatively straightforward implementation of a comprehensive uncertainty analysis effort. The structure of the modular technique allows easy integration into most experimental/modeling programs or data acquisition systems. The proposed methodology also provides correlation information between all outputs, thus providing information not easily obtained using the traditional uncertainty process based on analyzing one data reduction equation (DRE)/model at a time. Finally, the paper presents a straightforward methodology to obtain the covariance matrix for the input variables using uncorrelated elemental sources of systematic uncertainties along with uncorrelated sources corresponding to random uncertainties.

  13. A time series analysis of multiple ambient pollutants to investigate the underlying air pollution dynamics and interactions.

    PubMed

    Yu, Hwa-Lung; Lin, Yuan-Chien; Kuo, Yi-Ming

    2015-09-01

    Understanding the temporal dynamics and interactions of particulate matter (PM) concentration and composition is important for air quality control. This paper applied a dynamic factor analysis method (DFA) to reveal the underlying mechanisms of nonstationary variations in twelve ambient concentrations of aerosols and gaseous pollutants, and the associations with meteorological factors. This approach can consider the uncertainties and temporal dependences of time series data. The common trends of the yearlong and three selected diurnal variations were obtained to characterize the dominant processes occurring in general and specific scenarios in Taipei during 2009 (i.e., during Asian dust storm (ADS) events, rainfall, and under normal conditions). The results revealed the two distinct yearlong NOx transformation processes, and demonstrated that traffic emissions and photochemical reactions both critically influence diurnal variation, depending upon meteorological conditions. During an ADS event, transboundary transport and distinct weather conditions both influenced the temporal pattern of identified common trends. This study shows the DFA method can effectively extract meaningful latent processes of time series data and provide insights of the dominant associations and interactions in the complex air pollution processes. PMID:25600321

  14. The 2009-2010 Guerrero Slow Slip Event Monitored by InSAR, Using Time Series Approach

    NASA Astrophysics Data System (ADS)

    Bacques, G.; Pathier, E.; Lasserre, C.; Cotton, F.; Radiguet, M.; Cycle Sismique et Déformations Transitoires

    2011-12-01

    Time Series approach. Time Series approach is useful for monitoring ground deformation evolution during the slow slip events and makes the slip propagation mapping upon the subduction plane a promising goal. Here we present our first results concerning the 2009-2010 slow slip events, particularly the distribution of the cumulative surface displacement in LOS (satellite Line Of Sight), the slip distribution associated on the fault plane and the ground deformation evolution obtained. Finally, we open the discussion with a first comparison between the 2009-2010 and the 2006 events that reveal some differences concerning the amplitude and the distribution of the ground deformation.

  15. Insights into soil carbon dynamics across climatic and geologic gradients from time-series and fraction-specific radiocarbon analysis

    NASA Astrophysics Data System (ADS)

    van der Voort, Tessa Sophia; Hagedorn, Frank; Zell, Claudia; McIntyre, Cameron; Eglinton, Tim

    2016-04-01

    Understanding the interaction between soil organic matter (SOM) and climatic, geologic and ecological factors is essential for the understanding of potential susceptibility and vulnerability to climate and land use change. Radiocarbon constitutes a powerful tool for unraveling SOM dynamics and is increasingly used in studies of carbon turnover. The complex and inherently heterogeneous nature of SOM renders it challenging to assess the processes that govern SOM stability by solely looking at the bulk signature on a plot-scale level. This project combines bulk radiocarbon measurements on a regional-scale spanning wide climatic and geologic gradients with a more in-depth approach for a subset of locations. For this subset, time-series and carbon pool-specific radiocarbon data has been acquired for both topsoil and deeper soils. These well-studied sites are part of the Long-Term Forest Ecosystem Research (LWF) program of the Swiss Federal Institute for Forest, Snow and Landscape research (WSL). Statistical analysis was performed to examine relationships of radiocarbon signatures with variables such as temperature, precipitation and elevation. Bomb-curve modeling was applied determine carbon turnover using time-series data. Results indicate that (1) there is no significant correlation between Δ14C signature and environmental conditions except a weak positive correlation with mean annual temperature, (2) vertical gradients in Δ14C signatures in surface and deeper soils are highly similar despite covering disparate soil-types and climatic systems, and (3) radiocarbon signatures vary significantly between time-series samples and carbon pools. Overall, this study provides a uniquely comprehensive dataset that allows for a better understanding of links between carbon dynamics and environmental settings, as well as for pool-specific and long-term trends in carbon (de)stabilization.

  16. Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbance

    NASA Astrophysics Data System (ADS)

    Lanorte, Antonio; Lasaponara, Rosa; Lovallo, Michele; Telesca, Luciano

    2014-02-01

    The time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers.

  17. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  18. Online Time Series Analysis of Land Products over Asia Monsoon Region via Giovanni

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina

    2011-01-01

    Time series analysis is critical to the study of land cover/land use changes and climate. Time series studies at local-to-regional scales require higher spatial resolution, such as 1km or less, data. MODIS land products of 250m to 1km resolution enable such studies. However, such MODIS land data files are distributed in 10ox10o tiles, due to large data volumes. Conducting a time series study requires downloading all tiles that include the study area for the time period of interest, and mosaicking the tiles spatially. This can be an extremely time-consuming process. In support of the Monsoon Asia Integrated Regional Study (MAIRS) program, NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) has processed MODIS land products at 1 km resolution over the Asia monsoon region (0o-60oN, 60o-150oE) with a common data structure and format. The processed data have been integrated into the Giovanni system (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) that enables users to explore, analyze, and download data over an area and time period of interest easily. Currently, the following regional MODIS land products are available in Giovanni: 8-day 1km land surface temperature and active fire, monthly 1km vegetation index, and yearly 0.05o, 500m land cover types. More data will be added in the near future. By combining atmospheric and oceanic data products in the Giovanni system, it is possible to do further analyses of environmental and climate changes associated with the land, ocean, and atmosphere. This presentation demonstrates exploring land products in the Giovanni system with sample case scenarios.

  19. Error Analysis of the IGS repro2 Station Position Time Series

    NASA Astrophysics Data System (ADS)

    Rebischung, P.; Ray, J.; Benoist, C.; Metivier, L.; Altamimi, Z.

    2015-12-01

    Eight Analysis Centers (ACs) of the International GNSS Service (IGS) have completed a second reanalysis campaign (repro2) of the GNSS data collected by the IGS global tracking network back to 1994, using the latest available models and methodology. The AC repro2 contributions include in particular daily terrestrial frame solutions, the first time with sub-weekly resolution for the full IGS history. The AC solutions, comprising positions for 1848 stations with daily polar motion coordinates, were combined to form the IGS contribution to the next release of the International Terrestrial Reference Frame (ITRF2014). Inter-AC position consistency is excellent, about 1.5 mm horizontal and 4 mm vertical. The resulting daily combined frames were then stacked into a long-term cumulative frame assuming generally linear motions, which constitutes the GNSS input to the ITRF2014 inter-technique combination. A special challenge involved identifying the many position discontinuities, averaging about 1.8 per station. A stacked periodogram of the station position residual time series from this long-term solution reveals a number of unexpected spectral lines (harmonics of the GPS draconitic year, fortnightly tidal lines) on top of a white+flicker background noise and strong seasonal variations. In this study, we will present results from station- and AC-specific analyses of the noise and periodic errors present in the IGS repro2 station position time series. So as to better understand their sources, and in view of developing a spatio-temporal error model, we will focus in particular on the spatial distribution of the noise characteristics and of the periodic errors. By computing AC-specific long-term frames and analyzing the respective residual time series, we will additionally study how the characteristics of the noise and of the periodic errors depend on the adopted analysis strategy and reduction software.

  20. Metals Analysis Results for the Structural Qualification Test Series (SQTS) 01 - 05.

    SciTech Connect

    Zalk, D

    2006-04-11

    Enclosed is the report summarizing the metals analysis results at the Contained Firing Facility (CFF), during SQTS 01 - 05. This metals analysis includes evaluation of a bulk dust and surface swipe sampling protocol during the testing series that obtained samples at 3 primary locations in the CFF chamber area. The sampling protocol for each of the bulk dust samples involves an assessment of the concentration for 20 different metals, the oxidation state of selected metals, a particle size selective analysis, and morphological information. In addition, surface swipes were taken during SQTS 05 on the equipment and personnel door frames to indicate the characteristics of airborne metals due to leakage past the gasket seals. The bulk dust metals analysis indicates a nearly complete conversion of the aluminum casing to an oxide form with an even split between spherical and non-spherical morphology. Size selective analysis shows 83% of the particulates are in the inhalable size range of less than 100 microns and 46% are in the respirable range of less than 10 microns. Combining metals analysis and leakage results indicate the potential for a problematic personal exposure to metals external to the chamber unless modifications are made. Please feel free to call me at 2-8904 if you have any questions or if I may be of further service.

  1. Nitrogen isotopes in Tree-Rings - An approach combining soil biogeochemistry and isotopic long series with statistical modeling

    NASA Astrophysics Data System (ADS)

    Savard, Martine M.; Bégin, Christian; Paré, David; Marion, Joëlle; Laganière, Jérôme; Séguin, Armand; Stefani, Franck; Smirnoff, Anna

    2016-04-01

    Monitoring atmospheric emissions from industrial centers in North America generally started less than 25 years ago. To compensate for the lack of monitoring, previous investigations have interpreted tree-ring N changes using the known chronology of human activities, without facing the challenge of separating climatic effects from potential anthropogenic impacts. Here we document such an attempt conducted in the oil sands (OS) mining region of Northeastern Alberta, Canada. The reactive nitrogen (Nr)-emitting oil extraction operations began in 1967, but air quality measurements were only initiated in 1997. To investigate if the beginning and intensification of OS operations induced changes in the forest N-cycle, we sampled white spruce (Picea glauca (Moench) Voss) stands located at various distances from the main mining area, and receiving low, but different N deposition. Our approach combines soil biogeochemical and metagenomic characterization with long, well dated, tree-ring isotopic series. To objectively delineate the natural N isotopic behaviour in trees, we have characterized tree-ring N isotope (15N/14N) ratios between 1880 and 2009, used statistical analyses of the isotopic values and local climatic parameters of the pre-mining period to calibrate response functions and project the isotopic responses to climate during the extraction period. During that period, the measured series depart negatively from the projected natural trends. In addition, these long-term negative isotopic trends are better reproduced by multiple-regression models combining climatic parameters with the proxy for regional mining Nr emissions. These negative isotopic trends point towards changes in the forest soil biogeochemical N cycle. The biogeochemical data and ultimate soil mechanisms responsible for such changes will be discussed during the presentation.

  2. Beyond Fractals and 1/f Noise: Multifractal Analysis of Complex Physiological Time Series

    NASA Astrophysics Data System (ADS)

    Ivanov, Plamen Ch.; Amaral, Luis A. N.; Ashkenazy, Yosef; Stanley, H. Eugene; Goldberger, Ary L.; Hausdorff, Jeffrey M.; Yoneyama, Mitsuru; Arai, Kuniharu

    2001-03-01

    We investigate time series with 1/f-like spectra generated by two physiologic control systems --- the human heartbeat and human gait. We show that physiological fluctuations exhibit unexpected ``hidden'' structures often described by scaling laws. In particular, our studies indicate that when analyzed on different time scales the heartbeat fluctuations exhibit cascades of branching patterns with self-similar (fractal) properties, characterized by long-range power-law anticorrelations. We find that these scaling features change during sleep and wake phases, and with pathological perturbations. Further, by means of a new wavelet-based technique, we find evidence of multifractality in the healthy human heartbeat even under resting conditions, and show that the multifractal character and nonlinear properties of the healthy heart are encoded in the Fourier phases. We uncover a loss of multifractality for a life-threatening condition, congestive heart failure. In contrast to the heartbeat, we find that the interstride interval time series of healthy human gait, a voluntary process under neural regulation, is described by a single fractal dimension (such as classical 1/f noise) indicating monofractal behavior. Thus our approach can help distinguish physiological and physical signals with comparable frequency spectra and two-point correlations, and guide modeling of their control mechanisms.

  3. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  4. Use of a prototype pulse oximeter for time series analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica

    2015-05-01

    This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.

  5. The ICR142 NGS validation series: a resource for orthogonal assessment of NGS analysis.

    PubMed

    Ruark, Elise; Renwick, Anthony; Clarke, Matthew; Snape, Katie; Ramsay, Emma; Elliott, Anna; Hanks, Sandra; Strydom, Ann; Seal, Sheila; Rahman, Nazneen

    2016-01-01

    To provide a useful community resource for orthogonal assessment of NGS analysis software, we present the ICR142 NGS validation series. The dataset includes high-quality exome sequence data from 142 samples together with Sanger sequence data at 730 sites; 409 sites with variants and 321 sites at which variants were called by an NGS analysis tool, but no variant is present in the corresponding Sanger sequence. The dataset includes 286 indel variants and 275 negative indel sites, and thus the ICR142 validation dataset is of particular utility in evaluating indel calling performance. The FASTQ files and Sanger sequence results can be accessed in the European Genome-phenome Archive under the accession number EGAS00001001332.

  6. Joint statistical analysis of multichannel time series from single quantum dot-(Cy5)n constructs.

    PubMed

    Xu, C Shan; Kim, Hahkjoon; Hayden, Carl C; Yang, Haw

    2008-05-15

    One of the major challenges in single-molecule studies is how to extract reliable information from the inevitably noisy data. Here, we demonstrate the unique capabilities of multichannel joint statistical analysis of multispectral time series using Föster resonance energy transfer (FRET) in single quantum dot (QD)-organic dye hybrids as a model system. The multispectral photon-by-photon registration allows model-free determination of intensity change points of the donor and acceptor channels independently. The subsequent joint analysis of these change points gives high-confidence assignments of acceptor photobleaching events despite the interference from background noise and from intermittent blinking of the QD donors and acceptors themselves. Finally, the excited-state lifetimes of donors and acceptors are calculated using the joint maximum likelihood estimation (MLE) method on the donor and acceptor decay profiles, guided by a four-state kinetics model.

  7. A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.

    PubMed

    Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C

    2003-12-01

    The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. PMID:14640813

  8. [Evolution of child undernutrition in Chile and some of its conditioning factors: a time series analysis].

    PubMed

    Amigo, H; Díaz, L; Pino, P; Vera, G

    1994-06-01

    The objective of this study was to determine the evolution of the nutritional status of the population under five years of age during the period 1975-1990. Several conditioning factors were also assessed. The information was evaluated through time series analysis by using the AREG procedure. This procedure allows for the estimation of a regression model correcting by the autocorrelation of errors. Results indicates a significant trend to decreased undernutrition rates (p < 0.0001). A seasonal effect on undernutrition was observed, being higher the prevalences in summer. Analysis of selected conditioning factors, as well as the familiar buying capacity remained stable during the period. An exception to the lack of association among undernutrition and the conditioning factors evaluated, was seen during the period 1975-1982 when clear inverse relationship was evidenced. In conclusion, the decrease of infant undernutrition in Chile during the period 1975-1990 was not related to the changes observed in certain socioeconomic indices. PMID:7733798

  9. The ICR142 NGS validation series: a resource for orthogonal assessment of NGS analysis

    PubMed Central

    Ruark, Elise; Renwick, Anthony; Clarke, Matthew; Snape, Katie; Ramsay, Emma; Elliott, Anna; Hanks, Sandra; Strydom, Ann; Seal, Sheila; Rahman, Nazneen

    2016-01-01

    To provide a useful community resource for orthogonal assessment of NGS analysis software, we present the ICR142 NGS validation series. The dataset includes high-quality exome sequence data from 142 samples together with Sanger sequence data at 730 sites; 409 sites with variants and 321 sites at which variants were called by an NGS analysis tool, but no variant is present in the corresponding Sanger sequence. The dataset includes 286 indel variants and 275 negative indel sites, and thus the ICR142 validation dataset is of particular utility in evaluating indel calling performance. The FASTQ files and Sanger sequence results can be accessed in the European Genome-phenome Archive under the accession number EGAS00001001332. PMID:27158454

  10. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  11. [Local fractal analysis of noise-like time series by all permutations method for 1-115 min periods].

    PubMed

    Panchelyuga, V A; Panchelyuga, M S

    2015-01-01

    Results of local fractal analysis of 329-per-day time series of 239Pu alpha-decay rate fluctuations by means of all permutations method (APM) are presented. The APM-analysis reveals in the time series some steady frequency set. The coincidence of the frequency set with the Earth natural oscillations was demonstrated. A short review of works by different authors who analyzed the time series of fluctuations in processes of different nature is given. We have shown that the periods observed in those works correspond to the periods revealed in our study. It points to a common mechanism of the phenomenon observed. PMID:26016038

  12. [Local fractal analysis of noise-like time series by all permutations method for 1-115 min periods].

    PubMed

    Panchelyuga, V A; Panchelyuga, M S

    2015-01-01

    Results of local fractal analysis of 329-per-day time series of 239Pu alpha-decay rate fluctuations by means of all permutations method (APM) are presented. The APM-analysis reveals in the time series some steady frequency set. The coincidence of the frequency set with the Earth natural oscillations was demonstrated. A short review of works by different authors who analyzed the time series of fluctuations in processes of different nature is given. We have shown that the periods observed in those works correspond to the periods revealed in our study. It points to a common mechanism of the phenomenon observed.

  13. Quadrantal multi-scale distribution entropy analysis of heartbeat interval series based on a modified Poincaré plot

    NASA Astrophysics Data System (ADS)

    Huo, Chengyu; Huang, Xiaolin; Zhuang, Jianjun; Hou, Fengzhen; Ni, Huangjing; Ning, Xinbao

    2013-09-01

    The Poincaré plot is one of the most important approaches in human cardiac rhythm analysis. However, further investigations are still needed to concentrate on techniques that can characterize the dispersion of the points displayed by a Poincaré plot. Based on a modified Poincaré plot, we provide a novel measurement named distribution entropy (DE) and propose a quadrantal multi-scale distribution entropy analysis (QMDE) for the quantitative descriptions of the scatter distribution patterns in various regions and temporal scales. We apply this method to the heartbeat interval series derived from healthy subjects and congestive heart failure (CHF) sufferers, respectively, and find that the discriminations between them are most significant in the first quadrant, which implies significant impacts on vagal regulation brought about by CHF. We also investigate the day-night differences of young healthy people, and it is shown that the results present a clearly circadian rhythm, especially in the first quadrant. In addition, the multi-scale analysis indicates that the results of healthy subjects and CHF sufferers fluctuate in different trends with variation of the scale factor. The same phenomenon also appears in circadian rhythm investigations of young healthy subjects, which implies that the cardiac dynamic system is affected differently in various temporal scales by physiological or pathological factors.

  14. On the characterization of vegetation recovery after fire disturbance using Fisher-Shannon analysis and SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series

    NASA Astrophysics Data System (ADS)

    Lasaponara, Rosa; Lanorte, Antonio; Lovallo, Michele; Telesca, Luciano

    2015-04-01

    Time series can fruitfully support fire monitoring and management from statistical analysis of fire occurrence (Tuia et al. 2008) to danger estimation (lasaponara 2005), damage evaluation (Lanorte et al 2014) and post fire recovery (Lanorte et al. 2014). In this paper, the time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers. Reference Lanorte A, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to

  15. Molecular analysis of HLA class I alleles in the Mexican Seri Indians: implications for their origin.

    PubMed

    Infante, E; Olivo, A; Alaez, C; Williams, F; Middleton, D; de la Rosa, G; Pujol, M J; Durán, C; Navarro, J L; Gorodezky, C

    1999-07-01

    The molecular analysis of HLA class I loci has demonstrated that, although, the genetic profile is restricted in Amerindians, several micropolymorphisms may be important in conferring a biological advantage. We analyzed the HLA-A and B genetic profile of Seris, a Mexican Indian tribe living in northwestern Mexico in the state of Sonora. There are presently only 619 individuals. Our study included 100 Seris belonging to nine families. HLA-A and -B loci typing was performed by polymerase chain reaction using an amplification refractory mutation system (PCR-ARMS) on a select group of samples; all of them were typed by polymerase chain reaction using sequence-specific oliogonuoleotide probes (PCR-SSOP) at a low-intermediate resolution level. The correlation between the techniques was 100%. Only five HLA-A alleles and seven HLA-B alleles were found. A*0201, A*68, A*31, A*24, B*3501, B*40, B*51, B*3512 and B*15 were present in over 5% of the individuals. B*27052 was detected in 2%. B27 is absent in any other Mexican Indian groups previously studied. The presence of B27 may be the result of a founder effect due to different waves of southward migrations. The B-locus is more diverse and the prevalent haplotypes were: A*0201-B*3501, A*0201-B*40, A*0201-B*3512, A*31-B*51, A*68-B*3501 and A*68-B*40. This genetic profile is different from the pattern of other Mexicans. The phylogenetic tree suggests that Seris are more closely related to the Warao Indians from Venezuela, who live in a similar ecosystem, and to some groups of Argentina, than they are to the Mexican Lacandones who live in the jungle. These data emphasize the relevance of the interaction between genes and environment.

  16. Novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis

    NASA Astrophysics Data System (ADS)

    Mihailović, Dragutin T.; Mimić, Gordan; Nikolić-Djorić, Emilija; Arsenić, Ilija

    2015-01-01

    We propose novel metrics based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We consider the origins of the Kolmogorov complexity and discuss its physical meaning. To get better insights into the nature of complex systems and time series analysis we introduce three novel measures based on the Kolmogorov complexity: (i) the Kolmogorov complexity spectrum, (ii) the Kolmogorov complexity spectrum highest value and (iii) the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied to different time series originating from: a model output (the biochemical substance exchange in a multi-cell system), four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222Rn concentration and UV radiation dose) and the economy (stock price dynamics). The results obtained offer deeper insights into the complexity of system dynamics and time series analysis with the proposed complexity measures.

  17. Physics-Based Correction of Inhomogeneities in Temperature Series: Model Transferability Testing and Comparison to Statistical Approaches

    NASA Astrophysics Data System (ADS)

    Auchmann, Renate; Brönnimann, Stefan; Croci-Maspoli, Mischa

    2016-04-01

    For the correction of inhomogeneities in sub-daily temperature series, Auchmann and Brönnimann (2012) developed a physics-based model for one specific type of break, i.e. the transition from a Wild screen to a Stevenson screen at one specific station in Basel, Switzerland. The model is based solely on physical considerations, no relationships of the covariates to the differences between the parallel measurements have been investigated. The physics-based model requires detailed information on the screen geometry, the location, and includes a variety of covariates in the model. The model is mainly based on correcting the radiation error, including a modification by ambient wind. In this study we test the application of the model to another station, Zurich, experiencing the same type of transition. Furthermore we compare the performance of the physics based correction to purely statistical correction approaches (constant correction, correcting for annual cycle using spline). In Zurich the Wild screen was replaced in 1954 by the Stevenson screen, from 1954-1960 parallel temperature measurements in both screens were taken, which will be used to assess the performance of the applied corrections. For Zurich the required model input is available (i.e. three times daily observations of wind, cloud cover, pressure and humidity measurements, local times of sunset and sunrise). However, a large number of stations do not measure these additional input data required for the model, which hampers the transferability and applicability of the model to other stations. Hence, we test possible simplifications and generalizations of the model to make it more easily applicable to stations with the same type of inhomogeneity. In a last step we test whether other types of transitions (e.g., from a Stevenson screen to an automated weather system) can be corrected using the principle of a physics-based approach.

  18. A new approach to detect congestive heart failure using Teager energy nonlinear scatter plot of R-R interval series.

    PubMed

    Kamath, Chandrakar

    2012-09-01

    A novel approach to distinguish congestive heart failure (CHF) subjects from healthy subjects is proposed. Heart rate variability (HRV) is impaired in CHF subjects. In this work hypothesizing that capturing moment to moment nonlinear dynamics of HRV will reveal cardiac patterning, we construct the nonlinear scatter plot for Teager energy of R-R interval series. The key feature of Teager energy is that it models the energy of the source that generated the signal rather than the energy of the signal itself. Hence, any deviations in the genesis of HRV, by complex interactions of hemodynamic, electrophysiological, and humoral variables, as well as by the autonomic and central nervous regulations, get manifested in the Teager energy function. Comparison of the Teager energy scatter plot with the second-order difference plot (SODP) for normal and CHF subjects reveals significant differences qualitatively and quantitatively. We introduce the concept of curvilinearity for central tendency measures of the plots and define a radial distance index that reveals the efficacy of the Teager energy scatter plot over SODP in separating CHF subjects from healthy subjects. The k-nearest neighbor classifier with RDI as feature showed almost 100% classification rate.

  19. The analysis of groundwater levels influenced by dual factors in western Jilin Province by using time series analysis method

    NASA Astrophysics Data System (ADS)

    Lu, Wen Xi; Zhao, Ying; Chu, Hai Bo; Yang, Lei Lei

    2014-09-01

    To enhance our understanding of the dynamic characteristics of groundwater level in the western Jilin Province of China, two models of decomposition method in time series analysis, additive model and multiplicative model, are employed in this study. The data used in the models are the monthly groundwater levels of three wells observed from 1986 to 2011. Moreover, the analysis of three wells, located in the upper, middle and downstream of the groundwater flow path, helps to obtain the variation in each well and the mutual comparison among them. The final results indicate that the groundwater levels show a decreasing trend and the period of variation last for about 7 years. In addition, hydrographs of the three wells manifest the impacts of human behavior on groundwater level increases since 1995. Furthermore, compared with the autoregressive integrated moving average model, the decomposition method is recommended in the analysis and prediction of groundwater levels.

  20. A FORTRAN program for the statistical analysis of incomplete time series data sets by a method of partition.

    PubMed

    Patel, M K; Waterhouse, J P

    1993-03-01

    A program written in FORTRAN-77 which executes an analysis for periodicity of a time series data set is presented. Time series analysis now has applicability and use in a wide range of biomedical studies. The analytical method termed here a method of partition is derived from periodogram analysis, but uses the principle of analysis of variance (ANOVA). It is effective when used on incomplete data sets. An example in which a data set is made progressively more incomplete by the random removal of values demonstrates this, and a listing of the program and a sample output in both an abbreviated and a full version are given.