Science.gov

Sample records for series analysis approach

  1. A multiscale approach to InSAR time series analysis

    NASA Astrophysics Data System (ADS)

    Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y. N.; Dicaprio, C.; Rickerby, A.

    2008-12-01

    We describe a new technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale analysis of InSAR Time Series), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. This approach also permits a consistent treatment of all data independent of the presence of localized holes in any given interferogram. In essence, MInTS allows one to considers all data at the same time (as opposed to one pixel at a time), thereby taking advantage of both spatial and temporal characteristics of the deformation field. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to includes a set of general functions (e.g., splines) in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate our results by comparing with ground-based observations.

  2. A Multiscale Approach to InSAR Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Muse, P.; Simons, M.; Lin, N.; Dicaprio, C. J.

    2010-12-01

    We present a technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale InSAR Time Series analysis), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. As opposed to single pixel InSAR time series techniques, MInTS takes advantage of both spatial and temporal characteristics of the deformation field. We use a weighting scheme which accounts for the presence of localized holes due to decorrelation or unwrapping errors in any given interferogram. We represent time-dependent deformation using a dictionary of general basis functions, capable of detecting both steady and transient processes. The estimation is regularized using a model resolution based smoothing so as to be able to capture rapid deformation where there are temporally dense radar acquisitions and to avoid oscillations during time periods devoid of acquisitions. MInTS also has the flexibility to explicitly parametrize known time-dependent processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). We use cross validation to choose the regularization penalty parameter in the inversion of for the time-dependent deformation field. We demonstrate MInTS using a set of 63 ERS-1/2 and 29 Envisat interferograms for Long Valley Caldera.

  3. A Multiscale Approach to InSAR Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y.; Dicaprio, C. J.

    2009-12-01

    We describe progress in the development of MInTS (Multiscale analysis of InSAR Time Series), an approach to constructed self-consistent time-dependent deformation observations from repeated satellite-based InSAR images of a given region. MInTS relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. In essence, MInTS allows one to considers all data at the same time as opposed to one pixel at a time, thereby taking advantage of both spatial and temporal characteristics of the deformation field. This approach also permits a consistent treatment of all data independent of the presence of localized holes due to unwrapping issues in any given interferogram. Specifically, the presence of holes is accounted for through a weighting scheme that accounts for the extent of actual data versus the area of holes associated with any given wavelet. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to include a set of general functions in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. We also experiment with the use of sparsity inducing regularization as a way to select from a large dictionary of time functions. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate

  4. A Comparison of Alternative Approaches to the Analysis of Interrupted Time-Series.

    ERIC Educational Resources Information Center

    Harrop, John W.; Velicer, Wayne F.

    1985-01-01

    Computer generated data representative of 16 Auto Regressive Integrated Moving Averages (ARIMA) models were used to compare the results of interrupted time-series analysis using: (1) the known model identification, (2) an assumed (l,0,0) model, and (3) an assumed (3,0,0) model as an approximation to the General Transformation approach. (Author/BW)

  5. Complex networks approach to geophysical time series analysis: Detecting paleoclimate transitions via recurrence networks

    NASA Astrophysics Data System (ADS)

    Donner, R. V.; Zou, Y.; Donges, J. F.; Marwan, N.; Kurths, J.

    2009-12-01

    We present a new approach for analysing structural properties of time series from complex systems. Starting from the concept of recurrences in phase space, the recurrence matrix of a time series is interpreted as the adjacency matrix of an associated complex network which links different points in time if the evolution of the considered states is very similar. A critical comparison of these recurrence networks with similar existing techniques is presented, revealing strong conceptual benefits of the new approach which can be considered as a unifying framework for transforming time series into complex networks that also includes other methods as special cases. Based on different model systems, we demonstrate that there are fundamental interrelationships between the topological properties of recurrence networks and the statistical properties of the phase space density of the underlying dynamical system. Hence, the network description yields new quantitative characteristics of the dynamical complexity of a time series, which substantially complement existing measures of recurrence quantification analysis. Finally, we illustrate the potential of our approach for detecting hidden dynamical transitions from geoscientific time series by applying it to different paleoclimate records. In particular, we are able to resolve previously unknown climatic regime shifts in East Africa during the last about 4 million years, which might have had a considerable influence on the evolution of hominids in the area.

  6. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  7. Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach.

    PubMed

    Monti, Martin M

    2011-01-01

    Functional magnetic resonance imaging (fMRI) is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a general linear model (GLM) approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power, and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making. PMID:21442013

  8. 3D time series analysis of cell shape using Laplacian approaches

    PubMed Central

    2013-01-01

    Background Fundamental cellular processes such as cell movement, division or food uptake critically depend on cells being able to change shape. Fast acquisition of three-dimensional image time series has now become possible, but we lack efficient tools for analysing shape deformations in order to understand the real three-dimensional nature of shape changes. Results We present a framework for 3D+time cell shape analysis. The main contribution is three-fold: First, we develop a fast, automatic random walker method for cell segmentation. Second, a novel topology fixing method is proposed to fix segmented binary volumes without spherical topology. Third, we show that algorithms used for each individual step of the analysis pipeline (cell segmentation, topology fixing, spherical parameterization, and shape representation) are closely related to the Laplacian operator. The framework is applied to the shape analysis of neutrophil cells. Conclusions The method we propose for cell segmentation is faster than the traditional random walker method or the level set method, and performs better on 3D time-series of neutrophil cells, which are comparatively noisy as stacks have to be acquired fast enough to account for cell motion. Our method for topology fixing outperforms the tools provided by SPHARM-MAT and SPHARM-PDM in terms of their successful fixing rates. The different tasks in the presented pipeline for 3D+time shape analysis of cells can be solved using Laplacian approaches, opening the possibility of eventually combining individual steps in order to speed up computations. PMID:24090312

  9. Detection of chaos: New approach to atmospheric pollen time-series analysis

    NASA Astrophysics Data System (ADS)

    Bianchi, M. M.; Arizmendi, C. M.; Sanchez, J. R.

    1992-09-01

    Pollen and spores are biological particles that are ubiquitous to the atmosphere and are pathologically significant, causing plant diseases and inhalant allergies. One of the main objectives of aerobiological surveys is forecasting. Prediction models are required in order to apply aerobiological knowledge to medical or agricultural practice; a necessary condition of these models is not to be chaotic. The existence of chaos is detected through the analysis of a time series. The time series comprises hourly counts of atmospheric pollen grains obtained using a Burkard spore trap from 1987 to 1989 at Mar del Plata. Abraham's method to obtain the correlation dimension was applied. A low and fractal dimension shows chaotic dynamics. The predictability of models for atomspheric pollen forecasting is discussed.

  10. A hybrid wavelet analysis-cloud model data-extending approach for meteorologic and hydrologic time series

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing

    2015-05-01

    For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.

  11. A Time Series Approach to Random Number Generation: Using Recurrence Quantification Analysis to Capture Executive Behavior

    PubMed Central

    Oomens, Wouter; Maes, Joseph H. R.; Hasselman, Fred; Egger, Jos I. M.

    2015-01-01

    The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG) task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA), a non-linear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation. PMID:26097449

  12. A Time Series Approach to Random Number Generation: Using Recurrence Quantification Analysis to Capture Executive Behavior.

    PubMed

    Oomens, Wouter; Maes, Joseph H R; Hasselman, Fred; Egger, Jos I M

    2015-01-01

    The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG) task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA), a non-linear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation. PMID:26097449

  13. Time series analysis to monitor and assess water resources: a moving average approach.

    PubMed

    Reghunath, Rajesh; Murthy, T R Sreedhara; Raghavan, B R

    2005-10-01

    An understanding of the behavior of the groundwater body and its long-term trends are essential for making any management decision in a given watershed. Geostatistical methods can effectively be used to derive the long-term trends of the groundwater body. Here an attempt has been made to find out the long-term trends of the water table fluctuations of a river basin through a time series approach. The method was found to be useful for demarcating the zones of discharge and of recharge of an aquifer. The recharge of the aquifer is attributed to the return flow from applied irrigation. In the study area, farmers mainly depend on borewells for water and water is pumped from the deep aquifer indiscriminately. The recharge of the shallow aquifer implies excessive pumping of the deep aquifer. Necessary steps have to be taken immediately at appropriate levels to control the irrational pumping of deep aquifer groundwater, which is needed as a future water source. The study emphasizes the use of geostatistics for the better management of water resources and sustainable development of the area. PMID:16240189

  14. Detecting network modules in fMRI time series: a weighted network analysis approach.

    PubMed

    Mumford, Jeanette A; Horvath, Steve; Oldham, Michael C; Langfelder, Peter; Geschwind, Daniel H; Poldrack, Russell A

    2010-10-01

    Many network analyses of fMRI data begin by defining a set of regions, extracting the mean signal from each region and then analyzing the correlations between regions. One essential question that has not been addressed in the literature is how to best define the network neighborhoods over which a signal is combined for network analyses. Here we present a novel unsupervised method for the identification of tightly interconnected voxels, or modules, from fMRI data. This approach, weighted voxel coactivation network analysis (WVCNA), is based on a method that was originally developed to find modules of genes in gene networks. This approach differs from many of the standard network approaches in fMRI in that connections between voxels are described by a continuous measure, whereas typically voxels are considered to be either connected or not connected depending on whether the correlation between the two voxels survives a hard threshold value. Additionally, instead of simply using pairwise correlations to describe the connection between two voxels, WVCNA relies on a measure of topological overlap, which not only compares how correlated two voxels are but also the degree to which the pair of voxels is highly correlated with the same other voxels. We demonstrate the use of WVCNA to parcellate the brain into a set of modules that are reliably detected across data within the same subject and across subjects. In addition we compare WVCNA to ICA and show that the WVCNA modules have some of the same structure as the ICA components, but tend to be more spatially focused. We also demonstrate the use of some of the WVCNA network metrics for assessing a voxel's membership to a module and also how that voxel relates to other modules. Last, we illustrate how WVCNA modules can be used in a network analysis to find connections between regions of the brain and show that it produces reasonable results. PMID:20553896

  15. Detecting network modules in fMRI time series: A weighted network analysis approach

    PubMed Central

    Mumford, Jeanette A; Horvath, Steve; Oldham, Michael C.; Langfelder, Peter; Geschwind, Daniel H.; Poldrack, Russell A

    2010-01-01

    Many network analyses of fMRI data begin by defining a set of regions, extracting the mean signal from each region and then analyzing the correlations between regions. One essential question that has not been addressed in the literature is how to best define the network neighborhoods over which a signal is combined for network analyses. Here we present a novel unsupervised method for the identification of tightly interconnected voxels, or modules, from fMRI data. This approach, weighted voxel coactivation network analysis (WVCNA) is based on a method that was originally developed to find modules of genes in gene networks. This approach differs from many of the standard network approaches in fMRI in that connections between voxels are described by a continuous measure, whereas typically voxels are considered to be either connected or not connected depending on whether the correlation between the two voxels survives a hard threshold value. Additionally, instead of simply using pairwise correlations to describe the connection between two voxels, WVCNA relies on a measure of topological overlap, which not only compares how correlated two voxels are, but also the degree to which the pair of voxels is highly correlated with the same other voxels. We demonstrate the use of WVCNA to parcellate the brain into a set of modules that are reliably detected across data within the same subject and across subjects. In addition we compare WVCNA to ICA and show that the WVCNA modules have some of the same structure as the ICA components, but tend to be more spatially focused. We also demonstrate the use of some of the WVCNA network metrics for assessing a voxel’s membership to a module and also how that voxel relates to other modules. Last, we illustrate how WVCNA modules can be used in a network analysis to find connections between regions of the brain and show that it produces reasonable results. PMID:20553896

  16. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    PubMed

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data. PMID:17287043

  17. A genetic programming approach for time-series analysis and prediction in space physics.

    NASA Astrophysics Data System (ADS)

    Jorgensen, A. M.; Brumby, S. P.; Henderson, M. G.

    2004-12-01

    A central theme in space weather prediction is the ability to predict time-series of relevant quantities, both empirically, and from physics-based models. Empirical models are often based on educated guesses, or intuition. The task of finding an empirical relationship relating quantities can be tedious and time-consuming, especially when a large number of parameters are involved. Genetic Programming (GP) provides a method for automating the guesswork, and can in some instances automatically find functional relationships between data streams. GP is an evolutionary computation technique which is an extension of the Genetic Algorithm framework used for function optimization. In GP an evolutionary algorithm combines elementary function operators in an attempt to build a function which is able to reproduce a training example from a set of input data. We will illustrate how a GP algorithm can be used in space physics by addressing two relevant topics: The prediction of relativistic electron fluxes, and prediction of Dst.

  18. Novel approaches in Extended Principal Components Analysis to compare spatio-temporal patterns among multiple image time series

    NASA Astrophysics Data System (ADS)

    Neeti, N.; Eastman, R.

    2012-12-01

    Extended Principal Components Analysis (EPCA) aims to examine the patterns of variability shared among multiple image time series. Conventionally, this is done by virtually extending the spatial dimension of the time series by spatially concatenating the different time series and then performing S-mode PCA. In S-mode analysis, samples in space are the statistical variables and samples in time are the statistical observations. This paper introduces the concept of temporal concatenation of multiple image time series to perform EPCA. EPCA can also be done with T-mode orientation in which samples in time are the statistical variables and samples in space are the statistical observations. This leads to a total of four orientations in which EPCA can be carried out. This research explores these four orientations and their implications in investigating spatio-temporal relationships among multiple time series. This research demonstrates that EPCA carried out with temporal concatenation of the multiple time series with T-mode (tT) is able to identify similar spatial patterns among multiple time series. The conventional S-mode EPCA with spatial concatenation (sS) identifies similar temporal patterns among multiple time series. The other two modes, namely T-mode with spatial concatenation (sT) and S-mode with temporal concatenation (tS), are able to identify patterns which share consistent temporal phase relationships and consistent spatial phase relationships with each other, respectively. In a case study using three sets of precipitation time series data from GPCP, CMAP and NCEP-DOE, the results show that examination of all four modes provides an effective basis comparison of the series.

  19. Model dissection from earthquake time series: A comparative analysis using modern non-linear forecasting and artificial neural network approaches

    NASA Astrophysics Data System (ADS)

    Sri Lakshmi, S.; Tiwari, R. K.

    2009-02-01

    This study utilizes two non-linear approaches to characterize model behavior of earthquake dynamics in the crucial tectonic regions of Northeast India (NEI). In particular, we have applied a (i) non-linear forecasting technique to assess the dimensionality of the earthquake-generating mechanism using the monthly frequency earthquake time series (magnitude ⩾4) obtained from NOAA and USGS catalogues for the period 1960-2003 and (ii) artificial neural network (ANN) methods—based on the back-propagation algorithm (BPA) to construct the neural network model of the same data set for comparing the two. We have constructed a multilayered feed forward ANN model with an optimum input set configuration specially designed to take advantage of more completely on the intrinsic relationships among the input and retrieved variables and arrive at the feasible model for earthquake prediction. The comparative analyses show that the results obtained by the two methods are stable and in good agreement and signify that the optimal embedding dimension obtained from the non-linear forecasting analysis compares well with the optimal number of inputs used for the neural networks. The constructed model suggests that the earthquake dynamics in the NEI region can be characterized by a high-dimensional chaotic plane. Evidence of high-dimensional chaos appears to be associated with "stochastic seasonal" bias in these regions and would provide some useful constraints for testing the model and criteria to assess earthquake hazards on a more rigorous and quantitative basis.

  20. Cabinetmaker. Occupational Analysis Series.

    ERIC Educational Resources Information Center

    Chinien, Chris; Boutin, France

    This document contains the analysis of the occupation of cabinetmaker, or joiner, that is accepted by the Canadian Council of Directors as the national standard for the occupation. The front matter preceding the analysis includes exploration of the development of the analysis, structure of the analysis, validation method, scope of the cabinetmaker…

  1. Permutations and time series analysis.

    PubMed

    Cánovas, Jose S; Guillamón, Antonio

    2009-12-01

    The main aim of this paper is to show how the use of permutations can be useful in the study of time series analysis. In particular, we introduce a test for checking the independence of a time series which is based on the number of admissible permutations on it. The main improvement in our tests is that we are able to give a theoretical distribution for independent time series. PMID:20059199

  2. Language time series analysis

    NASA Astrophysics Data System (ADS)

    Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos

    2006-10-01

    We use the detrended fluctuation analysis (DFA) and the Grassberger-Proccacia analysis (GP) methods in order to study language characteristics. Despite that we construct our signals using only word lengths or word frequencies, excluding in this way huge amount of information from language, the application of GP analysis indicates that linguistic signals may be considered as the manifestation of a complex system of high dimensionality, different from random signals or systems of low dimensionality such as the Earth climate. The DFA method is additionally able to distinguish a natural language signal from a computer code signal. This last result may be useful in the field of cryptography.

  3. FROG: Time-series analysis

    NASA Astrophysics Data System (ADS)

    Allan, Alasdair

    2014-06-01

    FROG performs time series analysis and display. It provides a simple user interface for astronomers wanting to do time-domain astrophysics but still offers the powerful features found in packages such as PERIOD (ascl:1406.005). FROG includes a number of tools for manipulation of time series. Among other things, the user can combine individual time series, detrend series (multiple methods) and perform basic arithmetic functions. The data can also be exported directly into the TOPCAT (ascl:1101.010) application for further manipulation if needed.

  4. Towards a classification approach using meta-biclustering: impact of discretization in the analysis of expression time series.

    PubMed

    Carreiro, André V; Ferreira, Artur J; Figueiredo, Mário A T; Madeira, Sara C

    2012-01-01

    Biclustering has been recognized as a remarkably effective method for discovering local temporal expression patterns and unraveling potential regulatory mechanisms, essential to understanding complex biomedical processes, such as disease progression and drug response. In this work, we propose a classification approach based on meta-biclusters (a set of similar biclusters) applied to prognostic prediction. We use real clinical expression time series to predict the response of patients with multiple sclerosis to treatment with Interferon-β. As compared to previous approaches, the main advantages of this strategy are the interpretability of the results and the reduction of data dimensionality, due to biclustering. This would allow the identification of the genes and time points which are most promising for explaining different types of response profiles, according to clinical knowledge. We assess the impact of different unsupervised and supervised discretization techniques on the classification accuracy. The experimental results show that, in many cases, the use of these discretization methods improves the classification accuracy, as compared to the use of the original features. PMID:22829578

  5. A phase-synchronization and random-matrix based approach to multichannel time-series analysis with application to epilepsy

    NASA Astrophysics Data System (ADS)

    Osorio, Ivan; Lai, Ying-Cheng

    2011-09-01

    We present a general method to analyze multichannel time series that are becoming increasingly common in many areas of science and engineering. Of particular interest is the degree of synchrony among various channels, motivated by the recognition that characterization of synchrony in a system consisting of many interacting components can provide insights into its fundamental dynamics. Often such a system is complex, high-dimensional, nonlinear, nonstationary, and noisy, rendering unlikely complete synchronization in which the dynamical variables from individual components approach each other asymptotically. Nonetheless, a weaker type of synchrony that lasts for a finite amount of time, namely, phase synchronization, can be expected. Our idea is to calculate the average phase-synchronization times from all available pairs of channels and then to construct a matrix. Due to nonlinearity and stochasticity, the matrix is effectively random. Moreover, since the diagonal elements of the matrix can be arbitrarily large, the matrix can be singular. To overcome this difficulty, we develop a random-matrix based criterion for proper choosing of the diagonal matrix elements. Monitoring of the eigenvalues and the determinant provides a powerful way to assess changes in synchrony. The method is tested using a prototype nonstationary noisy dynamical system, electroencephalogram (scalp) data from absence seizures for which enhanced cortico-thalamic synchrony is presumed, and electrocorticogram (intracranial) data from subjects having partial seizures with secondary generalization for which enhanced local synchrony is similarly presumed.

  6. Predicting road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  7. Hands-On Approach to Structure Activity Relationships: The Synthesis, Testing, and Hansch Analysis of a Series of Acetylcholineesterase Inhibitors

    ERIC Educational Resources Information Center

    Locock, Katherine; Tran, Hue; Codd, Rachel; Allan, Robin

    2015-01-01

    This series of three practical sessions centers on drugs that inhibit the enzyme acetylcholineesterase. This enzyme is responsible for the inactivation of acetylcholine and has been the target of drugs to treat glaucoma and Alzheimer's disease and for a number of insecticides and warfare agents. These sessions relate to a series of carbamate…

  8. Task Analysis Inventories. Series II.

    ERIC Educational Resources Information Center

    Wesson, Carl E.

    This second in a series of task analysis inventories contains checklists of work performed in twenty-two occupations. Each inventory is a comprehensive list of work activities, responsibilities, educational courses, machines, tools, equipment, and work aids used and the products produced or services rendered in a designated occupational area. The…

  9. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  10. Multifractal Analysis of Sunspot Number Time Series

    NASA Astrophysics Data System (ADS)

    Kasde, Satish Kumar; Gwal, Ashok Kumar; Sondhiya, Deepak Kumar

    2016-07-01

    Multifractal analysis based approaches have been recently developed as an alternative framework to study the complex dynamical fluctuations in sunspot numbers data including solar cycles 20 to 23 and ascending phase of current solar cycle 24.To reveal the multifractal nature, the time series data of monthly sunspot number are analyzed by singularity spectrum and multi resolution wavelet analysis. Generally, the multifractility in sunspot number generate turbulence with the typical characteristics of the anomalous process governing the magnetosphere and interior of Sun. our analysis shows that singularities spectrum of sunspot data shows well Gaussian shape spectrum, which clearly establishes the fact that monthly sunspot number has multifractal character. The multifractal analysis is able to provide a local and adaptive description of the cyclic components of sunspot number time series, which are non-stationary and result of nonlinear processes. Keywords: Sunspot Numbers, Magnetic field, Multifractal analysis and wavelet Transform Techniques.

  11. Introduction to Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1986-01-01

    The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.

  12. Approach to analysis of multiscale space-distributed time series: separation of spatio-temporal modes with essentially different time scales

    NASA Astrophysics Data System (ADS)

    Feigin, Alexander; Mukhin, Dmitry; Gavrilov, Andrey; Volodin, Evgeny; Loskutov, Evgeny

    2014-05-01

    Natural systems are in general space-distributed, and their evolution represents a broad spectrum of temporal scales. The multiscale nature may be resulted from multiplicity of mechanisms governing the system behaviour, and a large number of feedbacks and nonlinearities. A way to reveal and understand the underlying mechanisms as well as to model corresponding sub-systems is decomposition of the full (complex) system into well separated spatio-temporal patterns ("modes") that evolve with essentially different time scales. In the report a new method of a similar decomposition is discussed. The method is based on generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding space-distributed time series in basis of spatio-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points. The method is applied to decomposition of the Earth's climate system: on the base of 156 years time series of SST anomalies distributed over the globe [2] two climatic modes possessing by noticeably different time scales (3-5 and 9-11 years) are separated. For more accurate exclusion of "too slow" (and thus not represented correctly) processes from real data the numerically produced STEOF basis is used. For doing this the time series generated by the INM RAS Coupled Climate Model [3] is utilized. Relations of separated modes to ENSO and PDO are investigated. Possible development of the suggested approach in order to the separation of the modes that are nonlinearly uncorrelated is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/ 3. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm

  13. Analysis of time series from stochastic processes

    PubMed

    Gradisek; Siegert; Friedrich; Grabec

    2000-09-01

    Analysis of time series from stochastic processes governed by a Langevin equation is discussed. Several applications for the analysis are proposed based on estimates of drift and diffusion coefficients of the Fokker-Planck equation. The coefficients are estimated directly from a time series. The applications are illustrated by examples employing various synthetic time series and experimental time series from metal cutting. PMID:11088809

  14. Nonlinear time-series analysis revisited

    NASA Astrophysics Data System (ADS)

    Bradley, Elizabeth; Kantz, Holger

    2015-09-01

    In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data—typically univariate—via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems.

  15. Nonlinear time-series analysis revisited.

    PubMed

    Bradley, Elizabeth; Kantz, Holger

    2015-09-01

    In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data-typically univariate-via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems. PMID:26428563

  16. Hydrodynamic analysis of time series

    NASA Astrophysics Data System (ADS)

    Suciu, N.; Vamos, C.; Vereecken, H.; Vanderborght, J.

    2003-04-01

    It was proved that balance equations for systems with corpuscular structure can be derived if a kinematic description by piece-wise analytic functions is available [1]. For example, the hydrodynamic equations for one-dimensional systems of inelastic particles, derived in [2], were used to prove the inconsistency of the Fourier law of heat with the microscopic structure of the system. The hydrodynamic description is also possible for single particle systems. In this case, averages of physical quantities associated with the particle, over a space-time window, generalizing the usual ``moving averages'' which are performed on time intervals only, were shown to be almost everywhere continuous space-time functions. Moreover, they obey balance partial differential equations (continuity equation for the 'concentration', Navier-Stokes equation, a. s. o.) [3]. Time series can be interpreted as trajectories in the space of the recorded parameter. Their hydrodynamic interpretation is expected to enable deterministic predictions, when closure relations can be obtained for the balance equations. For the time being, a first result is the estimation of the probability density for the occurrence of a given parameter value, by the normalized concentration field from the hydrodynamic description. The method is illustrated by hydrodynamic analysis of three types of time series: white noise, stock prices from financial markets and groundwater levels recorded at Krauthausen experimental field of Forschungszentrum Jülich (Germany). [1] C. Vamoş, A. Georgescu, N. Suciu, I. Turcu, Physica A 227, 81-92, 1996. [2] C. Vamoş, N. Suciu, A. Georgescu, Phys. Rev E 55, 5, 6277-6280, 1997. [3] C. Vamoş, N. Suciu, W. Blaj, Physica A, 287, 461-467, 2000.

  17. Analysis of Polyphonic Musical Time Series

    NASA Astrophysics Data System (ADS)

    Sommer, Katrin; Weihs, Claus

    A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.

  18. Complex network approach to fractional time series

    SciTech Connect

    Manshour, Pouya

    2015-10-15

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.

  19. Complex network approach to fractional time series

    NASA Astrophysics Data System (ADS)

    Manshour, Pouya

    2015-10-01

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.

  20. Analysis of series resonant converter with series-parallel connection

    NASA Astrophysics Data System (ADS)

    Lin, Bor-Ren; Huang, Chien-Lan

    2011-02-01

    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  1. Nonlinear Analysis of Surface EMG Time Series

    NASA Astrophysics Data System (ADS)

    Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-04-01

    Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.

  2. English Grammar, A Combined Tagmemic and Transformational Approach. A Constrastive Analysis of English and Vietnamese, Vol. 1. Linguistic Circle of Canberra Publications, Series C--Books, No. 3.

    ERIC Educational Resources Information Center

    Nguyen, Dang Liem

    This is the first volume of a contrastive analysis of English and Vietnamese in the light of a combined tagmemic and transformational approach. The dialects contrasted are Midwest Standard American English and Standard Saigon Vietnamese. The study has been designed chiefly for pedagogical applications. A general introduction gives the history of…

  3. Allan deviation analysis of financial return series

    NASA Astrophysics Data System (ADS)

    Hernández-Pérez, R.

    2012-05-01

    We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.

  4. Time Series Analysis Using Geometric Template Matching.

    PubMed

    Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina

    2013-03-01

    We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data. PMID:22641699

  5. Entropic Analysis of Electromyography Time Series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  6. 3D QSAR STUDIES ON A SERIES OF QUINAZOLINE DERRIVATIVES AS TYROSINE KINASE (EGFR) INHIBITOR: THE K-NEAREST NEIGHBOR MOLECULAR FIELD ANALYSIS APPROACH

    PubMed Central

    Noolvi, Malleshappa N.; Patel, Harun M.

    2010-01-01

    Epidermal growth factor receptor (EGFR) protein tyrosine kinases (PTKs) are known for its role in cancer. Quinazoline have been reported to be the molecules of interest, with potent anticancer activity and they act by binding to ATP site of protein kinases. ATP binding site of protein kinases provides an extensive opportunity to design newer analogs. With this background, we report an attempt to discern the structural and physicochemical requirements for inhibition of EGFR tyrosine kinase. The k-Nearest Neighbor Molecular Field Analysis (kNN-MFA), a three dimensional quantitative structure activity relationship (3D- QSAR) method has been used in the present case to study the correlation between the molecular properties and the tyrosine kinase (EGFR) inhibitory activities on a series of quinazoline derivatives. kNNMFA calculations for both electrostatic and steric field were carried out. The master grid maps derived from the best model has been used to display the contribution of electrostatic potential and steric field. The statistical results showed significant correlation coefficient r2 (q2) of 0.846, r2 for external test set (pred_r2) 0.8029, coefficient of correlation of predicted data set (pred_r2se) of 0.6658, degree of freedom 89 and k nearest neighbor of 2. Therefore, this study not only casts light on binding mechanism between EGFR and its inhibitors, but also provides hints for the design of new EGFR inhibitors with observable structural diversity PMID:24825983

  7. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  8. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  9. Topological analysis of chaotic time series

    NASA Astrophysics Data System (ADS)

    Gilmore, Robert

    1997-10-01

    Topological methods have recently been developed for the classification, analysis, and synthesis of chaotic time series. These methods can be applied to time series with a Lyapunov dimension less than three. The procedure determines the stretching and squeezing mechanisms which operate to create a strange attractor and organize all the unstable periodic orbits in the attractor in a unique way. Strange attractors are identified by a set of integers. These are topological invariants for a two dimensional branched manifold, which is the infinite dissipation limit of the strange attractor. It is remarkable that this topological information can be extracted from chaotic time series. The data required for this analysis need not be extensive or exceptionally clean. The topological invariants: (1) are subject to validation/invalidation tests; (2) describe how to model the data; and (3) do not change as control parameters change. Topological analysis is the first step in a doubly discrete classification scheme for strange attractors. The second discrete classification involves specification of a 'basis set' set of periodic orbits whose presence forces the existence of all other periodic orbits in the strange attractor. The basis set of orbits does change as control parameters change. Quantitative models developed to describe time series data are tested by the methods of entrainment. This analysis procedure has been applied to analyze a number of data sets. Several analyses are described.

  10. Nonlinear Time Series Analysis via Neural Networks

    NASA Astrophysics Data System (ADS)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  11. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  12. Multifractal analysis of polyalanines time series

    NASA Astrophysics Data System (ADS)

    Figueirêdo, P. H.; Nogueira, E.; Moret, M. A.; Coutinho, Sérgio

    2010-05-01

    Multifractal properties of the energy time series of short α-helix structures, specifically from a polyalanine family, are investigated through the MF-DFA technique ( multifractal detrended fluctuation analysis). Estimates for the generalized Hurst exponent h(q) and its associated multifractal exponents τ(q) are obtained for several series generated by numerical simulations of molecular dynamics in different systems from distinct initial conformations. All simulations were performed using the GROMOS force field, implemented in the program THOR. The main results have shown that all series exhibit multifractal behavior depending on the number of residues and temperature. Moreover, the multifractal spectra reveal important aspects of the time evolution of the system and suggest that the nucleation process of the secondary structures during the visits on the energy hyper-surface is an essential feature of the folding process.

  13. Climate Time Series Analysis and Forecasting

    NASA Astrophysics Data System (ADS)

    Young, P. C.; Fildes, R.

    2009-04-01

    This paper will discuss various aspects of climate time series data analysis, modelling and forecasting being carried out at Lancaster. This will include state-dependent parameter, nonlinear, stochastic modelling of globally averaged atmospheric carbon dioxide; the computation of emission strategies based on modern control theory; and extrapolative time series benchmark forecasts of annual average temperature, both global and local. The key to the forecasting evaluation will be the iterative estimation of forecast error based on rolling origin comparisons, as recommended in the forecasting research literature. The presentation will conclude with with a comparison of the time series forecasts with forecasts produced from global circulation models and a discussion of the implications for climate modelling research.

  14. Ensemble vs. time averages in financial time series analysis

    NASA Astrophysics Data System (ADS)

    Seemann, Lars; Hua, Jia-Chen; McCauley, Joseph L.; Gunaratne, Gemunu H.

    2012-12-01

    Empirical analysis of financial time series suggests that the underlying stochastic dynamics are not only non-stationary, but also exhibit non-stationary increments. However, financial time series are commonly analyzed using the sliding interval technique that assumes stationary increments. We propose an alternative approach that is based on an ensemble over trading days. To determine the effects of time averaging techniques on analysis outcomes, we create an intraday activity model that exhibits periodic variable diffusion dynamics and we assess the model data using both ensemble and time averaging techniques. We find that ensemble averaging techniques detect the underlying dynamics correctly, whereas sliding intervals approaches fail. As many traded assets exhibit characteristic intraday volatility patterns, our work implies that ensemble averages approaches will yield new insight into the study of financial markets’ dynamics.

  15. Delay Differential Analysis of Time Series

    PubMed Central

    Lainscsek, Claudia; Sejnowski, Terrence J.

    2015-01-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  16. Nonstationary time series prediction combined with slow feature analysis

    NASA Astrophysics Data System (ADS)

    Wang, G.; Chen, X.

    2015-07-01

    Almost all climate time series have some degree of nonstationarity due to external driving forces perturbing the observed system. Therefore, these external driving forces should be taken into account when constructing the climate dynamics. This paper presents a new technique of obtaining the driving forces of a time series from the slow feature analysis (SFA) approach, and then introduces them into a predictive model to predict nonstationary time series. The basic theory of the technique is to consider the driving forces as state variables and to incorporate them into the predictive model. Experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted to test the model. The results showed improved prediction skills.

  17. Time-Series Analysis: A Cautionary Tale

    NASA Technical Reports Server (NTRS)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  18. Homogeneity analysis of precipitation series in Iran

    NASA Astrophysics Data System (ADS)

    Hosseinzadeh Talaee, P.; Kouchakzadeh, Mahdi; Shifteh Some'e, B.

    2014-10-01

    Assessment of the reliability and quality of historical precipitation data is required in the modeling of hydrology and water resource processes and for climate change studies. The homogeneity of the annual and monthly precipitation data sets throughout Iran was tested using the Bayesian, Cumulative Deviations, and von Neumann tests at a significance level of 0.05. The precipitation records from 41 meteorological stations covering the years between 1966 and 2005 were considered. The annual series of Iranian precipitation were found to be homogeneous by applying the Bayesian and Cumulative Deviations tests, while the von Neumann test detected inhomogeneities at seven stations. Almost all the monthly precipitation data sets are homogeneous and considered as "useful." The outputs of the statistical tests for the homogeneity analysis of the precipitation time series had discrepancies in some cases which are related to different sensitivities of the tests to break in the time series. It was found that the von Neumann test is more sensitive than the Bayesian and Cumulative Deviations tests in the determination of inhomogeneity in the precipitation series.

  19. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  20. AN ALTERNATIVE APPROACH TO THE TREATMENT OF MENISCAL PATHOLOGIES: A CASE SERIES ANALYSIS OF THE MULLIGAN CONCEPT “SQUEEZE” TECHNIQUE

    PubMed Central

    Richmond, Amy; Sanchez, Belinda; Stevenson, Valerie; Baker, Russell T.; May, James; Nasypany, Alan; Reordan, Don

    2016-01-01

    ABSTRACT Background Partial meniscectomy does not consistently produce the desired positive outcomes intended for meniscal tears lesions; therefore, a need exists for research into alternatives for treating symptoms of meniscal tears. The purpose of this case series was to examine the effect of the Mulligan Concept (MC) “Squeeze” technique in physically active participants who presented with clinical symptoms of meniscal tears. Description of Cases The MC “Squeeze” technique was applied in five cases of clinically diagnosed meniscal tears in a physically active population. The Numeric Pain Rating Scale (NRS), the Patient Specific Functional Scale (PSFS), the Disability in the Physically Active (DPA) Scale, and the Knee injury and Osteoarthritis Outcomes Score (KOOS) were administered to assess participant pain level and function. Outcomes Statistically significant improvements were found on cumulative NRS (p ≤ 0.001), current NRS (p ≤ 0.002), PSFS (p ≤ 0.003), DPA (p ≤ 0.019), and KOOS (p ≤ 0.002) scores across all five participants. All participants exceeded the minimal clinically important difference (MCID) on the first treatment and reported an NRS score and current pain score of one point or less at discharge. The MC “Squeeze” technique produced statistically and clinically significant changes across all outcome measures in all five participants. Discussion The use of the MC “Squeeze” technique in this case series indicated positive outcomes in five participants who presented with meniscal tear symptoms. Of importance to the athletic population, each of the participants continued to engage in sport activity as tolerated unless otherwise required during the treatment period. The outcomes reported in this case series exceed those reported when using traditional conservative therapy and the return to play timelines for meniscal tears treated with partial meniscectomies. Levels of Evidence Level 4 PMID:27525181

  1. Long-Term Retrospective Analysis of Mackerel Spawning in the North Sea: A New Time Series and Modeling Approach to CPR Data

    PubMed Central

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221

  2. Sliced Inverse Regression for Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Li-Sue

    1995-11-01

    In this thesis, general nonlinear models for time series data are considered. A basic form is x _{t} = f(beta_sp{1} {T}X_{t-1},beta_sp {2}{T}X_{t-1},... , beta_sp{k}{T}X_ {t-1},varepsilon_{t}), where x_{t} is an observed time series data, X_{t } is the first d time lag vector, (x _{t},x_{t-1},... ,x _{t-d-1}), f is an unknown function, beta_{i}'s are unknown vectors, varepsilon_{t }'s are independent distributed. Special cases include AR and TAR models. We investigate the feasibility applying SIR/PHD (Li 1990, 1991) (the sliced inverse regression and principal Hessian methods) in estimating beta _{i}'s. PCA (Principal component analysis) is brought in to check one critical condition for SIR/PHD. Through simulation and a study on 3 well -known data sets of Canadian lynx, U.S. unemployment rate and sunspot numbers, we demonstrate how SIR/PHD can effectively retrieve the interesting low-dimension structures for time series data.

  3. Singular spectrum analysis for time series with missing data

    USGS Publications Warehouse

    Schoellhamer, D.H.

    2001-01-01

    Geophysical time series often contain missing data, which prevents analysis with many signal processing and multivariate tools. A modification of singular spectrum analysis for time series with missing data is developed and successfully tested with synthetic and actual incomplete time series of suspended-sediment concentration from San Francisco Bay. This method also can be used to low pass filter incomplete time series.

  4. Sutureless clear corneal DSAEK with a modified approach for preventing pupillary block and graft dislocation: case series with retrospective comparative analysis.

    PubMed

    Titiyal, Jeewan S; Tinwala, Sana I; Shekhar, Himanshu; Sinha, Rajesh

    2015-04-01

    The purpose of this study was to describe a modified technique of sutureless DSAEK with continuous pressurized internal air tamponade. This was a prospective interventional case series, single-center, institutional study. Twenty-seven patients with corneal decompensation without scarring were included. Aphakic patients and patients with cataractous lens requiring IOL implantation surgery were excluded. Following preparation of the donor tissue, a corneal tunnel was made nasally with two side ports. All incisions were kept long enough to be overlapped by the peripheral part of the donor tissue. Descemet membrane scoring was done using a reverse Sinskey hook, following which it was removed with the same instrument or by forceps. The donor lenticule was then inserted using Busin's glide. Continuous pressurized internal air tamponade was achieved by means of a 30-gauge needle, inserted through the posterior limbus, for 12-14 min. At the end of the surgery, air was partially replaced with BSS, leaving a moderate-sized mobile air bubble in the anterior chamber. At the 6 month's follow-up, CDVA improved from counting fingers at half meter-6/24 preoperatively to 6/9-6/18 postoperatively, and the mean endothelial cell count decreased: to 1,800 from 2,200 cell/mm(2) preoperatively (18.19 % endothelial cell loss). Donor lenticule thickness as documented on AS-OCT was 70-110 µ on Day 1 and 50-80 µ at 6 months postoperative. None of the cases had flat AC or peripheral anterior synechiae formation. None of the patients required a second intervention. There were no cases of primary graft failure, pupillary block glaucomax or donor lenticule dislocation postoperatively. Our modified technique is simple and effective with reduction in postoperative complications associated with DSAEK, thereby maximizing anatomic and functional outcomes associated. PMID:24728534

  5. A seasonal and heteroscedastic gamma model for hydrological time series: A Bayesian approach

    NASA Astrophysics Data System (ADS)

    Cuervo, Edilberto Cepeda; Andrade, Marinho G.; Achcar, Jorge Alberto

    2012-10-01

    Time series models are often used in hydrology to model streamflow series in order to forecast and generate synthetic series which are inputs for the analysis of complex water resources systems. In this paper, we introduce a new modeling approach for hydrologic time series assuming a gamma distribution for the data, where both the mean and conditional variance are being modeled. Bayesian methods using standard Markov Chain Monte Carlo Methods (MCMC) and a simulation algorithm introduced by [1] are used to simulate samples of the joint posterior distribution of interest. An example is given with a time series of monthly averages of natural streamflows, measured from 1931 to 2010 in Furnas hydroelectric dam, in southeastern Brazil.

  6. Compounding approach for univariate time series with nonstationary variances.

    PubMed

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances. PMID:26764768

  7. Compounding approach for univariate time series with nonstationary variances

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  8. Irreversibility of financial time series: A graph-theoretical approach

    NASA Astrophysics Data System (ADS)

    Flanagan, Ryan; Lacasa, Lucas

    2016-04-01

    The relation between time series irreversibility and entropy production has been recently investigated in thermodynamic systems operating away from equilibrium. In this work we explore this concept in the context of financial time series. We make use of visibility algorithms to quantify, in graph-theoretical terms, time irreversibility of 35 financial indices evolving over the period 1998-2012. We show that this metric is complementary to standard measures based on volatility and exploit it to both classify periods of financial stress and to rank companies accordingly. We then validate this approach by finding that a projection in principal components space of financial years, based on time irreversibility features, clusters together periods of financial stress from stable periods. Relations between irreversibility, efficiency and predictability are briefly discussed.

  9. A simple classification of cranial-nasal-orbital communicating tumors that facilitate choice of surgical approaches: analysis of a series of 32 cases.

    PubMed

    Deng, Yue-Fei; Lei, Bing-Xi; Zheng, Mei-Guang; Zheng, Yi-Qing; Chen, Wei-Liang; Lan, Yu-Qing

    2016-08-01

    Cranial-nasal-orbital communicating tumors involving the anterior and middle skull base are among the most challenging to treat surgically, with high rates of incomplete resection and surgical complications. Currently, there is no recognized classification of tumors with regard to the choice of surgical approaches. From January 2004 to January 2014, we classified 32 cranial-nasal-orbital communicating tumors treated in our center into three types according to the tumor body location, scope of extension and direction of invasion: lateral (type I), central (type II) and extensive (type III). This classification considerably facilitated the choice of surgical routes and significantly influenced the surgical time and amount of hemorrhage during operation. In addition, we emphasized the use of transnasal endoscopy for large and extensive tumors, individualized treatment strategies drafted by a group of multidisciplinary collaborators, and careful reconstruction of the skull base defects. Our treatment strategies achieved good surgical outcomes, with a high ratio of total resection (87.5 %, 28/32, including 16 cases of benign tumors and 12 cases of malignant tumors) and a low percentage of surgical complications (18.8 %, 6/32). Original symptoms were alleviated in 29 patients. The average KPS score improved from 81.25 % preoperatively to 91.25 % at 3 months after surgery. No serious perioperative complications occurred. During the follow-up of 3 years on average, four patients with malignant tumors died, including three who had subtotal resections. The 3-year survival rate of patients with malignant tumors was 78.6 %, and the overall 3-year survival rate was 87.5 %. Our data indicate that the simple classification method has practical significance in guiding the choice of surgical approaches for cranial-nasal-orbital communicating tumors and may be extended to other types of skull base tumors. PMID:27016919

  10. Mixed Spectrum Analysis on fMRI Time-Series.

    PubMed

    Kumar, Arun; Lin, Feng; Rajapakse, Jagath C

    2016-06-01

    Temporal autocorrelation present in functional magnetic resonance image (fMRI) data poses challenges to its analysis. The existing approaches handling autocorrelation in fMRI time-series often presume a specific model of autocorrelation such as an auto-regressive model. The main limitation here is that the correlation structure of voxels is generally unknown and varies in different brain regions because of different levels of neurogenic noises and pulsatile effects. Enforcing a universal model on all brain regions leads to bias and loss of efficiency in the analysis. In this paper, we propose the mixed spectrum analysis of the voxel time-series to separate the discrete component corresponding to input stimuli and the continuous component carrying temporal autocorrelation. A mixed spectral analysis technique based on M-spectral estimator is proposed, which effectively removes autocorrelation effects from voxel time-series and identify significant peaks of the spectrum. As the proposed method does not assume any prior model for the autocorrelation effect in voxel time-series, varying correlation structure among the brain regions does not affect its performance. We have modified the standard M-spectral method for an application on a spatial set of time-series by incorporating the contextual information related to the continuous spectrum of neighborhood voxels, thus reducing considerably the computation cost. Likelihood of the activation is predicted by comparing the amplitude of discrete component at stimulus frequency of voxels across the brain by using normal distribution and modeling spatial correlations among the likelihood with a conditional random field. We also demonstrate the application of the proposed method in detecting other desired frequencies. PMID:26800533

  11. Behavior of road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin

    2014-12-01

    Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.

  12. Time series analysis of temporal networks

    NASA Astrophysics Data System (ADS)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  13. Flutter Analysis for Turbomachinery Using Volterra Series

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Yao, Weigang

    2014-01-01

    The objective of this paper is to describe an accurate and efficient reduced order modeling method for aeroelastic (AE) analysis and for determining the flutter boundary. Without losing accuracy, we develop a reduced order model based on the Volterra series to achieve significant savings in computational cost. The aerodynamic force is provided by a high-fidelity solution from the Reynolds-averaged Navier-Stokes (RANS) equations; the structural mode shapes are determined from the finite element analysis. The fluid-structure coupling is then modeled by the state-space formulation with the structural displacement as input and the aerodynamic force as output, which in turn acts as an external force to the aeroelastic displacement equation for providing the structural deformation. NASA's rotor 67 blade is used to study its aeroelastic characteristics under the designated operating condition. First, the CFD results are validated against measured data available for the steady state condition. Then, the accuracy of the developed reduced order model is compared with the full-order solutions. Finally the aeroelastic solutions of the blade are computed and a flutter boundary is identified, suggesting that the rotor, with the material property chosen for the study, is structurally stable at the operating condition, free of encountering flutter.

  14. Multifractal Analysis of Aging and Complexity in Heartbeat Time Series

    NASA Astrophysics Data System (ADS)

    Muñoz D., Alejandro; Almanza V., Victor H.; del Río C., José L.

    2004-09-01

    Recently multifractal analysis has been used intensively in the analysis of physiological time series. In this work we apply the multifractal analysis to the study of heartbeat time series from healthy young subjects and other series obtained from old healthy subjects. We show that this multifractal formalism could be a useful tool to discriminate these two kinds of series. We used the algorithm proposed by Chhabra and Jensen that provides a highly accurate, practical and efficient method for the direct computation of the singularity spectrum. Aging causes loss of multifractality in the heartbeat time series, it means that heartbeat time series of elderly persons are less complex than the time series of young persons. This analysis reveals a new level of complexity characterized by the wide range of necessary exponents to characterize the dynamics of young people.

  15. Deciding on the best (in this case) approach to time-series forecasting

    SciTech Connect

    Pack, D.J.

    1980-01-01

    This paper was motivated by a Decision Sciences article (v. 10, no. 2, 232-244(April 1979)) that presented comparisons of the adaptive estimation procedure (AEP), adaptive filtering, the Box-Jenkins (BJ) methodology, and multiple regression analysis as they apply to time-series forecasting with single-series models. While such comparisons are to be applauded in general, it is demonstrated that the empirical comparisons of the above paper are quite misleading with respect to choosing between the AEP and BJ approaches. This demonstration is followed by a somewhat philosophical discussion on comparison-of-methods techniques.

  16. Apparatus for statistical time-series analysis of electrical signals

    NASA Technical Reports Server (NTRS)

    Stewart, C. H. (Inventor)

    1973-01-01

    An apparatus for performing statistical time-series analysis of complex electrical signal waveforms, permitting prompt and accurate determination of statistical characteristics of the signal is presented.

  17. Time Series in Education: The Analysis of Daily Attendance in Two High Schools

    ERIC Educational Resources Information Center

    Koopmans, Matthijs

    2011-01-01

    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  18. Time-series analysis of offshore-wind-wave groupiness

    SciTech Connect

    Liang, H.B.

    1988-01-01

    This research is to applies basic time-series-analysis techniques on the complex envelope function where the study of the offshore-wind-wave groupiness is a relevant interest. In constructing the complex envelope function, a phase-unwrapping technique is integrated into the algorithm for estimating the carrier frequency and preserving the phase information for further studies. The Gaussian random wave model forms the basis of the wave-group statistics by the envelope-amplitude crossings. Good agreement between the theory and the analysis of field records is found. Other linear models, such as the individual-waves approach and the energy approach, are compared to the envelope approach by analyzing the same set of records. It is found that the character of the filter used in each approach dominates the wave-group statistics. Analyses indicate that the deep offshore wind waves are weakly nonlinear and the Gaussian random assumption remains appropriate for describing the sea state. Wave groups statistics derived from the Gaussian random wave model thus become applicable.

  19. Time-series analysis of Campylobacter incidence in Switzerland.

    PubMed

    Wei, W; Schüpbach, G; Held, L

    2015-07-01

    Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions. PMID:25400006

  20. Three Analysis Examples for Time Series Data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    With improvements in instrumentation and the automation of data collection, plot level repeated measures and time series data are increasingly available to monitor and assess selected variables throughout the duration of an experiment or project. Records and metadata on variables of interest alone o...

  1. Critical Thinking Skills. Analysis and Action Series.

    ERIC Educational Resources Information Center

    Heiman, Marcia; Slomianko, Joshua

    Intended for teachers across grade levels and disciplines, this monograph reviews research on the development of critical thinking skills and introduces a series of these skills that can be incorporated into classroom teaching. Beginning with a definition of critical thinking, the monograph contains two main sections. The first section reviews…

  2. Singular spectrum analysis and forecasting of hydrological time series

    NASA Astrophysics Data System (ADS)

    Marques, C. A. F.; Ferreira, J. A.; Rocha, A.; Castanheira, J. M.; Melo-Gonçalves, P.; Vaz, N.; Dias, J. M.

    The singular spectrum analysis (SSA) technique is applied to some hydrological univariate time series to assess its ability to uncover important information from those series, and also its forecast skill. The SSA is carried out on annual precipitation, monthly runoff, and hourly water temperature time series. Information is obtained by extracting important components or, when possible, the whole signal from the time series. The extracted components are then subject to forecast by the SSA algorithm. It is illustrated the SSA ability to extract a slowly varying component (i.e. the trend) from the precipitation time series, the trend and oscillatory components from the runoff time series, and the whole signal from the water temperature time series. The SSA was also able to accurately forecast the extracted components of these time series.

  3. A Time Series Approach for Soil Moisture Estimation

    NASA Technical Reports Server (NTRS)

    Kim, Yunjin; vanZyl, Jakob

    2006-01-01

    Soil moisture is a key parameter in understanding the global water cycle and in predicting natural hazards. Polarimetric radar measurements have been used for estimating soil moisture of bare surfaces. In order to estimate soil moisture accurately, the surface roughness effect must be compensated properly. In addition, these algorithms will not produce accurate results for vegetated surfaces. It is difficult to retrieve soil moisture of a vegetated surface since the radar backscattering cross section is sensitive to the vegetation structure and environmental conditions such as the ground slope. Therefore, it is necessary to develop a method to estimate the effect of the surface roughness and vegetation reliably. One way to remove the roughness effect and the vegetation contamination is to take advantage of the temporal variation of soil moisture. In order to understand the global hydrologic cycle, it is desirable to measure soil moisture with one- to two-days revisit. Using these frequent measurements, a time series approach can be implemented to improve the soil moisture retrieval accuracy.

  4. Stratospheric ozone time series analysis using dynamical linear models

    NASA Astrophysics Data System (ADS)

    Laine, Marko; Kyrölä, Erkki

    2013-04-01

    We describe a hierarchical statistical state space model for ozone profile time series. The time series are from satellite measurements by the SAGE II and GOMOS instruments spanning years 1984-2012. The original data sets are combined and gridded monthly using 10 degree latitude bands, and covering 20-60 km with 1 km vertical spacing. Model components include level, trend, seasonal effect with solar activity, and quasi biennial oscillations as proxy variables. A typical feature of an atmospheric time series is that they are not stationary but exhibit both slowly varying and abrupt changes in the distributional properties. These are caused by external forcing such as changes in the solar activity or volcanic eruptions. Further, the data sampling is often nonuniform, there are data gaps, and the uncertainty of the observations can vary. When observations are combined from various sources there will be instrument and retrieval method related biases. The differences in sampling lead also to uncertainties. Standard classical ARIMA type of statistical time series methods are mostly useless for atmospheric data. A more general approach makes use of dynamical linear models and Kalman filter type of sequential algorithms. These state space models assume a linear relationship between the unknown state of the system and the observations and for the process evolution of the hidden states. They are still flexible enough to model both smooth trends and sudden changes. The above mentioned methodological challenges are discussed, together with analysis of change points in trends related to recovery of stratospheric ozone. This work is part of the ESA SPIN and ozone CCI projects.

  5. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    NASA Astrophysics Data System (ADS)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  6. Time series analysis of Monte Carlo neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Nease, Brian Robert

    A time series based approach is applied to the Monte Carlo (MC) fission source distribution to calculate the non-fundamental mode eigenvalues of the system. The approach applies Principal Oscillation Patterns (POPs) to the fission source distribution, transforming the problem into a simple autoregressive order one (AR(1)) process. Proof is provided that the stationary MC process is linear to first order approximation, which is a requirement for the application of POPs. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern k-eigenvalue MC codes calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. The strength of this approach is contrasted against the Fission Matrix method (FMM) in terms of accuracy versus computer memory constraints. Multi-dimensional problems are considered since the approach has strong potential for use in reactor analysis, and the implementation of the method into production codes is discussed. Lastly, the appearance of complex eigenvalues is investigated and solutions are provided.

  7. A time-series approach to dynamical systems from classical and quantum worlds

    SciTech Connect

    Fossion, Ruben

    2014-01-08

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  8. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  9. Analysis of Multipsectral Time Series for supporting Forest Management Plans

    NASA Astrophysics Data System (ADS)

    Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.

    2010-05-01

    Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.

  10. The U-series comminution approach: where to from here

    NASA Astrophysics Data System (ADS)

    Handley, Heather; Turner, Simon; Afonso, Juan; Turner, Michael; Hesse, Paul

    2015-04-01

    Quantifying the rates of landscape evolution in response to climate change is inhibited by the difficulty of dating the formation of continental detrital sediments. The 'comminution age' dating model of DePaolo et al. (2006) hypothesises that the measured disequilibria between U-series nuclides (234U and 238U) in fine-grained continental (detrital) sediments can be used to calculate the time elapsed since mechanical weathering of a grain to the threshold size ( 50 µm). The comminution age includes the time that a particle has been mobilised in transport, held in temporary storage (e.g., soils and floodplains) and the time elapsed since final deposition to present day. Therefore, if the deposition age of sediment can be constrained independently, for example via optically stimulated luminescence (OSL) dating, the residence time of sediment (e.g., a palaeochannel deposit) can be determined. Despite the significant potential of this approach, there is still much work to be done before meaningful absolute comminution ages can be obtained. The calculated recoil loss factor and comminution age are highly dependent on the method of recoil loss factor determination used and the inherent assumptions. We present new and recently published uranium isotope data for aeolian sediment deposits, leached and unleached palaeochannel sediments and bedrock samples from Australia to exemplify areas of current uncertainty in the comminution age approach. In addition to the information gained from natural samples, Monte Carlo simulations have been conducted for a synthetic sediment sample to determine the individual and combined comminution age uncertainties associated to each input variable. Using a reasonable associated uncertainty for each input factor and including variations in the source rock and measured (234U/238U) ratios, the total combined uncertainty on comminution age in our simulation (for two methods of recoil loss factor estimation: weighted geometric and surface area

  11. Evolutionary factor analysis of replicated time series.

    PubMed

    Motta, Giovanni; Ombao, Hernando

    2012-09-01

    In this article, we develop a novel method that explains the dynamic structure of multi-channel electroencephalograms (EEGs) recorded from several trials in a motor-visual task experiment. Preliminary analyses of our data suggest two statistical challenges. First, the variance at each channel and cross-covariance between each pair of channels evolve over time. Moreover, the cross-covariance profiles display a common structure across all pairs, and these features consistently appear across all trials. In the light of these features, we develop a novel evolutionary factor model (EFM) for multi-channel EEG data that systematically integrates information across replicated trials and allows for smoothly time-varying factor loadings. The individual EEGs series share common features across trials, thus, suggesting the need to pool information across trials, which motivates the use of the EFM for replicated time series. We explain the common co-movements of EEG signals through the existence of a small number of common factors. These latent factors are primarily responsible for processing the visual-motor task which, through the loadings, drive the behavior of the signals observed at different channels. The estimation of the time-varying loadings is based on the spectral decomposition of the estimated time-varying covariance matrix. PMID:22364516

  12. Performance of multifractal detrended fluctuation analysis on short time series

    NASA Astrophysics Data System (ADS)

    López, Juan Luis; Contreras, Jesús Guillermo

    2013-02-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  13. Nonlinear times series analysis of epileptic human electroencephalogram (EEG)

    NASA Astrophysics Data System (ADS)

    Li, Dingzhou

    The problem of seizure anticipation in patients with epilepsy has attracted significant attention in the past few years. In this paper we discuss two approaches, using methods of nonlinear time series analysis applied to scalp electrode recordings, which is able to distinguish between epochs temporally distant from and just prior to, the onset of a seizure in patients with temporal lobe epilepsy. First we describe a method involving a comparison of recordings taken from electrodes adjacent to and remote from the site of the seizure focus. In particular, we define a nonlinear quantity which we call marginal predictability. This quantity is computed using data from remote and from adjacent electrodes. We find that the difference between the marginal predictabilities computed for the remote and adjacent electrodes decreases several tens of minutes prior to seizure onset, compared to its value interictally. We also show that these difl'crcnc es of marginal predictability intervals are independent of the behavior state of the patient. Next we examine the please coherence between different electrodes both in the long-range and the short-range. When time is distant from seizure onsets ("interictally"), epileptic patients have lower long-range phase coherence in the delta (1-4Hz) and beta (18-30Hz) frequency band compared to nonepileptic subjects. When seizures approach (''preictally"), we observe an increase in phase coherence in the beta band. However, interictally there is no difference in short-range phase coherence between this cohort of patients and non-epileptic subjects. Preictally short-range phase coherence also increases in the alpha (10-13Hz) and the beta band. Next we apply the quantity marginal predictability on the phase difference time series. Such marginal predictabilities are lower in the patients than in the non-epileptic subjects. However, when seizure approaches, the former moves asymptotically towards the latter.

  14. Biological Time Series Analysis Using a Context Free Language: Applicability to Pulsatile Hormone Data

    PubMed Central

    Dean, Dennis A.; Adler, Gail K.; Nguyen, David P.; Klerman, Elizabeth B.

    2014-01-01

    We present a novel approach for analyzing biological time-series data using a context-free language (CFL) representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP) analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals. PMID:25184442

  15. The bone lamina technique: a novel approach for lateral ridge augmentation--a case series.

    PubMed

    Wachtel, Hannes; Fickl, Stefan; Hinze, Marc; Bolz, Wolfgang; Thalmair, Tobias

    2013-01-01

    The goal of this case series is to present a novel treatment approach for lateral ridge augmentation. Four systemically healthy patients (aged 48 to 59 years) with inadequate dental alveolar ridge widths were selected for inclusion. All ridge defects were augmented using a xenogeneic cortical bone shield in combination with particulated bone substitutes and a thin collagen barrier. At baseline and after 6 months, digital cone beam computed tomography scans were performed. Biopsy specimens were harvested at reentry surgery and processed for histologic analysis. The results revealed a sufficient amount of bone structure for implant placement without additional augmentation procedures. The histologic analysis demonstrated that new bone formation had taken place and the bone shield had resorbed entirely. This case series indicates that the bone lamina technique has the biologic and mechanical properties to successfully achieve hard tissue augmentation of deficient ridges. PMID:23820709

  16. Analysis of Time-Series Quasi-Experiments. Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; Maguire, Thomas O.

    The objective of this project was to investigate the adequacy of statistical models developed by G. E. P. Box and G. C. Tiao for the analysis of time-series quasi-experiments: (1) The basic model developed by Box and Tiao is applied to actual time-series experiment data from two separate experiments, one in psychology and one in educational…

  17. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  18. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series

    PubMed Central

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M.; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in

  19. Automatising the analysis of stochastic biochemical time-series

    PubMed Central

    2015-01-01

    Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821

  20. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  1. Time series analysis of air pollutants in Beirut, Lebanon.

    PubMed

    Farah, Wehbeh; Nakhlé, Myriam Mrad; Abboud, Maher; Annesi-Maesano, Isabella; Zaarour, Rita; Saliba, Nada; Germanos, Georges; Gerard, Jocelyne

    2014-12-01

    This study reports for the first time a time series analysis of daily urban air pollutant levels (CO, NO, NO2, O3, PM10, and SO2) in Beirut, Lebanon. The study examines data obtained between September 2005 and July 2006, and their descriptive analysis shows long-term variations of daily levels of air pollution concentrations. Strong persistence of these daily levels is identified in the time series using an autocorrelation function, except for SO2. Time series of standardized residual values (SRVs) are also calculated to compare fluctuations of the time series with different levels. Time series plots of the SRVs indicate that NO and NO2 had similar temporal fluctuations. However, NO2 and O3 had opposite temporal fluctuations, attributable to weather conditions and the accumulation of vehicular emissions. The effects of both desert dust storms and airborne particulate matter resulting from the Lebanon War in July 2006 are also discernible in the SRV plots. PMID:25150052

  2. Wavelet transform approach for fitting financial time series data

    NASA Astrophysics Data System (ADS)

    Ahmed, Amel Abdoullah; Ismail, Mohd Tahir

    2015-10-01

    This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.

  3. Rice-planted area extraction by time series analysis of ENVISAT ASAR WS data using a phenology-based classification approach: A case study for Red River Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Nguyen, D.; Wagner, W.; Naeimi, V.; Cao, S.

    2015-04-01

    Recent studies have shown the potential of Synthetic Aperture Radars (SAR) for mapping of rice fields and some other vegetation types. For rice field classification, conventional classification techniques have been mostly used including manual threshold-based and supervised classification approaches. The challenge of the threshold-based approach is to find acceptable thresholds to be used for each individual SAR scene. Furthermore, the influence of local incidence angle on backscatter hinders using a single threshold for the entire scene. Similarly, the supervised classification approach requires different training samples for different output classes. In case of rice crop, supervised classification using temporal data requires different training datasets to perform classification procedure which might lead to inconsistent mapping results. In this study we present an automatic method to identify rice crop areas by extracting phonological parameters after performing an empirical regression-based normalization of the backscatter to a reference incidence angle. The method is evaluated in the Red River Delta (RRD), Vietnam using the time series of ENVISAT Advanced SAR (ASAR) Wide Swath (WS) mode data. The results of rice mapping algorithm compared to the reference data indicate the Completeness (User accuracy), Correctness (Producer accuracy) and Quality (Overall accuracies) of 88.8%, 92.5 % and 83.9 % respectively. The total area of the classified rice fields corresponds to the total rice cultivation areas given by the official statistics in Vietnam (R2  0.96). The results indicates that applying a phenology-based classification approach using backscatter time series in optimal incidence angle normalization can achieve high classification accuracies. In addition, the method is not only useful for large scale early mapping of rice fields in the Red River Delta using the current and future C-band Sentinal-1A&B backscatter data but also might be applied for other rice

  4. Wavelet analysis and scaling properties of time series

    NASA Astrophysics Data System (ADS)

    Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.

    2005-10-01

    We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.

  5. Wavelet analysis and scaling properties of time series.

    PubMed

    Manimaran, P; Panigrahi, Prasanta K; Parikh, Jitendra C

    2005-10-01

    We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior. PMID:16383481

  6. Time series expression analyses using RNA-seq: a statistical approach.

    PubMed

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021

  7. Fractal and natural time analysis of geoelectrical time series

    NASA Astrophysics Data System (ADS)

    Ramirez Rojas, A.; Moreno-Torres, L. R.; Cervantes, F.

    2013-05-01

    In this work we show the analysis of geoelectric time series linked with two earthquakes of M=6.6 and M=7.4. That time series were monitored at the South Pacific Mexican coast, which is the most important active seismic subduction zone in México. The geolectric time series were analyzed by using two complementary methods: a fractal analysis, by means of the detrended fluctuation analysis (DFA) in the conventional time, and the power spectrum defined in natural time domain (NTD). In conventional time we found long-range correlations prior to the EQ-occurrences and simultaneously in NTD, the behavior of the power spectrum suggest the possible existence of seismo electric signals (SES) similar with the previously reported in equivalent time series monitored in Greece prior to earthquakes of relevant magnitude.

  8. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  9. WUATSA: Weighted usable area time series analysis

    SciTech Connect

    Franc, G.M.

    1995-12-31

    As stated in my paper entitled, {open_quotes}FISHN-Minimum Flow Selection Made Easy{close_quotes}, there continues to exist differences of opinion between environmental resource agencies (Agencies) and power producers in the interpretation of Weighted Usable Area (WUA) versus flow data, as a tool for making minimum flow recommendations. WUA-flow curves are developed from Instream Flow Incremental Methodology (IFIM) studies. Each point on a WUA-flow curve defines the usable habitat area created within a bypassed reach, for a specific species and life stage, due to a specified minimum flow being constantly maintained within that reach. In the FISHN paper I discussed the Federal Energy Regulatory Commission`s (FERCs) effort to standardize the use of WUA-flow data to assist in minimum flow selection, as proposed in their article entitled, {open_quotes}Evaluating Relicense Proposals at the Federal Energy Regulatory Commission{close_quotes}. This FERC paper advanced a technique which has subsequently become known as the FARGO method (named after the primary author). The FISHN paper initially critiqued FARGO and then focused discussion on an alternative approach (FISHN) which is an extension to the IFIM methodology.

  10. The Effectiveness of Blind Source Separation Using Independent Component Analysis for GNSS Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Yan, Jun; Dong, Danan; Chen, Wen

    2016-04-01

    Due to the development of GNSS technology and the improvement of its positioning accuracy, observational data obtained by GNSS is widely used in Earth space geodesy and geodynamics research. Whereas the GNSS time series data of observation stations contains a plenty of information. This includes geographical space changes, deformation of the Earth, the migration of subsurface material, instantaneous deformation of the Earth, weak deformation and other blind signals. In order to decompose some instantaneous deformation underground, weak deformation and other blind signals hided in GNSS time series, we apply Independent Component Analysis (ICA) to daily station coordinate time series of the Southern California Integrated GPS Network. As ICA is based on the statistical characteristics of the observed signal. It uses non-Gaussian and independence character to process time series to obtain the source signal of the basic geophysical events. In term of the post-processing procedure of precise time-series data by GNSS, this paper examines GNSS time series using the principal component analysis (PCA) module of QOCA and ICA algorithm to separate the source signal. Then we focus on taking into account of these two signal separation technologies, PCA and ICA, for separating original signal that related geophysical disturbance changes from the observed signals. After analyzing these separation process approaches, we demonstrate that in the case of multiple factors, PCA exists ambiguity in the separation of source signals, that is the result related is not clear, and ICA will be better than PCA, which means that dealing with GNSS time series that the combination of signal source is unknown is suitable to use ICA.

  11. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    NASA Astrophysics Data System (ADS)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  12. Time series data analysis using DFA

    NASA Astrophysics Data System (ADS)

    Okumoto, A.; Akiyama, T.; Sekino, H.; Sumi, T.

    2014-02-01

    Detrended fluctuation analysis (DFA) was originally developed for the evaluation of DNA sequence and interval for heart rate variability (HRV), but it is now used to obtain various biological information. In this study we perform DFA on artificially generated data where we already know the relationship between signal and the physical event causing the signal. We generate artificial data using molecular dynamics. The Brownian motion of a polymer under an external force is investigated. In order to generate artificial fluctuation in the physical properties, we introduce obstacle pillars fixed to nanostructures. Using different conditions such as presence or absence of obstacles, external field, and the polymer length, we perform DFA on energies and positions of the polymer.

  13. Schoolwide Approaches to Discipline. The Informed Educator Series.

    ERIC Educational Resources Information Center

    Porch, Stephanie

    Although there are no simple solutions for how to turn around a school with serious discipline problems, schoolwide approaches have been effective, according to this report. The report examines research on schoolwide approaches to discipline and discusses the characteristics of programs that promote a culture of safety and support, improved…

  14. Aroma characterization based on aromatic series analysis in table grapes.

    PubMed

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, 'Kyoho' grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  15. Aroma characterization based on aromatic series analysis in table grapes

    PubMed Central

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, ‘Kyoho’ grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  16. Time series power flow analysis for distribution connected PV generation.

    SciTech Connect

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  17. Impacts of age-dependent tree sensitivity and dating approaches on dendrogeomorphic time series of landslides

    NASA Astrophysics Data System (ADS)

    Šilhán, Karel; Stoffel, Markus

    2015-05-01

    Different approaches and thresholds have been utilized in the past to date landslides with growth ring series of disturbed trees. Past work was mostly based on conifer species because of their well-defined ring boundaries and the easy identification of compression wood after stem tilting. More recently, work has been expanded to include broad-leaved trees, which are thought to produce less and less evident reactions after landsliding. This contribution reviews recent progress made in dendrogeomorphic landslide analysis and introduces a new approach in which landslides are dated via ring eccentricity formed after tilting. We compare results of this new and the more conventional approaches. In addition, the paper also addresses tree sensitivity to landslide disturbance as a function of tree age and trunk diameter using 119 common beech (Fagus sylvatica L.) and 39 Crimean pine (Pinus nigra ssp. pallasiana) trees growing on two landslide bodies. The landslide events reconstructed with the classical approach (reaction wood) also appear as events in the eccentricity analysis, but the inclusion of eccentricity clearly allowed for more (162%) landslides to be detected in the tree-ring series. With respect to tree sensitivity, conifers and broad-leaved trees show the strongest reactions to landslides at ages comprised between 40 and 60 years, with a second phase of increased sensitivity in P. nigra at ages of ca. 120-130 years. These phases of highest sensitivities correspond with trunk diameters at breast height of 6-8 and 18-22 cm, respectively (P. nigra). This study thus calls for the inclusion of eccentricity analyses in future landslide reconstructions as well as for the selection of trees belonging to different age and diameter classes to allow for a well-balanced and more complete reconstruction of past events.

  18. Iranian rainfall series analysis by means of nonparametric tests

    NASA Astrophysics Data System (ADS)

    Talaee, P. Hosseinzadeh

    2014-05-01

    The study of the trends and fluctuations in rainfall has received a great deal of attention, since changes in rainfall patterns may lead to floods or droughts. The objective of this study was to analyze the annual, seasonal, and monthly rainfall time series at seven rain gauge stations in the west of Iran for a 40-year period (from October 1969 to September 2009). The homogeneity of the rainfall data sets at the rain gauge stations was checked by using the cumulative deviations test. Three nonparametric tests, namely Kendall, Spearman, and Mann-Kendall, at the 95 % confidence level were used for the trend analysis and the Theil-Sen estimator was applied for determining the magnitudes of the trends. According to the homogeneity analysis, all of the rainfall series except the September series at Vasaj station were found to be homogeneous. The obtained results showed an insignificant trend in the annual and seasonal rainfall series at the majority of the considered stations. Moreover, only three significant trends were observed at the February rainfall of Aghajanbolaghi station, the November series of Vasaj station, and the March rainfall series of Khomigan station. The findings of this study on the temporal trends of rainfall can be implemented to improve the water resources strategies in the study region.

  19. Activity Approach to Just Beyond the Classroom. Environmental Education Series.

    ERIC Educational Resources Information Center

    Skliar, Norman; La Mantia, Laura

    To provide teachers with some of the many activities that can be carried on "just beyond the classroom," the booklet presents plans for more than 40 outdoor education activities, all emphasizing multidisciplinary, inquiry approach to learning. The school grounds offer optimum conditions for initiating studies in the out-of-doors. While every…

  20. Instructional Approaches to Slow Learning. Practical Suggestions for Teaching Series.

    ERIC Educational Resources Information Center

    Younie, William J.

    Designed for teachers, the text distinguishes types of slow learners and suggests practical approaches for their educational problems. Slow learning and its types are defined; the slow learner is characterized; stages of educational evaluation and aspects of administration are outlined. Curriculum considerations for different levels are described,…

  1. Emergent Approaches to Mental Health Problems. The Century Psychology Series.

    ERIC Educational Resources Information Center

    Cowen, Emory L., Ed.; And Others

    Innovative approaches to mental health problems are described. Conceptualizations about the following areas are outlined: psychiatry, the universe, and the community; theoretical malaise and community mental health; the relation of conceptual models to manpower needs; and mental health manpower and institutional change. Community programs and new…

  2. A Corpus Analysis of Vocabulary Coverage and Vocabulary Learning Opportunities within a Children's Story Series

    ERIC Educational Resources Information Center

    Sun, Yu-Chih

    2016-01-01

    Extensive reading for second language learners have been widely documented over the past few decades. However, few studies, if any, have used a corpus analysis approach to analyze the vocabulary coverage within a single-author story series, its repetition of vocabulary, and the incidental and intentional vocabulary learning opportunities therein.…

  3. Clinical immunology review series: an approach to desensitization

    PubMed Central

    Krishna, M T; Huissoon, A P

    2011-01-01

    Allergen immunotherapy describes the treatment of allergic disease through administration of gradually increasing doses of allergen. This form of immune tolerance induction is now safer, more reliably efficacious and better understood than when it was first formally described in 1911. In this paper the authors aim to summarize the current state of the art in immunotherapy in the treatment of inhalant, venom and drug allergies, with specific reference to its practice in the United Kingdom. A practical approach has been taken, with reference to current evidence and guidelines, including illustrative protocols and vaccine schedules. A number of novel approaches and techniques are likely to change considerably the way in which we select and treat allergy patients in the coming decade, and these advances are previewed. PMID:21175592

  4. A Monte Carlo Approach to Biomedical Time Series Search

    PubMed Central

    Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A.T.

    2016-01-01

    Time series subsequence matching (or signal searching) has importance in a variety of areas in health care informatics. These areas include case-based diagnosis and treatment as well as the discovery of trends and correlations between data. Much of the traditional research in signal searching has focused on high dimensional R-NN matching. However, the results of R-NN are often small and yield minimal information gain; especially with higher dimensional data. This paper proposes a randomized Monte Carlo sampling method to broaden search criteria such that the query results are an accurate sampling of the complete result set. The proposed method is shown both theoretically and empirically to improve information gain. The number of query results are increased by several orders of magnitude over approximate exact matching schemes and fall within a Gaussian distribution. The proposed method also shows excellent performance as the majority of overhead added by sampling can be mitigated through parallelization. Experiments are run on both simulated and real-world biomedical datasets.

  5. Improvements in Accurate GPS Positioning Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Koyama, Yuichiro; Tanaka, Toshiyuki

    Although the Global Positioning System (GPS) is used widely in car navigation systems, cell phones, surveying, and other areas, several issues still exist. We focus on the continuous data received in public use of GPS, and propose a new positioning algorithm that uses time series analysis. By fitting an autoregressive model to the time series model of the pseudorange, we propose an appropriate state-space model. We apply the Kalman filter to the state-space model and use the pseudorange estimated by the filter in our positioning calculations. The results of the authors' positioning experiment show that the accuracy of the proposed method is much better than that of the standard method. In addition, as we can obtain valid values estimated by time series analysis using the state-space model, the proposed state-space model can be applied to several other fields.

  6. Time series analysis and the analysis of aquatic and riparian ecosystems

    USGS Publications Warehouse

    Milhous, R.T.

    2003-01-01

    Time series analysis of physical instream habitat and the riparian zone is not done as frequently as would be beneficial in understanding the fisheries aspects of the aquatic ecosystem. This paper presents two case studies have how time series analysis may be accomplished. Time series analysis is the analysis of the variation of the physical habitat or the hydro-period in the riparian zone (in many situations, the floodplain).

  7. Analysis of Complex Intervention Effects in Time-Series Experiments.

    ERIC Educational Resources Information Center

    Bower, Cathleen

    An iterative least squares procedure for analyzing the effect of various kinds of intervention in time-series data is described. There are numerous applications of this design in economics, education, and psychology, although until recently, no appropriate analysis techniques had been developed to deal with the model adequately. This paper…

  8. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  9. ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES

    PubMed Central

    PENG, C.-K.; COSTA, MADALENA; GOLDBERGER, ARY L.

    2009-01-01

    We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations. PMID:20041035

  10. Nonlinear Analysis of Surface EMG Time Series of Back Muscles

    NASA Astrophysics Data System (ADS)

    Dolton, Donald C.; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul

    2004-10-01

    A nonlinear analysis of surface electromyography time series of subjects with and without low back pain is presented. The mean-square displacement and entropy shows anomalous diffusive behavior on intermediate time range 10 ms < t < 1 s. This behavior implies the presence of correlations in the signal. We discuss the shape of the power spectrum of the signal.

  11. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  12. Born-series approach to the calculation of Casimir forces

    NASA Astrophysics Data System (ADS)

    Bennett, Robert

    2014-06-01

    The Casimir force between two objects is notoriously difficult to calculate in anything other than parallel-plate geometries due to its nonadditive nature. This means that for more complicated, realistic geometries one usually has to resort to approaches such as making the crude proximity force approximation (PFA). Another issue with calculation of Casimir forces in real-world situations (such as with realistic materials) is that there are continuing doubts about the status of Lifshitz's original treatment as a true quantum theory. Here we demonstrate an alternative approach to the calculation of Casimir forces for arbitrary geometries which sidesteps both of these problems. Our calculations are based upon a Born expansion of the Green's function of the quantized electromagnetic vacuum field, interpreted as multiple scattering, with the relevant coupling strength being the difference in the dielectric functions of the various materials involved. This allows one to consider arbitrary geometries in single or multiple scattering simply by integrating over the desired shape, meaning that extension beyond the PFA is trivial. This work is mostly dedicated to illustration of the method by reproduction of known parallel-slab results—a process that turns out to be nontrivial and provides several useful insights. We also present a short example of calculation of the Casimir energy for a more complicated geometry; namely, that of two finite slabs.

  13. Wavelet analysis for non-stationary, nonlinear time series

    NASA Astrophysics Data System (ADS)

    Schulte, Justin A.

    2016-08-01

    Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the quasi-biennial oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16 and 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.

  14. Optimal trading strategies—a time series approach

    NASA Astrophysics Data System (ADS)

    Bebbington, Peter A.; Kühn, Reimer

    2016-05-01

    Motivated by recent advances in the spectral theory of auto-covariance matrices, we are led to revisit a reformulation of Markowitz’ mean-variance portfolio optimization approach in the time domain. In its simplest incarnation it applies to a single traded asset and allows an optimal trading strategy to be found which—for a given return—is minimally exposed to market price fluctuations. The model is initially investigated for a range of synthetic price processes, taken to be either second order stationary, or to exhibit second order stationary increments. Attention is paid to consequences of estimating auto-covariance matrices from small finite samples, and auto-covariance matrix cleaning strategies to mitigate against these are investigated. Finally we apply our framework to real world data.

  15. A three-phase series-parallel resonant converter -- analysis, design, simulation and experimental results

    SciTech Connect

    Bhat, A.K.S.; Zheng, L.

    1995-12-31

    A three-phase dc-to-dc series-parallel resonant converter is proposed and its operating modes for 180{degree} wide gating pulse scheme are explained. A detailed analysis of the converter using constant current model and Fourier series approach is presented. Based on the analysis, design curves are obtained and a design example of 1 kW converter is given. SPICE simulation results for the designed converter and experimental results for a 500 W converter are presented to verify the performance of the proposed converter for varying load conditions. The converter operates in lagging PF mode for the entire load range and requires a narrow variation in switching frequency.

  16. Scaling analysis of multi-variate intermittent time series

    NASA Astrophysics Data System (ADS)

    Kitt, Robert; Kalda, Jaan

    2005-08-01

    The scaling properties of the time series of asset prices and trading volumes of stock markets are analysed. It is shown that similar to the asset prices, the trading volume data obey multi-scaling length-distribution of low-variability periods. In the case of asset prices, such scaling behaviour can be used for risk forecasts: the probability of observing next day a large price movement is (super-universally) inversely proportional to the length of the ongoing low-variability period. Finally, a method is devised for a multi-factor scaling analysis. We apply the simplest, two-factor model to equity index and trading volume time series.

  17. Interglacial climate dynamics and advanced time series analysis

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  18. Theoretical approach for plasma series resonance effect in geometrically symmetric dual radio frequency plasma

    SciTech Connect

    Bora, B.; Bhuyan, H.; Favre, M.; Wyndham, E.; Chuaqui, H.

    2012-02-27

    Plasma series resonance (PSR) effect is well known in geometrically asymmetric capacitively couple radio frequency plasma. However, plasma series resonance effect in geometrically symmetric plasma has not been properly investigated. In this work, a theoretical approach is made to investigate the plasma series resonance effect and its influence on Ohmic and stochastic heating in geometrically symmetric discharge. Electrical asymmetry effect by means of dual frequency voltage waveform is applied to excite the plasma series resonance. The results show considerable variation in heating with phase difference between the voltage waveforms, which may be applicable in controlling the plasma parameters in such plasma.

  19. The Novel Transvestibule Approach for Endoscopic Thyroidectomy: A Case Series

    PubMed Central

    Yang, Kai; Ding, Boni; Lin, Changwei; Li, Wanwan

    2016-01-01

    Object: To evaluate the feasibility of NOTES for thyroid by the transvestibule approach. Methods: Six patients diagnosed with benign thyroid diseases were enrolled and underwent transvestibule endoscopic thyroidectomy in our hospital from October 2013 to September 2014. Results: All 6 patients completed transvestibule endoscopic thyroidectomy successfully with no conversion to open surgery. The mean operation time was 122 minutes (100 to 150 min). The average blood loss during surgery was 30 mL (10 to 40 mL). The pathologic diagnosis coincided with the preoperative diagnosis, which was 1 case of thyroid adenoma and 5 cases of thyroid goiters. The mean length of hospital stay was 8.2 days (8 to 10 d). No severe complications were reported during the 3 to 13 months’ follow-up. Conclusions: Transvestibule endoscopic thyroidectomy is feasible, with a satisfactory cosmetic effect; yet, further improvement of surgical techniques are required on account of the complexity of the surgical procedure and the prolonged operation time. PMID:26813240

  20. Toward a practical approach for ergodicity analysis

    NASA Astrophysics Data System (ADS)

    Wang, H.; Wang, C.; Zhao, Y.; Lin, X.; Yu, C.

    2015-09-01

    It is of importance to perform hydrological forecast using a finite hydrological time series. Most time series analysis approaches presume a data series to be ergodic without justifying this assumption. This paper presents a practical approach to analyze the mean ergodic property of hydrological processes by means of autocorrelation function evaluation and Augmented Dickey Fuller test, a radial basis function neural network, and the definition of mean ergodicity. The mean ergodicity of precipitation processes at the Lanzhou Rain Gauge Station in the Yellow River basin, the Ankang Rain Gauge Station in Han River, both in China, and at Newberry, MI, USA are analyzed using the proposed approach. The results indicate that the precipitations of March, July, and August in Lanzhou, and of May, June, and August in Ankang have mean ergodicity, whereas, the precipitation of any other calendar month in these two rain gauge stations do not have mean ergodicity. The precipitation of February, May, July, and December in Newberry show ergodic property, although the precipitation of each month shows a clear increasing or decreasing trend.

  1. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    NASA Astrophysics Data System (ADS)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  2. Recurrence quantification analysis and state space divergence reconstruction for financial time series analysis

    NASA Astrophysics Data System (ADS)

    Strozzi, Fernanda; Zaldívar, José-Manuel; Zbilut, Joseph P.

    2007-03-01

    The application of recurrence quantification analysis (RQA) and state space divergence reconstruction for the analysis of financial time series in terms of cross-correlation and forecasting is illustrated using high-frequency time series and random heavy-tailed data sets. The results indicate that these techniques, able to deal with non-stationarity in the time series, may contribute to the understanding of the complex dynamics hidden in financial markets. The results demonstrate that financial time series are highly correlated. Finally, an on-line trading strategy is illustrated and the results shown using high-frequency foreign exchange time series.

  3. Modelling trends in climatic time series using the state space approach

    NASA Astrophysics Data System (ADS)

    Laine, Marko; Kyrölä, Erkki

    2014-05-01

    A typical feature of an atmospheric time series is that they are not stationary but exhibit both slowly varying and abrupt changes in the distributional properties. These are caused by external forcing such as changes in the solar activity or volcanic eruptions. Further, the data sampling is often nonuniform, there are data gaps, and the uncertainty of the observations can vary. When observations are combined from various sources there will be instrument and retrieval method related biases. The differences in sampling lead to uncertainties, also. Dynamic regression with state space representation of the underlying processes provides flexible tools for these challenges in the analysis. By explicitly allowing for variability in the regression coefficients we let the system properties change in time. This change in time can be modelled and estimated, also. Furthermore, the use of unobservable state variables allows modelling of the processes that are driving the observed variability, such as seasonality or external forcing, and we can explicitly allow for some modelling error. The state space approach provides a well-defined hierarchical statistical model for assessing trends defined as long term background changes in the time series. The modelling assumptions can be evaluated and the method provides realistic uncertainty estimates for the model based statements on the quantities of interest. We show that a linear dynamic model (DLM) provides very flexible tool for trend and change point analysis in time series. Given the structural parameters of the model, the Kalman filter and Kalman smoother formulas can be used to estimate the model states. Further, we provide an efficient way to account for the structural parameter uncertainty by using adaptive Markov chain Monte Carlo (MCMC) algorithm. Then, the trend related statistics can be estimated by simulating realizations of the estimated processes with fully quantified uncertainties. This presentation will provide a

  4. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NASA Astrophysics Data System (ADS)

    Velsink, Hiddo

    2016-03-01

    Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on the unknown parameters of the adjustment problem. Thus they describe deformation patterns. If deformation is absent, the epochs of the time series are supposed to be related via affine, similarity or congruence transformations. S-basis invariant testing of deformation patterns is treated. The model is experimentally validated by showing the procedure for a point set of 3D coordinates, determined from total station measurements during five epochs. The modelling of two patterns, the movement of just one point in several epochs, and of several points, is shown. Full, rank deficient covariance matrices of the 3D coordinates, resulting from free network adjustments of the total station measurements of each epoch, are used in the analysis.

  5. Mode Analysis with Autocorrelation Method (Single Time Series) in Tokamak

    NASA Astrophysics Data System (ADS)

    Saadat, Shervin; Salem, Mohammad K.; Goranneviss, Mahmoud; Khorshid, Pejman

    2010-08-01

    In this paper plasma mode analyzed with statistical method that designated Autocorrelation function. Auto correlation function used from one time series, so for this purpose we need one Minov coil. After autocorrelation analysis on mirnov coil data, spectral density diagram is plotted. Spectral density diagram from symmetries and trends can analyzed plasma mode. RHF fields effects with this method ate investigated in IR-T1 tokamak and results corresponded with multichannel methods such as SVD and FFT.

  6. Time series analysis as a tool for karst water management

    NASA Astrophysics Data System (ADS)

    Fournier, Matthieu; Massei, Nicolas; Duran, Léa

    2015-04-01

    Karst hydrosystems are well known for their vulnerability to turbidity due to their complex and unique characteristics which make them very different from other aquifers. Moreover, many parameters can affect their functioning. It makes the characterization of their vulnerability difficult and needs the use of statistical analyses Time series analyses on turbidity, electrical conductivity and water discharge datasets, such as correlation and spectral analyses, have proven to be useful in improving our understanding of karst systems. However, the loss of information on time localization is a major drawback of those Fourier spectral methods; this problem has been overcome by the development of wavelet analysis (continuous or discrete) for hydrosystems offering the possibility to better characterize the complex modalities of variation inherent to non stationary processes. Nevertheless, from wavelet transform, signal is decomposed on several continuous wavelet signals which cannot be true with local-time processes frequently observed in karst aquifer. More recently, a new approach associating empirical mode decomposition and the Hilbert transform was presented for hydrosystems. It allows an orthogonal decomposition of the signal analyzed and provides a more accurate estimation of changing variability scales across time for highly transient signals. This study aims to identify the natural and anthropogenic parameters which control turbidity released at a well for drinking water supply. The well is located in the chalk karst aquifer near the Seine river at 40 km of the Seine estuary in western Paris Basin. At this location, tidal variations greatly affect the level of the water in the Seine. Continuous wavelet analysis on turbidity dataset have been used to decompose turbidity release at the well into three components i) the rain event periods, ii) the pumping periods and iii) the tidal range of Seine river. Time-domain reconstruction by inverse wavelet transform allows

  7. Satellite time series analysis using Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  8. The application of complex network time series analysis in turbulent heated jets

    SciTech Connect

    Charakopoulos, A. K.; Karakasidis, T. E. Liakopoulos, A.; Papanicolaou, P. N.

    2014-06-15

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.

  9. The application of complex network time series analysis in turbulent heated jets

    NASA Astrophysics Data System (ADS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Papanicolaou, P. N.; Liakopoulos, A.

    2014-06-01

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.

  10. Time series analysis using semiparametric regression on oil palm production

    NASA Astrophysics Data System (ADS)

    Yundari, Pasaribu, U. S.; Mukhaiyar, U.

    2016-04-01

    This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).

  11. Combination of equiprobable symbolization and time reversal asymmetry for heartbeat interval series analysis

    NASA Astrophysics Data System (ADS)

    Hou, Fengzhen; Huang, Xiaolin; Chen, Ying; Huo, Chengyu; Liu, Hongxing; Ning, Xinbao

    2013-01-01

    Symbolic dynamics method and time reversal asymmetry analysis are both important approaches in the study of heartbeat interval series. However, there is limited research work reported on combining these two methods. We provide a method of time reversal asymmetry analysis which focuses on the differences between the forward and backward embedding “m words” after the operation of equiprobable symbolization. To investigate the total amplitude as well as the distribution features of the difference, four indices are proposed. Based on the application to simulation series, we found that these measures can successfully detect time reversal asymmetry in chaos series. With application to human heartbeat interval series (RR series), it is suggested that the distribution features of the forward-backward difference can sensitively capture the dynamical changes caused by diseases or aging. In particular, the index ED, which reflects the random degree of the forward-backward difference distribution, can significantly discriminate healthy subjects from diseased ones. We conclude that RR series from healthy subjects show more asymmetry in temporal structure on the original time scale from the perspective of equiprobable symbolization, whereas diseases account for loss of this asymmetry.

  12. KALREF—A Kalman filter and time series approach to the International Terrestrial Reference Frame realization

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoping; Abbondanza, Claudio; Altamimi, Zuheir; Chin, T. Mike; Collilieux, Xavier; Gross, Richard S.; Heflin, Michael B.; Jiang, Yan; Parker, Jay W.

    2015-05-01

    The current International Terrestrial Reference Frame is based on a piecewise linear site motion model and realized by reference epoch coordinates and velocities for a global set of stations. Although linear motions due to tectonic plates and glacial isostatic adjustment dominate geodetic signals, at today's millimeter precisions, nonlinear motions due to earthquakes, volcanic activities, ice mass losses, sea level rise, hydrological changes, and other processes become significant. Monitoring these (sometimes rapid) changes desires consistent and precise realization of the terrestrial reference frame (TRF) quasi-instantaneously. Here, we use a Kalman filter and smoother approach to combine time series from four space geodetic techniques to realize an experimental TRF through weekly time series of geocentric coordinates. In addition to secular, periodic, and stochastic components for station coordinates, the Kalman filter state variables also include daily Earth orientation parameters and transformation parameters from input data frames to the combined TRF. Local tie measurements among colocated stations are used at their known or nominal epochs of observation, with comotion constraints applied to almost all colocated stations. The filter/smoother approach unifies different geodetic time series in a single geocentric frame. Fragmented and multitechnique tracking records at colocation sites are bridged together to form longer and coherent motion time series. While the time series approach to TRF reflects the reality of a changing Earth more closely than the linear approximation model, the filter/smoother is computationally powerful and flexible to facilitate incorporation of other data types and more advanced characterization of stochastic behavior of geodetic time series.

  13. The multiscale analysis between stock market time series

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian

    2015-11-01

    This paper is devoted to multiscale cross-correlation analysis on stock market time series, where multiscale DCCA cross-correlation coefficient as well as multiscale cross-sample entropy (MSCE) is applied. Multiscale DCCA cross-correlation coefficient is a realization of DCCA cross-correlation coefficient on multiple scales. The results of this method present a good scaling characterization. More significantly, this method is able to group stock markets by areas. Compared to multiscale DCCA cross-correlation coefficient, MSCE presents a more remarkable scaling characterization and the value of each log return of financial time series decreases with the increasing of scale factor. But the results of grouping is not as good as multiscale DCCA cross-correlation coefficient.

  14. Diagnosis of nonlinear systems using time series analysis

    SciTech Connect

    Hunter, N.F. Jr.

    1991-01-01

    Diagnosis and analysis techniques for linear systems have been developed and refined to a high degree of precision. In contrast, techniques for the analysis of data from nonlinear systems are in the early stages of development. This paper describes a time series technique for the analysis of data from nonlinear systems. The input and response time series resulting from excitation of the nonlinear system are embedded in a state space. The form of the embedding is optimized using local canonical variate analysis and singular value decomposition techniques. From the state space model, future system responses are estimated. The expected degree of predictability of the system is investigated using the state transition matrix. The degree of nonlinearity present is quantified using the geometry of the transfer function poles in the z plane. Examples of application to a linear single-degree-of-freedom system, a single-degree-of-freedom Duffing Oscillator, and linear and nonlinear three degree of freedom oscillators are presented. 11 refs., 9 figs.

  15. Time series analysis of electron flux at geostationary orbit

    SciTech Connect

    Szita, S.; Rodgers, D.J.; Johnstone, A.D.

    1996-07-01

    Time series of energetic (42.9{endash}300 keV) electron flux data from the geostationary satellite Meteosat-3 shows variability over various timescales. Of particular interest are the strong local time dependence of the flux data and the large flux peaks associated with particle injection events which occur over a timescale of a few hours. Fourier analysis has shown that for this energy range, the average electron flux diurnal variation can be approximated by a combination of two sine waves with periods of 12 and 24 hours. The data have been further examined using wavelet analysis, which shows how the diurnal variation changes and where it appears most significant. The injection events have a characteristic appearance but do not occur in phase with one another and therefore do not show up in a Fourier spectrum. Wavelet analysis has been used to look for characteristic time scales for these events. {copyright} {ital 1996 American Institute of Physics.}

  16. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  17. On-line analysis of reactor noise using time-series analysis

    SciTech Connect

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives.

  18. Time series analysis for psychological research: examining and forecasting change

    PubMed Central

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  19. Time series analysis for psychological research: examining and forecasting change.

    PubMed

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  20. Chaotic time series analysis in economics: Balance and perspectives

    SciTech Connect

    Faggini, Marisa

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  1. Series analysis of Q-state checkerboard Potts models

    SciTech Connect

    Hansel, D.; Maillard, J.M.

    1988-12-01

    The series analysis of the low temperature expansion of the checkerboard q-state Potts model in a magnetic field initiated in two previous papers is continued. In particular algebraic varieties of the parameter space (corresponding or generalizing the so-called disorder solutions), the checkerboard Potts model and its Bethe approximation are indistinguishable as far as one is concerned with the partition function and its first order derivatives. The difference between the two models occurs for higher order derivatives. In particular one gives the exact expression of the (low temperature expansion of the) susceptibility of the checkerboard Ising model in zero magnetic field on one of these varieties.

  2. Analysis of some meteorological variables time series relevant in urban environments by applying the multifractal analysis

    NASA Astrophysics Data System (ADS)

    Pavon-Dominguez, Pablo; Ariza-Villaverde, Ana B.; Jimenez-Hornero, Francisco J.; Gutierrez de Rave, Eduardo

    2010-05-01

    The time series corresponding to variables related with the climate have been frequently studied by using the descriptive statistics. However, as several works have suggested, other approaches such as the multifractal analysis can be taken into account to complete the information about some climatic and environmental phenomena obtained from the standard methods. As a consequence, the main aim of this work was to check whether some meteorological variables relevant in urban environments (i.e. air temperature, rainfall, relative humidity, solar radiation and surface wind velocity and direction) exhibited a multifractal nature. The analysis was extended to several time scales determining the multifractal parameters and exploring the existing relationships between them and those reported by the descriptive statistics. The daily time series studied in this work were recorded in Córdoba (37.85°N 4.85°W), southern Spain, from 2001 to 2006. The altitude of this location is 117 m and the climate of this location can be defined as a mixture of Mediterranean characteristics and Continental effects. The multifractal spectra showed convex shapes for all the considered variables, confirming the presence of a multifractal type of scaling that was kept for time resolutions ranging from one day to six years. In the case of rainfall, the observed range of time scales that exhibited a multifractal nature was more restrictive due to the presence of many zeros in the daily data that characterized the precipitation regime in some places of southern Spain. The multifractal spectra corresponding to surface wind velocity and rainfall showed longer left tails implying greater heterogeneity in the time series high values. However, the multifractal spectra obtained for the rest of meteorological variables exhibited the opposite behavior meaning that the low data in the time series had more influence in the distribution variability. The presence of rare low values was significant for

  3. Three-dimensional Neumann-series approach to model light transport in nonuniform media

    PubMed Central

    Jha, Abhinav K.; Kupinski, Matthew A.; Barrett, Harrison H.; Clarkson, Eric; Hartman, John H.

    2014-01-01

    We present the implementation, validation, and performance of a three-dimensional (3D) Neumann-series approach to model photon propagation in nonuniform media using the radiative transport equation (RTE). The RTE is implemented for nonuniform scattering media in a spherical harmonic basis for a diffuse-optical-imaging setup. The method is parallelizable and implemented on a computing system consisting of NVIDIA Tesla C2050 graphics processing units (GPUs). The GPU implementation provides a speedup of up to two orders of magnitude over non-GPU implementation, which leads to good computational efficiency for the Neumann-series method. The results using the method are compared with the results obtained using the Monte Carlo simulations for various small-geometry phantoms, and good agreement is observed. We observe that the Neumann-series approach gives accurate results in many cases where the diffusion approximation is not accurate. PMID:23201945

  4. Period analysis of hydrologic series through moving-window correlation analysis method

    NASA Astrophysics Data System (ADS)

    Xie, Yangyang; Huang, Qiang; Chang, Jianxia; Liu, Saiyan; Wang, Yimin

    2016-07-01

    Period analysis is of great significance for understanding various hydrologic processes and predicting the future hydrological regime of a watershed or region. Hence, many period analysis methods including fast Fourier transform (FFT), maximum entropy spectral analysis (MESA) and wavelet analysis (WA) have been developed to study this issue. However, due to the complex components of hydrologic series and the limitations of these conventional methods, the problem is still difficult to be solved. In this paper, the moving-window correlation analysis method (MWCA) has been proposed for analyzing the periodic component of hydrologic series, which includes construction of periodic processes, significant test of periods and time frequency analysis. Three commonly used methods (FFT, MESA and WA) and MWCA are employed to investigate the periods of synthetic series and observed hydrologic series, respectively. The results show that FFT, MESA and WA are not always as good as expected when detecting periods of a time series. By contrast, MWCA has better application effects, which could identify the true periods of time series, extract the reliable periodic components, find the active time ranges of periodic components and resist the disturbance of noises. In conclusion, MWCA is suitable to identify the true periods of hydrologic time series.

  5. Approaches to Language Testing. Advances in Language Testing Series: 2. Papers in Applied Linguistics.

    ERIC Educational Resources Information Center

    Spolsky, Bernard, Ed.

    This volume, one in a series on modern language testing, collects four essays dealing with current approaches to lanquage testing. The introduction traces the development of language testing theory and examines the role of linguistics in this area. "The Psycholinguistic Basis," by E. Ingram, discusses some interpretations of the term…

  6. Automatic change detection in time series of Synthetic Aperture Radar data using a scale-driven approach

    NASA Astrophysics Data System (ADS)

    Ajadi, O. A.; Meyer, F. J.

    2013-12-01

    Automatic change detection and change classification from Synthetic Aperture Radar (SAR) images is a difficult task mostly due to the high level of speckle noise inherent to SAR data and the highly non-Gaussian nature of the SAR amplitude information. Several approaches were developed in recent years to deal with SAR specific change detection problems from image pairs and time series of images. Despite these considerable efforts, no satisfying solution to this problem has been found so far. In this paper we present a promising new algorithm for change detection from SAR that is based on a multi-scale analysis of a times series of SAR images. Our approach is composed of three steps, including (1) data enhancement and filtering, (2) multi-scale change detection, and (3) time-series analysis of change detection maps. In the data enhancement and filtering step, we first form time-series of ratio images by dividing all SAR images by a reference acquisition to suppress stationary image information and enhance change signatures. Several methods for reference image selection will be discussed in the paper. The generated ratio images are further log-transformed to create near-Gaussian data and to convert the originally multiplicative noise into additive noise. A subsequent fast non-local mean filter is applied to reduce image noise whilst preserving most of the image details. The filtered log-ratio images are then inserted into a multi-scale change detection algorithm that is composed of: (1) a multi-scale decomposition of the input images using a two-dimensional discrete stationary wavelet transform (2D-SWT); (2) a multi-resolution classification into 'change' and 'no-change' areas; and (3) a scale-driven fusion of the classification results. In a final time-series analysis step the multi-temporal change detection maps are analyzed to identify seasonal, gradual, and abrupt changes. The performance of the developed approach will be demonstrated by application to the

  7. Time series clustering analysis of health-promoting behavior

    NASA Astrophysics Data System (ADS)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  8. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. PMID:23260716

  9. Analysis of graded-index optical fibers by the spectral parameter power series method

    NASA Astrophysics Data System (ADS)

    Castillo-Pérez, Raúl; Kravchenko, Vladislav V.; Torba, Sergii M.

    2015-02-01

    The spectral parameter power series (SPPS) method is a recently introduced technique (Kravchenko 2008 Complex Var. Elliptic Equ. 53 775-89, Kravchenko and Porter 2010 Math. Methods Appl. Sci. 33 459-68) for solving linear differential equations and related spectral problems. In this work we develop an approach based on the SPPS for analysis of graded-index optical fibers. The characteristic equation of the eigenvalue problem for calculation of guided modes is obtained in an analytical form in terms of SPPS. Truncation of the series and consideration in this way of the approximate characteristic equation gives us a simple and efficient numerical method for solving the problem. Comparison with the results obtained by other available techniques reveals clear advantages for the SPPS approach, in particular, with regards to accuracy. Based on the solution of the eigenvalue problem, parameters describing the dispersion are analyzed as well.

  10. Weighted permutation entropy based on different symbolic approaches for financial time series

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2016-02-01

    In this paper, we introduce weighted permutation entropy (WPE) and three different symbolic approaches to investigate the complexities of stock time series containing amplitude-coded information and explore the influence of using different symbolic approaches on obtained WPE results. We employ WPE based on symbolic approaches to the US and Chinese stock markets and make a comparison between the results of US and Chinese stock markets. Three symbolic approaches are able to help the complexity containing in the stock time series by WPE method drop whatever the embedding dimension is. The similarity between these stock markets can be detected by the WPE based on Binary Δ-coding-method, while the difference between them can be revealed by the WPE based on σ-method, Max-min-method. The combinations of the symbolic approaches: σ-method and Max-min-method, and WPE method are capable of reflecting the multiscale structure of complexity by different time delay and analyze the differences between complexities of stock time series in more detail and more accurately. Furthermore, the correlations between stock markets in the same region and the similarities hidden in the S&P500 and DJI, ShangZheng and ShenCheng are uncovered by the comparison of the WPE based on Binary Δ-coding-method of six stock markets.

  11. Hybrid grammar-based approach to nonlinear dynamical system identification from biological time series

    NASA Astrophysics Data System (ADS)

    McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.

    2006-02-01

    We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.

  12. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  13. Inorganic chemical analysis of environmental materials—A lecture series

    USGS Publications Warehouse

    Crock, J.G.; Lamothe, P.J.

    2011-01-01

    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.

  14. Feature extraction for change analysis in SAR time series

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2015-10-01

    In remote sensing, the change detection topic represents a broad field of research. If time series data is available, change detection can be used for monitoring applications. These applications require regular image acquisitions at identical time of day along a defined period. Focusing on remote sensing sensors, radar is especially well-capable for applications requiring regularity, since it is independent from most weather and atmospheric influences. Furthermore, regarding the image acquisitions, the time of day plays no role due to the independence from daylight. Since 2007, the German SAR (Synthetic Aperture Radar) satellite TerraSAR-X (TSX) permits the acquisition of high resolution radar images capable for the analysis of dense built-up areas. In a former study, we presented the change analysis of the Stuttgart (Germany) airport. The aim of this study is the categorization of detected changes in the time series. This categorization is motivated by the fact that it is a poor statement only to describe where and when a specific area has changed. At least as important is the statement about what has caused the change. The focus is set on the analysis of so-called high activity areas (HAA) representing areas changing at least four times along the investigated period. As first step for categorizing these HAAs, the matching HAA changes (blobs) have to be identified. Afterwards, operating in this object-based blob level, several features are extracted which comprise shape-based, radiometric, statistic, morphological values and one context feature basing on a segmentation of the HAAs. This segmentation builds on the morphological differential attribute profiles (DAPs). Seven context classes are established: Urban, infrastructure, rural stable, rural unstable, natural, water and unclassified. A specific HA blob is assigned to one of these classes analyzing the CovAmCoh time series signature of the surrounding segments. In combination, also surrounding GIS information

  15. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  16. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  17. Time series analysis of waterfowl species number change

    NASA Astrophysics Data System (ADS)

    Mengjung Chou, Caroline; Da-Wei Tsai, David; Honglay Chen, Paris

    2014-05-01

    The objective of this study is to analyze the time series of waterfowl species numbers in Da-du estuary which was set up as Important Bird Areas (IBAs) from birdlife international in 2004. The multiplicative decomposition method has been adapted to determine the species variations, including long-term (T), seasonal (S), circular (C), and irregular (I). The results indicated: (1) The long-term trend decreased with time from 1989 to 2012; (2) There were two seasonal high peaks in April and November each year with the lowest peak in June. Moreover, since the winter visitors had the dominant numbers in total species numbers, the seasonal changes were mainly depended on the winter birds' migration. (3) The waterfowl was gradually restored back from lowest point in 1996, but the difference between 1989 and 2003 indicated the irreversible effect existed already. (4) The irregular variation was proved as a random distribution by several statistical tests including normality test, homogeneity of variance, independence test and variation probability method to portray the characteristics of the distributions and to demonstrate its randomness. Consequently, this study exhibited the time series analysis methods were reasonable well to present the waterfowl species changes numerically. And those results could be the precious data for the researches of ecosystem succession and anthropogenic impacts in the estuary.

  18. Microvascular decompression for glossopharyngeal neuralgia through a microasterional approach: A case series

    PubMed Central

    Revuelta-Gutiérrez, Rogelio; Morales-Martínez, Andres Humberto; Mejías-Soto, Carolina; Martínez-Anda, Jaime Jesús; Ortega-Porcayo, Luis Alberto

    2016-01-01

    Background: Glossopharyngeal neuralgia (GPN) is an uncommon craniofacial pain syndrome. It is characterized by a sudden onset lancinating pain usually localized in the sensory distribution of the IX cranial nerve associated with excessive vagal outflow, which leads to bradycardia, hypotension, syncope, or cardiac arrest. This study aims to review our surgical experience performing microvascular decompression (MVD) in patients with GPN. Methods: Over the last 20 years, 14 consecutive cases were diagnosed with GPN. MVD using a microasterional approach was performed in all patients. Demographic data, clinical presentation, surgical findings, clinical outcome, complications, and long-term follow-up were reviewed. Results: The median age of onset was 58.7 years. The mean time from onset of symptoms to treatment was 8.8 years. Glossopharyngeal and vagus nerve compression was from the posterior inferior cerebellar artery in eleven cases (78.5%), vertebral artery in two cases (14.2%), and choroid plexus in one case (7.1%). Postoperative mean follow-up was 26 months (3–180 months). Pain analysis demonstrated long-term pain improvement of 114 ± 27.1 months and pain remission in 13 patients (92.9%) (P = 0.0001) two complications were documented, one patient had a cerebrospinal fluid leak, and another had bacterial meningitis. There was no surgical mortality. Conclusions: GPN is a rare entity, and secondary causes should be discarded. MVD through a retractorless microasterional approach is a safe and effective technique. Our series demonstrated an excellent clinical outcome with pain remission in 92.9%. PMID:27213105

  19. A three-phase series-parallel resonant converter -- analysis, design, simulation, and experimental results

    SciTech Connect

    Bhat, A.K.S.; Zheng, R.L.

    1996-07-01

    A three-phase dc-to-dc series-parallel resonant converter is proposed /and its operating modes for a 180{degree} wide gating pulse scheme are explained. A detailed analysis of the converter using a constant current model and the Fourier series approach is presented. Based on the analysis, design curves are obtained and a design example of a 1-kW converter is given. SPICE simulation results for the designed converter and experimental results for a 500-W converter are presented to verify the performance of the proposed converter for varying load conditions. The converter operates in lagging power factor (PF) mode for the entire load range and requires a narrow variation in switching frequency, to adequately regulate the output power.

  20. Identification of statistical patterns in complex systems via symbolic time series analysis.

    PubMed

    Gupta, Shalabh; Khatkhate, Amol; Ray, Asok; Keller, Eric

    2006-10-01

    Identification of statistical patterns from observed time series of spatially distributed sensor data is critical for performance monitoring and decision making in human-engineered complex systems, such as electric power generation, petrochemical, and networked transportation. This paper presents an information-theoretic approach to identification of statistical patterns in such systems, where the main objective is to enhance structural integrity and operation reliability. The core concept of pattern identification is built upon the principles of Symbolic Dynamics, Automata Theory, and Information Theory. To this end, a symbolic time series analysis method has been formulated and experimentally validated on a special-purpose test apparatus that is designed for data acquisition and real-time analysis of fatigue damage in polycrystalline alloys. PMID:17063932

  1. Motion Artifact Reduction in Ultrasound Based Thermal Strain Imaging of Atherosclerotic Plaques Using Time Series Analysis

    PubMed Central

    Dutta, Debaditya; Mahmoud, Ahmed M.; Leers, Steven A.; Kim, Kang

    2013-01-01

    Large lipid pools in vulnerable plaques, in principle, can be detected using US based thermal strain imaging (US-TSI). One practical challenge for in vivo cardiovascular application of US-TSI is that the thermal strain is masked by the mechanical strain caused by cardiac pulsation. ECG gating is a widely adopted method for cardiac motion compensation, but it is often susceptible to electrical and physiological noise. In this paper, we present an alternative time series analysis approach to separate thermal strain from the mechanical strain without using ECG. The performance and feasibility of the time-series analysis technique was tested via numerical simulation as well as in vitro water tank experiments using a vessel mimicking phantom and an excised human atherosclerotic artery where the cardiac pulsation is simulated by a pulsatile pump. PMID:24808628

  2. Visibility graph network analysis of gold price time series

    NASA Astrophysics Data System (ADS)

    Long, Yu

    2013-08-01

    Mapping time series into a visibility graph network, the characteristics of the gold price time series and return temporal series, and the mechanism underlying the gold price fluctuation have been explored from the perspective of complex network theory. The network degree distribution characters, which change from power law to exponent law when the series was shuffled from original sequence, and the average path length characters, which change from L∼lnN into lnL∼lnN as the sequence was shuffled, demonstrate that price series and return series are both long-rang dependent fractal series. The relations of Hurst exponent to the power-law exponent of degree distribution demonstrate that the logarithmic price series is a fractal Brownian series and the logarithmic return series is a fractal Gaussian series. Power-law exponents of degree distribution in a time window changing with window moving demonstrates that a logarithmic gold price series is a multifractal series. The Power-law average clustering coefficient demonstrates that the gold price visibility graph is a hierarchy network. The hierarchy character, in light of the correspondence of graph to price fluctuation, means that gold price fluctuation is a hierarchy structure, which appears to be in agreement with Elliot’s experiential Wave Theory on stock price fluctuation, and the local-rule growth theory of a hierarchy network means that the hierarchy structure of gold price fluctuation originates from persistent, short term factors, such as short term speculation.

  3. Homogenization of atmospheric pressure time series recorded at VLBI stations using a segmentation LASSO approach

    NASA Astrophysics Data System (ADS)

    Balidakis, Kyriakos; Heinkelmann, Robert; Lu, Cuixian; Soja, Benedikt; Karbon, Maria; Nilsson, Tobias; Glaser, Susanne; Andres Mora-Diaz, Julian; Anderson, James; Liu, Li; Raposo-Pulido, Virginia; Xu, Minghui; Schuh, Harald

    2015-04-01

    Time series of meteorological parameters recorded at VLBI (Very Long Baseline Interferometry) observatories allow us to realistically model and consequently to eliminate the atmosphere-induced effects in the VLBI products to a large extent. Nevertheless, this advantage of VLBI is not fully exploited since such information is contaminated with inconsistencies, such as uncertainties regarding the calibration and location of the meteorological sensors, outliers, missing data points, and breaks. It has been shown that such inconsistencies in meteorological data used for VLBI data analysis impose problems in the geodetic products (e.g vertical site position) and result in mistakes in geophysical interpretation. The aim of the procedure followed here is to optimally model the tropospheric delay and bending effects that are still the main sources of error in VLBI data analysis. In this study, the meteorological data recorded with sensors mounted in the vicinity of VLBI stations have been homogenized spanning the period from 1979 until today. In order to meet this objective, inhomogeneities were detected and adjusted using test results and metadata. Some of the approaches employed include Alexandersson's Standard Normal Homogeneity Test and an iterative procedure, of which the segmentation part is based on a dynamic programming algorithm and the functional part on a LASSO (Least Absolute Shrinkage and Selection Operator) estimator procedure. For the provision of reference time series that are necessary to apply the aforementioned methods, ECMWF's (European Centre for Medium-Range Weather Forecasts) ERA-Interim reanalysis surface data were employed. Special care was taken regarding the datum definition of this model. Due to the significant height difference between the VLBI antenna's reference point and the elevation included in geopotential fields of the specific numerical weather models, a hypsometric adjustment is applied using the absolute pressure level from the WMO

  4. Multifractal detrended fluctuation analysis of Pannonian earthquake magnitude series

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Toth, Laszlo

    2016-04-01

    The multifractality of the series of magnitudes of the earthquakes occurred in Pannonia region from 2002 to 2012 has been investigated. The shallow (depth less than 40 km) and deep (depth larger than 70 km) seismic catalogues were analysed by using the multifractal detrended fluctuation analysis. The shallow and deep catalogues are characterized by different multifractal properties: (i) the magnitudes of the shallow events are weakly persistent, while those of the deep ones are almost uncorrelated; (ii) the deep catalogue is more multifractal than the shallow one; (iii) the magnitudes of the deep catalogue are characterized by a right-skewed multifractal spectrum, while that of the shallow magnitude is rather symmetric; (iv) a direct relationship between the b-value of the Gutenberg-Richter law and the multifractality of the magnitudes is suggested.

  5. Multidimensional stock network analysis: An Escoufier's RV coefficient approach

    NASA Astrophysics Data System (ADS)

    Lee, Gan Siew; Djauhari, Maman A.

    2013-09-01

    The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.

  6. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  7. Time series analysis of diverse extreme phenomena: universal features

    NASA Astrophysics Data System (ADS)

    Eftaxias, K.; Balasis, G.

    2012-04-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.

  8. Time series modeling by a regression approach based on a latent process.

    PubMed

    Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice

    2009-01-01

    Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations. PMID:19616918

  9. Novex Analysis: A Cognitive Science Approach to Instructional Design.

    ERIC Educational Resources Information Center

    Taylor, James C.

    1994-01-01

    Describes the Novice-Expert (Novex) Analysis, a nine-step approach to instructional design aimed at effecting the shift from novice to expert by creating a series of learning activities to enable novices to construct an expert knowledge base. The "Dimensions of Processing Model" diagram illustrates how learning processes influence memory.…

  10. Finite element techniques in computational time series analysis of turbulent flows

    NASA Astrophysics Data System (ADS)

    Horenko, I.

    2009-04-01

    In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical

  11. REDFIT-X: Cross-spectral analysis of unevenly spaced paleoclimate time series

    NASA Astrophysics Data System (ADS)

    Björg Ólafsdóttir, Kristín; Schulz, Michael; Mudelsee, Manfred

    2016-06-01

    Cross-spectral analysis is commonly used in climate research to identify joint variability between two variables and to assess the phase (lead/lag) between them. Here we present a Fortran 90 program (REDFIT-X) that is specially developed to perform cross-spectral analysis of unevenly spaced paleoclimate time series. The data properties of climate time series that are necessary to take into account are for example data spacing (unequal time scales and/or uneven spacing between time points) and the persistence in the data. Lomb-Scargle Fourier transform is used for the cross-spectral analyses between two time series with unequal and/or uneven time scale and the persistence in the data is taken into account when estimating the uncertainty associated with cross-spectral estimates. We use a Monte Carlo approach to estimate the uncertainty associated with coherency and phase. False-alarm level is estimated from empirical distribution of coherency estimates and confidence intervals for the phase angle are formed from the empirical distribution of the phase estimates. The method is validated by comparing the Monte Carlo uncertainty estimates with the traditionally used measures. Examples are given where the method is applied to paleoceanographic time series.

  12. A Systematic Review of Methodology: Time Series Regression Analysis for Environmental Factors and Infectious Diseases

    PubMed Central

    Imai, Chisato; Hashizume, Masahiro

    2015-01-01

    Background: Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Methods: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Findings: Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. Conclusion: The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases. PMID:25859149

  13. Inverting geodetic time series with a principal component analysis-based inversion method

    NASA Astrophysics Data System (ADS)

    Kositsky, A. P.; Avouac, J.-P.

    2010-03-01

    The Global Positioning System (GPS) system now makes it possible to monitor deformation of the Earth's surface along plate boundaries with unprecedented accuracy. In theory, the spatiotemporal evolution of slip on the plate boundary at depth, associated with either seismic or aseismic slip, can be inferred from these measurements through some inversion procedure based on the theory of dislocations in an elastic half-space. We describe and test a principal component analysis-based inversion method (PCAIM), an inversion strategy that relies on principal component analysis of the surface displacement time series. We prove that the fault slip history can be recovered from the inversion of each principal component. Because PCAIM does not require externally imposed temporal filtering, it can deal with any kind of time variation of fault slip. We test the approach by applying the technique to synthetic geodetic time series to show that a complicated slip history combining coseismic, postseismic, and nonstationary interseismic slip can be retrieved from this approach. PCAIM produces slip models comparable to those obtained from standard inversion techniques with less computational complexity. We also compare an afterslip model derived from the PCAIM inversion of postseismic displacements following the 2005 8.6 Nias earthquake with another solution obtained from the extended network inversion filter (ENIF). We introduce several extensions of the algorithm to allow statistically rigorous integration of multiple data sources (e.g., both GPS and interferometric synthetic aperture radar time series) over multiple timescales. PCAIM can be generalized to any linear inversion algorithm.

  14. Distinguishing chaotic and stochastic dynamics from time series by using a multiscale symbolic approach.

    PubMed

    Zunino, L; Soriano, M C; Rosso, O A

    2012-10-01

    In this paper we introduce a multiscale symbolic information-theory approach for discriminating nonlinear deterministic and stochastic dynamics from time series associated with complex systems. More precisely, we show that the multiscale complexity-entropy causality plane is a useful representation space to identify the range of scales at which deterministic or noisy behaviors dominate the system's dynamics. Numerical simulations obtained from the well-known and widely used Mackey-Glass oscillator operating in a high-dimensional chaotic regime were used as test beds. The effect of an increased amount of observational white noise was carefully examined. The results obtained were contrasted with those derived from correlated stochastic processes and continuous stochastic limit cycles. Finally, several experimental and natural time series were analyzed in order to show the applicability of this scale-dependent symbolic approach in practical situations. PMID:23214666

  15. Three approaches to reliability analysis

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1989-01-01

    It is noted that current reliability analysis tools differ not only in their solution techniques, but also in their approach to model abstraction. The analyst must be satisfied with the constraints that are intrinsic to any combination of solution technique and model abstraction. To get a better idea of the nature of these constraints, three reliability analysis tools (HARP, ASSIST/SURE, and CAME) were used to model portions of the Integrated Airframe/Propulsion Control System architecture. When presented with the example problem, all three tools failed to produce correct results. In all cases, either the tool or the model had to be modified. It is suggested that most of the difficulty is rooted in the large model size and long computational times which are characteristic of Markov model solutions.

  16. Learning Rates and States from Biophysical Time Series: A Bayesian Approach to Model Selection and Single-Molecule FRET Data

    PubMed Central

    Bronson, Jonathan E.; Fei, Jingyi; Hofman, Jake M.; Gonzalez, Ruben L.; Wiggins, Chris H.

    2009-01-01

    Abstract Time series data provided by single-molecule Förster resonance energy transfer (smFRET) experiments offer the opportunity to infer not only model parameters describing molecular complexes, e.g., rate constants, but also information about the model itself, e.g., the number of conformational states. Resolving whether such states exist or how many of them exist requires a careful approach to the problem of model selection, here meaning discrimination among models with differing numbers of states. The most straightforward approach to model selection generalizes the common idea of maximum likelihood—selecting the most likely parameter values—to maximum evidence: selecting the most likely model. In either case, such an inference presents a tremendous computational challenge, which we here address by exploiting an approximation technique termed variational Bayesian expectation maximization. We demonstrate how this technique can be applied to temporal data such as smFRET time series; show superior statistical consistency relative to the maximum likelihood approach; compare its performance on smFRET data generated from experiments on the ribosome; and illustrate how model selection in such probabilistic or generative modeling can facilitate analysis of closely related temporal data currently prevalent in biophysics. Source code used in this analysis, including a graphical user interface, is available open source via http://vbFRET.sourceforge.net. PMID:20006957

  17. Hypnobehavioral approaches for school-age children with dysphagia and food aversion: a case series.

    PubMed

    Culbert, T P; Kajander, R L; Kohen, D P; Reaney, J B

    1996-10-01

    The purpose of this article is to describe hypnobehavioral treatment of five school-age children with maladaptive eating behaviors, including functional dysphagia, food aversion, globus hystericus, and conditioned fear of eating (phagophobia). The unique treatment approach described emphasizes the successful use of self-management techniques, particularly hypnosis, by all five children. Common etiological factors, treatment strategies, and proposed mechanisms of change are discussed. To the authors' knowledge, this is the first such case series in the mainstream pediatric literature describing the use of a hypnobehavioral approach for children with these maladaptive eating problems. PMID:8897222

  18. Fuzzy Inference System Approach for Locating Series, Shunt, and Simultaneous Series-Shunt Faults in Double Circuit Transmission Lines

    PubMed Central

    Swetapadma, Aleena; Yadav, Anamika

    2015-01-01

    Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance. PMID:26413088

  19. An Alternative Approach to Atopic Dermatitis: Part I—Case-Series Presentation

    PubMed Central

    2004-01-01

    Atopic dermatitis (AD) is a complex disease of obscure pathogenesis. A substantial portion of AD patients treated with conventional therapy become intractable after several cycles of recurrence. Over the last 20 years we have developed an alternative approach to treat many of these patients by diet and Kampo herbal medicine. However, as our approach is highly individualized and the Kampo formulae sometimes complicated, it is not easy to provide evidence to establish usefulness of this approach. In this Review, to demonstrate the effectiveness of the method of individualized Kampo therapy, results are presented for a series of patients who had failed with conventional therapy but were treated afterwards in our institution. Based on these data, we contend that there exist a definite subgroup of AD patients in whom conventional therapy fails, but the ‘Diet and Kampo’ approach succeeds, to heal. Therefore, this approach should be considered seriously as a second-line treatment for AD patients. In the Discussion, we review the evidential status of the current conventional strategies for AD treatment in general, and then specifically discuss the possibility of integrating Kampo regimens into it, taking our case-series presented here as evidential basis. We emphasize that Kampo therapy for AD is more ‘art’ than technology, for which expertise is an essential pre-requisite. PMID:15257326

  20. Spatiotemporal analysis of GPS time series in vertical direction using independent component analysis

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Dai, Wujiao; Peng, Wei; Meng, Xiaolin

    2015-11-01

    GPS has been widely used in the field of geodesy and geodynamics thanks to its technology development and the improvement of positioning accuracy. A time series observed by GPS in vertical direction usually contains tectonic signals, non-tectonic signals, residual atmospheric delay, measurement noise, etc. Analyzing these information is the basis of crustal deformation research. Furthermore, analyzing the GPS time series and extracting the non-tectonic information are helpful to study the effect of various geophysical events. Principal component analysis (PCA) is an effective tool for spatiotemporal filtering and GPS time series analysis. But as it is unable to extract statistically independent components, PCA is unfavorable for achieving the implicit information in time series. Independent component analysis (ICA) is a statistical method of blind source separation (BSS) and can separate original signals from mixed observations. In this paper, ICA is used as a spatiotemporal filtering method to analyze the spatial and temporal features of vertical GPS coordinate time series in the UK and Sichuan-Yunnan region in China. Meanwhile, the contributions from atmospheric and soil moisture mass loading are evaluated. The analysis of the relevance between the independent components and mass loading with their spatial distribution shows that the signals extracted by ICA have a strong correlation with the non-tectonic deformation, indicating that ICA has a better performance in spatiotemporal analysis.

  1. New insights into time series analysis. I. Correlated observations

    NASA Astrophysics Data System (ADS)

    Ferreira Lopes, C. E.; Cross, N. J. G.

    2016-02-01

    indices computed in this new approach allow us to reduce misclassification and these will be implemented in an automatic classifier which will be addressed in a forthcoming paper in this series.

  2. Discovery of progenitor cell signatures by time-series synexpression analysis during Drosophila embryonic cell immortalization

    PubMed Central

    Dequéant, Mary-Lee; Fagegaltier, Delphine; Hu, Yanhui; Spirohn, Kerstin; Simcox, Amanda; Hannon, Gregory J.; Perrimon, Norbert

    2015-01-01

    The use of time series profiling to identify groups of functionally related genes (synexpression groups) is a powerful approach for the discovery of gene function. Here we apply this strategy during RasV12 immortalization of Drosophila embryonic cells, a phenomenon not well characterized. Using high-resolution transcriptional time-series datasets, we generated a gene network based on temporal expression profile similarities. This analysis revealed that common immortalized cells are related to adult muscle precursors (AMPs), a stem cell-like population contributing to adult muscles and sharing properties with vertebrate satellite cells. Remarkably, the immortalized cells retained the capacity for myogenic differentiation when treated with the steroid hormone ecdysone. Further, we validated in vivo the transcription factor CG9650, the ortholog of mammalian Bcl11a/b, as a regulator of AMP proliferation predicted by our analysis. Our study demonstrates the power of time series synexpression analysis to characterize Drosophila embryonic progenitor lines and identify stem/progenitor cell regulators. PMID:26438832

  3. An approach for estimating time-variable rates from geodetic time series

    NASA Astrophysics Data System (ADS)

    Didova, Olga; Gunter, Brian; Riva, Riccardo; Klees, Roland; Roese-Koerner, Lutz

    2016-06-01

    There has been considerable research in the literature focused on computing and forecasting sea-level changes in terms of constant trends or rates. The Antarctic ice sheet is one of the main contributors to sea-level change with highly uncertain rates of glacial thinning and accumulation. Geodetic observing systems such as the Gravity Recovery and Climate Experiment (GRACE) and the Global Positioning System (GPS) are routinely used to estimate these trends. In an effort to improve the accuracy and reliability of these trends, this study investigates a technique that allows the estimated rates, along with co-estimated seasonal components, to vary in time. For this, state space models are defined and then solved by a Kalman filter (KF). The reliable estimation of noise parameters is one of the main problems encountered when using a KF approach, which is solved by numerically optimizing likelihood. Since the optimization problem is non-convex, it is challenging to find an optimal solution. To address this issue, we limited the parameter search space using classical least-squares adjustment (LSA). In this context, we also tested the usage of inequality constraints by directly verifying whether they are supported by the data. The suggested technique for time-series analysis is expanded to classify and handle time-correlated observational noise within the state space framework. The performance of the method is demonstrated using GRACE and GPS data at the CAS1 station located in East Antarctica and compared to commonly used LSA. The results suggest that the outlined technique allows for more reliable trend estimates, as well as for more physically valuable interpretations, while validating independent observing systems.

  4. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    NASA Astrophysics Data System (ADS)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  5. Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.

    PubMed

    Ak, Ronay; Fink, Olga; Zio, Enrico

    2016-08-01

    The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria. PMID:25910257

  6. The EarthLabs Climate Series: Approaching Climate Literacy From Multiple Contexts

    NASA Astrophysics Data System (ADS)

    Haddad, N.; Ledley, T. S.; Ellins, K.; McNeal, K.; Bardar, E. W.; Youngman, E.; Lockwood, J.; Dunlap, C.

    2015-12-01

    The EarthLabs Climate Series is a set of four distinct but related high school curriculum modules that help build student and teacher understanding of our planet's complex climate system. The web-based, freely available curriculum modules include a rich set of resources for teachers, and are tied together by a common set of climate related themes that include: 1) the Earth system with the complexities of its positive and negative feedback loops; 2) the range of temporal and spatial scales at which climate, weather, and other Earth system processes occur; and 3) the recurring question, "How do we know what we know about Earth's past and present climate?" which addresses proxy data and scientific instrumentation. The four modules (Climate and the Cryosphere; Climate and the Biosphere; Climate and the Carbon Cycle; and Climate Detectives) approach climate literacy from different contexts, and have provided teachers of biology, chemistry, marine science, environmental science, and earth science with opportunities to address climate science by selecting a module that best supplements the content of their particular course. This presentation will highlight the four curriculum modules in the Climate Series, the multiple pathways they offer teachers for introducing climate science into their existing courses, and the two newest elements of the series: the Climate Series Intro, which holds an extensive set of climate related resources for teachers; and the Climate Detectives module, which is based on the 2013 expedition of the Joides Resolution to collect cores from the seafloor below the Gulf of Alaska.

  7. Water Resources Management Plan for Ganga River using SWAT Modelling and Time series Analysis

    NASA Astrophysics Data System (ADS)

    Satish, L. N. V.

    2015-12-01

    Water resources management of the Ganga River is one of the primary objectives of National Ganga River Basin Environmental Management Plan. The present study aims to carry out water balance study and development of appropriate methodologies to compute environmental flow in the middle Ganga river basin between Patna-Farraka, India. The methodology adopted here are set-up a hydrological model to estimate monthly discharge at the tributaries under natural condition, hydrological alternation analysis of both observed and simulated discharge series, flow health analysis to obtain status of the stream health in the last 4 decades and estimating the e-flow using flow health indicators. ArcSWAT, was used to simulate 8 tributaries namely Kosi, Gandak and others. This modelling is quite encouraging and helps to provide the monthly water balance analysis for all tributaries for this study. The water balance analysis indicates significant change in surface and ground water interaction pattern within the study time period Indicators of hydrological alternation has been used for both observed and simulated data series to quantify hydrological alternation occurred in the tributaries and the main river in the last 4 decades,. For temporal variation of stream health, flow health tool has been used for observed and simulated discharge data. A detailed stream health analysis has been performed by considering 3 approaches based on i) observed flow time series, ii) observed and simulated flow time series and iii) simulated flow time series at small upland basin, major tributary and main Ganga river basin levels. At upland basin level, these approaches show that stream health and its temporal variations are good with non-significant temporal variation. At major tributary level, the stream health and its temporal variations are found to be deteriorating from 1970s. At the main Ganga reach level river health and its temporal variations does not show any declining trend. Finally, E- flows

  8. Time Series Analysis of the Blazar OJ 287

    NASA Astrophysics Data System (ADS)

    Gamel, Ellen; Ryle, W. T.; Carini, M. T.

    2013-06-01

    Blazars are a subset of active galactic nuclei (AGN) where the light is viewed along the jet of radiation produced by the central supermassive black hole. These very luminous objects vary in brightness and are associated with the cores of distant galaxies. The blazar, OJ 287, has been monitored and its brightness tracked over time. From these light curves the relationship between the characteristic “break frequency” and black hole mass can be determined through the use of power density spectra. In order to obtain a well-sampled light curve, this blazar will be observed at a wide range of timescales. Long time scales will be obtained using archived light curves from published literature. Medium time scales were obtained through a combination of data provided by Western Kentucky University and data collected at The Bank of Kentucky Observatory. Short time scales were achieved via a single night of observation at the 72” Perkins Telescope at Lowell Observatory in Flagstaff, AZ. Using time series analysis, we present a revised mass estimate for the super massive black hole of OJ 287. This object is of particular interest because it may harbor a binary black hole at its center.

  9. Permutation Entropy Analysis of Geomagnetic Indices Time Series

    NASA Astrophysics Data System (ADS)

    De Michelis, Paola; Consolini, Giuseppe

    2013-04-01

    The Earth's magnetospheric dynamics displays a very complex nature in response to solar wind changes as widely documented in the scientific literature. This complex dynamics manifests in various physical processes occurring in different regions of the Earth's magnetosphere as clearly revealed by previous analyses on geomagnetic indices (AE-indices, Dst, Sym-H, ....., etc.). One of the most interesting features of the geomagnetic indices as proxies of the Earth's magnetospheric dynamics is the multifractional nature of the time series of such indices. This aspect has been interpreted as the occurrence of intermittence and dynamical phase transition in the Earth's magnetosphere. Here, we investigate the Markovian nature of different geomagnetic indices (AE-indices, Sym-H, Asy-H) and their fluctuations by means of Permutation Entropy Analysis. The results clearly show the non-Markovian and different nature of the distinct sets of geomagnetic indices, pointing towards diverse underlying physical processes. A discussion in connection with the nature of the physical processes responsible of each set of indices and their multifractional character is attempted.

  10. Chaotic time series analysis of vision evoked EEG

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Wang, Hong

    2009-12-01

    To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.

  11. Chaotic time series analysis of vision evoked EEG

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Wang, Hong

    2010-01-01

    To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.

  12. On the Fourier and Wavelet Analysis of Coronal Time Series

    NASA Astrophysics Data System (ADS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-07-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  13. Detrended fluctuation analysis of laser Doppler flowmetry time series.

    PubMed

    Esen, Ferhan; Aydin, Gülsün Sönmez; Esen, Hamza

    2009-12-01

    Detrended fluctuation analysis (DFA) of laser Doppler flow (LDF) time series appears to yield improved prognostic power in microvascular dysfunction, through calculation of the scaling exponent, alpha. In the present study the long lasting strenuous activity-induced change in microvascular function was evaluated by DFA in basketball players compared with sedentary control. Forearm skin blood flow was measured at rest and during local heating. Three scaling exponents, the slopes of the three regression lines, were identified corresponding to cardiac, cardio-respiratory and local factors. Local scaling exponent was always approximately one, alpha=1.01+/-0.15, in the control group and did not change with local heating. However, we found a broken line with two scaling exponents (alpha(1)=1.06+/-0.01 and alpha(2)=0.75+/-0.01) in basketball players. The broken line became a single line having one scaling exponent (alpha(T)=0.94+/-0.01) with local heating. The scaling exponents, alpha(2) and alpha(T), smaller than 1 indicate reduced long-range correlation in blood flow due to a loss of integration in local mechanisms and suggest endothelial dysfunction as the most likely candidate. Evaluation of microvascular function from a baseline LDF signal at rest is the superiority of DFA to other methods, spectral or not, that use the amplitude changes of evoked relative signal. PMID:19660479

  14. Time series analysis of the cataclysmic variable V1101 Aquilae

    NASA Astrophysics Data System (ADS)

    Spahn, Alexander C.

    This work reports on the application of various time series analysis techniques to a two month portion of the light curve of the cataclysmic variable V1101 Aquilae. The system is a Z Cam type dwarf nova with an orbital period of 4.089 hours and an active outburst cycle of 15.15 days due to a high mass transfer rate. The system's light curve also displays higher frequency variations, known as negative sumperhums, with a period of 3.891 hours and a period deficit of --5.1%. The amplitude of the negative superhumps varies as an inverse function of system brightness, with an amplitude of 0.70358 during outburst and 0.97718 during quiescence (relative flux units). These variations are believed to be caused by the contrast between the accretion disk and the bright spot. An O--?C diagram was constructed and reveals the system's evolution. In general, during the rise to outburst, the disk moment of inertia decreases as mass is lost from the disk, causing the precession period of the tilted disk to increase and with it the negative superhump period. The decline of outburst is associated with the opposite effects. While no standstills were observed in this data, they are present in the AAVSO data and the results agree with the conditions for Z Cam stars.

  15. Spectral Procedures Enhance the Analysis of Three Agricultural Time Series

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many agricultural and environmental variables are influenced by cyclic processes that occur naturally. Consequently their time series often have cyclic behavior. This study developed times series models for three different phenomenon: (1) a 60 year-long state average crop yield record, (2) a four ...

  16. Analytical framework for recurrence network analysis of time series

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Donner, Reik V.; Kurths, Jürgen

    2012-04-01

    Recurrence networks are a powerful nonlinear tool for time series analysis of complex dynamical systems. While there are already many successful applications ranging from medicine to paleoclimatology, a solid theoretical foundation of the method has still been missing so far. Here, we interpret an ɛ-recurrence network as a discrete subnetwork of a “continuous” graph with uncountably many vertices and edges corresponding to the system's attractor. This step allows us to show that various statistical measures commonly used in complex network analysis can be seen as discrete estimators of newly defined continuous measures of certain complex geometric properties of the attractor on the scale given by ɛ. In particular, we introduce local measures such as the ɛ-clustering coefficient, mesoscopic measures such as ɛ-motif density, path-based measures such as ɛ-betweennesses, and global measures such as ɛ-efficiency. This new analytical basis for the so far heuristically motivated network measures also provides an objective criterion for the choice of ɛ via a percolation threshold, and it shows that estimation can be improved by so-called node splitting invariant versions of the measures. We finally illustrate the framework for a number of archetypical chaotic attractors such as those of the Bernoulli and logistic maps, periodic and two-dimensional quasiperiodic motions, and for hyperballs and hypercubes by deriving analytical expressions for the novel measures and comparing them with data from numerical experiments. More generally, the theoretical framework put forward in this work describes random geometric graphs and other networks with spatial constraints, which appear frequently in disciplines ranging from biology to climate science.

  17. Nonlinear time series analysis of normal and pathological human walking

    NASA Astrophysics Data System (ADS)

    Dingwell, Jonathan B.; Cusumano, Joseph P.

    2000-12-01

    Characterizing locomotor dynamics is essential for understanding the neuromuscular control of locomotion. In particular, quantifying dynamic stability during walking is important for assessing people who have a greater risk of falling. However, traditional biomechanical methods of defining stability have not quantified the resistance of the neuromuscular system to perturbations, suggesting that more precise definitions are required. For the present study, average maximum finite-time Lyapunov exponents were estimated to quantify the local dynamic stability of human walking kinematics. Local scaling exponents, defined as the local slopes of the correlation sum curves, were also calculated to quantify the local scaling structure of each embedded time series. Comparisons were made between overground and motorized treadmill walking in young healthy subjects and between diabetic neuropathic (NP) patients and healthy controls (CO) during overground walking. A modification of the method of surrogate data was developed to examine the stochastic nature of the fluctuations overlying the nominally periodic patterns in these data sets. Results demonstrated that having subjects walk on a motorized treadmill artificially stabilized their natural locomotor kinematics by small but statistically significant amounts. Furthermore, a paradox previously present in the biomechanical literature that resulted from mistakenly equating variability with dynamic stability was resolved. By slowing their self-selected walking speeds, NP patients adopted more locally stable gait patterns, even though they simultaneously exhibited greater kinematic variability than CO subjects. Additionally, the loss of peripheral sensation in NP patients was associated with statistically significant differences in the local scaling structure of their walking kinematics at those length scales where it was anticipated that sensory feedback would play the greatest role. Lastly, stride-to-stride fluctuations in the

  18. A Bayesian approach to estimation of a statistical change-point in the mean parameter for high dimensional non-linear time series

    NASA Astrophysics Data System (ADS)

    Speegle, Darrin; Steward, Robert

    2015-08-01

    We propose a semiparametric approach to infer the existence of and estimate the location of a statistical change-point to a nonlinear high dimensional time series contaminated with an additive noise component. In particular, we consider a p―dimensional stochastic process of independent multivariate normal observations where the mean function varies smoothly except at a single change-point. Our approach first involves a dimension reduction of the original time series through a random matrix multiplication. Next, we conduct a Bayesian analysis on the empirical detail coefficients of this dimensionally reduced time series after a wavelet transform. We also present a means to associate confidence bounds to the conclusions of our results. Aside from being computationally efficient and straight forward to implement, the primary advantage of our methods is seen in how these methods apply to a much larger class of time series whose mean functions are subject to only general smoothness conditions.

  19. Use of a Principal Components Analysis for the Generation of Daily Time Series.

    NASA Astrophysics Data System (ADS)

    Dreveton, Christine; Guillou, Yann

    2004-07-01

    A new approach for generating daily time series is considered in response to the weather-derivatives market. This approach consists of performing a principal components analysis to create independent variables, the values of which are then generated separately with a random process. Weather derivatives are financial or insurance products that give companies the opportunity to cover themselves against adverse climate conditions. The aim of a generator is to provide a wider range of feasible situations to be used in an assessment of risk. Generation of a temperature time series is required by insurers or bankers for pricing weather options. The provision of conditional probabilities and a good representation of the interannual variance are the main challenges of a generator when used for weather derivatives. The generator was developed according to this new approach using a principal components analysis and was applied to the daily average temperature time series of the Paris-Montsouris station in France. The observed dataset was homogenized and the trend was removed to represent correctly the present climate. The results obtained with the generator show that it represents correctly the interannual variance of the observed climate; this is the main result of the work, because one of the main discrepancies of other generators is their inability to represent accurately the observed interannual climate variance—this discrepancy is not acceptable for an application to weather derivatives. The generator was also tested to calculate conditional probabilities: for example, the knowledge of the aggregated value of heating degree-days in the middle of the heating season allows one to estimate the probability if reaching a threshold at the end of the heating season. This represents the main application of a climate generator for use with weather derivatives.


  20. Time-series analysis reveals genetic responses to intensive management of razorback sucker (Xyrauchen texanus)

    PubMed Central

    Dowling, Thomas E; Turner, Thomas F; Carson, Evan W; Saltzgiver, Melody J; Adams, Deborah; Kesner, Brian; Marsh, Paul C

    2014-01-01

    Time-series analysis is used widely in ecology to study complex phenomena and may have considerable potential to clarify relationships of genetic and demographic processes in natural and exploited populations. We explored the utility of this approach to evaluate population responses to management in razorback sucker, a long-lived and fecund, but declining freshwater fish species. A core population in Lake Mohave (Arizona-Nevada, USA) has experienced no natural recruitment for decades and is maintained by harvesting naturally produced larvae from the lake, rearing them in protective custody, and repatriating them at sizes less vulnerable to predation. Analyses of mtDNA and 15 microsatellites characterized for sequential larval cohorts collected over a 15-year time series revealed no changes in geographic structuring but indicated significant increase in mtDNA diversity for the entire population over time. Likewise, ratios of annual effective breeders to annual census size (Nb/Na) increased significantly despite sevenfold reduction of Na. These results indicated that conservation actions diminished near-term extinction risk due to genetic factors and should now focus on increasing numbers of fish in Lake Mohave to ameliorate longer-term risks. More generally, time-series analysis permitted robust testing of trends in genetic diversity, despite low precision of some metrics. PMID:24665337

  1. Time-series analysis reveals genetic responses to intensive management of razorback sucker (Xyrauchen texanus).

    PubMed

    Dowling, Thomas E; Turner, Thomas F; Carson, Evan W; Saltzgiver, Melody J; Adams, Deborah; Kesner, Brian; Marsh, Paul C

    2014-03-01

    Time-series analysis is used widely in ecology to study complex phenomena and may have considerable potential to clarify relationships of genetic and demographic processes in natural and exploited populations. We explored the utility of this approach to evaluate population responses to management in razorback sucker, a long-lived and fecund, but declining freshwater fish species. A core population in Lake Mohave (Arizona-Nevada, USA) has experienced no natural recruitment for decades and is maintained by harvesting naturally produced larvae from the lake, rearing them in protective custody, and repatriating them at sizes less vulnerable to predation. Analyses of mtDNA and 15 microsatellites characterized for sequential larval cohorts collected over a 15-year time series revealed no changes in geographic structuring but indicated significant increase in mtDNA diversity for the entire population over time. Likewise, ratios of annual effective breeders to annual census size (N b /N a) increased significantly despite sevenfold reduction of N a. These results indicated that conservation actions diminished near-term extinction risk due to genetic factors and should now focus on increasing numbers of fish in Lake Mohave to ameliorate longer-term risks. More generally, time-series analysis permitted robust testing of trends in genetic diversity, despite low precision of some metrics. PMID:24665337

  2. Examining deterrence of adult sex crimes: A semi-parametric intervention time series approach

    PubMed Central

    Park, Jin-Hong; Bandyopadhyay, Dipankar; Letourneau, Elizabeth

    2013-01-01

    Motivated by recent developments on dimension reduction (DR) techniques for time series data, the association of a general deterrent effect towards South Carolina (SC)’s registration and notification (SORN) policy for preventing sex crimes was examined. Using adult sex crime arrestee data from 1990 to 2005, the the idea of Central Mean Subspace (CMS) is extended to intervention time series analysis (CMS-ITS) to model the sequential intervention effects of 1995 (the year SC’s SORN policy was initially implemented) and 1999 (the year the policy was revised to include online notification) on the time series spectrum. The CMS-ITS model estimation was achieved via kernel smoothing techniques, and compared to interrupted auto-regressive integrated time series (ARIMA) models. Simulation studies and application to the real data underscores our model’s ability towards achieving parsimony, and to detect intervention effects not earlier determined via traditional ARIMA models. From a public health perspective, findings from this study draw attention to the potential general deterrent effects of SC’s SORN policy. These findings are considered in light of the overall body of research on sex crime arrestee registration and notification policies, which remain controversial. PMID:24795489

  3. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    NASA Astrophysics Data System (ADS)

    Tang, You-Fu; Liu, Shu-Lin; Jiang, Rui-Hong; Liu, Ying-Hui

    2013-03-01

    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained.

  4. Volterra Series Approach for Nonlinear Aeroelastic Response of 2-D Lifting Surfaces

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Marzocca, Piergiovanni; Librescu, Liviu

    2001-01-01

    The problem of the determination of the subcritical aeroelastic response and flutter instability of nonlinear two-dimensional lifting surfaces in an incompressible flow-field via Volterra series approach is addressed. The related aeroelastic governing equations are based upon the inclusion of structural nonlinearities, of the linear unsteady aerodynamics and consideration of an arbitrary time-dependent external pressure pulse. Unsteady aeroelastic nonlinear kernels are determined, and based on these, frequency and time histories of the subcritical aeroelastic response are obtained, and in this context the influence of geometric nonlinearities is emphasized. Conclusions and results displaying the implications of the considered effects are supplied.

  5. Approximate Symmetry Reduction Approach: Infinite Series Reductions to the KdV-Burgers Equation

    NASA Astrophysics Data System (ADS)

    Jiao, Xiaoyu; Yao, Ruoxia; Zhang, Shunli; Lou, Sen Y.

    2009-11-01

    For weak dispersion and weak dissipation cases, the (1+1)-dimensional KdV-Burgers equation is investigated in terms of approximate symmetry reduction approach. The formal coherence of similarity reduction solutions and similarity reduction equations of different orders enables series reduction solutions. For the weak dissipation case, zero-order similarity solutions satisfy the Painlevé II, Painlevé I, and Jacobi elliptic function equations. For the weak dispersion case, zero-order similarity solutions are in the form of Kummer, Airy, and hyperbolic tangent functions. Higher-order similarity solutions can be obtained by solving linear variable coefficients ordinary differential equations.

  6. A pairwise likelihood-based approach for changepoint detection in multivariate time series models

    PubMed Central

    Ma, Ting Fung; Yau, Chun Yip

    2016-01-01

    This paper develops a composite likelihood-based approach for multiple changepoint estimation in multivariate time series. We derive a criterion based on pairwise likelihood and minimum description length for estimating the number and locations of changepoints and for performing model selection in each segment. The number and locations of the changepoints can be consistently estimated under mild conditions and the computation can be conducted efficiently with a pruned dynamic programming algorithm. Simulation studies and real data examples demonstrate the statistical and computational efficiency of the proposed method. PMID:27279666

  7. GNSS Vertical Coordinate Time Series Analysis Using Single-Channel Independent Component Analysis Method

    NASA Astrophysics Data System (ADS)

    Peng, Wei; Dai, Wujiao; Santerre, Rock; Cai, Changsheng; Kuang, Cuilin

    2016-05-01

    Daily vertical coordinate time series of Global Navigation Satellite System (GNSS) stations usually contains tectonic and non-tectonic deformation signals, residual atmospheric delay signals, measurement noise, etc. In geophysical studies, it is very important to separate various geophysical signals from the GNSS time series to truthfully reflect the effect of mass loadings on crustal deformation. Based on the independence of mass loadings, we combine the Ensemble Empirical Mode Decomposition (EEMD) with the Phase Space Reconstruction-based Independent Component Analysis (PSR-ICA) method to analyze the vertical time series of GNSS reference stations. In the simulation experiment, the seasonal non-tectonic signal is simulated by the sum of the correction of atmospheric mass loading and soil moisture mass loading. The simulated seasonal non-tectonic signal can be separated into two independent signals using the PSR-ICA method, which strongly correlated with atmospheric mass loading and soil moisture mass loading, respectively. Likewise, in the analysis of the vertical time series of GNSS reference stations of Crustal Movement Observation Network of China (CMONOC), similar results have been obtained using the combined EEMD and PSR-ICA method. All these results indicate that the EEMD and PSR-ICA method can effectively separate the independent atmospheric and soil moisture mass loading signals and illustrate the significant cause of the seasonal variation of GNSS vertical time series in the mainland of China.

  8. Endoscopic Endonasal Transclival Approaches: Case Series and Outcomes for Different Clival Regions

    PubMed Central

    Little, Ryan E.; Taylor, Robert J.; Miller, Justin D.; Ambrose, Emily C.; Germanwala, Anand V.; Sasaki-Adams, Deanna M.; Ewend, Matthew G.; Zanation, Adam M.

    2014-01-01

    Objective Transclival endoscopic endonasal approaches to the skull base are novel with few published cases. We report our institution's experience with this technique and discuss outcomes according to the clival region involved. Design Retrospective case series. Setting Tertiary care academic medical center Participants All patients who underwent endoscopic endonasal transclival approaches for skull base lesions from 2008 to 2012. Main Outcome Measures Pathologies encountered, mean intraoperative time, intraoperative complications, gross total resection, intraoperative cerebrospinal fluid (CSF) leak, postoperative CSF leak, postoperative complications, and postoperative clinical course. Results A total of 49 patients underwent 55 endoscopic endonasal transclival approaches. Pathology included 43 benign and 12 malignant lesions. Mean follow-up was 15.4 months. Mean operative time was 167.9 minutes, with one patient experiencing an intraoperative internal carotid artery injury. Of the 15 cases with intraoperative cerebrospinal fluid (CSF) leaks, 1 developed postoperative CSF leak (6.7%). There were six other postoperative complications: four systemic complications, one case of meningitis, and one retropharyngeal abscess. Gross total resection was achieved for all malignancies approached with curative intent. Conclusions This study provides evidence that endoscopic endonasal transclival approaches are a safe and effective strategy for the surgical management of a variety of benign and malignant lesions. Level of Evidence 4. PMID:25093148

  9. A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series

    NASA Technical Reports Server (NTRS)

    Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.

    1998-01-01

    An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.

  10. Geodetic Time Series: An Overview of UNAVCO Community Resources and Examples of Time Series Analysis Using GPS and Strainmeter Data

    NASA Astrophysics Data System (ADS)

    Phillips, D. A.; Meertens, C. M.; Hodgkinson, K. M.; Puskas, C. M.; Boler, F. M.; Snett, L.; Mattioli, G. S.

    2013-12-01

    We present an overview of time series data, tools and services available from UNAVCO along with two specific and compelling examples of geodetic time series analysis. UNAVCO provides a diverse suite of geodetic data products and cyberinfrastructure services to support community research and education. The UNAVCO archive includes data from 2500+ continuous GPS stations, borehole geophysics instruments (strainmeters, seismometers, tiltmeters, pore pressure sensors), and long baseline laser strainmeters. These data span temporal scales from seconds to decades and provide global spatial coverage with regionally focused networks including the EarthScope Plate Boundary Observatory (PBO) and COCONet. This rich, open access dataset is a tremendous resource that enables the exploration, identification and analysis of time varying signals associated with crustal deformation, reference frame determinations, isostatic adjustments, atmospheric phenomena, hydrologic processes and more. UNAVCO provides a suite of time series exploration and analysis resources including static plots, dynamic plotting tools, and data products and services designed to enhance time series analysis. The PBO GPS network allow for identification of ~1 mm level deformation signals. At some GPS stations seasonal signals and longer-term trends in both the vertical and horizontal components can be dominated by effects of hydrological loading from natural and anthropogenic sources. Modeling of hydrologic deformation using GLDAS and a variety of land surface models (NOAH, MOSAIC, VIC and CLM) shows promise for independently modeling hydrologic effects and separating them from tectonic deformation as well as anthropogenic loading sources. A major challenge is to identify where loading is dominant and corrections from GLDAS can apply and where pumping is the dominant signal and corrections are not possible without some other data. In another arena, the PBO strainmeter network was designed to capture small short

  11. Applications of time-series analysis to mood fluctuations in bipolar disorder to promote treatment innovation: a case series.

    PubMed

    Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M

    2016-01-01

    Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP

  12. An effective approach for gap-filling continental scale remotely sensed time-series

    PubMed Central

    Weiss, Daniel J.; Atkinson, Peter M.; Bhatt, Samir; Mappin, Bonnie; Hay, Simon I.; Gething, Peter W.

    2014-01-01

    The archives of imagery and modeled data products derived from remote sensing programs with high temporal resolution provide powerful resources for characterizing inter- and intra-annual environmental dynamics. The impressive depth of available time-series from such missions (e.g., MODIS and AVHRR) affords new opportunities for improving data usability by leveraging spatial and temporal information inherent to longitudinal geospatial datasets. In this research we develop an approach for filling gaps in imagery time-series that result primarily from cloud cover, which is particularly problematic in forested equatorial regions. Our approach consists of two, complementary gap-filling algorithms and a variety of run-time options that allow users to balance competing demands of model accuracy and processing time. We applied the gap-filling methodology to MODIS Enhanced Vegetation Index (EVI) and daytime and nighttime Land Surface Temperature (LST) datasets for the African continent for 2000–2012, with a 1 km spatial resolution, and an 8-day temporal resolution. We validated the method by introducing and filling artificial gaps, and then comparing the original data with model predictions. Our approach achieved R2 values above 0.87 even for pixels within 500 km wide introduced gaps. Furthermore, the structure of our approach allows estimation of the error associated with each gap-filled pixel based on the distance to the non-gap pixels used to model its fill value, thus providing a mechanism for including uncertainty associated with the gap-filling process in downstream applications of the resulting datasets. PMID:25642100

  13. An effective approach for gap-filling continental scale remotely sensed time-series

    NASA Astrophysics Data System (ADS)

    Weiss, Daniel J.; Atkinson, Peter M.; Bhatt, Samir; Mappin, Bonnie; Hay, Simon I.; Gething, Peter W.

    2014-12-01

    The archives of imagery and modeled data products derived from remote sensing programs with high temporal resolution provide powerful resources for characterizing inter- and intra-annual environmental dynamics. The impressive depth of available time-series from such missions (e.g., MODIS and AVHRR) affords new opportunities for improving data usability by leveraging spatial and temporal information inherent to longitudinal geospatial datasets. In this research we develop an approach for filling gaps in imagery time-series that result primarily from cloud cover, which is particularly problematic in forested equatorial regions. Our approach consists of two, complementary gap-filling algorithms and a variety of run-time options that allow users to balance competing demands of model accuracy and processing time. We applied the gap-filling methodology to MODIS Enhanced Vegetation Index (EVI) and daytime and nighttime Land Surface Temperature (LST) datasets for the African continent for 2000-2012, with a 1 km spatial resolution, and an 8-day temporal resolution. We validated the method by introducing and filling artificial gaps, and then comparing the original data with model predictions. Our approach achieved R2 values above 0.87 even for pixels within 500 km wide introduced gaps. Furthermore, the structure of our approach allows estimation of the error associated with each gap-filled pixel based on the distance to the non-gap pixels used to model its fill value, thus providing a mechanism for including uncertainty associated with the gap-filling process in downstream applications of the resulting datasets.

  14. Uniform approach to linear and nonlinear interrelation patterns in multivariate time series

    NASA Astrophysics Data System (ADS)

    Rummel, Christian; Abela, Eugenio; Müller, Markus; Hauf, Martinus; Scheidegger, Olivier; Wiest, Roland; Schindler, Kaspar

    2011-06-01

    Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.

  15. On fractal analysis of cardiac interbeat time series

    NASA Astrophysics Data System (ADS)

    Guzmán-Vargas, L.; Calleja-Quevedo, E.; Angulo-Brown, F.

    2003-09-01

    In recent years the complexity of a cardiac beat-to-beat time series has been taken as an auxiliary tool to identify the health status of human hearts. Several methods has been employed to characterize the time series complexity. In this work we calculate the fractal dimension of interbeat time series arising from three groups: 10 young healthy persons, 8 elderly healthy persons and 10 patients with congestive heart failures. Our numerical results reflect evident differences in the dynamic behavior corresponding to each group. We discuss these results within the context of the neuroautonomic control of heart rate dynamics. We also propose a numerical simulation which reproduce aging effects of heart rate behavior.

  16. Multiscale entropy analysis of complex physiologic time series.

    PubMed

    Costa, Madalena; Goldberger, Ary L; Peng, C-K

    2002-08-01

    There has been considerable interest in quantifying the complexity of physiologic time series, such as heart rate. However, traditional algorithms indicate higher complexity for certain pathologic processes associated with random outputs than for healthy dynamics exhibiting long-range correlations. This paradox may be due to the fact that conventional algorithms fail to account for the multiple time scales inherent in healthy physiologic dynamics. We introduce a method to calculate multiscale entropy (MSE) for complex time series. We find that MSE robustly separates healthy and pathologic groups and consistently yields higher values for simulated long-range correlated noise compared to uncorrelated noise. PMID:12190613

  17. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  18. Global coseismic deformations, GNSS time series analysis, and earthquake scaling laws

    NASA Astrophysics Data System (ADS)

    Métivier, Laurent; Collilieux, Xavier; Lercier, Daphné; Altamimi, Zuheir; Beauducel, François

    2014-12-01

    We investigate how two decades of coseismic deformations affect time series of GPS station coordinates (Global Navigation Satellite System) and what constraints geodetic observations give on earthquake scaling laws. We developed a simple but rapid model for coseismic deformations, assuming different earthquake scaling relations, that we systematically applied on earthquakes with magnitude larger than 4. We found that coseismic displacements accumulated during the last two decades can be larger than 10 m locally and that the cumulative displacement is not only due to large earthquakes but also to the accumulation of many small motions induced by smaller earthquakes. Then, investigating a global network of GPS stations, we demonstrate that a systematic global modeling of coseismic deformations helps greatly to detect discontinuities in GPS coordinate time series, which are still today one of the major sources of error in terrestrial reference frame construction (e.g., the International Terrestrial Reference Frame). We show that numerous discontinuities induced by earthquakes are too small to be visually detected because of seasonal variations and GPS noise that disturb their identification. However, not taking these discontinuities into account has a large impact on the station velocity estimation, considering today's precision requirements. Finally, six groups of earthquake scaling laws were tested. Comparisons with our GPS time series analysis on dedicated earthquakes give insights on the consistency of these scaling laws with geodetic observations and Okada coseismic approach.

  19. A Rigorous Analysis of Series-Connected, Multi-Bandgap, Tandem Thermophotovoltaic (TPV) Energy Converters

    NASA Astrophysics Data System (ADS)

    Wanlass, M. W.; Albin, D. S.

    2004-11-01

    Multi-bandgap, photonic energy conversion is under investigation for nearly every class of photovoltaic materials, with monolithic, series-connected device structures being the preferred mode of implementation. For TPV energy conversion systems, such an approach represents the next wave in TPV converter advancement. In this paper, we focus on a rigorous analysis of series-connected, multi-bandgap, tandem (SCMBT) converter structures according to Kirchhoff's circuit laws. A general formulation is presented, followed by an application of the general formulation to a typical, semi-realistic model for well-behaved, p-n junction, photovoltaic devices. Using results generated from a computer code written in Visual Basic, we then present example calculations for SCMBT TPV converters with two subcells, for a TPV system utilizing a blackbody radiator operating at 954°C (1750°F). A comparison of the results obtained using the rigorous analysis, with those obtained by using the commonly adopted subcell-photocurrent-matching design rule, is discussed in detail. An output power density increase of ˜ 5% is realized in the solution determined by the rigorous analysis, as compared to that obtained with the subcell-photocurrent-matching rule. Additional interesting, non-intuitive results are also highlighted.

  20. Investigation on Law and Economics Based on Complex Network and Time Series Analysis

    PubMed Central

    Yang, Jian; Qu, Zhao; Chang, Hui

    2015-01-01

    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460

  1. A Time-Series Analysis of Hispanic Unemployment.

    ERIC Educational Resources Information Center

    Defreitas, Gregory

    1986-01-01

    This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)

  2. Time Series Analysis for the Drac River Basin (france)

    NASA Astrophysics Data System (ADS)

    Parra-Castro, K.; Donado-Garzon, L. D.; Rodriguez, E.

    2013-12-01

    This research is based on analyzing of discharge time-series in four stream flow gage stations located in the Drac River basin in France: (i) Guinguette Naturelle, (ii) Infernet, (iii) Parassat and the stream flow gage (iv) Villard Loubière. In addition, time-series models as the linear regression (single and multiple) and the MORDOR model were implemented to analyze the behavior the Drac River from year 1969 until year 2010. Twelve different models were implemented to assess the daily and monthly discharge time-series for the four flow gage stations. Moreover, five selection criteria were use to analyze the models: average division, variance division, the coefficient R2, Kling-Gupta Efficiency (KGE) and the Nash Number. The selection of the models was made to have the strongest models with an important level confidence. In this case, according to the best correlation between the time-series of stream flow gage stations and the best fitting models. Four of the twelve models were selected: two models for the stream flow gage station Guinguette Naturel, one for the station Infernet and one model for the station Villard Loubière. The R2 coefficients achieved were 0.87, 0.95, 0.85 and 0.87 respectively. Consequently, both confidence levels (the modeled and the empirical) were tested in the selected model, leading to the best fitting of both discharge time-series and models with the empirical confidence interval. Additionally, a procedure for validation of the models was conducted using the data for the year 2011, where extreme hydrologic and changes in hydrologic regimes events were identified. Furthermore, two different forms of estimating uncertainty through the use of confidence levels were studied: the modeled and the empirical confidence levels. This research was useful to update the used procedures and validate time-series in the four stream flow gage stations for the use of the company Électricité de France. Additionally, coefficients for both the models and

  3. Complexity analysis of the turbulent environmental fluid flow time series

    NASA Astrophysics Data System (ADS)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  4. Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling

    NASA Astrophysics Data System (ADS)

    Olofsson, K. E. J.; Hanson, J. M.; Shiraki, D.; Volpe, F. A.; Humphreys, D. A.; La Haye, R. J.; Lanctot, M. J.; Strait, E. J.; Welander, A. S.; Kolemen, E.; Okabayashi, M.

    2014-09-01

    Time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models. The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.

  5. Wavelet spectrum analysis approach to model validation of dynamic systems

    NASA Astrophysics Data System (ADS)

    Jiang, Xiaomo; Mahadevan, Sankaran

    2011-02-01

    Feature-based validation techniques for dynamic system models could be unreliable for nonlinear, stochastic, and transient dynamic behavior, where the time series is usually non-stationary. This paper presents a wavelet spectral analysis approach to validate a computational model for a dynamic system. Continuous wavelet transform is performed on the time series data for both model prediction and experimental observation using a Morlet wavelet function. The wavelet cross-spectrum is calculated for the two sets of data to construct a time-frequency phase difference map. The Box-plot, an exploratory data analysis technique, is applied to interpret the phase difference for validation purposes. In addition, wavelet time-frequency coherence is calculated using the locally and globally smoothed wavelet power spectra of the two data sets. Significance tests are performed to quantitatively verify whether the wavelet time-varying coherence is significant at a specific time and frequency point, considering uncertainties in both predicted and observed time series data. The proposed wavelet spectrum analysis approach is illustrated with a dynamics validation challenge problem developed at the Sandia National Laboratories. A comparison study is conducted to demonstrate the advantages of the proposed methodologies over classical frequency-independent cross-correlation analysis and time-independent cross-coherence analysis for the validation of dynamic systems.

  6. Seasonal and annual precipitation time series trend analysis in North Carolina, United States

    NASA Astrophysics Data System (ADS)

    Sayemuzzaman, Mohammad; Jha, Manoj K.

    2014-02-01

    The present study performs the spatial and temporal trend analysis of the annual and seasonal time-series of a set of uniformly distributed 249 stations precipitation data across the state of North Carolina, United States over the period of 1950-2009. The Mann-Kendall (MK) test, the Theil-Sen approach (TSA) and the Sequential Mann-Kendall (SQMK) test were applied to quantify the significance of trend, magnitude of trend, and the trend shift, respectively. Regional (mountain, piedmont and coastal) precipitation trends were also analyzed using the above-mentioned tests. Prior to the application of statistical tests, the pre-whitening technique was used to eliminate the effect of autocorrelation of precipitation data series. The application of the above-mentioned procedures has shown very notable statewide increasing trend for winter and decreasing trend for fall precipitation. Statewide mixed (increasing/decreasing) trend has been detected in annual, spring, and summer precipitation time series. Significant trends (confidence level ≥ 95%) were detected only in 8, 7, 4 and 10 nos. of stations (out of 249 stations) in winter, spring, summer, and fall, respectively. Magnitude of the highest increasing (decreasing) precipitation trend was found about 4 mm/season (- 4.50 mm/season) in fall (summer) season. Annual precipitation trend magnitude varied between - 5.50 mm/year and 9 mm/year. Regional trend analysis found increasing precipitation in mountain and coastal regions in general except during the winter. Piedmont region was found to have increasing trends in summer and fall, but decreasing trend in winter, spring and on an annual basis. The SQMK test on "trend shift analysis" identified a significant shift during 1960 - 70 in most parts of the state. Finally, the comparison between winter (summer) precipitations with the North Atlantic Oscillation (Southern Oscillation) indices concluded that the variability and trend of precipitation can be explained by the

  7. Extensive mapping of coastal change in Alaska by Landsat time-series analysis, 1972-2013

    NASA Astrophysics Data System (ADS)

    Reynolds, J.; Macander, M. J.; Swingley, C. S.; Spencer, S. R.

    2014-12-01

    The landscape-scale effects of coastal storms on Alaska's Bering Sea and Gulf of Alaska coasts includes coastal erosion, migration of spits and barrier islands, breaching of coastal lakes and lagoons, and inundation and salt-kill of vegetation. Large changes in coastal storm frequency and intensity are expected due to climate change and reduced sea-ice extent. Storms have a wide range of impacts on carbon fluxes and on fish and wildlife resources, infrastructure siting and operation, and emergency response planning. In areas experiencing moderate to large effects, changes can be mapped by analyzing trends in time series of Landsat imagery from Landsat 1 through Landsat 8. The authors are performing a time-series trend analysis for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska. Ice- and cloud-free Landsat imagery from Landsat 1-8, covering 1972-2013, were analyzed using a combination of regression, changepoint detection, and classification tree approaches to detect, classify, and map changes in near-infrared reflectance. Areas with significant changes in coastal features, as well as timing of dominant changes and, in some cases, rates of change were identified . The approach captured many coastal changes over the 42-year study period, including coastal erosion exceeding the 60-m pixel resolution of the Multispectral Scanner (MSS) data and migrations of coastal spits and estuarine channels.

  8. InSAR time series analysis of crustal deformation in southern California from 1992-2010

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Lundgren, P.

    2010-12-01

    Since early the 1990’s, Interferometric Satellite Aperture Radar (InSAR) data has had some success imaging surface deformation of plate boundary deformation zones. The ~18 years of extensive data collection over southern California now make it possible to generate a long time interval InSAR-based line-of-sight (LOS) velocity map to examine the resolution of both steady-state and transient deformation processes. We perform InSAR time series analysis on an extensive catalog of ERS-1/2 and Envisat data from 1992 up to the present in southern California by applying a variant of the Small Baseline Subset (SBAS) time series analysis approach. Despite the limitation imposed by atmospheric phase delay, the large number of data acquisitions and long duration of data sampling allow us to effectively suppress the atmospheric noise through spatiotemporal smoothing in the time series analysis. We integrate an updated version of a California GPS velocity solution with InSAR to constrain the long wavelength deformation signals while estimating and removing the effect of orbital error. A large number of interferograms (> 800) over 5 tracks in southern California have been processed and analyzed. We examine the time dependency of resulting deformation patterns. Preliminary results from the ~18 year time series already reveal some interesting features. For example, the InSAR LOS displacements show significant transient variations in greater spatial resolution following the 1999 Mw7.1 Hector Mine earthquake. The 7-year post-seismic rate map demonstrates a broad transient deformation pattern and much localized deformation near the fault surface trace, reflecting a combined effect from afterslip, poroelastic, and viscoelastic relaxation at different spatiotemporal scales. We observe a variation of deformation rate across the Blackwater-Little lake fault system in the Eastern California Shear Zone, suggesting a possible transient variation over this part of the plate boundary. The In

  9. Simulating photon-transport in uniform media using the radiative transport equation: a study using the Neumann-series approach

    PubMed Central

    Jha, Abhinav K.; Kupinski, Matthew A.; Masumura, Takahiro; Clarkson, Eric; Maslov, Alexey V.; Barrett, Harrison H.

    2014-01-01

    We present the implementation, validation, and performance of a Neumann-series approach for simulating light propagation at optical wavelengths in uniform media using the radiative transport equation (RTE). The RTE is solved for an anisotropic-scattering medium in a spherical harmonic basis for a diffuse-optical-imaging setup. The main objectives of this paper are threefold: to present the theory behind the Neumann-series form for the RTE, to design and develop the mathematical methods and the software to implement the Neumann series for a diffuse-optical-imaging setup, and, finally, to perform an exhaustive study of the accuracy, practical limitations, and computational efficiency of the Neumann-series method. Through our results, we demonstrate that the Neumann-series approach can be used to model light propagation in uniform media with small geometries at optical wavelengths. PMID:23201893

  10. On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Vigo, Isabel M.; Trottini, Mario; Belda, Santiago

    2016-04-01

    In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.

  11. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  12. Engage: The Science Speaker Series - A novel approach to improving science outreach and communication

    NASA Astrophysics Data System (ADS)

    Mitchell, R.; Hilton, E.; Rosenfield, P.

    2011-12-01

    Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series is hosted on the University of Washington campus and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the importance of an interdisciplinary approach, and discuss strategies for

  13. Engage: The Science Speaker Series - A novel approach to improving science outreach and communication

    NASA Astrophysics Data System (ADS)

    Mitchell, R.; Hilton, E.; Rosenfield, P.

    2012-12-01

    Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010 and 2011, and will be retaught in 2012. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series has been hosted at the University of Washington campus and Seattle Town Hall, and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the

  14. Multifractal analysis of time series generated by discrete Ito equations

    SciTech Connect

    Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele

    2015-06-15

    In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.

  15. Financial time series analysis based on information categorization method

    NASA Astrophysics Data System (ADS)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  16. Dynamical Analysis and Visualization of Tornadoes Time Series

    PubMed Central

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281

  17. Dynamical analysis and visualization of tornadoes time series.

    PubMed

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281

  18. Characterization of Ground Deformation above AN Urban Tunnel by Means of Insar Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Ferretti, A.; Iannacone, J.; Falorni, G.; Berti, M.; Corsini, A.

    2013-12-01

    Ground deformation produced by tunnel excavation in urban areas can cause damage to buildings and infrastructure. In these contexts, monitoring systems are required to determine the surface area affected by displacement and the rates of movement. Advanced multi-image satellite-based InSAR approaches are uniquely suited for this purpose as they provide an overview of the entire affected area and can measure movement rates with millimeter precision. Persistent scatterer approaches such as SqueeSAR™ use reflections off buildings, lampposts, roads, etc to produce a high-density point cloud in which each point has a time series of deformation spanning the period covered by the imagery. We investigated an area of about 10 km2 in North Vancouver, (Canada) where the shaft excavation of the Seymour-Capilano water filtration plant was started in 2004. As part of the project, twin tunnels in bedrock were excavated to transfer water from the Capilano Reservoir to the treatment plant. A radar dataset comprising 58 images (spanning March 2001 - June 2008) acquired by the Radarsat-1 satellite and covering the period of excavation was processed with the SqueeSAR™ algorithm (Ferretti et al., 2011) to assess the ground deformation caused by the tunnel excavation. To better characterize the deformation in the time and space domains and correlate ground movement with excavation, an in-depth time series analysis was carried out. Berti et al. (2013) developed an automatic procedure for the analysis of InSAR time series based on a sequence of statistical tests. The tool classifies time series into six distinctive types (uncorrelated; linear; quadratic; bilinear; discontinuous without constant velocity; discontinuous with change in velocity) which can be linked to different physical phenomena. It also provides a series of descriptive parameters which can be used to characterize the temporal changes of ground motion. We processed the movement time series with PSTime to determine the

  19. Clinical Immunology Review Series: An approach to the patient with recurrent superficial abscesses

    PubMed Central

    Johnston, S L

    2008-01-01

    ARTICLES PUBLISHED IN THIS CLINICAL IMMUNOLOGY REVIEW SERIES allergy in childhood, allergy diagnosis by use of the clinical immunology laboratory, anaphylaxis, angioedema, management of pulmonary disease in primary antibody deficiency, recurrent infections in childhood, recurrent infections in adulthood, recurrent oro-genital ulceration, recurrent superficial abscesses, urticaria, vasculitis/CTD Patients may be referred to the immunology clinic for investigation of recurrent superficial abscess formation. In the majority of adult patients this clinical presentation does not equate with an underlying primary immune deficiency. Nevertheless, recurrent mucocutaneous abscesses can be associated with significant morbidity and long-term complications, including scarring and fistula formation, and may be associated with underlying immune-mediated disease. This review sets out an approach to the patient with recurrent superficial abscesses, focusing on the differential diagnoses, investigation and management of both the common causes and those associated with specific immune deficiency. PMID:18422735

  20. Catchment classification based on a comparative analysis of time series of natural tracers

    NASA Astrophysics Data System (ADS)

    Lehr, Christian; Lischeid, Gunnar; Tetzlaff, Doerthe

    2014-05-01

    Catchments do not only smooth the precipitation signal into the discharge hydrograph, but transform also chemical signals (e.g. contaminations or nutrients) in a characteristic way. Under the assumption of an approximately homogeneous input signal of a conservative tracer in the catchment the transformation of the signal at different locations can be used to infer hydrological properties of the catchment. For this study comprehensive data on geology, soils, topography, land use, etc. as well as hydrological knowledge about transit times, mixing ratio of base flow, etc. is available for the catchment of the river Dee (1849 km2) in Scotland, UK. The Dee has its origin in the Cairngorm Mountains in Central Scotland and flows towards the eastern coast of Scotland where it ends in the Northern Sea at Aberdeen. From the source in the west to the coast in the east there is a distinct decrease in precipitation and altitude. For one year water quality in the Dee has been sampled biweekly at 59 sites along the main stem of the river and outflows of a number of tributaries. A nonlinear variant of Principal Component Analysis (Isometric Feature Mapping) has been applied on time series of different chemical parameters that were assumed to be relative conservative and applicable as natural tracers. Here, the information in the time series was not used to analyse the temporal development at the different sites, but in a snapshot kind of approach, the spatial expression of the different solutes at the 26 sampling dates. For all natural tracers the first component depicted > 89 % of the variance in the series. Subsequently, the spatial expression of the first component was related to the spatial patterns of the catchment characteristics. The presented approach allows to characterise a catchment in a spatial discrete way according to the hydrologically active properties of the catchment on the landscape scale, which is often the scale of interest for water managing purposes.

  1. Wasserstein distances in the analysis of time series and dynamical systems

    NASA Astrophysics Data System (ADS)

    Muskulus, Michael; Verduyn-Lunel, Sjoerd

    2011-01-01

    A new approach based on Wasserstein distances, which are numerical costs of an optimal transportation problem, allows us to analyze nonlinear phenomena in a robust manner. The long-term behavior is reconstructed from time series, resulting in a probability distribution over phase space. Each pair of probability distributions is then assigned a numerical distance that quantifies the differences in their dynamical properties. From the totality of all these distances a low-dimensional representation in a Euclidean space is derived, in which the time series can be classified and statistically analyzed. This representation shows the functional relationships between the dynamical systems under study. It allows us to assess synchronization properties and also offers a new way of numerical bifurcation analysis. The statistical techniques for this distance-based analysis of dynamical systems are presented, filling a gap in the literature, and their application is discussed in a few examples of datasets arising in physiology and neuroscience, and in the well-known Hénon system.

  2. Combined use of correlation dimension and entropy as discriminating measures for time series analysis

    NASA Astrophysics Data System (ADS)

    Harikrishnan, K. P.; Misra, R.; Ambika, G.

    2009-09-01

    We show that the combined use of correlation dimension (D2) and correlation entropy (K2) as discriminating measures can extract a more accurate information regarding the different types of noise present in a time series data. For this, we make use of an algorithmic approach for computing D2 and K2 proposed by us recently [Harikrishnan KP, Misra R, Ambika G, Kembhavi AK. Physica D 2006;215:137; Harikrishnan KP, Ambika G, Misra R. Mod Phys Lett B 2007;21:129; Harikrishnan KP, Misra R, Ambika G. Pramana - J Phys, in press], which is a modification of the standard Grassberger-Proccacia scheme. While the presence of white noise can be easily identified by computing D2 of data and surrogates, K2 is a better discriminating measure to detect colored noise in the data. Analysis of time series from a real world system involving both white and colored noise is presented as evidence. To our knowledge, this is the first time that such a combined analysis is undertaken on a real world data.

  3. Cross-recurrence quantification analysis of categorical and continuous time series: an R package

    PubMed Central

    Coco, Moreno I.; Dale, Rick

    2014-01-01

    This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between “correlation” and “co-visitation” when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results’ consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels. PMID:25018736

  4. Flow Analysis: A Novel Approach For Classification.

    PubMed

    Vakh, Christina; Falkova, Marina; Timofeeva, Irina; Moskvin, Alexey; Moskvin, Leonid; Bulatov, Andrey

    2016-09-01

    We suggest a novel approach for classification of flow analysis methods according to the conditions under which the mass transfer processes and chemical reactions take place in the flow mode: dispersion-convection flow methods and forced-convection flow methods. The first group includes continuous flow analysis, flow injection analysis, all injection analysis, sequential injection analysis, sequential injection chromatography, cross injection analysis, multi-commutated flow analysis, multi-syringe flow injection analysis, multi-pumping flow systems, loop flow analysis, and simultaneous injection effective mixing flow analysis. The second group includes segmented flow analysis, zone fluidics, flow batch analysis, sequential injection analysis with a mixing chamber, stepwise injection analysis, and multi-commutated stepwise injection analysis. The offered classification allows systematizing a large number of flow analysis methods. Recent developments and applications of dispersion-convection flow methods and forced-convection flow methods are presented. PMID:26364745

  5. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    NASA Technical Reports Server (NTRS)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  6. Time series analysis of transient chaos: Theory and experiment

    SciTech Connect

    Janosi, I.M.; Tel, T.

    1996-06-01

    A simple method is described how nonattracting chaotic sets can be reconstructed from time series by gluing those pieces of many transiently chaotic signals together that come close to this invariant set. The method is illustrated by both a map of well known dynamics, the H{acute e}non map, and a signal obtained from an experiment, the NMR laser. The strange saddle responsible for the transient chaotic behavior is reconstructed and its characteristics like dimension, Lyapunov exponent, and correlation function are determined. {copyright} {ital 1996 American Institute of Physics.}

  7. Advantages of the Multiple Case Series Approach to the Study of Cognitive Deficits in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Towgood, Karren J.; Meuwese, Julia D. I.; Gilbert, Sam J.; Turner, Martha S.; Burgess, Paul W.

    2009-01-01

    In the neuropsychological case series approach, tasks are administered that tap different cognitive domains, and differences within rather than across individuals are the basis for theorising; each individual is effectively their own control. This approach is a mainstay of cognitive neuropsychology, and is particularly suited to the study of…

  8. Anatomy of the ICDS series: A bibliometric analysis

    NASA Astrophysics Data System (ADS)

    Cardona, Manuel; Marxa, Werner

    2007-12-01

    In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called “source journals” covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories.

  9. A new complexity measure for time series analysis and classification

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  10. Advantages of the multiple case series approach to the study of cognitive deficits in autism spectrum disorder

    PubMed Central

    Towgood, Karren J.; Meuwese, Julia D.I.; Gilbert, Sam J.; Turner, Martha S.; Burgess, Paul W.

    2009-01-01

    In the neuropsychological case series approach, tasks are administered that tap different cognitive domains, and differences within rather than across individuals are the basis for theorising; each individual is effectively their own control. This approach is a mainstay of cognitive neuropsychology, and is particularly suited to the study of populations with heterogeneous deficits. However it has very rarely been applied to the study of cognitive differences in autism spectrum disorder (ASD). Here, we investigate whether this approach can yield information beyond that given by the typical group study method, when applied to an ASD population. Twenty-one high-functioning adult ASD participants and 22 IQ, age, and gender-matched control participants were administered a large battery of neuropsychological tests that would represent a typical neuropsychological assessment for neurological patients in the United Kingdom. The data were analysed using both group and single-case study methods. The group analysis revealed a limited number of deficits, principally on tests with a large executive function component, with no impairment in more routine abilities such as basic attending, language and perception. Single-case study analysis proved more fruitful revealing evidence of considerable variation in abilities both between and within ASD participants. Both sub-normal and supra-normal performance were observed, with the most defining feature of the ASD group being this variability. We conclude that the use of group-level analysis alone in the study of cognitive deficits in ASD risks missing cognitive characteristics that may be vitally important both theoretically and clinically, and even may be misleading because of averaging artifact. PMID:19580821

  11. Advantages of the multiple case series approach to the study of cognitive deficits in autism spectrum disorder.

    PubMed

    Towgood, Karren J; Meuwese, Julia D I; Gilbert, Sam J; Turner, Martha S; Burgess, Paul W

    2009-11-01

    In the neuropsychological case series approach, tasks are administered that tap different cognitive domains, and differences within rather than across individuals are the basis for theorising; each individual is effectively their own control. This approach is a mainstay of cognitive neuropsychology, and is particularly suited to the study of populations with heterogeneous deficits. However it has very rarely been applied to the study of cognitive differences in autism spectrum disorder (ASD). Here, we investigate whether this approach can yield information beyond that given by the typical group study method, when applied to an ASD population. Twenty-one high-functioning adult ASD participants and 22 IQ, age, and gender-matched control participants were administered a large battery of neuropsychological tests that would represent a typical neuropsychological assessment for neurological patients in the United Kingdom. The data were analysed using both group and single-case study methods. The group analysis revealed a limited number of deficits, principally on tests with a large executive function component, with no impairment in more routine abilities such as basic attending, language and perception. Single-case study analysis proved more fruitful revealing evidence of considerable variation in abilities both between and within ASD participants. Both sub-normal and supra-normal performance were observed, with the most defining feature of the ASD group being this variability. We conclude that the use of group-level analysis alone in the study of cognitive deficits in ASD risks missing cognitive characteristics that may be vitally important both theoretically and clinically, and even may be misleading because of averaging artifact. PMID:19580821

  12. Class D Management Implementation Approach of the First Orbital Mission of the Earth Venture Series

    NASA Technical Reports Server (NTRS)

    Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris

    2013-01-01

    A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.

  13. Hydrological alteration along the Missouri River Basin: A time series approach

    USGS Publications Warehouse

    Pegg, M.A.; Pierce, C.L.; Roy, A.

    2003-01-01

    Human alteration of large rivers is common-place, often resulting in significant changes in flow characteristics. We used a time series approach to examine daily mean flow data from locations throughout the main-stem Missouri River. Data from a pre-alteration period (1925-1948) were compared with a post-alteration period (1967-1996), with separate analyses conducted using either data from the entire year or restricted to the spring fish spawning period (1 April-30 June). Daily mean flows were significantly higher during the post-alteration period at all locations. Flow variability was markedly reduced during the post-alteration period as a probable result of flow regulation and climatological shifts. Daily mean flow during the spring fish spawning period was significantly lower during the post-alteration period at the most highly altered locations in the middle portion of the river, but unchanged at the least altered locations in the upper and lower portions of the river. Our data also corroborate other analyses, using alternate statistical approaches, that suggest similar changes to the Missouri River system. Our results suggest human alterations on the Missouri River, particularly in the middle portion most strongly affected by impoundments and channelization, have resulted in changes to the natural flow regime.

  14. Class D management implementation approach of the first orbital mission of the Earth Venture series

    NASA Astrophysics Data System (ADS)

    Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris

    2013-09-01

    A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.

  15. Modular Approach to Instrumental Analysis.

    ERIC Educational Resources Information Center

    Deming, Richard L.; And Others

    1982-01-01

    To remedy certain deficiencies, an instrument analysis course was reorganized into six one-unit modules: optical spectroscopy, magnetic resonance, separations, electrochemistry, radiochemistry, and computers and interfacing. Selected aspects of the course are discussed. (SK)

  16. Different approaches of spectral analysis

    NASA Technical Reports Server (NTRS)

    Lacoume, J. L.

    1977-01-01

    Several approaches to the problem of the calculation of spectral power density of a random function from an estimate of the autocorrelation function were studied. A comparative study was presented of these different methods. The principles on which they are based and the hypothesis implied were pointed out. Some indications on the optimization of the length of the estimated correlation function was given. An example of application of the different methods discussed in this paper was included.

  17. Time series analysis of molecular dynamics simulation using wavelet

    NASA Astrophysics Data System (ADS)

    Toda, Mikito

    2012-08-01

    A new method is presented to extract nonstationary features of slow collective motion toward time series data of molecular dynamics simulation for proteins. The method consists of the following two steps: (1) the wavelet transformation and (2) the singular value decomposition (SVD). The wavelet transformation enables us to characterize time varying features of oscillatory motions and SVD enables us to reduce the degrees of freedom of the movement. We apply the method to molecular dynamics simulation of various proteins such as Adenylate Kinase from Escherichia coli (AKE) and Thermomyces lanuginosa lipase (TLL). Moreover, we introduce indexes to characterize collective motion of proteins. These indexes provide us with information of nonstationary deformation of protein structures. We discuss future prospects of our study involving "intrinsically disordered proteins".

  18. Fisher-Shannon analysis of ionization processes and isoelectronic series

    SciTech Connect

    Sen, K. D.; Antolin, J.; Angulo, J. C.

    2007-09-15

    The Fisher-Shannon plane which embodies the Fisher information measure in conjunction with the Shannon entropy is tested in its ability to quantify and compare the informational behavior of the process of atomic ionization. We report the variation of such an information measure and its constituents for a comprehensive set of neutral atoms, and their isoelectronic series including the mononegative ions, using the numerical data generated on 320 atomic systems in position, momentum, and product spaces at the Hartree-Fock level. It is found that the Fisher-Shannon plane clearly reveals shell-filling patterns across the periodic table. Compared to position space, a significantly higher resolution is exhibited in momentum space. Characteristic features in the Fisher-Shannon plane accompanying the ionization process are identified, and the physical reasons for the observed patterns are described.

  19. Flood analysis using generalized logistic models in partial duration series

    NASA Astrophysics Data System (ADS)

    Bhunya, P. K.; Singh, R. D.; Berndtsson, R.; Panda, S. N.

    2012-02-01

    SummaryAs a generalization of the commonly assumed Poisson distribution (PD) used to estimate the annual number of peaks over threshold in partial duration series (PDS) model, the negative binomial (NB) distribution is proposed in this study. Instead of generalized pareto distribution (GPD) and exponential distribution (ED) models popularly applied to predict the probability of the exceedances of peak over threshold, the performance of the general logistic distribution (GLD) models is analyzed. Two different models for analyzing extreme hydrologic events are compared, based on, PDS and annual maximum series (AMS), respectively. The performance of the two models in terms of uncertainty of T-year event estimator [ q( T)] is evaluated in the cases of estimation with the method of moments (MOMs), maximum likelihood (ML), and probability weighted moments (PWMs). The annual maximum distribution corresponding to a PDS model with Poisson distributed count of peaks above threshold and GLD for flood exceedances was found to be an extreme value type I (EV1) distribution. The comparison between PDS and AMS is made using ratio of variance of the T-year event estimates, which is derived analytically after checking the reliability of the expressions with Monte Carlo simulations. The results reveal that the AMS/NB-GLD and PDS/GLD models using PWM estimation method give least variance of flood estimates with the PDS model giving marginally better results. From the overall results, it was observed that the Poisson distribution performs better, where the difference between mean ( μ) and variance of counts of threshold exceedances is small otherwise the NB distribution is found to be efficient when used in combination with generalized logistic distribution in the PDS model, and this is more prominent for μ < 1.4. Hence, in such cases when the PDS data have a mean less than this, the AMS/NB-GLD and PDS/GLD should be a better model for q( T) estimation as compared to PDS/ED.

  20. Taxation in Public Education. Analysis and Bibliography Series, No. 12.

    ERIC Educational Resources Information Center

    Ross, Larry L.

    Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…

  1. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  2. Analysis of Zenith Tropospheric Delay above Europe based on long time series derived from the EPN data

    NASA Astrophysics Data System (ADS)

    Baldysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; Kroszczynski, Krzysztof; Araszkiewicz, Andrzej

    2015-04-01

    In recent years, the GNSS system began to play an increasingly important role in the research related to the climate monitoring. Based on the GPS system, which has the longest operational capability in comparison with other systems, and a common computational strategy applied to all observations, long and homogeneous ZTD (Zenith Tropospheric Delay) time series were derived. This paper presents results of analysis of 16-year ZTD time series obtained from the EPN (EUREF Permanent Network) reprocessing performed by the Military University of Technology. To maintain the uniformity of data, analyzed period of time (1998-2013) is exactly the same for all stations - observations carried out before 1998 were removed from time series and observations processed using different strategy were recalculated according to the MUT LAC approach. For all 16-year time series (59 stations) Lomb-Scargle periodograms were created to obtain information about the oscillations in ZTD time series. Due to strong annual oscillations which disturb the character of oscillations with smaller amplitude and thus hinder their investigation, Lomb-Scargle periodograms for time series with the deleted annual oscillations were created in order to verify presence of semi-annual, ter-annual and quarto-annual oscillations. Linear trend and seasonal components were estimated using LSE (Least Square Estimation) and Mann-Kendall trend test were used to confirm the presence of linear trend designated by LSE method. In order to verify the effect of the length of time series on the estimated size of the linear trend, comparison between two different length of ZTD time series was performed. To carry out a comparative analysis, 30 stations which have been operating since 1996 were selected. For these stations two periods of time were analyzed: shortened 16-year (1998-2013) and full 18-year (1996-2013). For some stations an additional two years of observations have significant impact on changing the size of linear

  3. Multiscale InSAR Time Series (MInTS) analysis of surface deformation

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Musé, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; Dicaprio, C. J.

    2012-02-01

    We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, since the coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.

  4. Multiscale InSAR Time Series (MInTS) analysis of surface deformation

    NASA Astrophysics Data System (ADS)

    Hetland, E. A.; Muse, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; DiCaprio, C. J.

    2011-12-01

    We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, such that coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least-squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.

  5. A Predictive Analysis Approach to Adaptive Testing.

    ERIC Educational Resources Information Center

    Kirisci, Levent; Hsu, Tse-Chi

    The predictive analysis approach to adaptive testing originated in the idea of statistical predictive analysis suggested by J. Aitchison and I.R. Dunsmore (1975). The adaptive testing model proposed is based on parameter-free predictive distribution. Aitchison and Dunsmore define statistical prediction analysis as the use of data obtained from an…

  6. Health Promotion for Adult Literacy Students: An Empowering Approach. Introduction to the Series.

    ERIC Educational Resources Information Center

    Hudson River Center for Program Development, Glenmont, NY.

    This teaching guide introduces and gives advice on using a series of health promotion materials in adult basic education classes. The series was developed with input from adult learners. This guide describes the series and offers advice on staff preparation and suggestions for lesson preparation. The guide is organized in six sections that cover…

  7. Time series analysis of personal exposure to ambient air pollution and mortality using an exposure simulator

    PubMed Central

    Chang, Howard H.; Fuentes, Montserrat; Frey, H. Christopher

    2013-01-01

    This paper describes a modeling framework for estimating the acute effects of personal exposure to ambient air pollution in a time series design. First, a spatial hierarchical model is used to relate Census tract-level daily ambient concentrations and simulated exposures for a subset of the study period. The complete exposure time series is then imputed for risk estimation. Modeling exposure via a statistical model reduces the computational burden associated with simulating personal exposures considerably. This allows us to consider personal exposures at a finer spatial resolution to improve exposure assessment and for a longer study period. The proposed approach is applied to an analysis of fine particulate matter of <2.5 μm in aerodynamic diameter (PM2.5) and daily mortality in the New York City metropolitan area during the period 2001–2005. Personal PM2.5 exposures were simulated from the Stochastic Human Exposure and Dose Simulation. Accounting for exposure uncertainty, the authors estimated a 2.32% (95% posterior interval: 0.68, 3.94) increase in mortality per a 10 μg/m3 increase in personal exposure to PM2.5 from outdoor sources on the previous day. The corresponding estimates per a 10 μg/m3 increase in PM2.5 ambient concentration was 1.13% (95% confidence interval: 0.27, 2.00). The risks of mortality associated with PM2.5 were also higher during the summer months. PMID:22669499

  8. Constraint-based analysis of gene interactions using restricted boolean networks and time-series data

    PubMed Central

    2011-01-01

    Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available. PMID:21554763

  9. Data Reorganization for Optimal Time Series Data Access, Analysis, and Visualization

    NASA Astrophysics Data System (ADS)

    Rui, H.; Teng, W. L.; Strub, R.; Vollmer, B.

    2012-12-01

    The way data are archived is often not optimal for their access by many user communities (e.g., hydrological), particularly if the data volumes and/or number of data files are large. The number of data records of a non-static data set generally increases with time. Therefore, most data sets are commonly archived by time steps, one step per file, often containing multiple variables. However, many research and application efforts need time series data for a given geographical location or area, i.e., a data organization that is orthogonal to the way the data are archived. The retrieval of a time series of the entire temporal coverage of a data set for a single variable at a single data point, in an optimal way, is an important and longstanding challenge, especially for large science data sets (i.e., with volumes greater than 100 GB). Two examples of such large data sets are the North American Land Data Assimilation System (NLDAS) and Global Land Data Assimilation System (GLDAS), archived at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC; Hydrology Data Holdings Portal, http://disc.sci.gsfc.nasa.gov/hydrology/data-holdings). To date, the NLDAS data set, hourly 0.125x0.125° from Jan. 1, 1979 to present, has a total volume greater than 3 TB (compressed). The GLDAS data set, 3-hourly and monthly 0.25x0.25° and 1.0x1.0° Jan. 1948 to present, has a total volume greater than 1 TB (compressed). Both data sets are accessible, in the archived time step format, via several convenient methods, including Mirador search and download (http://mirador.gsfc.nasa.gov/), GrADS Data Server (GDS; http://hydro1.sci.gsfc.nasa.gov/dods/), direct FTP (ftp://hydro1.sci.gsfc.nasa.gov/data/s4pa/), and Giovanni Online Visualization and Analysis (http://disc.sci.gsfc.nasa.gov/giovanni). However, users who need long time series currently have no efficient way to retrieve them. Continuing a longstanding tradition of facilitating data access, analysis, and

  10. Variance Analysis of Unevenly Spaced Time Series Data

    NASA Technical Reports Server (NTRS)

    Hackman, Christine; Parker, Thomas E.

    1996-01-01

    We have investigated the effect of uneven data spacing on the computation of delta (sub chi)(gamma). Evenly spaced simulated data sets were generated for noise processes ranging from white phase modulation (PM) to random walk frequency modulation (FM). Delta(sub chi)(gamma) was then calculated for each noise type. Data were subsequently removed from each simulated data set using typical two-way satellite time and frequency transfer (TWSTFT) data patterns to create two unevenly spaced sets with average intervals of 2.8 and 3.6 days. Delta(sub chi)(gamma) was then calculated for each sparse data set using two different approaches. First the missing data points were replaced by linear interpolation and delta (sub chi)(gamma) calculated from this now full data set. The second approach ignored the fact that the data were unevenly spaced and calculated delta(sub chi)(gamma) as if the data were equally spaced with average spacing of 2.8 or 3.6 days. Both approaches have advantages and disadvantages, and techniques are presented for correcting errors caused by uneven data spacing in typical TWSTFT data sets.

  11. Pitfalls in Fractal Time Series Analysis: fMRI BOLD as an Exemplary Case

    PubMed Central

    Eke, Andras; Herman, Peter; Sanganahalli, Basavaraju G.; Hyder, Fahmeed; Mukli, Peter; Nagy, Zoltan

    2012-01-01

    This article will be positioned on our previous work demonstrating the importance of adhering to a carefully selected set of criteria when choosing the suitable method from those available ensuring its adequate performance when applied to real temporal signals, such as fMRI BOLD, to evaluate one important facet of their behavior, fractality. Earlier, we have reviewed on a range of monofractal tools and evaluated their performance. Given the advance in the fractal field, in this article we will discuss the most widely used implementations of multifractal analyses, too. Our recommended flowchart for the fractal characterization of spontaneous, low frequency fluctuations in fMRI BOLD will be used as the framework for this article to make certain that it will provide a hands-on experience for the reader in handling the perplexed issues of fractal analysis. The reason why this particular signal modality and its fractal analysis has been chosen was due to its high impact on today’s neuroscience given it had powerfully emerged as a new way of interpreting the complex functioning of the brain (see “intrinsic activity”). The reader will first be presented with the basic concepts of mono and multifractal time series analyses, followed by some of the most relevant implementations, characterization by numerical approaches. The notion of the dichotomy of fractional Gaussian noise and fractional Brownian motion signal classes and their impact on fractal time series analyses will be thoroughly discussed as the central theme of our application strategy. Sources of pitfalls and way how to avoid them will be identified followed by a demonstration on fractal studies of fMRI BOLD taken from the literature and that of our own in an attempt to consolidate the best practice in fractal analysis of empirical fMRI BOLD signals mapped throughout the brain as an exemplary case of potentially wide interest. PMID:23227008

  12. Analysis of Binary Series to Evaluate Astronomical Forcing of a Middle Permian Chert Sequence in South China

    NASA Astrophysics Data System (ADS)

    Hinnov, L. A.; Yao, X.; Zhou, Y.

    2014-12-01

    We describe a Middle Permian radiolarian chert sequence in South China (Chaohu area), with sequence of chert and mudstone layers formulated into binary series.Two interpolation approaches were tested: linear interpolation resulting in a "triangle" series, and staircase interpolation resulting in a "boxcar" series. Spectral analysis of the triangle series reveals decimeter chert-mudstone cycles which represent theoretical Middle Permian 32 kyr obliquity cycling. Tuning these cycles to a 32-kyr periodicity reveals that other cm-scale cycles are in the precession index band and have a strong ~400 kyr amplitude modulation. Additional tuning tests further support a hypothesis of astronomical forcing of the chert sequence. Analysis of the boxcar series reveals additional "eccentricity" terms transmitted by the boxcar representation of the modulating precession-scale cycles. An astronomical time scale reconstructed from these results assumes a Roadian/Wordian boundary age of 268.8 Ma for the onset of the first chert layer at the base of the sequence and ends at 264.1 Ma, for a total duration of 4.7 Myrs. We propose that monsoon-controlled upwelling contributed to the development of the chert-mudstone cycles. A seasonal monsoon controlled by astronomical forcing influenced the intensity of upwelling, modulating radiolarian productivity and silica deposition.

  13. InSAR and GPS time series analysis: Crustal deformation in the Yucca Mountain, Nevada region

    NASA Astrophysics Data System (ADS)

    Li, Z.; Hammond, W. C.; Blewitt, G.; Kreemer, C. W.; Plag, H.

    2010-12-01

    Several previous studies have successfully demonstrated that long time series (e.g. >5 years) of GPS measurements can be employed to detect tectonic signals with a vertical rate greater than 0.3 mm/yr (e.g. Hill and Blewitt, 2006; Bennett et al. 2009). However, GPS stations are often sparse, with spacing from a few kilometres to a few hundred kilometres. Interferometric SAR (InSAR) can complement GPS by providing high horizontal spatial resolution (e.g. meters to tens-of metres) over large regions (e.g. 100 km × 100 km). A major source of error for repeat-pass InSAR is the phase delay in radio signal propagation through the atmosphere. The portion of this attributable to tropospheric water vapour causes errors as large as 10-20 cm in deformation retrievals. InSAR Time Series analysis with Atmospheric Estimation Models (InSAR TS + AEM), developed at the University of Glasgow, is a robust time series analysis approach, which mainly uses interferograms with small geometric baselines to minimise the effects of decorrelation and inaccuracies in topographic data. In addition, InSAR TS + AEM can be used to separate deformation signals from atmospheric water vapour effects in order to map surface deformation as it evolves in time. The principal purposes of this study are to assess: (1) how consistent InSAR-derived deformation time series are with GPS; and (2) how precise InSAR-derived atmospheric path delays can be. The Yucca Mountain, Nevada region is chosen as the study site because of its excellent GPS network and extensive radar archives (>10 years of dense and high-quality GPS stations, and >17 years of ERS and ENVISAT radar acquisitions), and because of its arid environment. The latter results in coherence that is generally high, even for long periods that span the existing C-band radar archives of ERS and ENVISAT. Preliminary results show that our InSAR LOS deformation map agrees with GPS measurements to within 0.35 mm/yr RMS misfit at the stations which is the

  14. Physiological time-series analysis: what does regularity quantify?

    NASA Technical Reports Server (NTRS)

    Pincus, S. M.; Goldberger, A. L.

    1994-01-01

    Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.

  15. Presentations to Emergency Departments for COPD: A Time Series Analysis

    PubMed Central

    Youngson, Erik; Rowe, Brian H.

    2016-01-01

    Background. Chronic obstructive pulmonary disease (COPD) is a common respiratory condition characterized by progressive dyspnea and acute exacerbations which may result in emergency department (ED) presentations. This study examines monthly rates of presentations to EDs in one Canadian province. Methods. Presentations for COPD made by individuals aged ≥55 years during April 1999 to March 2011 were extracted from provincial databases. Data included age, sex, and health zone of residence (North, Central, South, and urban). Crude rates were calculated. Seasonal autoregressive integrated moving average (SARIMA) time series models were developed. Results. ED presentations for COPD totalled 188,824 and the monthly rate of presentation remained relatively stable (from 197.7 to 232.6 per 100,000). Males and seniors (≥65 years) comprised 52.2% and 73.7% of presentations, respectively. The ARIMA(1,0, 0) × (1,0, 1)12 model was appropriate for the overall rate of presentations and for each sex and seniors. Zone specific models showed relatively stable or decreasing rates; the North zone had an increasing trend. Conclusions. ED presentation rates for COPD have been relatively stable in Alberta during the past decade. However, their increases in northern regions deserve further exploration. The SARIMA models quantified the temporal patterns and can help planning future health care service needs. PMID:27445514

  16. Vapor burn analysis for the Coyote series LNG spill experiments

    SciTech Connect

    Rodean, H.C.; Hogan, W.J.; Urtiew, P.A.; Goldwire, H.C. Jr.; McRae, T.G.; Morgan, D.L. Jr.

    1984-04-01

    A major purpose of the Coyote series of field experiments at China Lake, California, in 1981 was to study the burning of vapor clouds from spills of liquefied natural gas (LNG) on water. Extensive arrays of instrumentation were deployed to obtain micrometeorological, gas concentration, and fire-related data. The instrumentation included in situ sensors of various types, high-speed motion picture cameras, and infrared (IR) imagers. Five of the total of ten Coyote spill experiments investigated vapor burns. The first vapor-burn experiment, Coyote 2, was done with a small spill of LNG to assess instrument capability and survivability in vapor cloud fires. The emphasis in this report is on the other four vapor-burn experiments: Coyotes 3, 5, 6, and 7. The data are analyzed to determine fire spread, flame propagation, and heat flux - quantities that are related to the determination of the damage zone for vapor burns. The results of the analyses are given here. 20 references, 57 figures, 7 tables.

  17. Presentations to Emergency Departments for COPD: A Time Series Analysis.

    PubMed

    Rosychuk, Rhonda J; Youngson, Erik; Rowe, Brian H

    2016-01-01

    Background. Chronic obstructive pulmonary disease (COPD) is a common respiratory condition characterized by progressive dyspnea and acute exacerbations which may result in emergency department (ED) presentations. This study examines monthly rates of presentations to EDs in one Canadian province. Methods. Presentations for COPD made by individuals aged ≥55 years during April 1999 to March 2011 were extracted from provincial databases. Data included age, sex, and health zone of residence (North, Central, South, and urban). Crude rates were calculated. Seasonal autoregressive integrated moving average (SARIMA) time series models were developed. Results. ED presentations for COPD totalled 188,824 and the monthly rate of presentation remained relatively stable (from 197.7 to 232.6 per 100,000). Males and seniors (≥65 years) comprised 52.2% and 73.7% of presentations, respectively. The ARIMA(1,0, 0) × (1,0, 1)12 model was appropriate for the overall rate of presentations and for each sex and seniors. Zone specific models showed relatively stable or decreasing rates; the North zone had an increasing trend. Conclusions. ED presentation rates for COPD have been relatively stable in Alberta during the past decade. However, their increases in northern regions deserve further exploration. The SARIMA models quantified the temporal patterns and can help planning future health care service needs. PMID:27445514

  18. Calibrative approaches to protein solubility modeling of a mutant series using physicochemical descriptors

    PubMed Central

    Labute, P.

    2010-01-01

    A set of physicochemical properties describing a protein of known structure is employed for a calibrative approach to protein solubility. Common hydrodynamic and electrophoretic properties routinely measured in the bio-analytical laboratory such as zeta potential, dipole moment, the second osmotic virial coefficient are first estimated in silico as a function a pH and solution ionic strength starting with the protein crystal structure. The utility of these descriptors in understanding the solubility of a series of ribonuclease Sa mutants is investigated. A simple two parameter model was trained using solubility data of the wild type protein measured at a restricted number of solution pHs. Solubility estimates of the mutants demonstrate that zeta potential and dipole moment may be used to rationalize solubility trends over a wide pH range. Additionally a calibrative model based on the protein’s second osmotic virial coefficient, B22 was developed. A modified DVLO type potential along with a simplified representation of the protein allowed for efficient computation of the second viral coefficient. The standard error of prediction for both models was on the order of 0.3 log S units. These results are very encouraging and demonstrate that these models may be trained with a small number of samples and employed extrapolatively for estimating mutant solubilities. PMID:20842408

  19. The lateral extracavitary approach to the thoracolumbar spine: a case series and systematic review.

    PubMed

    Foreman, Paul M; Naftel, Robert P; Moore, Thomas A; Hadley, Mark N

    2016-04-01

    OBJECT Since its introduction in 1976, the lateral extracavitary approach (LECA) has been used to access ventral and ventrolateral pathology affecting the thoracolumbar spine. Reporting of outcomes and complications has been inconsistent. A case series and systematic review are presented to summarize the available data. METHODS A retrospective review of medical records was performed, which identified 65 consecutive patients who underwent LECA for the treatment of thoracolumbar spine and spinal cord pathology. Cases were divided according to the presenting pathology. Neurological outcomes and complications were detailed. In addition, a systematic review of outcomes and complications in patients treated with the LECA as reported in the literature was completed. RESULTS Sixty-five patients underwent the LECA to the spine for the treatment of thoracic spine and spinal cord pathology. The most common indication for surgery was thoracic disc herniation (23/65, 35.4%). Neurological outcomes were excellent: 69.2% improved, 29.2% experienced no change, and 1.5% were worse. Two patients (3.1%) experienced a complication. The systematic review revealed comparable neurological outcomes (74.9% improved) but a notably higher complication rate (32.2%). CONCLUSIONS The LECA provides dorsal and unilateral ventrolateral access to and exposure of the thoracolumbar spine and spinal cord while allowing for posterior instrumentation through the same incision. Although excellent neurological results can be expected, the risk of pulmonary complications should be considered. PMID:26682602

  20. Systems Building Techniques. Analysis and Bibliography Series, No. 15.

    ERIC Educational Resources Information Center

    Baas, Alan M.

    This review presents an analysis of the literature concerning the growth of systems building programs in education and reports on the conclusions of numerous architects and educators that the systems-built school may well be the only cost-effective answer available to today's educational facilities needs. The terms "building systems" and "systems…

  1. Educational Attainment: Analysis by Immigrant Generation. IZA Discussion Paper Series.

    ERIC Educational Resources Information Center

    Chiswick, Barry R.; DebBurman, Noyna

    This paper presents a theoretical and empirical analysis of the largely ignored issue of the determinants of the educational attainment of adults by immigrant generation. Using Current Population Survey (CPS) data, differences in educational attainment are analyzed by immigrant generation (first, second, and higher order generations), and among…

  2. Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  3. Models for Planning. Analysis of Literature and Selected Bibliography. Analysis and Bibliography Series, No. 5.

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Educational Management, Eugene, OR.

    This review analyzes current research trends in the application of planning models to broad educational systems. Planning models reviewed include systems approach models, simulation models, operational gaming, linear programing, Markov chain analysis, dynamic programing, and queuing techniques. A 77-item bibliography of recent literature is…

  4. PX series AMTEC cell design, testing and analysis

    SciTech Connect

    Borkowski, C.A.; Sievers, R.K.; Hendricks, T.J.

    1997-12-31

    PX (Pluto Express) cell testing and analysis has shown that AMTEC (Alkali Metal Thermal to Electric Conversion) cells can reach the power levels required by proposed RPS (Radioisotope Power Supply) system designs. A major PX cell design challenge was to optimize the power and efficiency of the cell while allowing a broad operational power range. These design optimization issues are greatly dependent on the placement of the evaporation zone. Before the PX-2 and PX-4 cells were built, the results from the PX-1, ATC-2 (artery test cell) and design analysis indicated the need for a thermal bridge between the heat input surface of the cell and the structure supporting the evaporation zone. Test and analytic results are presented illustrating the magnitude of the power transfer to the evaporation zone and the effect of this power transfer on the performance of the cell. Comparisons are also made between the cell test data and analytic results of cell performance to validate the analytic models.

  5. Mapping mountain pine beetle mortality through growth trend analysis of time-series landsat data

    USGS Publications Warehouse

    Liang, Lu; Chen, Yanlei; Hawbaker, Todd J.; Zhu, Zhi-Liang; Gong, Peng

    2014-01-01

    Disturbances are key processes in the carbon cycle of forests and other ecosystems. In recent decades, mountain pine beetle (MPB; Dendroctonus ponderosae) outbreaks have become more frequent and extensive in western North America. Remote sensing has the ability to fill the data gaps of long-term infestation monitoring, but the elimination of observational noise and attributing changes quantitatively are two main challenges in its effective application. Here, we present a forest growth trend analysis method that integrates Landsat temporal trajectories and decision tree techniques to derive annual forest disturbance maps over an 11-year period. The temporal trajectory component successfully captures the disturbance events as represented by spectral segments, whereas decision tree modeling efficiently recognizes and attributes events based upon the characteristics of the segments. Validated against a point set sampled across a gradient of MPB mortality, 86.74% to 94.00% overall accuracy was achieved with small variability in accuracy among years. In contrast, the overall accuracies of single-date classifications ranged from 37.20% to 75.20% and only become comparable with our approach when the training sample size was increased at least four-fold. This demonstrates that the advantages of this time series work flow exist in its small training sample size requirement. The easily understandable, interpretable and modifiable characteristics of our approach suggest that it could be applicable to other ecoregions.

  6. Association mechanism between a series of rodenticide and humic acid: a frontal analysis to support the biological data.

    PubMed

    André, Claire; Guyon, Catherine; Thomassin, Mireille; Barbier, Alexandre; Richert, Lysiane; Guillaume, Yves-Claude

    2005-06-01

    The binding constants (K) of a series of anticoagulant rodenticides with the main soil organic component, humic acid (HA), were determined using frontal analysis approach. The order of the binding constants was identical as the one obtained in a previous paper [J. Chromatogr. B 813 (2004) 295], i.e. bromadiolone>brodifacoum>difenacoum>chlorophacinone>diphacinone, confirming the power of this frontal analysis approach for the determination of binding constants. Moreover, and for the first time, the concentration of unbound rodenticide to HAs could be determined. Thanks this approach, we could clearly demonstrate that HA acid protected the human hepatoma cell line HepG2 against the cytotoxicity of all the rodenticides tested and that the toxicity of rodenticides was directly linked to the free rodenticide fraction in the medium (i.e. unbound rodenticide to HA). PMID:15866487

  7. An approach to constructing a homogeneous time series of soil mositure using SMOS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Overlapping soil moisture time series derived from two satellite microwave radiometers (SMOS, Soil Moisture and Ocean Salinity; AMSR-E, Advanced Microwave Scanning Radiometer - Earth Observing System) are used to generate a soil moisture time series from 2003 to 2010. Two statistical methodologies f...

  8. New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…

  9. Characterizing rainfall of hot arid region by using time-series modeling and sustainability approaches: a case study from Gujarat, India

    NASA Astrophysics Data System (ADS)

    Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi

    2016-05-01

    This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable

  10. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  11. Accounting for Errors in Model Analysis Theory: A Numerical Approach

    NASA Astrophysics Data System (ADS)

    Sommer, Steven R.; Lindell, Rebecca S.

    2004-09-01

    By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.

  12. Contribution of alcohol in accident related mortality in Belarus: a time series approach

    PubMed Central

    Razvodovsky, Yury Evgeny

    2012-01-01

    Abstract: Background: High accidental death rates in the former Soviet republics (FSR) and its profound fluctuation over the past decades have attracted considerable interest. The research evidences emphasize binge drinking pattern as a potentially important contributor to accident mortality crisis in FSR. In line with this evidence we assume that higher level of alcohol consumption in conjunction with binge drinking pattern results in close aggregate-level association between alcohol psychoses and accidental death rates in the former Soviet Slavic republic Belarus. Methods: Trends in alcohol psychoses rate (as a proxy for alcohol consumption) from 1979 to 2007 were analyzed employing a distributed lag analysis in order to asses bivariate relationship between the two time series. Results: According to the Bureau of Forensic Medicine autopsy reports the number of deaths due to accidents and injuries increased by 52.5% (from 62.3 to 95.0 per 100.000 of residents), and fatal alcohol poisoning rate increased by 108.6% (from 12.8 to 26.7 per 100.000 of residents) in Belarus between 1979 and 2007. Alcohol in blood was found in 50.1% victims of deaths from accidents and injuries for the whole period, with the minimum figure 40% in 1986 and maximum 58.2% in 2005. The outcome of distributed lags analysis indicated statistically significant association between the number of alcohol psychoses cases and the number BAC-positive deaths from accidents at zero lag. Conclusion: The outcome of this study supports previous findings suggesting that alcohol and deaths from accidents are closely connected in a culture with prevailing intoxication-oriented drinking pattern, and add to growing body of evidence that a substantial proportion of accidental deaths in Belarus are due to effects of binge drinking. PMID:21502784

  13. Approaches to remote sensing data analysis

    USGS Publications Warehouse

    Pettinger, Lawrence R.

    1978-01-01

    Objectives: To present an overview of the essential steps in the remote sensing data analysis process, and to compare and contrast manual (visual) and automated analysis methods Rationale: This overview is intended to provide a framework for choosing a manual of digital analysis approach to collecting resource information. It can also be used as a basis for understanding/evaluating invited papers and poster sessions during the Symposium

  14. The Use of Scaffolding Approach to Enhance Students' Engagement in Learning Structural Analysis

    ERIC Educational Resources Information Center

    Hardjito, Djwantoro

    2010-01-01

    This paper presents a reflection on the use of Scaffolding Approach to engage Civil Engineering students in learning Structural Analysis subjects. In this approach, after listening to the lecture on background theory, students are provided with a series of practice problems, each one comes with the steps, formulas, hints, and tables needed to…

  15. DAHITI - An Innovative Approach for Estimating Water Level Time Series over Inland Water using Multi-Mission Satellite Altimetry

    NASA Astrophysics Data System (ADS)

    Schwatke, Christian; Dettmering, Denise

    2016-04-01

    Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, for some years, this technology has also been used to retrieve water levels from lakes, reservoirs, rivers, wetlands and in general any inland water body. In this contribution, a new approach for the estimation of inland water level time series is presented. The method is the basis for the computation of time series of rivers and lakes available through the web service 'Database for Hydrological Time Series over Inland Water' (DAHITI). It is based on an extended outlier rejection and a Kalman filter approach incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1, Jason-2, Topex/Poseidon, and SARAL/AltiKa, including their uncertainties. The new approach yields RMS differences with respect to in situ data between 4 cm and 36 cm for lakes and 8 cm and 114 cm for rivers, respectively. Within this presentation, the new approach will be introduced and examples for water level time series for a variety of lakes and rivers will be shown featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases.

  16. Modeling Dyadic Processes Using Hidden Markov Models: A Time Series Approach to Mother-Infant Interactions during Infant Immunization

    ERIC Educational Resources Information Center

    Stifter, Cynthia A.; Rovine, Michael

    2015-01-01

    The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…

  17. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    NASA Astrophysics Data System (ADS)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q<0 and greater than GHE when q>0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  18. Singular spectrum analysis and Fisher-Shannon analysis of spring flow time series: An application to Anjar Spring, Lebanon

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Lovallo, Michele; Shaban, Amin; Darwich, Talal; Amacha, Nabil

    2013-09-01

    In this study, the time dynamics of water flow from Anjar Spring was investigated, which is one of the major issuing springs in the central part of Lebanon. Likewise, many water sources in Lebanon, this spring has no continuous records for the discharge, and this would prevent the application of standard time series analysis tools. Furthermore, the highly nonstationary character of the series implies that suited methodologies can be employed to get insight into its dynamical features. Therefore, the Singular Spectrum Analysis (SSA) and Fisher-Shannon (FS) method, which are useful methods to disclose dynamical features in noisy nonstationary time series with gaps, are jointly applied to analyze the Anjar Spring water flow series. The SSA revealed that the series can be considered as the superposition of meteo-climatic periodic components, low-frequency trend and noise-like high-frequency fluctuations. The FS method allowed to extract and to identify among all the SSA reconstructed components the long-term trend of the series. The long-term trend is characterized by higher Fisher Information Measure (FIM) and lower Shannon entropy, and thus, represents the main informative component of the whole series. Generally water discharge time series presents very complex time structure, therefore the joint application of the SSA and the FS method would be very useful in disclosing the main informative part of such kind of data series in the view of existing climatic variability and/or anthropogenic challenges.

  19. Teaching time-series analysis. I. Finite Fourier analysis of ocean waves

    NASA Astrophysics Data System (ADS)

    Whitford, Dennis J.; Vieira, Mario E. C.; Waters, Jennifer K.

    2001-04-01

    The introduction of students to methods of time-series analysis is a pedagogical challenge, since the availability of easily manipulated computer software presents an attractive alternative to an understanding of the computations, as well as their assumptions and limitations. A two-part pedagogical tutorial exercise is offered as a hands-on laboratory to complement classroom discussions or as a reference for students involved in independent research projects. The exercises are focused on the analysis of ocean waves, specifically wind-generated surface gravity waves. The exercises are cross-disciplinary in nature and can be extended to any other field dealing with random signal analysis. The first exercise introduces the manual arithmetic steps of a finite Fourier analysis of a wave record, develops a spectrum, and compares these results to the results obtained using a fast Fourier transform (FFT). The second part of the exercise, described in the subsequent article, takes a longer wave record and addresses the theoretical and observed wave probability distributions of wave heights and sea surface elevations. These results are then compared to a FFT, thus linking the two pedagogical laboratory exercise parts for a more complete understanding of both exercises.

  20. Performance analysis of SHE-PWM using Fourier Series and Newton-Raphson analysis

    NASA Astrophysics Data System (ADS)

    Lada, M. Y.; Khiar, M. S. A.; Ghani, S. A.; Nawawi, M. R. M.; Nor, A. S. M.; Yuen, J. G. M.

    2015-05-01

    The performance of inverter has become a vital role in contributing effective power system nowadays. However the major issue that will reduce the inverter performance is the harmonic distortions that contribute to power losses. Thus, there are a variety of controls techniques have been implemented for inverters switching such as square wave, SHE-PWM, unipolar and bipolar. The square wave type inverter produces output voltage in square shape which has simple logic control and power switches. Next, unipolar and bipolar techniques are using comparator to compare the reference voltage waveform with the triangular waveform. The difference between unipolar and bipolar is there are two reference signals which are compared with the triangular waveform for unipolar switching. On the other hand, bipolar switching compares triangular waveform with a reference signal. Selective Harmonic Elimination Pulse-Width Modulation (SHE-PWM) is another control technique for inverters. This research propose SHE-PWM as a low switching frequency strategy that uses Fourier Series and Newton-Raphson analysis to calculate the switching angles for elimination of harmonic distortion. Fourier Series is used to determine the amplitude of any odd harmonic in the output signal whereas Newton-Raphson is used to solve the equation for finding switching angles. As a result, SHE-PWM can select the low frequency harmonic components need to be eliminated and reduce the harmonic distortion. It also prevents the harmonic distortion that sensitive to the inverter performance

  1. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  2. CCD Observing and Dynamical Time Series Analysis of Active Galactic Nuclei.

    NASA Astrophysics Data System (ADS)

    Nair, Achotham Damodaran

    1995-01-01

    The properties, working and operations procedure of the Charge Coupled Device (CCD) at the 30" telescope at Rosemary Hill Observatory (RHO) are discussed together with the details of data reduction. Several nonlinear techniques of time series analysis, based on the behavior of the nearest neighbors, have been used to analyze the time series of the quasar 3C 345. A technique using Artificial Neural Networks based on prediction of the time series is used to study the dynamical properties of 3C 345. Finally, a heuristic model for variability of Active Galactic Nuclei is discussed.

  3. Time series analysis of hydraulic head and strain of subsurface formations in the Kanto Plain, Japan

    NASA Astrophysics Data System (ADS)

    Aichi, Masaatsu

    2015-04-01

    The hydraulic head and strain of subsurface formations have been monitored more than several decades in the Kanto Plain, Japan. Time series analysis of these data revealed that the relation between hydraulic head and strain observed in some monitoring wells could be modeled by linear poroelasticity. Based on the relations of time series data, the poroelastic coefficients were estimated. The obtained values were consistent with those from laboratory experiments reported in literatures.

  4. FunPat: function-based pattern analysis on RNA-seq time series data

    PubMed Central

    2015-01-01

    Background Dynamic expression data, nowadays obtained using high-throughput RNA sequencing, are essential to monitor transient gene expression changes and to study the dynamics of their transcriptional activity in the cell or response to stimuli. Several methods for data selection, clustering and functional analysis are available; however, these steps are usually performed independently, without exploiting and integrating the information derived from each step of the analysis. Methods Here we present FunPat, an R package for time series RNA sequencing data that integrates gene selection, clustering and functional annotation into a single framework. FunPat exploits functional annotations by performing for each functional term, e.g. a Gene Ontology term, an integrated selection-clustering analysis to select differentially expressed genes that share, besides annotation, a common dynamic expression profile. Results FunPat performance was assessed on both simulated and real data. With respect to a stand-alone selection step, the integration of the clustering step is able to improve the recall without altering the false discovery rate. FunPat also shows high precision and recall in detecting the correct temporal expression patterns; in particular, the recall is significantly higher than hierarchical, k-means and a model-based clustering approach specifically designed for RNA sequencing data. Moreover, when biological replicates are missing, FunPat is able to provide reproducible lists of significant genes. The application to real time series expression data shows the ability of FunPat to select differentially expressed genes with high reproducibility, indirectly confirming high precision and recall in gene selection. Moreover, the expression patterns obtained as output allow an easy interpretation of the results. Conclusions A novel analysis pipeline was developed to search the main temporal patterns in classes of genes similarly annotated, improving the sensitivity of

  5. Providing web-based tools for time series access and analysis

    NASA Astrophysics Data System (ADS)

    Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane

    2014-05-01

    Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30

  6. An Interactive Analysis of Hyperboles in a British TV Series: Implications For EFL Classes

    ERIC Educational Resources Information Center

    Sert, Olcay

    2008-01-01

    This paper, part of an ongoing study on the analysis of hyperboles in a British TV series, reports findings drawing upon a 90,000 word corpus. The findings are compared to the ones from CANCODE (McCarthy and Carter 2004), a five-million word corpus of spontaneous speech, in order to identify similarities between the two. The analysis showed that…

  7. Use of interrupted time series analysis in evaluating health care quality improvements.

    PubMed

    Penfold, Robert B; Zhang, Fang

    2013-01-01

    Interrupted time series (ITS) analysis is arguably the strongest quasi-experimental research design. ITS is particularly useful when a randomized trial is infeasible or unethical. The approach usually involves constructing a time series of population-level rates for a particular quality improvement focus (eg, rates of attention-deficit/hyperactivity disorder [ADHD] medication initiation) and testing statistically for a change in the outcome rate in the time periods before and time periods after implementation of a policy/program designed to change the outcome. In parallel, investigators often analyze rates of negative outcomes that might be (unintentionally) affected by the policy/program. We discuss why ITS is a useful tool for quality improvement. Strengths of ITS include the ability to control for secular trends in the data (unlike a 2-period before-and-after t test), ability to evaluate outcomes using population-level data, clear graphical presentation of results, ease of conducting stratified analyses, and ability to evaluate both intended and unintended consequences of interventions. Limitations of ITS include the need for a minimum of 8 time periods before and 8 after an intervention to evaluate changes statistically, difficulty in analyzing the independent impact of separate components of a program that are implemented close together in time, and existence of a suitable control population. Investigators must also be careful not to make individual-level inferences when population-level rates are used to evaluate interventions (though ITS can be used with individual-level data). A brief description of ITS is provided, including a fully implemented (but hypothetical) study of the impact of a program to reduce ADHD medication initiation in children younger than 5 years old and insured by Medicaid in Washington State. An example of the database needed to conduct an ITS is provided, as well as SAS code to implement a difference-in-differences model using

  8. Brief Communication: Earthquake sequencing: analysis of time series constructed from the Markov chain model

    NASA Astrophysics Data System (ADS)

    Cavers, M. S.; Vasudevan, K.

    2015-10-01

    Directed graph representation of a Markov chain model to study global earthquake sequencing leads to a time series of state-to-state transition probabilities that includes the spatio-temporally linked recurrent events in the record-breaking sense. A state refers to a configuration comprised of zones with either the occurrence or non-occurrence of an earthquake in each zone in a pre-determined time interval. Since the time series is derived from non-linear and non-stationary earthquake sequencing, we use known analysis methods to glean new information. We apply decomposition procedures such as ensemble empirical mode decomposition (EEMD) to study the state-to-state fluctuations in each of the intrinsic mode functions. We subject the intrinsic mode functions, derived from the time series using the EEMD, to a detailed analysis to draw information content of the time series. Also, we investigate the influence of random noise on the data-driven state-to-state transition probabilities. We consider a second aspect of earthquake sequencing that is closely tied to its time-correlative behaviour. Here, we extend the Fano factor and Allan factor analysis to the time series of state-to-state transition frequencies of a Markov chain. Our results support not only the usefulness of the intrinsic mode functions in understanding the time series but also the presence of power-law behaviour exemplified by the Fano factor and the Allan factor.

  9. Process fault detection and nonlinear time series analysis for anomaly detection in safeguards

    SciTech Connect

    Burr, T.L.; Mullen, M.F.; Wangen, L.E.

    1994-02-01

    In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect loss of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.

  10. Kalman filter approach for estimating water level time series over inland water using multi-mission satellite altimetry

    NASA Astrophysics Data System (ADS)

    Schwatke, C.; Dettmering, D.; Bosch, W.; Seitz, F.

    2015-05-01

    Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, since some years, this technology is also used for observing inland water levels of lakes and rivers. In this paper, a new approach for the estimation of inland water level time series is described. It is used for the computation of time series available through the web service "Database for Hydrological Time Series over Inland Water" (DAHITI). The method is based on a Kalman filter approach incorporating multi-mission altimeter observations and their uncertainties. As input data, cross-calibrated altimeter data from Envisat, ERS-2, Jason-1, Jason-2, Topex/Poseidon, and SARAL/AltiKa are used. The paper presents water level time series for a variety of lakes and rivers in North and South America featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparison with in-situ gauge data and results from external inland altimeter databases. The new approach yields RMS differences with respect to in-situ data between 4 and 38 cm for lakes and 12 and 139 cm for rivers, respectively. For most study cases, more accurate height information than from available other altimeter data bases can be achieved.

  11. DAHITI - an innovative approach for estimating water level time series over inland waters using multi-mission satellite altimetry

    NASA Astrophysics Data System (ADS)

    Schwatke, C.; Dettmering, D.; Bosch, W.; Seitz, F.

    2015-10-01

    Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, for some years, this technology has also been used to retrieve water levels from reservoirs, wetlands and in general any inland water body, although the radar altimetry technique has been especially applied to rivers and lakes. In this paper, a new approach for the estimation of inland water level time series is described. It is used for the computation of time series of rivers and lakes available through the web service "Database for Hydrological Time Series over Inland Waters" (DAHITI). The new method is based on an extended outlier rejection and a Kalman filter approach incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1, Jason-2, TOPEX/Poseidon, and SARAL/AltiKa, including their uncertainties. The paper presents water level time series for a variety of lakes and rivers in North and South America featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases. The new approach yields rms differences with respect to in situ data between 4 and 36 cm for lakes and 8 and 114 cm for rivers. For most study cases, more accurate height information than from other available altimeter databases can be achieved.

  12. Spectral analysis of hydrological time series of a river basin in southern Spain

    NASA Astrophysics Data System (ADS)

    Luque-Espinar, Juan Antonio; Pulido-Velazquez, David; Pardo-Igúzquiza, Eulogio; Fernández-Chacón, Francisca; Jiménez-Sánchez, Jorge; Chica-Olmo, Mario

    2016-04-01

    Spectral analysis has been applied with the aim to determine the presence and statistical significance of climate cycles in data series from different rainfall, piezometric and gauging stations located in upper Genil River Basin. This river starts in Sierra Nevada Range at 3,480 m a.s.l. and is one of the most important rivers of this region. The study area has more than 2.500 km2, with large topographic differences. For this previous study, we have used more than 30 rain data series, 4 piezometric data series and 3 data series from gauging stations. Considering a monthly temporal unit, the studied period range from 1951 to 2015 but most of the data series have some lacks. Spectral analysis is a methodology widely used to discover cyclic components in time series. The time series is assumed to be a linear combination of sinusoidal functions of known periods but of unknown amplitude and phase. The amplitude is related with the variance of the time series, explained by the oscillation at each frequency (Blackman and Tukey, 1958, Bras and Rodríguez-Iturbe, 1985, Chatfield, 1991, Jenkins and Watts, 1968, among others). The signal component represents the structured part of the time series, made up of a small number of embedded periodicities. Then, we take into account the known result for the one-sided confidence band of the power spectrum estimator. For this study, we established confidence levels of <90%, 90%, 95%, and 99%. Different climate signals have been identified: ENSO, QBO, NAO, Sun Spot cycles, as well as others related to sun activity, but the most powerful signals correspond to the annual cycle, followed by the 6 month and NAO cycles. Nevertheless, significant differences between rain data series and piezometric/flow data series have been pointed out. In piezometric data series and flow data series, ENSO and NAO signals could be stronger than others with high frequencies. The climatic peaks in lower frequencies in rain data are smaller and the confidence

  13. Documentation of a spreadsheet for time-series analysis and drawdown estimation

    USGS Publications Warehouse

    Halford, Keith J.

    2006-01-01

    Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user

  14. Nonstationary frequency analysis for the trivariate flood series of the Weihe River

    NASA Astrophysics Data System (ADS)

    Jiang, Cong; Xiong, Lihua

    2016-04-01

    Some intensive human activities such as water-soil conservation can significantly alter the natural hydrological processes of rivers. In this study, the effect of the water-soil conservation on the trivariate flood series from the Weihe River located in the Northwest China is investigated. The annual maxima daily discharge, annual maxima 3-day flood volume and annual maxima 5-day flood volume are chosen as the study data and used to compose the trivariate flood series. The nonstationarities in both the individual univariate flood series and the corresponding antecedent precipitation series generating the flood events are examined by the Mann-Kendall trend test. It is found that all individual univariate flood series present significant decreasing trend, while the antecedent precipitation series can be treated as stationary. It indicates that the increase of the water-soil conservation land area has altered the rainfall-runoff relationship of the Weihe basin, and induced the nonstationarities in the three individual univariate flood series. The time-varying moments model based on the Pearson type III distribution is applied to capture the nonstationarities in the flood frequency distribution with the water-soil conservation land area introduced as the explanatory variable of the flood distribution parameters. Based on the analysis for each individual univariate flood series, the dependence structure among the three univariate flood series are investigated by the time-varying copula model also with the water-soil conservation land area as the explanatory variable of copula parameters. The results indicate that the dependence among the trivariate flood series is enhanced by the increase of water-soil conservation land area.

  15. Analysing time-varying trends in stratospheric ozone time series using the state space approach

    NASA Astrophysics Data System (ADS)

    Laine, M.; Latva-Pukkila, N.; Kyrölä, E.

    2014-09-01

    We describe a hierarchical statistical state space model for ozone profile time series. The time series are from satellite measurements by the Stratospheric Aerosol and Gas Experiment (SAGE) II and the Global Ozone Monitoring by Occultation of Stars (GOMOS) instruments spanning the years 1984-2011. Vertical ozone profiles were linearly interpolated on an altitude grid with 1 km resolution covering 20-60 km. Monthly averages were calculated for each altitude level and 10° wide latitude bins between 60° S and 60° N. In the analysis, mean densities are studied separately for the 25-35, 35-45, and 45-55 km layers. Model variables include the ozone mean level, local trend, seasonal oscillations, and proxy variables for solar activity, the Quasi-Biennial Oscillation (QBO), and the El Niño-Southern Oscillation (ENSO). This is a companion paper to tet{erkkitime}, where a piecewise linear model was used together with the same proxies as in this work (excluding ENSO). The piecewise linear trend was allowed to change at the beginning of 1997 in all latitudes and altitudes. In the modelling of the present paper such an assumption is not needed as the linear trend is allowed to change continuously at each time step. This freedom is also allowed for the seasonal oscillations whereas other regression coefficients are taken independent of time. According to our analyses, the slowly varying ozone background shows roughly three general development patterns. A continuous decay for the whole period 1984-2011 is evident in the southernmost latitude belt 50-60° S in all altitude regions and in 50-60° N in the lowest altitude region 25-35 km. A second pattern, where a recovery after an initial decay is followed by a further decay, is found at northern latitudes from the equator to 50° N in the lowest altitude region (25-35 km) and between 40° N and 60° N in the 35-45 km altitude region. Further ozone loss occurred after 2007 in these regions. Everywhere else a decay is followed

  16. Accuracy enhancement of GPS time series using principal component analysis and block spatial filtering

    NASA Astrophysics Data System (ADS)

    He, Xiaoxing; Hua, Xianghong; Yu, Kegen; Xuan, Wei; Lu, Tieding; Zhang, W.; Chen, X.

    2015-03-01

    This paper focuses on performance analysis and accuracy enhancement of long-term position time series of a regional network of GPS stations with two near sub-blocks, one block of 8 stations in Cascadia region and another block of 14 stations in Southern California. We have analyzed the seasonal variations of the 22 IGS site positions between 2004 and 2011. The Green's function is used to calculate the station-site displacements induced by the environmental loading due to atmospheric pressure, soil moisture, snow depth and nontidal ocean. The analysis has revealed that these loading factors can result in position shift of centimeter level, the displacement time series exhibit a periodic pattern, which can explain about 12.70-21.78% of the seasonal amplitude on vertical GPS time series, and the loading effect is significantly different among the two nearby geographical regions. After the loading effect is corrected, the principal component analysis (PCA)-based block spatial filtering is proposed to filter out the remaining common mode error (CME) of the GPS time series. The results show that the PCA-based block spatial filtering can extract the CME more accurately and effectively than the conventional overall filtering method, reducing more of the uncertainty. With the loading correction and block spatial filtering, about 68.34-73.20% of the vertical GPS seasonal power can be separated and removed, improving the reliability of the GPS time series and hence enabling better deformation analysis and higher precision geodetic applications.

  17. A hybrid-domain approach for modeling climate data time series

    NASA Astrophysics Data System (ADS)

    Wen, Qiuzi H.; Wang, Xiaolan L.; Wong, Augustine

    2011-09-01

    In order to model climate data time series that often contain periodic variations, trends, and sudden changes in mean (mean shifts, mostly artificial), this study proposes a hybrid-domain (HD) algorithm, which incorporates a time domain test and a newly developed frequency domain test through an iterative procedure that is analogue to the well known backfitting algorithm. A two-phase competition procedure is developed to address the confounding issue between modeling periodic variations and mean shifts. A variety of distinctive features of climate data time series, including trends, periodic variations, mean shifts, and a dependent noise structure, can be modeled in tandem using the HD algorithm. This is particularly important for homogenization of climate data from a low density observing network in which reference series are not available to help preserve climatic trends and long-term periodic variations, preventing them from being mistaken as artificial shifts. The HD algorithm is also powerful in estimating trend and periodicity in a homogeneous data time series (i.e., in the absence of any mean shift). The performance of the HD algorithm (in terms of false alarm rate and hit rate in detecting shifts/cycles, and estimation accuracy) is assessed via a simulation study. Its power is further illustrated through its application to a few climate data time series.

  18. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  19. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria)

    PubMed Central

    Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.

    2014-01-01

    Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and

  20. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria).

    PubMed

    Mayaud, C; Wagner, T; Benischke, R; Birk, S

    2014-04-16

    The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the

  1. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria)

    NASA Astrophysics Data System (ADS)

    Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.

    2014-04-01

    The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the

  2. Task Analysis: A Top-Down Approach.

    ERIC Educational Resources Information Center

    Harmon, Paul

    1983-01-01

    This approach to task analysis includes descriptions of (1) inputs, outputs, and jobs; (2) flow of materials and decisions between jobs; (3) inputs, major tasks, and outputs of each job; (4) sequence of steps for major tasks; (5) heuristics/algorithms for each sequence step; and (6) information needed to use heuristics algorithms. (EAO)

  3. Heterogeneous Factor Analysis Models: A Bayesian Approach.

    ERIC Educational Resources Information Center

    Ansari, Asim; Jedidi, Kamel; Dube, Laurette

    2002-01-01

    Developed Markov Chain Monte Carlo procedures to perform Bayesian inference, model checking, and model comparison in heterogeneous factor analysis. Tested the approach with synthetic data and data from a consumption emotion study involving 54 consumers. Results show that traditional psychometric methods cannot fully capture the heterogeneity in…

  4. A Mellin transform approach to wavelet analysis

    NASA Astrophysics Data System (ADS)

    Alotta, Gioacchino; Di Paola, Mario; Failla, Giuseppe

    2015-11-01

    The paper proposes a fractional calculus approach to continuous wavelet analysis. Upon introducing a Mellin transform expression of the mother wavelet, it is shown that the wavelet transform of an arbitrary function f(t) can be given a fractional representation involving a suitable number of Riesz integrals of f(t), and corresponding fractional moments of the mother wavelet. This result serves as a basis for an original approach to wavelet analysis of linear systems under arbitrary excitations. In particular, using the proposed fractional representation for the wavelet transform of the excitation, it is found that the wavelet transform of the response can readily be computed by a Mellin transform expression, with fractional moments obtained from a set of algebraic equations whose coefficient matrix applies for any scale a of the wavelet transform. Robustness and computationally efficiency of the proposed approach are shown in the paper.

  5. The Terror Attacks of 9/11 and Suicides in Germany: A Time Series Analysis.

    PubMed

    Medenwald, Daniel

    2016-04-01

    Data on the effect of the September 11, 2001 (9/11) terror attacks on suicide rates remain inconclusive. Reportedly, even people located far from the attack site have considerable potential for personalizing the events that occurred on 9/11. Durkheim's theory states that suicides decrease during wartime; thus, a decline in suicides might have been expected after 9/11. We conducted a time series analysis of 164,136 officially recorded suicides in Germany between 1995 and 2009 using the algorithm introduced by Box and Jenkins. Compared with the average death rate, we observed no relevant change in the suicide rate of either sex after 9/11. Our estimates of an excess of suicides approached the null effect value on and within a 7-day period after 9/11, which also held when subsamples of deaths in urban or rural settings were examined. No evidence of Durkheim's theory attributable to the 9/11attacks was found in this sample. PMID:27082561

  6. Assessing coal-mine safety regulation: A pooled time-series analysis

    SciTech Connect

    Chun Youngpyoung.

    1991-01-01

    This study attempts to assess the independent, relative, and conjoint effects of four types of variables on coal-mine safety: administrative (mine inspections, mine investigations, and mine safety grants); political (state party competition, gubernatorial party affiliation, and deregulation); economic (state per-capita income and unemployment rates); task-related (mine size, technology, and type of mining), and state dummy variables. Trend, Pearson correlation, and pooled time-series analyses are performed on fatal and nonfatal injury rates reported in 25 coal-producing states during the 1975-1985 time period. These are then interpreted in light of three competing theories of regulation: capture, nonmarket failure, and threshold. Analysis reveals: (1) distinctions in the total explanatory power of the model across different types of injuries, as well as across presidential administrations; (2) a consistently more powerful impact on safety of informational implementation tools (safety education grants) over command-and-control approaches (inspections and investigations) or political variables; and (3) limited, albeit conjectural, support for a threshold theory of regulation in the coal mine safety arena.

  7. A Physiological Time Series Dynamics-Based Approach to Patient Monitoring and Outcome Prediction

    PubMed Central

    Lehman, Li-Wei H.; Adams, Ryan P.; Mayaud, Louis; Moody, George B.; Malhotra, Atul; Mark, Roger G.; Nemati, Shamim

    2015-01-01

    Cardiovascular variables such as heart rate (HR) and blood pressure (BP) are regulated by an underlying control system, and therefore, the time series of these vital signs exhibit rich dynamical patterns of interaction in response to external perturbations (e.g., drug administration), as well as pathological states (e.g., onset of sepsis and hypotension). A question of interest is whether “similar” dynamical patterns can be identified across a heterogeneous patient cohort, and be used for prognosis of patients’ health and progress. In this paper, we used a switching vector autoregressive framework to systematically learn and identify a collection of vital sign time series dynamics, which are possibly recurrent within the same patient and may be shared across the entire cohort. We show that these dynamical behaviors can be used to characterize the physiological “state” of a patient. We validate our technique using simulated time series of the cardiovascular system, and human recordings of HR and BP time series from an orthostatic stress study with known postural states. Using the HR and BP dynamics of an intensive care unit (ICU) cohort of over 450 patients from the MIMIC II database, we demonstrate that the discovered cardiovascular dynamics are significantly associated with hospital mortality (dynamic modes 3 and 9, p = 0.001, p = 0.006 from logistic regression after adjusting for the APACHE scores). Combining the dynamics of BP time series and SAPS-I or APACHE-III provided a more accurate assessment of patient survival/mortality in the hospital than using SAPS-I and APACHE-III alone (p = 0.005 and p = 0.045). Our results suggest that the discovered dynamics of vital sign time series may contain additional prognostic value beyond that of the baseline acuity measures, and can potentially be used as an independent predictor of outcomes in the ICU. PMID:25014976

  8. Influence of sampling intake position on suspended solid measurements in sewers: two probability/time-series-based approaches.

    PubMed

    Sandoval, Santiago; Bertrand-Krajewski, Jean-Luc

    2016-06-01

    Total suspended solid (TSS) measurements in urban drainage systems are required for several reasons. Aiming to assess uncertainties in the mean TSS concentration due to the influence of sampling intake vertical position and vertical concentration gradients in a sewer pipe, two methods are proposed: a simplified method based on a theoretical vertical concentration profile (SM) and a time series grouping method (TSM). SM is based on flow rate and water depth time series. TSM requires additional TSS time series as input data. All time series are from the Chassieu urban catchment in Lyon, France (time series from 2007 with 2-min time step, 89 rainfall events). The probability of measuring a TSS value lower than the mean TSS along the vertical cross section (TSS underestimation) is about 0.88 with SM and about 0.64 with TSM. TSM shows more realistic TSS underestimation values (about 39 %) than SM (about 269 %). Interquartile ranges (IQR) over the probability values indicate that SM is more uncertain (IQR = 0.08) than TSM (IQR = 0.02). Differences between the two methods are mainly due to simplifications in SM (absence of TSS measurements). SM assumes a significant asymmetry of the TSS concentration profile along the vertical axis in the cross section. This is compatible with the distribution of TSS measurements found in the TSM approach. The methods provide insights towards an indicator of the measurement performance and representativeness for a TSS sampling protocol. PMID:27178049

  9. An Integrated Approach to Life Cycle Analysis

    NASA Technical Reports Server (NTRS)

    Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.

    2006-01-01

    Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.

  10. Mobile Visualization and Analysis Tools for Spatial Time-Series Data

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2013-12-01

    The Siberian Earth System Science Cluster (SIB-ESS-C) provides access and analysis services for spatial time-series data build on products from the Moderate Resolution Imaging Spectroradiometer (MODIS) and climate data from meteorological stations. Until now a webportal for data access, visualization and analysis with standard-compliant web services was developed for SIB-ESS-C. As a further enhancement a mobile app was developed to provide an easy access to these time-series data for field campaigns. The app sends the current position from the GPS receiver and a specific dataset (like land surface temperature or vegetation indices) - selected by the user - to our SIB-ESS-C web service and gets the requested time-series data for the identified pixel back in real-time. The data is then being plotted directly in the app. Furthermore the user has possibilities to analyze the time-series data for breaking points and other phenological values. These processings are executed on demand of the user on our SIB-ESS-C web server and results are transfered to the app. Any processing can also be done at the SIB-ESS-C webportal. The aim of this work is to make spatial time-series data and analysis functions available for end users without the need of data processing. In this presentation the author gives an overview on this new mobile app, the functionalities, the technical infrastructure as well as technological issues (how the app was developed, our made experiences).

  11. Approaches to Literature through Theme. The Oryx Reading Motivation Series No. 1.

    ERIC Educational Resources Information Center

    Montgomery, Paula Kay

    Intended to help teachers and librarians inspire students in grades 5-9 to read and keep reading, this book provides literature theme approaches and teaching strategies for reading and studying literature. Chapter 1 discusses approaches, methods, techniques, and strategies in using literature approaches to motivate reading. Chapter 2 defines a…

  12. The Reggio Emilia Approach to Early Years Education. Early Education Support Series.

    ERIC Educational Resources Information Center

    Valentine, Marianne

    Noting that the approach to early childhood education from the northern Italian town of Reggio Emilia has become renowned worldwide, this report explains the approach and explores the possible translation or adaptation of aspects of this pedagogical approach to Scotland. Following an introduction, the report is presented in three parts. Part 1…

  13. Modified cross sample entropy and surrogate data analysis method for financial time series

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2015-09-01

    For researching multiscale behaviors from the angle of entropy, we propose a modified cross sample entropy (MCSE) and combine surrogate data analysis with it in order to compute entropy differences between original dynamics and surrogate series (MCSDiff). MCSDiff is applied to simulated signals to show accuracy and then employed to US and Chinese stock markets. We illustrate the presence of multiscale behavior in the MCSDiff results and reveal that there are synchrony containing in the original financial time series and they have some intrinsic relations, which are destroyed by surrogate data analysis. Furthermore, the multifractal behaviors of cross-correlations between these financial time series are investigated by multifractal detrended cross-correlation analysis (MF-DCCA) method, since multifractal analysis is a multiscale analysis. We explore the multifractal properties of cross-correlation between these US and Chinese markets and show the distinctiveness of NQCI and HSI among the markets in their own region. It can be concluded that the weaker cross-correlation between US markets gives the evidence for the better inner mechanism in the US stock markets than that of Chinese stock markets. To study the multiscale features and properties of financial time series can provide valuable information for understanding the inner mechanism of financial markets.

  14. Holistic approach to analysis of medical data: vulvar cancer.

    PubMed

    Buković, D; Rudan, I; Ivanisević, M; Sostarić, S; Rubala, D

    1997-06-01

    This paper continues the series of studies introducing holistic approach to analysis of clinical data. Namely, besides the information regarding his/her disease, each hospitalized cancer patient also provides the variety of data regarding his/her psychological, cultural, social, economical, genetic, constitutional and medical background. The aim of this study was to introduce a holistic approach to analysis of medical data, in this case clinical data regarding cancer of the vulva. Such approach requires the collection of data regarding different aspects of the cancer patients, and after the satisfactory sample size is obtained (which should be at least five times greater than the number of examined patient characteristics), the performance of factor analysis. In this study, the authors have processed the data regarding 25 characteristics of all 755 vulvar cancer patients treated between 1938 and 1990 at the Department for Gynecological Oncology of the University Hospital for Gynecology and Obstetrics, Zagreb, Croatia. In factor analysis, the principal components were rotated after the initial extraction (the authors recommended the use of oblimin rotation) in order to obtain better ground for interpretation of the obtained results. The next step in this approach was the stepwise exclusion of characteristics with smallest commonality according to Kaiser-Meyer-Olkin criteria, and retaining the characteristics and components with the most significant impact on the explained system variance. When the number of principal components and initial analyzed characteristics was reduced to 3-4 and 7-10, respectively, the ultimate interpretations and conclusions were made. This approach outlined some clusters of correlations between medical data which are difficult to identify using other statistical procedures, primarily the impacts of various socioeconomic and hereditary-constitutional variables on overall survival. PMID:9225511

  15. Series-hybrid bearing - An approach to extending bearing fatigue life at high speeds

    NASA Technical Reports Server (NTRS)

    Anderson, W. J.; Coe, H. H.; Fleming, D. P.; Parker, R. J.

    1971-01-01

    Fluid film bearing of hybrid device consists of orifice compensated annular thrust bearing and self-acting journal bearing. In series hybrid bearing, both ball bearing and annular thrust bearing carry full system thrust load, but two bearings share speed. Operation of system is stable and automatically fail-safe.

  16. Discussion Guide for Film Clip Series--"The Team Approach in Education: Twenty Questions on Film."

    ERIC Educational Resources Information Center

    Bowman, Garda W.; And Others

    This discussion guide is part of a multi-media package of audiovisual and written materials designed to assist trainers of teams in a school setting, particularly for use with teams of teachers and auxiliaries (paraprofessionals). The purpose of the film clip series--to stimulate discussion that is geared to problem solving--is discussed, and the…

  17. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  18. A Molecular Mechanics Approach to Modeling Protein-Ligand Interactions: Relative Binding Affinities in Congeneric Series

    PubMed Central

    Rapp, Chaya S.; Kalyanaraman, Chakrapani; Schiffmiller, Aviva; Schoenbrun, Esther Leah; Jacobson, Matthew P.

    2011-01-01

    We introduce the “Prime-ligand” method for ranking ligands in congeneric series. The method employs a single scoring function, the OPLS-AA/GBSA molecular mechanics/implicit solvent model, for all stages of sampling and scoring. We evaluate the method using 12 test sets of congeneric series for which experimental binding data is available in the literature, as well as the structure of one member of the series bound to the protein. Ligands are ‘docked’ by superimposing a common stem fragment among the compounds in the series using a crystal complex from the Protein Databank, and sampling the conformational space of the variable region. Our results show good correlation between our predicted rankings and experimental data for cases in which binding affinities differ by at least one order of magnitude. For 11 out of 12 cases, >90% of such ligand pairs could be correctly ranked, while for the remaining case, Factor Xa, 76% of such pairs were correctly ranked. A small number of compounds could not be docked using the current protocol due to the large size of functional groups that could not be accommodated by a rigid receptor. CPU requirements for the method, involving CPU-minutes per ligand, are modest compared with more rigorous methods that use similar force fields, such as free energy perturbation. We also benchmark the scoring function using series of ligand bound to the same protein within the CSAR data set. We demonstrate that energy minimization of ligand in the crystal structures is critical to obtain any correlation with experimentally determined binding affinities. PMID:21780805

  19. Anti-persistence in levels of Lake Naivasha: Assessing effect of human intervention through time-series analysis

    NASA Astrophysics Data System (ADS)

    Tarafdar, Sujata; Harper, David

    2008-01-01

    Lake Naivasha in Kenya is an important natural fresh water reserve, supporting surrounding wildlife as well as agriculture and industry. Uncontrolled use of the lake water for the past few decades is causing concern for environmentalists. In the present paper, fluctuations in the lake level for the last half century are analysed using standard tools for time-series analysis. The intervals 1951-1980 (period I) and 1981-2000 (period II) are treated separately, to look for any difference in their statistical patterns. From period II onwards, increased human consumption is believed to affect the level significantly. We analyse the data using three different approaches: (i) rescaled range analysis (R/S), (ii) roughness scaling analysis and (iii) a Lomb periodogram. R/S analysis shows no difference between the behavior in periods I and II, but the other methods reveal different fluctuation patterns for the two periods. The water level shows stronger fluctuations in period I compared to II. R/S analysis, however, shows an interesting anti-persistence with a Hurst exponent 0.44, which is not usually observed in natural time series.

  20. A convolution integral approach for performance assessments with uncertainty analysis

    SciTech Connect

    Dawoud, E.; Miller, L.F.

    1999-09-01

    Performance assessments that include uncertainty analyses and risk assessments are typically not obtained for time-dependent releases of radioactive contaminants to the geosphere when a series of sequentially coupled transport models is required for determining results. This is due, in part, to the geophysical complexity of the site, and to the numerical complexity of the fate and transport models. The lack of a practical tool for linking the transport models in a fashion that facilitates uncertainty analysis is another reason for not performing uncertainty analyses in these studies. The multiconvolution integral (MCI) approach presented herein greatly facilitates the practicality of incorporating uncertainty analyses into performance assessments. In this research an MCI approach is developed, and the decoupling of fate and transport processes into an independent system is described. A conceptual model, extracted from the Inactive Tanks project at the Oak Ridge National Laboratory (ORNL), is used to demonstrate the approach. Numerical models are used for transport of {sup 90}Sr from a disposal facility, WC-1 at ORNL, through the vadose and saturated zones to a downgradient point at Fifth Creek, and an analytical surface water model is used to transport the contaminants to a downstream potential receptor point at White Oak Creek. The probability density functions of the final concentrations obtained by the MCI approach are in excellent agreement with those obtained by a Monte Carlo approach that propagated uncertainties through all submodels for each random sample.

  1. Daily water and sediment discharges from selected rivers of the eastern United States; a time-series modeling approach

    USGS Publications Warehouse

    Fitzgerald, Michael G.; Karlinger, Michael R.

    1983-01-01

    Time-series models were constructed for analysis of daily runoff and sediment discharge data from selected rivers of the Eastern United States. Logarithmic transformation and first-order differencing of the data sets were necessary to produce second-order, stationary time series and remove seasonal trends. Cyclic models accounted for less than 42 percent of the variance in the water series and 31 percent in the sediment series. Analysis of the apparent oscillations of given frequencies occurring in the data indicates that frequently occurring storms can account for as much as 50 percent of the variation in sediment discharge. Components of the frequency analysis indicate that a linear representation is reasonable for the water-sediment system. Models that incorporate lagged water discharge as input prove superior to univariate techniques in modeling and prediction of sediment discharges. The random component of the models includes errors in measurement and model hypothesis and indicates no serial correlation. An index of sediment production within or between drain-gage basins can be calculated from model parameters.

  2. Determinants of healthcare expenditures in Iran: evidence from a time series analysis

    PubMed Central

    Rezaei, Satar; Fallah, Razieh; Kazemi Karyani, Ali; Daroudi, Rajabali; Zandiyan, Hamed; Hajizadeh, Mohammad

    2016-01-01

    Background: A dramatic increase in healthcare expenditures is a major health policy concern worldwide. Understanding factors that underlie the growth in healthcare expenditures is essential to assist decision-makers in finding best policies to manage healthcare costs. We aimed to examine the determinants of healthcare spending in Iran over the periods of 1978-2011. Methods: A time series analysis was used to examine the effect of selected socio-economic, demographic and health service input on per capita healthcare expenditures (HCE) in Iran from 1978 to 2011. Data were retrieved from the Central Bank of Iran, Iranian Statistical Center and World Bank. Autoregressive distributed lag approach and error correction method were employed to examine long- and short-run effects of covariates. Results: Our findings indicated that the GDP per capita, degree of urbanization and illiteracy rate increase healthcare expenditures, while physician per 10,000 populations and proportion of population aged≥ 65 years decrease healthcare expenditures. In addition, we found that healthcare spending is a "necessity good" with long- and short-run income (GDP per capita), elasticities of 0.46 (p<0.01) and 0.67 (p = 0.01), respectively. Conclusion: Our analysis identified GDP per capita, illiteracy rate, degree of urbanization and number of physicians as some of the driving forces behind the persistent increase in HCE in Iran. These findings provide important insights into the growth in HCE in Iran. In addition, since we found that health spending is a "necessity good" in Iran, healthcare services should thus be the object of public funding and government intervention PMID:27390683

  3. Analysis of initial drainage network evolution from aerial photography and a DEM time series

    NASA Astrophysics Data System (ADS)

    Schneider, Anna; Gerke, Horst H.; Maurer, Thomas; Nenov, Rossen; Raab, Thomas

    2013-04-01

    The evolution of erosion rill or gully networks is a formative process in initial landscape development. Digital representations of drainage networks are often derived from Digital Elevation Models (DEMs) based on morphometric parameters, or mapped in field surveys or from aerial photographs. This study attempted to reconstruct and analyze the first five years of erosion rill network evolution in the 6 ha artificial catchment 'Hühnerwasser', which serves as a real world-laboratory to study patterns and processes of initial ecosystem development. The drainage network was characterized in a twofold approach, based on the analysis of remotely-sensed data. We used high-resolution drone-based aerial photographs to map the actively eroding rill network for four states of development, and a time series of ten Digital Elevation Models to characterize the morphology of the surface. Rill network maps and morphometric parameters were combined to allow for region-specific analyses of morphometry for different parts of the rill network. After a rapid growth of the erosion rill network during the first two years of development, a reduction of the area of actively eroding rills was observed. Region-specific analysis of morphometry indicates an increase in flow accumulation in the central parts of the rill network, which suggests that locally evolving feedback cycles between flow accumulation and erosion affected rill network development, in addition to the effects of precipitation characteristics and the growth of vegetation cover. The combination of drainage network characterization from aerial photography and DEMs could improve analyses of initial drainage network development in experimental studies, as it allows for critical comparisons of flow accumulation patterns and the actual patterns of erosion rills or gullies.

  4. An Automated Approach to Map the History of Forest Disturbance from Insect Mortality and Harvest with Landsat Time-Series Data

    NASA Technical Reports Server (NTRS)

    Rudasill-Neigh, Christopher S.; Bolton, Douglas K.; Diabate, Mouhamad; Williams, Jennifer J.; Carvalhais, Nuno

    2014-01-01

    Forests contain a majority of the aboveground carbon (C) found in ecosystems, and understanding biomass lost from disturbance is essential to improve our C-cycle knowledge. Our study region in the Wisconsin and Minnesota Laurentian Forest had a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 2007, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometer (AVHRR). To understand the potential role of disturbances in the terrestrial C-cycle, we developed an algorithm to map forest disturbances from either harvest or insect outbreak for Landsat time-series stacks. We merged two image analysis approaches into one algorithm to monitor forest change that included: (1) multiple disturbance index thresholds to capture clear-cut harvest; and (2) a spectral trajectory-based image analysis with multiple confidence interval thresholds to map insect outbreak. We produced 20 maps and evaluated classification accuracy with air-photos and insect air-survey data to understand the performance of our algorithm. We achieved overall accuracies ranging from 65% to 75%, with an average accuracy of 72%. The producer's and user's accuracy ranged from a maximum of 32% to 70% for insect disturbance, 60% to 76% for insect mortality and 82% to 88% for harvested forest, which was the dominant disturbance agent. Forest disturbances accounted for 22% of total forested area (7349 km2). Our algorithm provides a basic approach to map disturbance history where large impacts to forest stands have occurred and highlights the limited spectral sensitivity of Landsat time-series to outbreaks of defoliating insects. We found that only harvest and insect mortality events can be mapped with adequate accuracy with a non-annual Landsat time-series. This limited our land cover understanding of NDVI decline drivers. We demonstrate that to capture more subtle disturbances with spectral trajectories, future observations

  5. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  6. Complexity analysis of the air temperature and the precipitation time series in Serbia

    NASA Astrophysics Data System (ADS)

    Mimić, G.; Mihailović, D. T.; Kapor, D.

    2015-11-01

    In this paper, we have analyzed the time series of daily values for three meteorological elements, two continuous and a discontinuous one, i.e., the maximum and minimum air temperature and the precipitation. The analysis was done based on the observations from seven stations in Serbia from the period 1951-2010. The main aim of this paper was to quantify the complexity of the annual values for the mentioned time series and to calculate the rate of its change. For that purpose, we have used the sample entropy and the Kolmogorov complexity as the measures which can indicate the variability and irregularity of a given time series. Results obtained show that the maximum temperature has increasing trends in the given period which points out a warming, ranged in the interval 1-2 °C. The increasing temperature indicates the higher internal energy of the atmosphere, changing the weather patterns, manifested in the time series. The Kolmogorov complexity of the maximum temperature time series has statistically significant increasing trends, while the sample entropy has increasing but statistically insignificant trend. The trends of complexity measures for the minimum temperature depend on the location. Both complexity measures for the precipitation time series have decreasing trends.

  7. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account. PMID:24032873

  8. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    NASA Astrophysics Data System (ADS)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  9. Time series analysis of knowledge of results effects during motor skill acquisition.

    PubMed

    Blackwell, J R; Simmons, R W; Spray, J A

    1991-03-01

    Time series analysis was used to investigate the hypothesis that during acquisition of a motor skill, knowledge of results (KR) information is used to generate a stable internal referent about which response errors are randomly distributed. Sixteen subjects completed 50 acquisition trials of each of three movements whose spatial-temporal characteristics differed. Acquisition trials were either blocked, with each movement being presented in series, or randomized, with the presentation of movements occurring in random order. Analysis of movement time data indicated the contextual interference effect reported in previous studies was replicated in the present experiment. Time series analysis of the acquisition trial data revealed the majority of individual subject response patterns during blocked trials were best described by a model with a temporarily stationary, internal reference of the criterion and systematic, trial-to-trial variation of response errors. During random trial conditions, response patterns were usually best described by a "White-noise" model. This model predicts a permanently stationary, internal reference associated with randomly distributed response errors that are unaffected by KR information. These results are not consistent with previous work using time series analysis to describe motor behavior (Spray & Newell, 1986). PMID:2028084

  10. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  11. Driver Education Task Analysis: Instructional Objectives. HumRRO Safety Series.

    ERIC Educational Resources Information Center

    McKnight, A. James; Hundt, Alan G.

    Developed from a systematic analysis of driving behaviors, this publication contains a set of instructional objectives for driver education courses and a series of tests designed to measure the degree to which the instructional objectives have been met by students. Part 1 provides a description of objectives for 74 learning units, including such…

  12. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  13. Rationale, Development, and Validation of a Series of Self-Instructional Modules in Interaction Analysis.

    ERIC Educational Resources Information Center

    Suiter, Phil Edward; Queen, Bernard

    This study was designed to develop a series of instructional modules to teach inservice teachers the Flanders System of Interaction Analysis. Instructional modules were constructed based on research information, and then modified from feedback from experts and random trials. Two field-test groups were used to provide data for validation testing,…

  14. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces.

    PubMed

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails. PMID:26172763

  15. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package.

    PubMed

    Donges, Jonathan F; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V; Marwan, Norbert; Dijkstra, Henk A; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. PMID:26627561

  16. AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice.

    ERIC Educational Resources Information Center

    Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler

    2001-01-01

    Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

  17. A new approach for agroecosystems monitoring using high-revisit multitemporal satellite data series

    NASA Astrophysics Data System (ADS)

    Diez, M.; Moclán, C.; Romo, A.; Pirondini, F.

    2014-10-01

    With increasing population pressure throughout the world and the need for increased agricultural production there is a definite need for improved management of the world's agricultural resources. Comprehensive, reliable and timely information on agricultural resources is necessary for the implementation of effective management decisions. In that sense, the demand for high-quality and high-frequency geo-information for monitoring of agriculture and its associated ecosystems has been growing in the recent decades. Satellite image data enable direct observation of large areas at frequent intervals and therefore allow unprecedented mapping and monitoring of crops evolution. Furthermore, real time analysis can assist in making timely management decisions that affect the outcome of the crops. The DEIMOS-1 satellite, owned and operated by ELECNOR DEIMOS IMAGING (Spain), provides 22m, 3-band imagery with a very wide (620-km) swath, and has been specifically designed to produce high-frequency revisit on very large areas. This capability has been proved through the contracts awarded to Airbus Defence and Space every year since 2011, where DEIMOS-1 has provided the USDA with the bulk of the imagery used to monitor the crop season in the Lower 48, in cooperation with its twin satellite DMCii's UK-DMC2. Furthermore, high density agricultural areas have been targeted with increased frequency and analyzed in near real time to monitor tightly the evolution. In this paper we present the results obtained from a campaign carried out in 2013 with DEIMOS-1 and UK-DMC2 satellites. These campaigns provided a high-frequency revisit of target areas, with one image every two days on average: almost a ten-fold frequency improvement with respect to Landsat-8. The results clearly show the effectiveness of a high-frequency monitoring approach with high resolution images with respect to classic strategies where results are more exposed to weather conditions.

  18. Evaluation of physiologic complexity in time series using generalized sample entropy and surrogate data analysis

    NASA Astrophysics Data System (ADS)

    Eduardo Virgilio Silva, Luiz; Otavio Murta, Luiz

    2012-12-01

    Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiffmax) for q ≠1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiffmax values were capable of distinguish HRV groups (p-values 5.10×10-3, 1.11×10-7, and 5.50×10-7 for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis.

  19. Wavelet analysis for non-stationary, non-linear time series

    NASA Astrophysics Data System (ADS)

    Schulte, J. A.

    2015-12-01

    Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the Quasi-biennial Oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase-coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16, 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.

  20. The F-12 series aircraft approach to design for control system reliability

    NASA Technical Reports Server (NTRS)

    Schenk, F. L.; Mcmaster, J. R.

    1976-01-01

    The F-12 series aircraft control system design philosophy is reviewed as it pertains to functional reliability. The basic control system, i.e., cables, mixer, feel system, trim devices, and hydraulic systems are described and discussed. In addition, the implementation of the redundant stability augmentation system in the F-12 aircraft is described. Finally, the functional reliability record that has been achieved is presented.

  1. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  2. TWO APPROACHES TO TEACHING SYNTAX. INDIANA UNIVERSITY ENGLISH CURRICULUM STUDY SERIES.

    ERIC Educational Resources Information Center

    BROWN, MARSHALL L.; AND OTHERS

    TWO TRANSFORMATIONAL-GENERATIVE APPROACHES TO TEACHING SYNTAX IN JUNIOR AND SENIOR HIGH SCHOOLS ARE PRESENTED. ONE IS FOR USE WITH AVERAGE AND TALENTED STUDENTS IN GRADES 7-9, AND THE OTHER IS FOR SLOW-LEARNING STUDENTS IN GRADES 7-11. A DISCUSSION OF THE FIRST APPROACH IS DIVIDED BY GRADE LEVEL AND INCLUDES AN EXAMINATION OF BASIC SENTENCE…

  3. Comprehensive Reform for Urban High Schools: A Talent Development Approach. Sociology of Education Series.

    ERIC Educational Resources Information Center

    Legters, Nettie E.; Balfanz, Robert; Jordan, Will J.; McPartland, James M.

    This book offers an alternative to current reform efforts, the talent development approach, detailing organizational, curricular, and instructional strategies that provide practitioners with a blueprint for whole school reform. The book presents the story of what happened in urban high schools when this approach was implemented. There are eight…

  4. The study of coastal groundwater depth and salinity variation using time-series analysis

    SciTech Connect

    Tularam, G.A. . E-mail: a.tularam@griffith.edu.au; Keeler, H.P. . E-mail: p.keeler@ms.unimelb.edu.au

    2006-10-15

    A time-series approach is applied to study and model tidal intrusion into coastal aquifers. The authors examine the effect of tidal behaviour on groundwater level and salinity intrusion for the coastal Brisbane region using auto-correlation and spectral analyses. The results show a close relationship between tidal behaviour, groundwater depth and salinity levels for the Brisbane coast. The known effect can be quantified and incorporated into new models in order to more accurately map salinity intrusion into coastal groundwater table.

  5. Ultrasound-guided central cluster approach for the supraclavicular brachial plexus block: a case series.

    PubMed

    Lee, Mi Geum; Lee, Kyung Cheon; Kim, Hong Soon; Park, Seol Ju; Suh, Young Je; Shin, Hyeon Ju

    2015-12-01

    There are many different approaches to ultrasound-guided supraclavicular brachial plexus block (US-SCBPB), and each has a different success rate and complications. The most commonly performed US-SCBPB is the corner pocket approach in which the needle is advanced very close to the subclavian artery and pleura. Therefore, it may be associated with a risk of subclavian artery puncture or pneumothorax. We advanced the needle into the central part of the neural cluster after penetrating the sheath of the brachial plexus in US-SCBPB. We refer to this new method as the "central cluster approach." In this approach, the needle does not have to advance close to the subclavian artery or pleura. The aim of this study was to evaluate the clinical outcomes of the central cluster approach in US-SCBPB. PMID:26634085

  6. Ultrasound-guided central cluster approach for the supraclavicular brachial plexus block: a case series

    PubMed Central

    Lee, Mi Geum; Lee, Kyung Cheon; Kim, Hong Soon; Park, Seol Ju; Suh, Young Je

    2015-01-01

    There are many different approaches to ultrasound-guided supraclavicular brachial plexus block (US-SCBPB), and each has a different success rate and complications. The most commonly performed US-SCBPB is the corner pocket approach in which the needle is advanced very close to the subclavian artery and pleura. Therefore, it may be associated with a risk of subclavian artery puncture or pneumothorax. We advanced the needle into the central part of the neural cluster after penetrating the sheath of the brachial plexus in US-SCBPB. We refer to this new method as the "central cluster approach." In this approach, the needle does not have to advance close to the subclavian artery or pleura. The aim of this study was to evaluate the clinical outcomes of the central cluster approach in US-SCBPB. PMID:26634085

  7. Time series analysis and Monte Carlo methods for eigenvalue separation in neutron multiplication problems

    SciTech Connect

    Nease, Brian R. Ueki, Taro

    2009-12-10

    A time series approach has been applied to the nuclear fission source distribution generated by Monte Carlo (MC) particle transport in order to calculate the non-fundamental mode eigenvalues of the system. The novel aspect is the combination of the general technical principle of projection pursuit for multivariate data with the neutron multiplication eigenvalue problem in the nuclear engineering discipline. Proof is thoroughly provided that the stationary MC process is linear to first order approximation and that it transforms into one-dimensional autoregressive processes of order one (AR(1)) via the automated choice of projection vectors. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern MC codes for nuclear criticality calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. This time series approach was tested for a variety of problems including multi-dimensional ones. Numerical results show that the time series approach has strong potential for three dimensional whole reactor core. The eigenvalue ratio can be updated in an on-the-fly manner without storing the nuclear fission source distributions at all previous iteration cycles for the mean subtraction. Lastly, the effects of degenerate eigenvalues are investigated and solutions are provided.

  8. Primary Dentition Analysis: Exploring a Hidden Approach

    PubMed Central

    Vanjari, Kalasandhya; Kamatham, Rekhalakshmi; Gaddam, Kumar Raja

    2016-01-01

    ABSTRACT Background: Accurate prediction of the mesiodistal widths (MDWs) of canines and premolars in children with primary dentition facilitates interception of malocclusion at an early age. Boston University (BU) approach is one, i.e., based on primary teeth for predicting canine and premolar dimensions. Aim: To predict the canine and premolar dimensions, in the contemporary population, using BU approach and compare with the values obtained using Tanaka-Johnston (T/J) approach. Design: Children in the age range of 7-11 years with presence of all permanent mandibular incisors and primary maxillary and mandibular canines and first molars were included in the study. Those with interproximal caries or restorations, abnormalities in shape or size and history of orthodontic treatment were excluded. Impressions of both arches were made using irreversible hydrocolloid and poured with dental stone. The MDWs of the required teeth were measured on the models using electronic digital vernier caliper from which widths of permanent canines and premolars were predicted using both T/J and BU approaches. Results: Statistically significant (p = 0.00) positive correlation (r = 0.52-0.55) was observed between T/J and BU approaches. A statistically significant (p = 0.00) strong positive correlation (r = 0.72-0.77) was observed among girls, whereas boys showed a statistically nonsignificant weak positive correlation (r=0.17-0.42) based on gender. Conclusion: Boston University approach can be further studied prospectively to make it possible as a prediction method of permanent tooth dimensions for children in primary dentition stage. How to cite this article: Nuvvula S, Vanjari K, Kamatham R, Gaddam KR. Primary Dentition Analysis: Exploring a Hidden Approach. Int J Clin Pediatr Dent 2016;9(1):1-4. PMID:27274146

  9. Fourier Series Applications in Multitemporal Remote Sensing Analysis using Landsat Data

    NASA Astrophysics Data System (ADS)

    Brooks, Evan Beren

    Researchers now have unprecedented access to free Landsat data, enabling detailed monitoring of the Earth's land surface and vegetation. There are gaps in the data, due in part to cloud cover. The gaps are aperiodic and localized, forcing any detailed multitemporal analysis based on Landsat data to compensate. Harmonic regression approximates Landsat data for any point in time with minimal training images and reduced storage requirements. In two study areas in North Carolina, USA, harmonic regression approaches were least as good at simulating missing data as STAR-FM for images from 2001. Harmonic regression had an R2 ≥ 0.9 over three quarters of all pixels. It gave the highest R2predicted values on two thirds of the pixels. Applying harmonic regression with the same number of harmonics to consecutive years yielded an improved fit, R2 ≥ 0.99 for most pixels. We next demonstrate a change detection method based on exponentially weighted moving average (EWMA) charts of harmonic residuals. In the process, a data-driven cloud filter is created, enabling use of partially clouded data. The approach is shown capable of detecting thins and subtle forest degradations in Alabama, USA, considerably finer than the Landsat spatial resolution in an on-the-fly fashion, with new images easily incorporated into the algorithm. EWMA detection accurately showed the location, timing, and magnitude of 85% of known harvests in the study area, verified by aerial imagery. We use harmonic regression to improve the precision of dynamic forest parameter estimates, generating a robust time series of vegetation index values. These values are classified into strata maps in Alabama, USA, depicting regions of similar growth potential. These maps are applied to Forest Service Forest Inventory and Analysis (FIA) plots, generating post-stratified estimates of static and dynamic forest parameters. Improvements to efficiency for all parameters were such that a comparable random sample would require

  10. An approach to jointly invert hypocenters and 1D velocity structure and its application to the Lushan earthquake series

    NASA Astrophysics Data System (ADS)

    Qian, Hui; Mechie, James; Li, Haibing; Xue, Guangqi; Su, Heping; Cui, Xiang

    2016-01-01

    Earthquake location is essential when defining fault systems and other geological structures. Many methods have been developed to locate hypocenters within a 1D velocity model. In this study, a new approach, named MatLoc, has been developed which can simultaneously invert for the locations and origin times of the hypocenters and the velocity structure, from the arrival times of local earthquakes. Moreover, it can invert for layer boundary depths, such as Moho depths, which can be well constrained by the Pm and Pn phases. For this purpose, the package was developed to take into account reflected phases, e.g., the Pm phase. The speed of the inversion is acceptable due to the use of optimized matrix calculations. The package has been used to re-locate the Lushan earthquake series which occurred in Sichuan, China, from April 20 to April 22, 2013. The results obtained with the package show that the Lushan earthquake series defines the dip of the Guankou fault, on which most of the series occurred, to be 39° toward the NW. Further, the surface projection of the Lushan earthquake series is consistent with the regional tectonic strike which is about N45° E.

  11. Young volcanoes in the Chilean Southern Volcanic Zone: A statistical approach to eruption prediction based on time series

    NASA Astrophysics Data System (ADS)

    Dzierma, Y.; Wehrmann, H.

    2010-03-01

    Forecasting volcanic activity has long been an aim of applied volcanology with regard to mitigating consequences of volcanic eruptions. Effective disaster management requires both information on expected physical eruption behaviour such as types and magnitudes of eruptions as typical for the individual volcano, usually reconstructed from deposits of past eruptions, and the likelihood that a new eruption will occur within a given time. Here we apply a statistical procedure to provide a probability estimate for future eruptions based on eruption time series, and discuss the limitations of this approach. The statistical investigation encompasses a series of young volcanoes of the Chilean Southern Volcanic Zone. Most of the volcanoes considered have been active in historical times, in addition to several volcanoes with a longer eruption record from Late-Pleistocene to Holocene. Furthermore, eruption rates of neighbouring volcanoes are compared with the aim to reveal possible regional relations, potentially resulting from local to medium-scale tectonic dynamics. One special focus is directed to the two currently most active volcanoes of South America, Llaima and Villarrica, whose eruption records comprise about 50 historical eruptions over the past centuries. These two front volcanoes are considered together with Lanín Volcano, situated in the back-arc of Villarrica, for which the analysis is based on eight eruptions in the past 10 ka. For Llaima and Villarrica, affirmed tests for independence of the repose times between successive eruptions permit to assume Poisson processes; which is hampered for Lanín because of the more limited availability of documented eruptions. The assumption of stationarity reaches varying degrees of confidence depending on the time interval considered, ameliorating towards the more recent and hence probably more complete eruption record. With these pre-requisites of the time series, several distribution functions are fit and the goodness of

  12. A deliberate practice approach to teaching phylogenetic analysis.

    PubMed

    Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  13. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  14. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs

  15. Systemic and intensifying drought induces collapse and replacement of native fishes: a time-series approach

    NASA Astrophysics Data System (ADS)

    Ruhi, A.; Olden, J. D.; Sabo, J. L.

    2015-12-01

    In the American Southwest, hydrologic drought has become a new normal as a result of increasing human appropriation of freshwater resources and increased aridity associated with global warming. Although drought has often been touted to threaten freshwater biodiversity, connecting drought to extinction risk of highly-imperiled faunas remains a challenge. Here we combine time-series methods from signal processing and econometrics to analyze a spatially comprehensive and long-term dataset to link discharge variation and community abundance of fish across the American Southwest. This novel time series framework identifies ongoing trends in daily discharge anomalies across the Southwest, quantifies the effect of the historical hydrologic drivers on fish community abundance, and allows us to simulate species trajectories and range-wide risk of decline (quasiextinction) under scenarios of future climate. Spectral anomalies are declining over the last 30 years in at least a quarter of the stream gaging stations across the American Southwest and these anomalies are robust predictors of historical abundance of native and non-native fishes. Quasiextinction probabilities are high (>50 %) for nearly ¾ of the native species across several large river basins in the same region; and the negative trend in annual anomalies increases quasiextinction risk for native but reduces this risk for non-native fishes. These findings suggest that ongoing drought is causing range-wide collapse and replacement of native fish faunas, and that this homogenization of western fish faunas will continue given the prevailing negative trend in discharge anomalies. Additionally, this combination of methods can be applied elsewhere as long as environmental and biological long-term time-series data are available. Collectively, these methods allow identifying the link between hydroclimatic forcing and ecological responses and thus may help anticipating the potential impacts of ongoing and future hydrologic

  16. Graphic analysis and multifractal on percolation-based return interval series

    NASA Astrophysics Data System (ADS)

    Pei, A. Q.; Wang, J.

    2015-05-01

    A financial time series model is developed and investigated by the oriented percolation system (one of the statistical physics systems). The nonlinear and statistical behaviors of the return interval time series are studied for the proposed model and the real stock market by applying visibility graph (VG) and multifractal detrended fluctuation analysis (MF-DFA). We investigate the fluctuation behaviors of return intervals of the model for different parameter settings, and also comparatively study these fluctuation patterns with those of the real financial data for different threshold values. The empirical research of this work exhibits the multifractal features for the corresponding financial time series. Further, the VGs deviated from both of the simulated data and the real data show the behaviors of small-world, hierarchy, high clustering and power-law tail for the degree distributions.

  17. Mining biomedical time series by combining structural analysis and temporal abstractions.

    PubMed Central

    Bellazzi, R.; Magni, P.; Larizza, C.; De Nicolao, G.; Riva, A.; Stefanelli, M.

    1998-01-01

    This paper describes the combination of Structural Time Series analysis and Temporal Abstractions for the interpretation of data coming from home monitoring of diabetic patients. Blood Glucose data are analyzed by a novel Bayesian technique for time series analysis. The results obtained are post-processed using Temporal Abstractions in order to extract knowledge that can be exploited "at the point of use" from physicians. The proposed data analysis procedure can be viewed as a Knowledge Discovery in Data Base process that is applied to time-varying data. The work here described is part of a Web-based telemedicine system for the management of Insulin Dependent Diabetes Mellitus patients, called T-IDDM. PMID:9929202

  18. Statistical Analysis of Sensor Network Time Series at Multiple Time Scales

    NASA Astrophysics Data System (ADS)

    Granat, R. A.; Donnellan, A.

    2013-12-01

    Modern sensor networks often collect data at multiple time scales in order to observe physical phenomena that occur at different scales. Whether collected by heterogeneous or homogenous sensor networks, measurements at different time scales are usually subject to different dynamics, noise characteristics, and error sources. We explore the impact of these effects on the results of statistical time series analysis methods applied to multi-scale time series data. As a case study, we analyze results from GPS time series position data collected in Japan and the Western United States, which produce raw observations at 1Hz and orbit corrected observations at time resolutions of 5 minutes, 30 minutes, and 24 hours. We utilize the GPS analysis package (GAP) software to perform three types of statistical analysis on these observations: hidden Markov modeling, probabilistic principle components analysis, and covariance distance analysis. We compare the results of these methods at the different time scales and discuss the impact on science understanding of earthquake fault systems generally and recent large seismic events specifically, including the Tohoku-Oki earthquake in Japan and El Mayor-Cucupah earthquake in Mexico.

  19. Percutaneous trans-ulnar artery approach for coronary angiography and angioplasty; A case series study

    PubMed Central

    Roghani-Dehkordi, Farshad; Hadizadeh, Mahmood; Hadizadeh, Fatemeh

    2015-01-01

    BACKGROUND Coronary angiography is the gold standard method for diagnosis of coronary heart disease and usually performed by femoral approach that has several complications. To reduce these complications, upper extremity approach is increasingly used and is becoming preferred access site by many interventionists. Although radial approach is relatively well studied, safety, feasibility and risk of applying ulnar approach in not clearly known yet. METHODS We followed 97 patients (man = 56%, mean ± standard deviation of age = 57 ± 18) who had undergone coronary angiography or angioplasty via ulnar approach for 6-10 months and recorded their outcomes. RESULTS In 97 patients out of 105 ones (92.38%), procedure through ulnar access were successfully done. Unsuccessful puncture (3 patients), wiring (2 patients), passing of sheet (2 patients), and anatomically unsuitable ulnar artery (1 patient) were the reasons of failure. In 94 patients (89.52%), the angiography and angioplasty was done without any complications. Five patients (5.1%) hematoma and 11 patients (11%) experienced low-grade pain that resolved with painkiller. No infection, amputation or need for surgery was reported. CONCLUSION This study demonstrated that ulnar access in our patients was a safe and practical approach for coronary angiography or angioplasty, without any major complication. Bearing in mind its high success rate, it can be utilized when a radial artery is not useful for the catheterization and in cases such as prior harvesting of the radial artery (in prior coronary artery bypass grafting). PMID:26715936

  20. Scalable Hyper-parameter Estimation for Gaussian Process Based Time Series Analysis

    SciTech Connect

    Chandola, Varun; Vatsavai, Raju

    2010-01-01

    Gaussian process (GP) is increasingly becoming popular as a kernel machine learning tool for non-parametric data analysis. Recently, GP has been applied to model non-linear dependencies in time series data. GP based analysis can be used to solve problems of time series prediction, forecasting, missing data imputation, change point detection, anomaly detection, etc. But the use of GP to handle massive scientific time series data sets has been limited, owing to its expensive computational complexity. The primary bottleneck is the handling of the covariance matrix whose size is quadratic in the length of the time series. In this paper we propose a scalable method that exploit the special structure of the covariance matrix for hyper-parameter estimation in GP based learning. The proposed method allows estimation of hyper parameters associated with GP in quadratic time, which is an order of magnitude improvement over standard methods with cubic complexity. Moreover, the proposed method does not require explicit computation of the covariance matrix and hence has memory requirement linear to the length of the time series as opposed to the quadratic memory requirement of standard methods. To further improve the computational complexity of the proposed method, we provide a parallel version to concurrently estimate the log likelihood for a set of time series which is the key step in the hyper-parameter estimation. Performance results on a multi-core system show that our proposed method provides significant speedups as high as 1000, even when running in serial mode, while maintaining a small memory footprint. The parallel version exploits the natural parallelization potential of the serial algorithm and is shown to perform significantly better than the serial faster algorithm, with speedups as high as 10.

  1. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    PubMed

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series. PMID:26540681

  2. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    NASA Astrophysics Data System (ADS)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  3. [Approximation of Time Series of Paramecia caudatum Dynamics by Verhulst and Gompertz Models: Non-traditional Approach].

    PubMed

    Nedorezov, L V

    2015-01-01

    For approximation of some well-known time series of Paramecia caudatun population dynamics (G. F. Gause, The Struggle for Existence, 1934) Verhulst and Gompertz models were used. The parameters were estimated for each of the models in two different ways: with the least squares method (global fitting) and non-traditional approach (a method of extreme points). The results obtained were compared and also with those represented by G. F. Gause. Deviations of theoretical (model) trajectories from experimental time series were tested using various non-parametric statistical tests. It was shown that the least square method-estimations lead to the results which not always meet the requirements imposed for a "fine" model. But in some cases a small modification of the least square method-estimations is possible allowing for satisfactory representations of experimental data set for approximation. PMID:26349222

  4. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    PubMed Central

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  5. TIME SERIES ANALYSIS OF REMOTELY-SENSED TIR EMISSION: linking anomalies to physical processes

    NASA Astrophysics Data System (ADS)

    Pavlidou, E.; van der Meijde, M.; Hecker, C.; van der Werff, H.; Ettema, J.

    2013-12-01

    In the last 15 years, remote sensing has been evaluated for detecting thermal anomalies as precursor to earthquakes. Important issues that need yet to be tackled include definition of: (a) thermal anomaly, taking into account weather conditions, observation settings and ';natural' variability caused by background sources (b) the length of observations required for this purpose; and (c) the location of detected anomalies, which should be physically related to the tectonic activity. To determine whether thermal anomalies are statistical noise, mere meteorological conditions, or actual earthquake-related phenomena, we apply a novel approach. We use brightness temperature (top-of-atmosphere) data from thermal infrared imagery acquired at a hypertemporal (sub-hourly) interval, from geostationary weather satellites over multiple years. The length of the time series allows for analysis of meteorological effects (diurnal, seasonal or annual trends) and background variability, through the application of a combined spatial and temporal filter to distinguish extreme occurrences from trends. The definition of potential anomalies is based on statistical techniques, taking into account published (geo)physical characteristics of earthquake related thermal anomalies. We use synthetic data to test the performance of the proposed detection method and track potential factors affecting the results. Subsequently, we apply the method on original data from Iran and Turkey, in quiescent and earthquake-struck periods alike. We present our findings with main focus to assess resulting anomalies in relation to physical processes thereby considering: (a) meteorological effects, (b) the geographical, geological and environmental settings, and (c) physically realistic distances and potential physical relations with the activity of causative faults.

  6. Efficient transfer entropy analysis of non-stationary neural time series.

    PubMed

    Wollstadt, Patricia; Martínez-Zarzuela, Mario; Vicente, Raul; Díaz-Pernas, Francisco J; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  7. CANONICAL CORRELATION ANALYSIS BETWEEN TIME SERIES AND STATIC OUTCOMES, WITH APPLICATION TO THE SPECTRAL ANALYSIS OF HEART RATE VARIABILITY.

    PubMed

    Krafty, Robert T; Hall, Martica

    2013-03-01

    Although many studies collect biomedical time series signals from multiple subjects, there is a dearth of models and methods for assessing the association between frequency domain properties of time series and other study outcomes. This article introduces the random Cramér representation as a joint model for collections of time series and static outcomes where power spectra are random functions that are correlated with the outcomes. A canonical correlation analysis between cepstral coefficients and static outcomes is developed to provide a flexible yet interpretable measure of association. Estimates of the canonical correlations and weight functions are obtained from a canonical correlation analysis between the static outcomes and maximum Whittle likelihood estimates of truncated cepstral coefficients. The proposed methodology is used to analyze the association between the spectrum of heart rate variability and measures of sleep duration and fragmentation in a study of older adults who serve as the primary caregiver for their ill spouse. PMID:24851143

  8. Endoscopic approaches to brainstem cavernous malformations: Case series and review of the literature

    PubMed Central

    Nayak, Nikhil R.; Thawani, Jayesh P.; Sanborn, Matthew R.; Storm, Phillip B.; Lee, John Y.K.

    2015-01-01

    Background: Symptomatic cavernous malformations involving the brainstem are frequently difficult to access via traditional methods. Conventional skull-base approaches require significant brain retraction or bone removal to provide an adequate operative corridor. While there has been a trend toward limited employment of the most invasive surgical approaches, recent advances in endoscopic technology may complement existing methods to access these difficult to reach areas. Case Descriptions: Four consecutive patients were treated for symptomatic, hemorrhagic brainstem cavernous malformations via fully endoscopic approaches (endonasal, transclival; retrosigmoid; lateral supracerebellar, infratentorial; endonasal, transclival). Together, these lesions encompassed all three segments of the brainstem. Three of the patients had complete resection of the cavernous malformation, while one patient had stable residual at long-term follow up. Associated developmental venous anomalies were preserved in the two patients where one was identified preoperatively. Three of the four patients maintained stable or improved neurological examinations following surgery, while one patient experienced ipsilateral palsies of cranial nerves VII and VIII. The first transclival approach resulted in a symptomatic cerebrospinal fluid leak requiring re-operation, but the second did not. Although there are challenges associated with endoscopic approaches, relative to our prior microsurgical experience with similar cases, visualization and illumination of the surgical corridors were superior without significant limitations on operative mobility. Conclusion: The endoscope is a promising adjunct to the neurosurgeon's ability to approach difficult to access brainstem cavernous malformations. It allows the surgeon to achieve well-illuminated, panoramic views, and by combining approaches, can provide minimally invasive access to most regions of the brainstem. PMID:25984383

  9. Time series analysis and feature extraction techniques for structural health monitoring applications

    NASA Astrophysics Data System (ADS)

    Overbey, Lucas A.

    Recently, advances in sensing and sensing methodologies have led to the deployment of multiple sensor arrays on structures for structural health monitoring (SHM) applications. Appropriate feature extraction, detection, and classification methods based on measurements obtained from these sensor networks are vital to the SHM paradigm. This dissertation focuses on a multi-input/multi-output approach to novel data processing procedures to produce detailed information about the integrity of a structure in near real-time. The studies employ nonlinear time series analysis techniques to extract three different types of features for damage diagnostics: namely, nonlinear prediction error, transfer entropy, and the generalized interdependence. These features form reliable measures of generalized correlations between multiple measurements to capture aspects of the dynamics related to the presence of damage. Several analyses are conducted on each of these features. Specifically, variations of nonlinear prediction error are introduced, analyzed, and validated, including the use of a stochastic excitation to augment generality, introduction of local state-space models for sensitivity enhancement, and the employment of comparisons between multiple measurements for localization capability. A modification and enhancement to transfer entropy is created and validated for improved sensitivity. In addition, a thorough analysis of the effects of variability to transfer entropy estimation is made. The generalized interdependence is introduced into the literature and validated as an effective measure of damage presence, extent, and location. These features are validated on a multi-degree-of-freedom dynamic oscillator and several different frame experiments. The evaluated features are then fed into four different classification schemes to obtain a concurrent set of outputs that categorize the integrity of the structure, e.g. the presence, extent, location, and type of damage, taking

  10. Towards Solving the Mixing Problem in the Decomposition of Geophysical Time Series by Independent Component Analysis

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2000-01-01

    The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.

  11. Quantifying surface water-groundwater interactions using time series analysis of streambed thermal records: Method development

    USGS Publications Warehouse

    Hatch, C.E.; Fisher, A.T.; Revenaugh, J.S.; Constantz, J.; Ruehl, C.

    2006-01-01

    We present a method for determining streambed seepage rates using time series thermal data. The new method is based on quantifying changes in phase and amplitude of temperature variations between pairs of subsurface sensors. For a reasonable range of streambed thermal properties and sensor spacings the time series method should allow reliable estimation of seepage rates for a range of at least ??10 m d-1 (??1.2 ?? 10-2 m s-1), with amplitude variations being most sensitive at low flow rates and phase variations retaining sensitivity out to much higher rates. Compared to forward modeling, the new method requires less observational data and less setup and data handling and is faster, particularly when interpreting many long data sets. The time series method is insensitive to streambed scour and sedimentation, which allows for application under a wide range of flow conditions and allows time series estimation of variable streambed hydraulic conductivity. This new approach should facilitate wider use of thermal methods and improve understanding of the complex spatial and temporal dynamics of surface water-groundwater interactions. Copyright 2006 by the American Geophysical Union.

  12. The octave approach to EEG analysis.

    PubMed

    Stassen, H H

    1991-10-01

    A "tonal" approach to EEG spectral analysis is presented which is compatible with the concept of physical octaves, thus providing a constant resolution of partial tones over the full frequency range inherent to human brain waves, rather than for equidistant frequency steps in the spectral domain. The specific advantages of the tonal approach, however, mainly pay off in the field of EEG sleep analysis where the interesting information is predominantly located in the lower octaves. In such cases the proposed method reveals a fine structure which displays regular maxima possessing typical properties of "overtones" within the three octaves 1-2 Hz, 2-4 Hz and 4-8 Hz. Accordingly, spectral patterns derived from tonal spectral analyses are particularly suited to measure the fine gradations of mutual differences between individual EEG sleep patterns and will therefore allow a more efficient investigation of the genetically determined proportion of sleep EEGs. On the other hand, we also tested the efficiency of tonal spectral analyses on the basis of our 5-year follow-up data of 30 healthy volunteers. It turned out that 28 persons (93.3%) could be uniquely recognized after five years by means of their EEG spectral patterns. Hence, tonal spectral analysis proved to be a powerful tool also in cases where the main EEG information is typically located in the medium octave 8-16 Hz. PMID:1762585

  13. Completion and continuation of nonlinear traffic time series: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Belomestny, D.; Jentsch, V.; Schreckenberg, M.

    2003-11-01

    When dealing with nonlinear time series of car traffic on highways, one of the outstanding problems to be solved is completion and continuation of data in space and time. To this end, the underlying process is decomposed into stochastic and deterministic components. The former is approximated by Gaussian white noise, while the latter refers, apart from always existing trends, to the space- and time-dependent jam propagation process. Jams are modelled in terms of dynamical Bayesian networks with radial basis functions involved. The models developed are used to tackle travel time estimation and prediction. Results are obtained for one of the most crowded traffic areas of Europe, namely the ring-like highway around Cologne.

  14. Sensitivity analysis for nonrandom dropout: a local influence approach.

    PubMed

    Verbeke, G; Molenberghs, G; Thijs, H; Lesaffre, E; Kenward, M G

    2001-03-01

    Diggle and Kenward (1994, Applied Statistics 43, 49-93) proposed a selection model for continuous longitudinal data subject to nonrandom dropout. It has provoked a large debate about the role for such models. The original enthusiasm was followed by skepticism about the strong but untestable assumptions on which this type of model invariably rests. Since then, the view has emerged that these models should ideally be made part of a sensitivity analysis. This paper presents a formal and flexible approach to such a sensitivity assessment based on local influence (Cook, 1986, Journal of the Royal Statistical Society, Series B 48, 133-169). The influence of perturbing a missing-at-random dropout model in the direction of nonrandom dropout is explored. The method is applied to data from a randomized experiment on the inhibition of testosterone production in rats. PMID:11252620

  15. Responses to Defense Cutbacks: The Worker Mobility Approach. Research and Evaluation Report Series 97-D.

    ERIC Educational Resources Information Center

    Social Policy Research Associates, Menlo Park, CA.

    Of the 19 projects conducted as part of the Defense Conversion Adjustment (DCA) Demonstration administered by the U.S. Department of Labor's Office of Work-Based Learning, 8 tested the worker mobility approach. The projects, which shared the common goal of helping dislocated defense workers find high-quality jobs, tested one or more of the…

  16. Responses to Defense Cutbacks: The Dislocation Aversion Approach. Research and Evaluation Report Series 97-C.

    ERIC Educational Resources Information Center

    Social Policy Research Associates, Menlo Park, CA.

    Of the 19 projects conducted as part of the Defense Conversion Adjustment (DCA) Demonstration administered by the U.S. Department of Labor's Office of Work-Based Learning, 9 tested the dislocation aversion approach. The projects attempted to alleviate the negative impacts of defense cutbacks on communities, firms, and workers. Six projects…

  17. Responses to Defense Cutbacks: The Community Planning Approach. Research and Evaluation Report Series 97-B.

    ERIC Educational Resources Information Center

    Social Policy Research Associates, Menlo Park, CA.

    Of the 19 projects conducted as part of the Defense Conversion Adjustment (DCA) Demonstration administered by the U.S. Department of Labor's Office of Work-Based Learning, 5 tested the community planning approach. The projects attempted to alleviate the negative impacts of defense cutbacks on communities, firms, and workers. The project sites…

  18. Innovative Approaches in Rural Education. Rural Information Center Publication Series, No. 72.

    ERIC Educational Resources Information Center

    Tuthill, Shirley J., Comp.

    This publication contains an annotated bibliography and information on journals, organizations, and resources related to new approaches in rural education. The annotated bibliography describes 97 journal articles, ERIC digests, research reports, programs descriptions and evaluations, government reports, conference papers, and grant guides. Most…

  19. The Public Library and Literacy: A Community Based Approach. Occasional Paper Series 2, No. 3.

    ERIC Educational Resources Information Center

    Szudy, Thomas; Pritchard, Linda L., Ed.

    This report presents the steps and actions that public libraries may take to begin to meet the needs for literacy training in their individual communities. Outlined are three major approaches that libraries can consider when addressing the literacy needs of their communities: (1) initiating a literacy program; (2) cooperating with an existing…

  20. Drama for Learning: Dorothy Heathcote's Mantle of the Expert Approach to Education. Dimensions of Drama Series.

    ERIC Educational Resources Information Center

    Heathcote, Dorothy; Bolton, Gavin

    This book describes how theater can create an impetus for productive learning across the curriculum. Dorothy Heathcote's "mantle of the expert" approach is discussed in which teachers and students explore, in role, the knowledge they already have about a problem or task while making new discoveries along the way. The book also presents a variety…

  1. Faces of Change. Visual Evidence: An Instructional Approach. Instructor's Notes: Film/Essay Series.

    ERIC Educational Resources Information Center

    Miller, Norman N.

    Designed for use with the multidisciplinary film project, "Faces of Change, Five Rural Societies in Transition" for the college social studies curriculum, this manual contains an overview of the material and its underlying philosophy and suggests teaching strategies. The first section discusses the overall approach, the use of films in teaching,…

  2. Assessment of Social Competence, Adaptive Behaviors, and Approaches to Learning with Young Children. Working Paper Series.

    ERIC Educational Resources Information Center

    Meisels, Samuel J.; Atkins-Burnett, Sally; Nicholson, Julie

    Prepared in support of the Early Childhood Longitudinal Study (ECLS), which will examine children's early school experiences beginning with kindergarten, this working paper focuses on research regarding the measurement of young children's social competence, adaptive behavior, and approaches to learning. The paper reviews the key variables and…

  3. The Adolescent Community Reinforcement Approach for Adolescent Cannabis Users, Cannabis Youth Treatment (CYT) Series, Volume 4.

    ERIC Educational Resources Information Center

    Godley, Susan Harrington; Meyers, Robert J.; Smith, Jane Ellen; Karvinen, Tracy; Titus, Janet C.; Godley, Mark D.; Dent, George; Passetti, Lora; Kelberg, Pamela

    This publication was written for therapists and their supervisors who may want to implement the adolescent community reinforcement approach intervention, which was one of the five interventions tested by the Center for Substance Abuse Treatment's (CSAT's) Cannabis Youth Treatment (CYT) Project. The CYT Project provided funding to support a study…

  4. Evaluating Assistive Technology in Early Childhood Education: The Use of a Concurrent Time Series Probe Approach

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Boeckmann, Nichole M.

    2009-01-01

    As assistive technology applications are increasingly implemented in early childhood settings for children who are at risk or who have disabilities, it is critical that teachers utilize observational approaches to determine whether targeted assistive technology-supported interventions make a difference in children's learning. One structured…

  5. Phonics, Spelling, and Word Study: A Sensible Approach. The Bill Harp Professional Teachers Library Series.

    ERIC Educational Resources Information Center

    Glazer, Susan Mandel

    This concise book shares several sensible, logical, and meaningful approaches that guide young children to use the written coding system to read, spell, and make meaning of the English language coding system. The book demonstrates that phonics, spelling, and word study are essential parts of literacy learning. After an introduction, chapters are:…

  6. THE MENTALLY RETARDED CHILD, A PSYCHOLOGICAL APPROACH. MCGRAW-HILL SERIES IN PSYCHOLOGY.

    ERIC Educational Resources Information Center

    ROBINSON, HALBERT B.; ROBINSON, NANCY M.

    PRESENTING A PSYCHOLOGICAL APPROACH TO MENTAL RETARDATION, THIS TEXT BEGINS WITH A DISCUSSION OF THEORIES OF INTELLIGENCE, PROBLEMS OF DEFINITION, AND THE CURRENT STATUS OF THE FIELD OF MENTAL RETARDATION. A SECTION ON ETIOLOGY AND SYNDROMES PRESENTS INFORMATION ON GENETIC FACTORS AND GENETIC SYNDROMES AND THE PHYSICAL AND PSYCHOLOGICAL…

  7. Place as Text: Approaches to Active Learning. 2nd Edition. National Collegiate Honors Council Monograph Series

    ERIC Educational Resources Information Center

    Braid, Bernice, Ed.; Long, Ada, Ed.

    2010-01-01

    The decade since publication of "Place as Text: Approaches to Active Learning" has seen an explosion of interest and productivity in the field of experiential education. This monograph presents a story of an experiment and a blueprint of sorts for anyone interested in enriching an existing program or willing to experiment with pedagogy…

  8. Positive Approaches to Behavior Management: Monograph 5. Monograph Series in Behavior Disorders.

    ERIC Educational Resources Information Center

    Eyde, Donna R.

    Prevention and problem solving approaches to behavior management in classrooms for behaviorally disordered (BD) students are reviewed. Attention is focused on positive strategies teachers can use to manage inappropriate behavior and to teach students alternative appropriate behaviors. The following components of prevention that contribute to a…

  9. Communication for the Workplace: An Integrated Language Approach. Second Edition. Job Skills. Net Effect Series.

    ERIC Educational Resources Information Center

    Ettinger, Blanche; Perfetto, Edda

    Using a developmental, hands-on approach, this text/workbook helps students master the basic English skills that are essential to write effective business correspondence, to recognize language errors, and to develop decision-making and problem-solving skills. Its step-by-step focus and industry-specific format encourages students to review,…

  10. Breaking the Boundaries: A One-World Approach to Planning Education. Urban Innovation Abroad Series.

    ERIC Educational Resources Information Center

    Sanyal, Bishwapriya, Ed.

    Chapters in this collection present the views of academics and professionals from the Third World who have received their planning education in the west and who now hold posts in western urban and regional planning schools. They discuss the need for a radically changed curriculum based on a comparative, one-world approach to planning education.…

  11. Local Rainfall Forecast System based on Time Series Analysis and Neural Networks

    NASA Astrophysics Data System (ADS)

    Buendia, Fulgencio S.; Tarquis, A. M.; Buendia, G.; Andina, D.

    2010-05-01

    Rainfall is one of the most important events in daily life of human beings. During several decades, scientists have been trying to characterize the weather, current forecasts are based on high complex dynamic models. In this paper is presented a local rainfall forecast system based on Time Series analysis and Neural Networks. This model tries to complement the currently state of the art ensembles, from a locally historical perspective, where the model definition is not so dependent from the exact values of the initial conditions. After several year taking data, expert meteorologists proposed this approximation to characterize the local weather behavior, that is being automated by this system in different stages. However the whole system is introduced, it is focused on the different rainfall events situation classification as well as the time series analysis and forecast

  12. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  13. Parametric time-series analysis of daily air pollutants of city of Shumen, Bulgaria

    NASA Astrophysics Data System (ADS)

    Ivanov, A.; Voynikova, D.; Gocheva-Ilieva, S.; Boyadzhiev, D.

    2012-10-01

    The urban air pollution is one of the main factors determining the ambient air quality, which affects on the human health and the environment. In this paper parametric time series models are obtained for studying the distribution over time of primary pollutants as sulphur and nitrogen oxides, particulate matter and a secondary pollutant ground level ozon in the town of Shumen, Bulgaria. The methods of factor analysis and ARIMA are used to carry out the time series analysis based on hourly average data in 2011 and first quarter of 2012. The constructed models are applied for a short-term air pollution forecasting. The results are estimated on the basis of national and European regulation indices. The sources of pollutants in the region and their harmful effects on human health are also discussed.

  14. Time-series analysis of nonstationary plasma fluctuations using wavelet transforms

    SciTech Connect

    Santoso, S.; Powers, E.J.; Bengtson, R.D.; Ouroua, A.

    1997-01-01

    A wavelet or time-scale approach to analyzing a single time series and two time series, in which the fluctuating quantities are statistically nonstationary, is presented. The time scale and scale {open_quotes}power spectra{close_quotes} are introduced and utilized to analyze transient potential fluctuations measured at the core of sawtoothing TEXT-U plasmas. The results show features that have not been previously observed using any Fourier techniques. In addition, the linear time-scale {open_quotes}coherence spectrum{close_quotes} is developed to quantify the degree of linear relationship between two nonstationary fluctuating quantities in the time-scale domain. Such a spectrum is also useful in tracking the time-varying phase difference. A numerical example is provided to demonstrate the efficacy of the time-scale spectra. {copyright} {ital 1997 American Institute of Physics.}

  15. Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin

    NASA Astrophysics Data System (ADS)

    zhangli, Sun; xiufang, Zhu; yaozhong, Pan

    2016-04-01

    Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.

  16. Testing the utility of a geographical profiling approach in three rape series of a single offender: a case study.

    PubMed

    Santtila, Pekka; Zappalà, Angelo; Laukkanen, Manne; Picozzi, Massimo

    2003-01-01

    The purpose of the study was to test the potential utility of a geographical profiling approach for three separate series of rapes committed by a single offender. Two different mathematical distance-decay functions using either a normal distribution with a mean distance and standard deviation based on previous research or a truncated negative exponential function based on distances between crime sites in the series under investigation were applied to each of the series giving prioritised search areas the accuracy of which was then assessed. The prioritised area that had to be searched before the home base of the offender could be located varied from 7.60 km(2) or 2.15% of the total search area at its best to 15 1.10 km(2) or 42.66% at its worst for the normal distribution based on previous research and from 42.06 km(2) or 11.88% to no improvement when the truncated negative exponential function was used. The functions used showed less predictive ability when the offender was a commuter. Explanations for the variations in the findings as well as suggestions for improvements were outlined in the discussion. PMID:12505470

  17. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i

  18. Surgical Approaches to First Branchial Cleft Anomaly Excision: A Case Series.

    PubMed

    Quintanilla-Dieck, Lourdes; Virgin, Frank; Wootten, Chistopher; Goudy, Steven; Penn, Edward

    2016-01-01

    Objectives. First branchial cleft anomalies (BCAs) constitute a rare entity with variable clinical presentations and anatomic findings. Given the high rate of recurrence with incomplete excision, identification of the entire tract during surgical treatment is of paramount importance. The objectives of this paper were to present five anatomic variations of first BCAs and describe the presentation, evaluation, and surgical approach to each one. Methods. A retrospective case review and literature review were performed. We describe patient characteristics, presentation, evaluation, and surgical approach of five patients with first BCAs. Results. Age at definitive surgical treatment ranged from 8 months to 7 years. Various clinical presentations were encountered, some of which were atypical for first BCAs. All had preoperative imaging demonstrating the tract. Four surgical approaches required a superficial parotidectomy with identification of the facial nerve, one of which revealed an aberrant facial nerve. In one case the tract was found to travel into the angle of the mandible, terminating as a mandibular cyst. This required en bloc excision that included the lateral cortex of the mandible. Conclusions. First BCAs have variable presentations. Complete surgical excision can be challenging. Therefore, careful preoperative planning and the recognition of atypical variants during surgery are essential. PMID:27034873

  19. Surgical Approaches to First Branchial Cleft Anomaly Excision: A Case Series

    PubMed Central

    Quintanilla-Dieck, Lourdes; Virgin, Frank; Wootten, Chistopher; Goudy, Steven; Penn, Edward

    2016-01-01

    Objectives. First branchial cleft anomalies (BCAs) constitute a rare entity with variable clinical presentations and anatomic findings. Given the high rate of recurrence with incomplete excision, identification of the entire tract during surgical treatment is of paramount importance. The objectives of this paper were to present five anatomic variations of first BCAs and describe the presentation, evaluation, and surgical approach to each one. Methods. A retrospective case review and literature review were performed. We describe patient characteristics, presentation, evaluation, and surgical approach of five patients with first BCAs. Results. Age at definitive surgical treatment ranged from 8 months to 7 years. Various clinical presentations were encountered, some of which were atypical for first BCAs. All had preoperative imaging demonstrating the tract. Four surgical approaches required a superficial parotidectomy with identification of the facial nerve, one of which revealed an aberrant facial nerve. In one case the tract was found to travel into the angle of the mandible, terminating as a mandibular cyst. This required en bloc excision that included the lateral cortex of the mandible. Conclusions. First BCAs have variable presentations. Complete surgical excision can be challenging. Therefore, careful preoperative planning and the recognition of atypical variants during surgery are essential. PMID:27034873

  20. A disaggregate approach to crash rate analysis.

    PubMed

    Kam, Booi Hon

    2003-09-01

    This paper presents a disaggregate approach to crash rate analysis. Enumerating crash rates on a per trip-kilometer basis, the proposed method removes the linearity assumption inherent in the conventional quotient indicator of accidents per unit travel distance. The approach involves combining two disparate datasets on a geographic information systems (GIS) platform by matching accident records to a defined travel corridor. As an illustration of the methodology, travel information from the Victorian Activity and Travel Survey (VATS) and accident records contained in CrashStat were used to estimate the crash rates of Melbourne residents in different age-sex groups according to time of the day and day of the week. The results show a polynomial function of a cubic order when crash rates are plotted against age group, which contrasts distinctly with the U-shape curve generated by using the conventional aggregate quotient approach. Owing to the validity of the many assumptions adopted in the computation, this study does not claim that the results obtained are conclusive. The methodology, however, is seen as providing a framework upon which future crash risk measures could be based as the use of spatial tracking devises become prevalent in travel surveys. PMID:12850070

  1. Low Cost Beam-Steering Approach for a Series-Fed Array

    NASA Technical Reports Server (NTRS)

    Host, Nicholas K.; Chen, Chi-Chih; Volakis, John L.; Miranda, Felix A.

    2013-01-01

    Phased array antennas showcase many advantages over mechanically steered systems. However, they are also more complex and costly. This paper presents a concept which overcomes these detrimental attributes by eliminating all of the phased array backend (including phase shifters). Instead, a propagation constant reconfigurable transmission line in a series fed array arrangement is used to allow phase shifting with one small (less than or equal to 100mil) linear mechanical motion. A novel slotted coplanar stripline design improves on previous transmission lines by demonstrating a greater control of propagation constant, thus allowing practical prototypes to be built. Also, beam steering pattern control is explored. We show that with correct choice of line impedance, pattern control is possible for all scan angles. A 20 element array scanning from -25 deg less than or equal to theta less than or equal to 21 deg. with mostly uniform gain at 13GHz is presented. Measured patterns show a reduced scan range of 12 deg. less than or equal to theta less than or equal to 25 deg. due to a correctable manufacturing error as verified by simulation. Beam squint is measured to be plus or minus 2.5 deg for a 600MHz bandwidth and cross-pol is measured to be at least -15dB.

  2. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    PubMed Central

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899

  3. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    PubMed

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899

  4. Microcanonical thermostatistics analysis without histograms: Cumulative distribution and Bayesian approaches

    NASA Astrophysics Data System (ADS)

    Alves, Nelson A.; Morero, Lucas D.; Rizzi, Leandro G.

    2015-06-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature β(E) and the microcanonical entropy S(E) is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms H(E) , which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for H(E) in order to evaluate β(E) by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distribution function (CDF), and (ii) a Bayesian approach to model this CDF. Comparison with a simple linear regression method is also carried out. The performance of these approaches is evaluated considering coarse-grained protein models for folding and peptide aggregation.

  5. Time series analysis of satellite derived surface temperature for Lake Garda

    NASA Astrophysics Data System (ADS)

    Pareeth, Sajid; Metz, Markus; Rocchini, Duccio; Salmaso, Nico; Neteler, Markus

    2014-05-01

    Remotely sensed satellite imageryis the most suitable tool for researchers around the globe in complementing in-situ observations. Nonetheless, it would be crucial to check for quality, validate and standardize methodologies to estimate the target variables from sensor data. Satellite imagery with thermal infrared bands provides opportunity to remotely measure the temperature in a very high spatio-temporal scale. Monitoring surface temperature of big lakes to understand the thermal fluctuations over time is considered crucial in the current status of global climate change scenario. The main disadvantage of remotely sensed data is the gaps due to presence of clouds and aerosols. In this study we use statistically reconstructed daily land surface temperature products from MODIS (MOD11A1 and MYD11A1) at a better spatial resolution of 250 m. The ability of remotely sensed datasets to capture the thermal variations over time is validated against historical monthly ground observation data collected for Lake Garda. The correlation between time series of satellite data LST (x,y,t) and the field measurements f (x,y,t) are found to be in acceptable range with a correlation coefficient of 0.94. We compared multiple time series analysis methods applied on the temperature maps recorded in the last ten years (2002 - 2012) and monthly field measurements in two sampling points in Lake Garda. The time series methods STL - Seasonal Time series decomposition based on Loess method, DTW - Dynamic Time Waping method, and BFAST - Breaks for Additive Season and Trend, are implemented and compared in their ability to derive changes in trends and seasonalities. These methods are mostly implemented on time series of vegetation indices from satellite data, but seldom used on thermal data because of the temporal incoherence of the data. The preliminary results show that time series methods applied on satellite data are able to reconstruct the seasons on an annual scale while giving us a

  6. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    NASA Astrophysics Data System (ADS)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  7. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    PubMed Central

    Lutaif, N.A.; Palazzo, R.; Gontijo, J.A.R.

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile. PMID:24519093

  8. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis.

    PubMed

    Lutaif, N A; Palazzo, R; Gontijo, J A R

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile. PMID:24519093

  9. Generalized Multiscale Entropy Analysis: Application to Quantifying the Complex Volatility of Human Heartbeat Time Series

    PubMed Central

    Costa, Madalena D.; Goldberger, Ary L.

    2016-01-01

    We introduce a generalization of multiscale entropy (MSE) analysis. The method is termed MSEn, where the subscript denotes the moment used to coarse-grain a time series. MSEμ, described previously, uses the mean value (first moment). Here, we focus on MSEσ2, which uses the second moment, i.e., the variance. MSEσ2 quantifies the dynamics of the volatility (variance) of a signal over multiple time scales. We use the method to analyze the structure of heartbeat time series. We find that the dynamics of the volatility of heartbeat time series obtained from healthy young subjects is highly complex. Furthermore, we find that the multiscale complexity of the volatility, not only the multiscale complexity of the mean heart rate, degrades with aging and pathology. The “bursty” behavior of the dynamics may be related to intermittency in energy and information flows, as part of multiscale cycles of activation and recovery. Generalized MSE may also be useful in quantifying the dynamical properties of other physiologic and of non-physiologic time series. PMID:27099455

  10. Nonlinear Time Series Analysis of Nodulation Factor Induced Calcium Oscillations: Evidence for Deterministic Chaos?

    PubMed Central

    Hazledine, Saul; Sun, Jongho; Wysham, Derin; Downie, J. Allan; Oldroyd, Giles E. D.; Morris, Richard J.

    2009-01-01

    Legume plants form beneficial symbiotic interactions with nitrogen fixing bacteria (called rhizobia), with the rhizobia being accommodated in unique structures on the roots of the host plant. The legume/rhizobial symbiosis is responsible for a significant proportion of the global biologically available nitrogen. The initiation of this symbiosis is governed by a characteristic calcium oscillation within the plant root hair cells and this signal is activated by the rhizobia. Recent analyses on calcium time series data have suggested that stochastic effects have a large role to play in defining the nature of the oscillations. The use of multiple nonlinear time series techniques, however, suggests an alternative interpretation, namely deterministic chaos. We provide an extensive, nonlinear time series analysis on the nature of this calcium oscillation response. We build up evidence through a series of techniques that test for determinism, quantify linear and nonlinear components, and measure the local divergence of the system. Chaos is common in nature and it seems plausible that properties of chaotic dynamics might be exploited by biological systems to control processes within the cell. Systems possessing chaotic control mechanisms are more robust in the sense that the enhanced flexibility allows more rapid response to environmental changes with less energetic costs. The desired behaviour could be most efficiently targeted in this manner, supporting some intriguing speculations about nonlinear mechanisms in biological signaling. PMID:19675679

  11. Effective low-order models for atmospheric dynamics and time series analysis

    NASA Astrophysics Data System (ADS)

    Gluhovsky, Alexander; Grady, Kevin

    2016-02-01

    The paper focuses on two interrelated problems: developing physically sound low-order models (LOMs) for atmospheric dynamics and employing them as novel time-series models to overcome deficiencies in current atmospheric time series analysis. The first problem is warranted since arbitrary truncations in the Galerkin method (commonly used to derive LOMs) may result in LOMs that violate fundamental conservation properties of the original equations, causing unphysical behaviors such as unbounded solutions. In contrast, the LOMs we offer (G-models) are energy conserving, and some retain the Hamiltonian structure of the original equations. This work examines LOMs from recent publications to show that all of them that are physically sound can be converted to G-models, while those that cannot lack energy conservation. Further, motivated by recent progress in statistical properties of dynamical systems, we explore G-models for a new role of atmospheric time series models as their data generating mechanisms are well in line with atmospheric dynamics. Currently used time series models, however, do not specifically utilize the physics of the governing equations and involve strong statistical assumptions rarely met in real data.

  12. Effective low-order models for atmospheric dynamics and time series analysis.

    PubMed

    Gluhovsky, Alexander; Grady, Kevin

    2016-02-01

    The paper focuses on two interrelated problems: developing physically sound low-order models (LOMs) for atmospheric dynamics and employing them as novel time-series models to overcome deficiencies in current atmospheric time series analysis. The first problem is warranted since arbitrary truncations in the Galerkin method (commonly used to derive LOMs) may result in LOMs that violate fundamental conservation properties of the original equations, causing unphysical behaviors such as unbounded solutions. In contrast, the LOMs we offer (G-models) are energy conserving, and some retain the Hamiltonian structure of the original equations. This work examines LOMs from recent publications to show that all of them that are physically sound can be converted to G-models, while those that cannot lack energy conservation. Further, motivated by recent progress in statistical properties of dynamical systems, we explore G-models for a new role of atmospheric time series models as their data generating mechanisms are well in line with atmospheric dynamics. Currently used time series models, however, do not specifically utilize the physics of the governing equations and involve strong statistical assumptions rarely met in real data. PMID:26931600

  13. Harmonic analysis of environmental time series with missing data or irregular sample spacing.

    PubMed

    Dilmaghani, Shabnam; Henry, Isaac C; Soonthornnonda, Puripus; Christensen, Erik R; Henry, Ronald C

    2007-10-15

    The Lomb periodogram and discrete Fourier transform are described and applied to harmonic analysis of two typical data sets, one air quality time series and one water quality time series. The air quality data is a 13 year series of 24 hour average particulate elemental carbon data from the IMPROVE station in Washington, D.C. The water quality data are from the stormwater monitoring network in Milwaukee, WI and cover almost 2 years of precipitation events. These data have irregular sampling periods and missing data that preclude the straightforward application of the fast Fourier transform (FFT). In both cases, an anthropogenic periodicity is identified; a 7-day weekday/ weekend effect in the Washington elemental carbon series and a 1 month cycle in several constituents of stormwater. Practical aspects of application of the Lomb periodogram are discussed, particularly quantifying the effects of random noise. The proper application of the FFT to data that are irregularly spaced with missing values is demonstrated on the air quality data. Recommendations are given when to use the Lomb periodogram and when to use the FFT. PMID:17993144

  14. The application of artificial neural networks to magnetotelluric time-series analysis

    NASA Astrophysics Data System (ADS)

    Manoj, C.; Nagarajan, Nandini

    2003-05-01

    Magnetotelluric (MT) signals are often contaminated with noise from natural or man-made processes that may not fit a normal distribution or are highly correlated. This may lead to serious errors in computed MT transfer functions and result in erroneous interpretation. A substantial improvement is possible when the time-series are presented as clean as possible for further processing. Cleaning of MT time-series is often done by manual editing. Editing of magnetotelluric time-series is subjective in nature and time consuming. Automation of such a process is difficult to achieve by statistical methods. Artificial neural networks (ANNs) are widely used to automate processes that require human intelligence. The objective here is to automate MT long-period time-series editing using ANN. A three-layer feed-forward artificial neural network (FANN) was adopted for the problem. As ANN-based techniques are computationally intensive, a novel approach was made, which involves editing of five simultaneously measured MT time-series that have been subdivided into stacks (a stack=5 × 256 data points). Neural network training was done at two levels. Signal and noise patterns of individual channels were taught first. Five channel parameters along with interchannel correlation and amplitude ratios formed the input for a final network, which predicts the quality of a stack. A large database (5000 traces for pattern training and 900 vectors for interchannel training) was prepared to train the network. There were two error parameters to minimize while training: training error and testing error. Training was stopped when both errors were below an acceptable level. The sensitivity of the neural network to the signal-to-noise ratio and the relative significance of its inputs were tested to ensure that the training was correct. MT time-series from four stations with varying degrees of noise contamination were used to demonstrate the application of the network. The application brought out

  15. Wavelet-based Time Series Bootstrap Approach for Multidecadal Hydrologic Projections Using Observed and Paleo Data of Climate Indicators

    NASA Astrophysics Data System (ADS)

    Erkyihun, S. T.

    2013-12-01

    Understanding streamflow variability and the ability to generate realistic scenarios at multi-decadal time scales is important for robust water resources planning and management in any River Basin - more so on the Colorado River Basin with its semi-arid climate and highly stressed water resources It is increasingly evident that large scale climate forcings such as El Nino Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) are known to modulate the Colorado River Basin hydrology at multi-decadal time scales. Thus, modeling these large scale Climate indicators is important to then conditionally modeling the multi-decadal streamflow variability. To this end, we developed a simulation model that combines the wavelet-based time series method, Wavelet Auto Regressive Moving Average (WARMA) with a K-nearest neighbor (K-NN) bootstrap approach. In this, for a given time series (climate forcings), dominant periodicities/frequency bands are identified from the wavelet spectrum that pass the 90% significant test. The time series is filtered at these frequencies in each band to create ';components'; the components are orthogonal and when added to the residual (i.e., noise) results in the original time series. The components, being smooth, are easily modeled using parsimonious Auto Regressive Moving Average (ARMA) time series models. The fitted ARMA models are used to simulate the individual components which are added to obtain simulation of the original series. The WARMA approach is applied to all the climate forcing indicators which are used to simulate multi-decadal sequences of these forcing. For the current year, the simulated forcings are considered the ';feature vector' and K-NN of this are identified; one of the neighbors (i.e., one of the historical year) is resampled using a weighted probability metric (with more weights to nearest neighbor and least to the farthest) and the corresponding streamflow is the

  16. Reference manual for generation and analysis of Habitat Time Series: version II

    USGS Publications Warehouse

    Milhous, Robert T.; Bartholow, John M.; Updike, Marlys A.; Moos, Alan R.

    1990-01-01

    The selection of an instream flow requirement for water resource management often requires the review of how the physical habitat changes through time. This review is referred to as 'Time Series Analysis." The Tune Series Library (fSLIB) is a group of programs to enter, transform, analyze, and display time series data for use in stream habitat assessment. A time series may be defined as a sequence of data recorded or calculated over time. Examples might be historical monthly flow, predicted monthly weighted usable area, daily electrical power generation, annual irrigation diversion, and so forth. The time series can be analyzed, both descriptively and analytically, to understand the importance of the variation in the events over time. This is especially useful in the development of instream flow needs based on habitat availability. The TSLIB group of programs assumes that you have an adequate study plan to guide you in your analysis. You need to already have knowledge about such things as time period and time step, species and life stages to consider, and appropriate comparisons or statistics to be produced and displayed or tabulated. Knowing your destination, you must first evaluate whether TSLIB can get you there. Remember, data are not answers. This publication is a reference manual to TSLIB and is intended to be a guide to the process of using the various programs in TSLIB. This manual is essentially limited to the hands-on use of the various programs. a TSLIB use interface program (called RTSM) has been developed to provide an integrated working environment where the use has a brief on-line description of each TSLIB program with the capability to run the TSLIB program while in the user interface. For information on the RTSM program, refer to Appendix F. Before applying the computer models described herein, it is recommended that the user enroll in the short course "Problem Solving with the Instream Flow Incremental Methodology (IFIM)." This course is offered

  17. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  18. Subsonic flutter analysis addition to NASTRAN. [for use with CDC 6000 series digital computers

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Harder, R. L.

    1973-01-01

    A subsonic flutter analysis capability has been developed for NASTRAN, and a developmental version of the program has been installed on the CDC 6000 series digital computers at the Langley Research Center. The flutter analysis is of the modal type, uses doublet lattice unsteady aerodynamic forces, and solves the flutter equations by using the k-method. Surface and one-dimensional spline functions are used to transform from the aerodynamic degrees of freedom to the structural degrees of freedom. Some preliminary applications of the method to a beamlike wing, a platelike wing, and a platelike wing with a folded tip are compared with existing experimental and analytical results.

  19. A nonlinear modeling approach using weighted piecewise series and its applications to predict unsteady flows

    NASA Astrophysics Data System (ADS)

    Yao, Weigang; Liou, Meng-Sing

    2016-08-01

    To preserve nonlinearity of a full-order system over a range of parameters of interest, we propose an accurate and robust nonlinear modeling approach by assembling a set of piecewise linear local solutions expanded about some sampling states. The work by Rewienski and White [1] on micromachined devices inspired our use of piecewise linear local solutions to study nonlinear unsteady aerodynamics. These local approximations are assembled via nonlinear weights of radial basis functions. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving with different pitching motions, specifically AGARD's CT2 and CT5 problems [27], in which the flows exhibit different nonlinear behaviors. Furthermore, application of the developed aerodynamic model to a two-dimensional aero-elastic system proves the approach is capable of predicting limit cycle oscillations (LCOs) by using AGARD's CT6 [28] as a benchmark test. All results, based on inviscid solutions, confirm that our nonlinear model is stable and accurate, against the full model solutions and measurements, and for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robust for inputs that considerably depart from the base trajectory in form and magnitude. This modeling provides a very efficient way for predicting unsteady flowfields with varying parameters because it needs only a tiny fraction of the cost of a full-order modeling for each new condition-the more cases studied, the more savings rendered. Hence, the present approach is especially useful for parametric studies, such as in the case of design optimization and exploration of flow phenomena.

  20. Delineation of Surface-Groundwater Interactions Using Statistical Analysis of Temperature Time-Series and Resistivity Methods

    NASA Astrophysics Data System (ADS)

    Scotch, C. G.; Murgulet, D.; Hay, R.

    2013-12-01

    Although surface-water and groundwater are often referred to as separate domains, they are intimately related as a change in one domain can ultimately affect the other domain. Since the two domains act as linked pathways for contaminant transport in the hydrologic cycle a comprehensive understanding of this relationship is essential for improved SW-GW management practices. The main objective of this study is to develop new statistical methods to better identify and characterize the advective component or water movement between SW-GW in a coastal area along the South Texas coast, adjacent to the Gulf of Mexico (GOM) margin, characterized by low gradients and low-conductivity stream beds. Identifying advection zones using temperature data in regions with low topographic relief and numerous small-scale flow paths is difficult. To overcome this challenge this study proposes the use of seasonal-trend decomposition (STL) of time series temperature data to analyze exchanges in this type of environment. Seasonal decomposition analysis was used to remove the daily and annual cyclic components leaving the random or non-cyclic component. It can be inferred that high variances of the random component indicate periods of advection. This statistically-derived advective component correlates well with advection periods identified from the conventional time-series temperature profile analysis. This correlation is a good validation of the statistical approach as means of identifying periods of advection and SW-GW interaction. Electrical resistivity imaging will be used for validation of the statistical model.

  1. On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series

    NASA Astrophysics Data System (ADS)

    Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman

    2016-04-01

    The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for

  2. Population-level administration of AlcoholEdu for college: an ARIMA time-series analysis.

    PubMed

    Wyatt, Todd M; Dejong, William; Dixon, Elizabeth

    2013-08-01

    Autoregressive integrated moving averages (ARIMA) is a powerful analytic tool for conducting interrupted time-series analysis, yet it is rarely used in studies of public health campaigns or programs. This study demonstrated the use of ARIMA to assess AlcoholEdu for College, an online alcohol education course for first-year students, and other health and safety programs introduced at a moderate-size public university in the South. From 1992 to 2009, the university administered annual Core Alcohol and Drug Surveys to samples of undergraduates (Ns = 498 to 1032). AlcoholEdu and other health and safety programs that began during the study period were assessed through a series of quasi-experimental ARIMA analyses. Implementation of AlcoholEdu in 2004 was significantly associated with substantial decreases in alcohol consumption and alcohol- or drug-related negative consequences. These improvements were sustained over time as succeeding first-year classes took the course. Previous studies have shown that AlcoholEdu has an initial positive effect on students' alcohol use and associated negative consequences. This investigation suggests that these positive changes may be sustainable over time through yearly implementation of the course with first-year students. ARIMA time-series analysis holds great promise for investigating the effect of program and policy interventions to address alcohol- and drug-related problems on campus. PMID:23742712

  3. Geospatial Analysis of Near-Surface Soil Moisture Time Series Data Over Indian Region

    NASA Astrophysics Data System (ADS)

    Berwal, P.; Murthy, C. S.; Raju, P. V.; Sesha Sai, M. V. R.

    2016-06-01

    The present study has developed the time series database surface soil moisture over India, for June, July and August months for the period of 20 years from 1991 to 2010, using data products generated under Climate Change Initiative Programme of European Space Agency. These three months represent the crop sowing period in the prime cropping season in the country and the soil moisture data during this period is highly useful to detect the drought conditions and assess the drought impact. The time series soil moisture data which is in 0.25 degree spatial resolution was analyzed to generate different indicators. Rainfall data of same spatial resolution for the same period, generated by India Meteorological Department was also procured and analyzed. Geospatial analysis of soil moisture and rainfall derived indicators was carried out to study (1) inter annual variability of soil moisture and rainfall, (2) soil moisture deviations from normal during prominent drought years, (3) soil moisture and rainfall correlations and (4) drought exposure based on soil moisture and rainfall variability. The study has successfully demonstrated the potential of these soil moisture time series data sets for generating regional drought surveillance information products, drought hazard mapping, drought exposure analysis and detection of drought sensitive areas in the crop planting period.

  4. Water quality management using statistical analysis and time-series prediction model

    NASA Astrophysics Data System (ADS)

    Parmar, Kulwinder Singh; Bhardwaj, Rashmi

    2014-12-01

    This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.

  5. Nonlinear time series analysis of the fluctuations of the geomagnetic horizontal field

    NASA Astrophysics Data System (ADS)

    George, B.; Renuka, G.; Satheesh Kumar, K.; Kumar, C. P. Anil; Venugopal, C.

    2002-02-01

    A detailed nonlinear time series analysis of the hourly data of the geomagnetic horizontal intensity H measured at Kodaikanal (10.2° N; 77.5° E; mag: dip 3.5° N) has been carried out to investigate the dynamical behaviour of the fluctuations of H. The recurrence plots, spatiotemporal entropy and the result of the surrogate data test show the deterministic nature of the fluctuations, rejecting the hypothesis that H belong to the family of linear stochastic signals. The low dimensional character of the dynamics is evident from the estimated value of the correlation dimension and the fraction of false neighbours calculated for various embedding dimensions. The exponential decay of the power spectrum and the positive Lyapunov exponent indicate chaotic behaviour of the underlying dynamics of H. This is also supported by the results of the comparison of the chaotic characteristics of the time series of H with the pseudo-chaotic characteristics of coloured noise time series. We have also shown that the error involved in the short-term prediction of successive values of H, using a simple but robust, zero-order nonlinear prediction method, increases exponentially. It has also been suggested that there exists the possibility of characterizing the geomagnetic fluctuations in terms of the invariants in chaos theory, such as Lyapunov exponents and correlation dimension. The results of the analysis could also have implications in the development of a suitable model for the daily fluctuations of geomagnetic horizontal intensity.

  6. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  7. Rapid space trajectory generation using a Fourier series shape-based approach

    NASA Astrophysics Data System (ADS)

    Taheri, Ehsan

    With the insatiable curiosity of human beings to explore the universe and our solar system, it is essential to benefit from larger propulsion capabilities to execute efficient transfers and carry more scientific equipments. In the field of space trajectory optimization the fundamental advances in using low-thrust propulsion and exploiting the multi-body dynamics has played pivotal role in designing efficient space mission trajectories. The former provides larger cumulative momentum change in comparison with the conventional chemical propulsion whereas the latter results in almost ballistic trajectories with negligible amount of propellant. However, the problem of space trajectory design translates into an optimal control problem which is, in general, time-consuming and very difficult to solve. Therefore, the goal of the thesis is to address the above problem by developing a methodology to simplify and facilitate the process of finding initial low-thrust trajectories in both two-body and multi-body environments. This initial solution will not only provide mission designers with a better understanding of the problem and solution but also serves as a good initial guess for high-fidelity optimal control solvers and increases their convergence rate. Almost all of the high-fidelity solvers enjoy the existence of an initial guess that already satisfies the equations of motion and some of the most important constraints. Despite the nonlinear nature of the problem, it is sought to find a robust technique for a wide range of typical low-thrust transfers with reduced computational intensity. Another important aspect of our developed methodology is the representation of low-thrust trajectories by Fourier series with which the number of design variables reduces significantly. Emphasis is given on simplifying the equations of motion to the possible extent and avoid approximating the controls. These facts contribute to speeding up the solution finding procedure. Several example

  8. An improved time series approach for estimating groundwater recharge from groundwater level fluctuations

    NASA Astrophysics Data System (ADS)

    Cuthbert, M. O.

    2010-09-01

    An analytical solution to a linearized Boussinesq equation is extended to develop an expression for groundwater drainage using estimations of aquifer parameters. This is then used to develop an improved water table fluctuation (WTF) technique for estimating groundwater recharge. The resulting method extends the standard WTF technique by making it applicable, as long as aquifer properties for the area are relatively well known, in areas with smoothly varying water tables and is not reliant on precipitation data. The method is validated against numerical simulations and a case study from a catchment where recharge is "known" a priori using other means. The approach may also be inverted to provide initial estimates of aquifer parameters in areas where recharge can be reliably estimated by other methods.

  9. Surgery-first orthognathic approach case series: Salient features and guidelines

    PubMed Central

    Gandedkar, Narayan H; Chng, Chai Kiat; Tan, Winston

    2016-01-01

    Conventional orthognathic surgery treatment involves a prolonged period of orthodontic treatment (pre- and post-surgery), making the total treatment period of 3–4 years too exhaustive. Surgery-first orthognathic approach (SFOA) sees orthognathic surgery being carried out first, followed by orthodontic treatment to align the teeth and occlusion. Following orthognathic surgery, a period of rapid metabolic activity within tissues ensues is known as the regional acceleratory phenomenon (RAP). By performing surgery first, RAP can be harnessed to facilitate efficient orthodontic treatment. This phenomenon is believed to be a key factor in the notable reduction in treatment duration using SFOA. This article presents two cases treated with SFOA with emphasis on “case selection, treatment strategy, merits, and limitations” of SFOA. Further, salient features comparison of “conventional orthognathic surgery” and “SFOA” with an overview of author's SFOA treatment protocol is enumerated. PMID:26998476

  10. Regenerative Approach to Bilateral Rostral Mandibular Reconstruction in a Case Series of Dogs

    PubMed Central

    Arzi, Boaz; Cissell, Derek D.; Pollard, Rachel E.; Verstraete, Frank J. M.

    2015-01-01

    Extensive rostral mandibulectomy in dogs typically results in instability of the mandibles that may lead to malocclusion, difficulty in prehension, mastication, and pain of the temporomandibular joint. Large rostral mandibular defects are challenging to reconstruct due to the complex geometry of this region. In order to restore mandibular continuity and stability following extensive rostral mandibulectomy, we developed a surgical technique using a combination of intraoral and extraoral approaches, a locking titanium plate, and a compression resistant matrix (CRM) infused with rhBMP-2. Furthermore, surgical planning that consisted of computed tomographic (CT) scanning and 3D model printing was utilized. We describe a regenerative surgical technique for immediate or delayed reconstruction of critical-size rostral mandibular defects in five dogs. Three dogs had healed with intact gingival covering over the mandibular defect and had immediate return to normal function and occlusion. Two dogs had the complication of focal plate exposure and dehiscence, which was corrected with mucosal flaps and suturing; these dogs have since healed with intact gingival covering over the mandibular defect. Mineralized tissue formation was palpated clinically within 2 weeks and solid bone formation within 3 months. CT findings at 6 months postoperatively demonstrated that the newly regenerated mandibular bone had increased in mineral volume with evidence of integration between the native bone, new bone, and CRM compared to the immediate postoperative CT. We conclude that rostral mandibular reconstruction using a regenerative approach provides an excellent solution for restoring mandibular continuity and preventing mandibular instability in dogs. PMID:26664933

  11. Approaches to data analysis of multiple-choice questions

    NASA Astrophysics Data System (ADS)

    Ding, Lin; Beichner, Robert

    2009-12-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  12. Hydroelastic vibration analysis of partially liquid-filled shells using a series representation of the liquid

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Herr, R. W.; Sewall, J. L.

    1980-01-01

    A series representation of the oscillatory behavior of incompressible nonviscous liquids contained in partially filled elastic tanks is presented. Each term is selected on the basis of hydroelastic vibrations in circular cylindrical tanks. Using a complementary energy principle, the superposition of terms is made to approximately satisfy the liquid-tank interface compatibility. This analysis is applied to the gravity sloshing and hydroelastic vibrations of liquids in hemispherical tanks and in a typical elastic aerospace propellant tank. With only a few series terms retained, the results correlate very well with existing analytical results, NASTRAN-generated analytical results, and experimental test results. Hence, although each term is based on a cylindrical tank geometry, the superposition can be successfully applied to noncylindrical tanks.

  13. An application of the Fourier Series in the analysis of waterhammer in pumped storage plant

    SciTech Connect

    Serpas, D.

    1995-12-31

    In a pumped storage scheme, water flows from the upper reservoir to the lower reservoir, and the turbo/machine acts as turbine. In the pumping mode, the water is pumped from the lower reservoir to the upper reservoir, and the turbo/machine acts as a pump. A method using Fourier Series for the approximation of the characteristics curves of the pump in four quadrants is applied for the analysis of waterhammer due the power failure. Others conditions may also be analyzed. A program adjusts the data of the four characteristic curves in CHAUDRHY, 1987 by Fourier Series. This adjustment should permit a closer representation of the pump actually used in the system that will result in a more economical design.

  14. Random matrix approach to categorical data analysis.

    PubMed

    Patil, Aashay; Santhanam, M S

    2015-09-01

    Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings, and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow random matrix predictions with the dominant eigenvalue being an exception. We demonstrate this approach by applying it to the data for Indian general elections and sea level pressures in the North Atlantic ocean. PMID:26465449

  15. Variational approach for nonpolar solvation analysis

    PubMed Central

    Chen, Zhan; Zhao, Shan; Chun, Jaehun; Thomas, Dennis G.; Baker, Nathan A.; Bates, Peter W.; Wei, G. W.

    2012-01-01

    Solvation analysis is one of the most important tasks in chemical and biological modeling. Implicit solvent models are some of the most popular approaches. However, commonly used implicit solvent models rely on unphysical definitions of solvent-solute boundaries. Based on differential geometry, the present work defines the solvent-solute boundary via the variation of the nonpolar solvation free energy. The solvation free energy functional of the system is constructed based on a continuum description of the solvent and the discrete description of the solute, which are dynamically coupled by the solvent-solute boundaries via van der Waals interactions. The first variation of the energy functional gives rise to the governing Laplace-Beltrami equation. The present model predictions of the nonpolar solvation energies are in an excellent agreement with experimental data, which supports the validity of the proposed nonpolar solvation model. PMID:22938212

  16. Random matrix approach to categorical data analysis

    NASA Astrophysics Data System (ADS)

    Patil, Aashay; Santhanam, M. S.

    2015-09-01

    Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings, and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow random matrix predictions with the dominant eigenvalue being an exception. We demonstrate this approach by applying it to the data for Indian general elections and sea level pressures in the North Atlantic ocean.

  17. Statistical approach to partial equilibrium analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  18. Vandalism Prevention. ACSA School Management Digest, Series 1, Number 2. ERIC/CEM Research Analysis Series, Number 29.

    ERIC Educational Resources Information Center

    Coursen, David

    Estimates of the precise costs of school vandalism vary widely, but the seriousness of the problem is beyond dispute. It is possible to get a general idea of the nature and motivation of most vandals and, in doing so, begin to understand the problem and devise solutions for it. There are two basic approaches to vandalism prevention. Currently, as…

  19. Extracting tidal frequencies using multivariate harmonic analysis of sea level height time series

    NASA Astrophysics Data System (ADS)

    Amiri-Simkooei, A. R.; Zaminpardaz, S.; Sharifi, M. A.

    2014-10-01

    This contribution is seen as a first attempt to extract the tidal frequencies using a multivariate spectral analysis method applied to multiple time series of tide-gauge records. The existing methods are either physics-based in which the ephemeris of Moon, Sun and other planets are used, or are observation-based in which univariate analysis methods—Fourier and wavelet for instance—are applied to tidal observations. The existence of many long tide-gauge records around the world allows one to use tidal observations and extract the main tidal constituents for which efficient multivariate methods are to be developed. This contribution applies the multivariate least-squares harmonic estimation (LS-HE) to the tidal time series of the UK tide-gauge stations. The first 413 harmonics of the tidal constituents and their nonlinear components are provided using the multivariate LS-HE. A few observations of the research are highlighted: (1) the multivariate analysis takes information of multiple time series into account in an optimal least- squares sense, and thus the tidal frequencies have higher detection power compared to the univariate analysis. (2) Dominant tidal frequencies range from the long-term signals to the sixth-diurnal species interval. Higher frequencies have negligible effects. (3) The most important tidal constituents (the first 50 frequencies) ordered from their amplitudes range from 212 cm (M2) to 1 cm (OQ2) for the data set considered. There are signals in this list that are not available in the 145 main tidal frequencies of the literature. (4) Tide predictions using different lists of tidal frequencies on five different data sets around the world are compared. The prediction results using the first significant 50 constituents provided promising results on these locations of the world.

  20. Representative locations from time series of soil water content using time stability and wavelet analysis.

    PubMed

    Rivera, Diego; Lillo, Mario; Granda, Stalin

    2014-12-01

    The concept of time stability has been widely used in the design and assessment of monitoring networks of soil moisture, as well as in hydrological studies, because it is as a technique that allows identifying of particular locations having the property of representing mean values of soil moisture in the field. In this work, we assess the effect of time stability calculations as new information is added and how time stability calculations are affected at shorter periods, subsampled from the original time series, containing different amounts of precipitation. In doing so, we defined two experiments to explore the time stability behavior. The first experiment sequentially adds new data to the previous time series to investigate the long-term influence of new data in the results. The second experiment applies a windowing approach, taking sequential subsamples from the entire time series to investigate the influence of short-term changes associated with the precipitation in each window. Our results from an operating network (seven monitoring points equipped with four sensors each in a 2-ha blueberry field) show that as information is added to the time series, there are changes in the location of the most stable point (MSP), and that taking the moving 21-day windows, it is clear that most of the variability of soil water content changes is associated with both the amount and intensity of rainfall. The changes of the MSP over each window depend on the amount of water entering the soil and the previous state of the soil water content. For our case study, the upper strata are proxies for hourly to daily changes in soil water content, while the deeper strata are proxies for medium-range stored water. Thus, different locations and depths are representative of processes at different time scales. This situation must be taken into account when water management depends on soil water content values from fixed locations. PMID:25249045

  1. The Re-Analysis of Ozone Profile Data from a 41-Year Series of SBUV Instruments

    NASA Technical Reports Server (NTRS)

    Kramarova, Natalya; Frith, Stacey; Bhartia, Pawan K.; McPeters, Richard; Labow, Gordon; Taylor, Steven; Fisher, Bradford

    2012-01-01

    In this study we present the validation of ozone profiles from a number of Solar Back Scattered Ultra Violet (SBUV) and SBUV/2 instruments that were recently reprocessed using an updated (Version 8.6) algorithm. The SBUV dataset provides the longest available record of global ozone profiles, spanning a 41-year period from 1970 to 2011 (except a 5-year gap in the 1970s) and includes ozone profile records obtained from the Nimbus-4 BUV and Nimbus-7 SBUV instruments, and a series of SBUV(/2) instruments launched on NOAA operational satellites (NOAA 09, 11, 14, 16, 17, 18, 19). Although modifications in instrument design were made in the evolution from the BUV instrument to the modern SBUV(/2) model, the basic principles of the measurement technique and retrieval algorithm remain the same. The long term SBUV data record allows us to create a consistent, calibrated dataset of ozone profiles that can be used for climate studies and trend analyses. In particular, we focus on estimating the various sources of error in the SBUV profile ozone retrievals using independent observations and analysis of the algorithm itself. For the first time we include in the metadata a quantitative estimate of the smoothing error, defined as the error due to profile variability that the SBUV observing system cannot inherently measure. The magnitude of the smoothing error varies with altitude, latitude, season and solar zenith angle. Between 10 and 1 hPa the smoothing errors for the SBUV monthly zonal mean retrievals are of the order of 1 %, but start to increase above and below this layer. The largest smoothing errors, as large as 15-20%, were detected in in the troposphere. The SBUV averaging kernels, provided with the ozone profiles in version 8.6, help to eliminate the smoothing effect when comparing the SBUV profiles with high vertical resolution measurements, and make it convenient to use the SBUV ozone profiles for data assimilation and model validation purposes. The smoothing error can

  2. Sediment residence times constrained by uranium-series isotopes: A critical appraisal of the comminution approach

    NASA Astrophysics Data System (ADS)

    Handley, Heather K.; Turner, Simon; Afonso, Juan C.; Dosseto, Anthony; Cohen, Tim

    2013-02-01

    Quantifying the rates of landscape evolution in response to climate change is inhibited by the difficulty of dating the formation of continental detrital sediments. We present uranium isotope data for Cooper Creek palaeochannel sediments from the Lake Eyre Basin in semi-arid South Australia in order to attempt to determine the formation ages and hence residence times of the sediments. To calculate the amount of recoil loss of 234U, a key input parameter used in the comminution approach, we use two suggested methods (weighted geometric and surface area measurement with an incorporated fractal correction) and typical assumed input parameter values found in the literature. The calculated recoil loss factors and comminution ages are highly dependent on the method of recoil loss factor determination used and the chosen assumptions. To appraise the ramifications of the assumptions inherent in the comminution age approach and determine individual and combined comminution age uncertainties associated to each variable, Monte Carlo simulations were conducted for a synthetic sediment sample. Using a reasonable associated uncertainty for each input factor and including variations in the source rock and measured (234U/238U) ratios, the total combined uncertainty on comminution age in our simulation (for both methods of recoil loss factor estimation) can amount to ±220-280 ka. The modelling shows that small changes in assumed input values translate into large effects on absolute comminution age. To improve the accuracy of the technique and provide meaningful absolute comminution ages, much tighter constraints are required on the assumptions for input factors such as the fraction of α-recoil lost 234Th and the initial (234U/238U) ratio of the source material. In order to be able to directly compare calculated comminution ages produced by different research groups, the standardisation of pre-treatment procedures, recoil loss factor estimation and assumed input parameter values

  3. Extensive mapping of coastal change in Alaska by Landsat time-series analysis, 1972-2013 (Invited)

    NASA Astrophysics Data System (ADS)

    Macander, M. J.; Swingley, C. S.; Reynolds, J.

    2013-12-01

    The landscape-scale effects of coastal storms on Alaska's Bering Sea and Gulf of Alaska coasts includes coastal erosion, migration of spits and barrier islands, breaching of coastal lakes and lagoons, and inundation and salt-kill of vegetation. Large changes in coastal storm frequency and intensity are expected due to climate change and reduced sea-ice extent. Storms have a wide range of impacts on carbon fluxes and on fish and wildlife resources, infrastructure siting and operation, and emergency response planning. In areas experiencing moderate to large effects, changes can be mapped by analyzing trends in time series of Landsat imagery from Landsat 1 through Landsat 8. ABR, Inc.--Environmental Research & Services and the Western Alaska Landscape Conservation Cooperative are performing a time-series trend analysis for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska. The archive of Landsat imagery covers the time period 1972-present. For a pilot study area in Kotzebue Sound, we conducted a regression analysis of changes in near-infrared reflectance to identify areas with significant changes in coastal features, 1972-2011. Suitable ice- and cloud-free Landsat imagery was obtained for 28 of the 40 years during the period. The approach captured several coastal changes over the 40-year study period, including coastal erosion exceeding the 60-m pixel resolution of the Multispectral Scanner (MSS) data and migrations of coastal spits and estuarine channels. In addition several lake drainage events were identified, mostly inland from the coastal zone. Analysis of shorter, decadal time periods produced noisier results that were generally consistent with the long-term trend analysis. Unusual conditions at the start or end of the time-series can strongly influence decadal results. Based on these results the study is being scaled up to map coastal change for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska coast. The

  4. Comprehensive Model of Annual Plankton Succession Based on the Whole-Plankton Time Series Approach

    PubMed Central

    Romagnan, Jean-Baptiste; Legendre, Louis; Guidi, Lionel; Jamet, Jean-Louis; Jamet, Dominique; Mousseau, Laure; Pedrotti, Maria-Luiza; Picheral, Marc; Gorsky, Gabriel; Sardet, Christian; Stemmann, Lars

    2015-01-01

    Ecological succession provides a widely accepted description of seasonal changes in phytoplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes) and larger (i.e. macroplankton) organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankton in the whole-plankton assemblage that spanned 5 orders of magnitude in size from microbes to macroplankton predators (not including fish or fish larvae, for which no consistent data were available). Samples were collected in the northwestern Mediterranean Sea (Bay of Villefranche) weekly during 10 months. Simultaneously collected samples were analyzed by flow cytometry, inverse microscopy, FlowCam, and ZooScan. The whole-plankton assemblage underwent sharp reorganizations that corresponded to bottom-up events of vertical mixing in the water-column, and its development was top-down controlled by large gelatinous filter feeders and predators. Based on the results provided by our novel whole-plankton assemblage approach, we propose a new comprehensive conceptual model of the annual plankton succession (i.e. whole plankton model) characterized by both stepwise stacking of four broad trophic communities from early spring through summer, which is a new concept, and progressive replacement of ecological plankton categories within the different trophic communities, as recognised traditionally. PMID:25780912

  5. Experimental verification and stability state space analysis of CLL-T series parallel resonant converter with fuzzy controller

    NASA Astrophysics Data System (ADS)

    Nagarajan, Chinnadurai; Madheswaran, Muthusamy

    2012-12-01

    This paper presents a closed loop CLL-T (capacitor inductor inductor) series parallel resonant converter (SPRC) has been simulated and the performance is analyzed. A three element CLL-T SPRC working under load independent operation (voltage type and current type load) is presented in this paper. The stability and AC analysis of CLL-T SPRC has been developed using state space technique and the regulation of output voltage is done by using Fuzzy controller. The simulation study indicates the superiority of fuzzy control over the conventional control methods. The proposed approach is expected to provide better voltage regulation for dynamic load conditions. A prototype 300 W, 100 kHz converter is designed and built to experimentally demonstrate, dynamic and steady state performance for the CLL-T SPRC are compared from the simulation studies.

  6. Sinking Chao Phraya delta plain, Thailand, derived from SAR interferometry time series analysis

    NASA Astrophysics Data System (ADS)

    Tanaka, A.; Mio, A.; Saito, Y.

    2013-12-01

    The Bangkok Metropolitan region and its surrounding provinces are located in a low-lying delta plain of the Chao Phraya River. Extensive groundwater use from the late 1950s has caused the decline of groundwater levels in the aquifers and Holocene clay compaction beneath the Bangkok Region, resulting in significant subsidence of the ground. This ground deformation has been monitored using leveling surveys since 1978, and differential InSAR (Interferometric Synthetic Aperture Radar) analysis. It shows that the Bangkok Metropolitan region is subsiding at a rate of about 20 mm/year during the recent years due to law-limited groundwater pumping, although the highest subsidence rate as high as 120 mm/year was recorded in 1981. The subsidence rate in the Bangkok area has significantly decreased since the late 1980s; however, the affected area has spread out to the surrounding areas. The maximum subsidence rate up to 30 mm/year occurred in the outlying southeast and southwest coastal zones in 2002. In this study, we apply a SAR interferometry time series analysis to monitor ground deformations in the lower Chao Phraya delta plain (Lower Central Plain), Thailand, using ALOS (Advanced Land Observing Satellite) PALSAR (Phased Array type L-band SAR) data acquired between July 2007 and September 2010. We derive a single reference time series interferogram from the stacking of unwrapped phases under the assumptions that those phases are smoothly and continuously connected, and apply a smoothness-constrained inversion algorithm that optimizes the displacement from the phase unwrapping of multitemporal differential SAR interferograms. The SAR interferometry time series analysis succeeds to monitor the incremental line-of-sight (LOS)-change between SAR scene acquisitions. LOS displacements are converted to vertical displacements, based on the assumption that the ground displacement in this area occurs only in the vertical directions. This reveals an overall pattern of subsidence

  7. Multifractal analysis of geophysical time series in the urban lake of Créteil (France).

    NASA Astrophysics Data System (ADS)

    Mezemate, Yacine; Tchiguirinskaia, Ioulia; Bonhomme, Celine; Schertzer, Daniel; Lemaire, Bruno Jacques; Vinçon leite, Brigitte; Lovejoy, Shaun

    2013-04-01

    Urban water bodies take part in the environmental quality of the cities. They regulate heat, contribute to the beauty of landscape and give some space for leisure activities (aquatic sports, swimming). As they are often artificial they are only a few meters deep. It confers them some specific properties. Indeed, they are particularly sensitive to global environmental changes, including climate change, eutrophication and contamination by micro-pollutants due to the urbanization of the watershed. Monitoring their quality has become a major challenge for urban areas. The need for a tool for predicting short-term proliferation of potentially toxic phytoplankton therefore arises. In lakes, the behavior of biological and physical (temperature) fields is mainly driven by the turbulence regime in the water. Turbulence is highly non linear, nonstationary and intermittent. This is why statistical tools are needed to characterize the evolution of the fields. The knowledge of the probability distribution of all the statistical moments of a given field is necessary to fully characterize it. This possibility is offered by the multifractal analysis based on the assumption of scale invariance. To investigate the effect of space-time variability of temperature, chlorophyll and dissolved oxygen on the cyanobacteria proliferation in the urban lake of Creteil (France), a spectral analysis is first performed on each time series (or on subsamples) to have an overall estimate of their scaling behaviors. Then a multifractal analysis (Trace Moment, Double Trace Moment) estimates the statistical moments of different orders. This analysis is adapted to the specific properties of the studied time series, i. e. the presence of large scale gradients. The nonlinear behavior of the scaling functions K(q) confirms that the investigated aquatic time series are indeed multifractal and highly intermittent .The knowledge of the universal multifractal parameters is the key to calculate the different

  8. Coenzyme Q(10): a novel therapeutic approach for Fibromyalgia? case series with 5 patients.

    PubMed

    Cordero, Mario D; Alcocer-Gómez, Elísabet; de Miguel, Manuel; Cano-García, Francisco Javier; Luque, Carlos M; Fernández-Riejo, Patricia; Fernández, Ana María Moreno; Sánchez-Alcazar, José Antonio

    2011-07-01

    Coenzyme Q(10) (CoQ(10)) is an essential electron carrier in the mitochondrial respiratory chain and a strong antioxidant. Low CoQ(10) levels have been detected in patients with Fibromyalgia (FM). The purpose of the present work was to assess the effect of CoQ(10) on symptoms of five patients with FM. Patients were evaluated clinically with Visual Analogical Scale of pain (VAS), and Fibromyalgia Impact Questionnaire (FIQ). Patients with CoQ(10) deficiency showed a statistically significant reduction on symptoms after CoQ(10) treatment during 9 months (300 mg/day). Determination of deficiency and consequent supplementation in FM may result in clinical improvement. Further analysis involving more scientifically rigorous methodology will be required to confirm this observation. PMID:21496502

  9. Measurement, time series analysis and source apportionment of inorganic and organic speciated PM(2.5) air pollution in Denver

    NASA Astrophysics Data System (ADS)

    Dutton, Steven James

    Particulate air pollution has demonstrated significant health effects ranging from worsening of asthma to increased rates of respiratory and cardiopulmonary mortality. These results have prompted the US-EPA to include particulate matter (PM) as one of the six criteria air pollutants regulated under the Clean Air Act. The diverse chemical make-up and physical characteristics of PM make it a challenging pollutant to characterize and regulate. Particulate matter less than 2.5 microns in diameter (PM2.5) has the ability to travel deep into the lungs and therefore has been linked with some of the more significant health effects. The toxicity of any given particle is likely dependent on its chemical composition. The goal of this project has been to chemically characterize a long time series of PM 2.5 measurements collected at a receptor site in Denver to a level of detail that has not been done before on this size data set. This has involved characterization of inorganic ions using ion chromatography, total elemental and organic carbon using thermal optical transmission, and organic molecular marker species using gas chromatography-mass spectrometry. Methods have been developed to allow for daily measurement and speciation for these compounds over a six year period. Measurement methods, novel approaches to uncertainty estimation, time series analysis, spectral and pattern analyses and source apportionment using two multivariate factor analysis models are presented. Analysis results reveal several natural and anthropogenic sources contributing to PM2.5 in Denver. The most distinguishable sources are motor vehicles and biomass combustion. This information will be used in a health effect analysis as part of a larger study called the Denver Aerosol Sources and Health (DASH) study. Such results will inform regulatory decisions and may help create a better understanding of the underlying mechanisms for the observed adverse health effects associated with PM2.5.

  10. A combined U-series, radiocarbon and stable isotope approach for constructing a Pleistocene lake hydrograph: an example from Surprise Valley, California

    NASA Astrophysics Data System (ADS)

    Ibarra, D. E.; Weaver, K. L.; Harris, C.; Maher, K.

    2013-12-01

    Lake records and lake hydrographs provide an integrated record of the hydrologic conditions across a watershed. To provide useful constraints on past changes in climate, robust hydrographs require concordance among multiple geochronologic approaches as well as supporting geochemical and hydrologic evidence. Dating shoreline or near-shore lacustrine carbonates using U-series and radiocarbon methods is one approach for developing the age-elevation constraints to construct lake hydrographs. Geochemical analyses (e.g., stable isotopes, elemental ratios, U-series measurements) of modern waters and sediments, as well as the primary carbonate samples, can be used to assess the potential influence of open-system behavior, detrital Th corrections, or pedogenic overprinting on the calculated ages. Additionally, topographic analyses (e.g., basin pour point, shoreline elevations and sample locations) further constrain the spatial relevance and relationships between sample localities. To evaluate the timing and magnitude of the most recent late Pleistocene lake cycle in Surprise Valley, California, we analyzed 111 sub-samples from 22 laminated shoreline tufa samples using U-series disequilibrium geochronology, and pair these analyses with 15 radiocarbon ages. To further assess the radiocarbon and U-series ages, we measured the stable isotope (δ18O and δ13C) and elemental (Sr/Ca) signatures of the tufa samples, and characterized the range of (234U/238U) observed in the modern waters and playas within the watershed. Topographic analysis verified that Lake Surprise is a closed, inward draining basin, and demonstrated lateral correspondence between samples from the four targeted shoreline sets. Multiple lines of evidence revealed that samples from the highest shorelines are likely from older, higher lake cycles and were influenced by variable amounts of open-system exchange or pedogenic overprinting. The measured U concentrations of ~300 to 1200 ng/g, with (238U/232Th) from ~3 to

  11. Approaches to Cycle Analysis and Performance Metrics

    NASA Technical Reports Server (NTRS)

    Parson, Daniel E.

    2003-01-01

    The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.

  12. Hierarchical approaches to analysis of natural textures

    NASA Astrophysics Data System (ADS)

    Lutsiv, Vadim R.; Malyshev, Igor A.; Novikova, Tatiana A.

    2004-09-01

    The surface textures of natural objects often have the visible fractal-like properties. A similar pattern of texture could be found looking at the forests in the aerial photographs or at the trees in the outdoor scenes when the image spatial resolution was changed. Or the texture patterns are different at different spatial resolution levels in the aerial photographs of villages. It creates the difficulties in image segmentation and object recognition because the levels of spatial resolution necessary to get the homogeneously and correctly labeled texture regions differ for different types of landscape. E.g. if the spatial resolution was sufficient for distinguishing between the textures of agricultural fields, water, and asphalt, the texture labeled areas of forest or suburbs are hardly fragmented, because the texture peculiarities corresponding to two stable levels of texture spatial resolution will be visible in this case. A hierarchical texture analysis could solve this problem, and we did it in two different ways: we performed the texture segmentation simultaneously for several levels of image spatial resolution, or we subjected the texture labeled image of highest spatial resolution to a recurring texture segmentation using the texture cells of larger sizes. The both approaches turned out to be rather fruitful for the aerial photographs as well as for the outdoor images. They generalize and support the hierarchical image analysis technique presented in another our paper. Some of the methods applied were borrowed from the living vision systems.

  13. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  14. Analysis of trends and breakpoints in observed discharge time series in Lower Saxony, Germany

    NASA Astrophysics Data System (ADS)

    Fangmann, Anne; Belli, Aslan; Haberlandt, Uwe

    2013-04-01

    Historical streamflow in the federal state of Lower Saxony, Germany was analyzed for potential trends and breakpoints. The investigation was based on time series of daily mean discharge values in the periods 1951 to 2005, for which 34 gauging stations showed a sufficient record length, and 1966 to 2005, for which 110 gauges were available. Indices characterizing both high and low flow conditions, as well as the mean discharge within a year and the individual seasons, were extracted from the daily time series and subjected to statistical analyses, including the estimation of trend direction, slope and local and global significance, as well as a breakpoint analysis. Simultaneously, several precipitation and temperature indices were tested for trends in the exact same manner, in order to investigate alterations in the atmospheric driving forces as potential causes for changes in the hydrological regime. 263 precipitation and 18 temperature stations provided the daily data from 1951 to 2005. For the discharge the largest significant changes could be noted in summer, where low, high and medium flows decreased throughout. Spatially, these downward trends proved strongest in the eastern half of Lower Saxony. A breakpoint analysis revealed that a large portion of gauging stations feature breaks in the summer indicator time series in 1988, after which a trend reversal, i.e. an increase in discharge, was observed. In spring and fall, a spatial differentiation between an increase in the northwest and a decrease in the southeast were found for the low flow. In winter, an increasing tendency in all discharge portions could be noted, but merely the trends in the flood indices proved field significant. Generally, the trends in discharge were found consistent with those in temperature and especially precipitation. For the mean temperature, consistently strong, positive, significant trends were detected, while the analysis of the precipitation indices revealed increases in winter

  15. Kinematic and kinetic analysis of two gymnastics acrobatic series to performing the backward stretched somersault.

    PubMed

    Mkaouer, Bessem; Jemni, Monèm; Amara, Samiha; Chaabène, Helmi; Tabka, Zouhair

    2013-01-01

    Back swing connections during gymnastics acrobatic series considerably influence technical performance and difficulties, particularly in the back somersault. The aim of this study was to compare the take-off's kinetic and kinematic variables between two acrobatic series leading to perform the backward stretched somersault (also called salto): round-off, flic-flac to stretched salto versus round-off, tempo-salto to stretched salto. Five high level male gymnasts (age 23.17 ± 1.61 yrs; body height 1.65 ± 0.05 m; body mass 56.80 ± 7.66 kg) took part in this investigation. A force plate synchronized with a two dimensional movement analysis system was used to collect kinetic and kinematic data. Statistical analysis via the non-parametric Wilcoxon Rank-sum test showed significant differences between the take-offs' variables. The backswing connections were different in the take-off angle, linear momentum, vertical velocity and horizontal and vertical displacements. In conclusion, considering that the higher elevation of the centre of mass in the flight phase would allow best performance and lower the risk of falls, particularly when combined to a great angular momentum, this study demonstrated that the optimal connection series was round-off, flic-flac to stretched salto which enabled the best height in the somersault. Analysis of the results suggests that both connections facilitate the performance of single and double (or triple) backward somersaults with or without rotations around the longitudinal axis. Gymnasts could perform these later while gaining height if they chose the round-off, flic-flac technique or gaining some backward displacement if they choose the round-off, salto tempo. PMID:24146701

  16. Kinematic and Kinetic Analysis of Two Gymnastics Acrobatic Series to Performing the Backward Stretched Somersault

    PubMed Central

    Mkaouer, Bessem; Jemni, Monèm; Amara, Samiha; Chaabène, Helmi; Tabka, Zouhair

    Back swing connections during gymnastics acrobatic series considerably influence technical performance and difficulties, particularly in the back somersault. The aim of this study was to compare the take-off’s kinetic and kinematic variables between two acrobatic series leading to perform the backward stretched somersault (also called salto): round-off, flic-flac to stretched salto versus round-off, tempo-salto to stretched salto. Five high level male gymnasts (age 23.17 ± 1.61 yrs; body height 1.65 ± 0.05 m; body mass 56.80 ± 7.66 kg) took part in this investigation. A force plate synchronized with a two dimensional movement analysis system was used to collect kinetic and kinematic data. Statistical analysis via the non-parametric Wilcoxon Rank-sum test showed significant differences between the take-offs’ variables. The backswing connections were different in the take-off angle, linear momentum, vertical velocity and horizontal and vertical displacements. In conclusion, considering that the higher elevation of the centre of mass in the flight phase would allow best performance and lower the risk of falls, particularly when combined to a great angular momentum, this study demonstrated that the optimal connection series was round-off, flic-flac to stretched salto which enabled the best height in the somersault. Analysis of the results suggests that both connections facilitate the performance of single and double (or triple) backward somersaults with or without rotations around the longitudinal axis. Gymnasts could perform these later while gaining height if they chose the round-off, flic-flac technique or gaining some backward displacement if they choose the round-off, salto tempo. PMID:24146701

  17. Analysis of the mass balance time series of glaciers in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Carturan, L.; Baroni, C.; Brunetti, M.; Carton, A.; Dalla Fontana, G.; Salvatore, M. C.; Zanoner, T.; Zuecco, G.

    2015-10-01

    This work presents an analysis of the mass balance series of nine Italian glaciers, which were selected based on the length, continuity and reliability of observations. All glaciers experienced mass loss in the observation period, which is variable for the different glaciers and ranges between 10 and 47 years. The longest series display increasing mass loss rates, that were mainly due to increased ablation during longer and warmer ablation seasons. The mean annual mass balance (Ba) in the decade from 2004 to 2013 ranged from -1788 mm to -763 mm w.e. yr-1. Low-altitude glaciers with low elevation ranges are more out of balance than the higher, larger and steeper glaciers, which maintain residual accumulation areas in their upper reaches. The response of glaciers is mainly controlled by the combination of October-May precipitation and June-September temperature, but rapid geometric adjustments and atmospheric changes lead to modifications in their response to climatic variations. In particular, a decreasing correlation of Ba with the June-September temperature and an increasing correlation with October-May precipitation are observed for some glaciers. In addition, the October-May temperature tends to become significantly correlated with Ba, possibly indicating a decrease in the fraction of solid precipitation, and/or increased ablation, during the accumulation season. Because most of the monitored glaciers have no more accumulation area, their observations series are at risk due to their impending extinction, thus requiring a soon replacement.

  18. Water quality time series for Big Melen stream (Turkey): its decomposition analysis and comparison to upstream.

    PubMed

    Karakaya, N; Evrendilek, F

    2010-06-01

    Big Melen stream is one of the major water resources providing 0.268 [corrected] km(3) year(-1) of drinking and municipal water for Istanbul. Monthly time series data between 1991 and 2004 for 25 chemical, biological, and physical water properties of Big Melen stream were separated into linear trend, seasonality, and error components using additive decomposition models. Water quality index (WQI) derived from 17 water quality variables were used to compare Aksu upstream and Big Melen downstream water quality. Twenty-six additive decomposition models of water quality time series data including WQI had R (2) values ranging from 88% for log(water temperature) (P < or = 0.001) to 3% for log(total dissolved solids) (P < or = 0.026). Linear trend models revealed that total hardness, calcium concentration, and log(nitrite concentration) had the highest rate of increase over time. Tukey's multiple comparison pointed to significant decreases in 17 water quality variables including WQI of Big Melen downstream relative to those of Aksu upstream (P < or = 0.001). Monitoring changes in water quality on the basis of watersheds through WQI and decomposition analysis of time series data paves the way for an adaptive management process of water resources that can be tailored in response to effectiveness and dynamics of management practices. PMID:19444637

  19. Analysis of the mass balance time series of glaciers in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Carturan, Luca; Baroni, Carlo; Brunetti, Michele; Carton, Alberto; Dalla Fontana, Giancarlo; Salvatore, Maria Cristina; Zanoner, Thomas; Zuecco, Giulia

    2016-03-01

    This work presents an analysis of the mass balance series of nine Italian glaciers, which were selected based on the length, continuity and reliability of observations. All glaciers experienced mass loss in the observation period, which is variable for the different glaciers and ranges between 10 and 47 years. The longest series display increasing mass loss rates, which were mainly due to increased ablation during longer and warmer ablation seasons. The mean annual mass balance (Ba) in the decade from 2004 to 2013 ranged from -1788 to -763 mm w.e. yr-1. Low-altitude glaciers with low range of elevation are more out of balance than the higher, larger and steeper glaciers, which maintain residual accumulation areas in their upper reaches. The response of glaciers is mainly controlled by the combination of October-May precipitations and June-September temperatures, but rapid geometric adjustments and atmospheric changes lead to modifications in their response to climatic variations. In particular, a decreasing correlation of Ba with the June-September temperatures and an increasing correlation with October-May precipitations are observed for some glaciers. In addition, the October-May temperatures tend to become significantly correlated with Ba, possibly indicating a decrease in the fraction of solid precipitation, and/or increased ablation, during the accumulation season. Because most of the monitored glaciers have no more accumulation area, their observations series are at risk due to their impending extinction, thus requiring a replacement soon.

  20. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis

    NASA Astrophysics Data System (ADS)

    Rzepecka, Zofia; Kalita, Jakub

    2016-04-01

    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.