A Multiscale Approach to InSAR Time Series Analysis
NASA Astrophysics Data System (ADS)
Hetland, E. A.; Muse, P.; Simons, M.; Lin, N.; Dicaprio, C. J.
2010-12-01
We present a technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale InSAR Time Series analysis), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. As opposed to single pixel InSAR time series techniques, MInTS takes advantage of both spatial and temporal characteristics of the deformation field. We use a weighting scheme which accounts for the presence of localized holes due to decorrelation or unwrapping errors in any given interferogram. We represent time-dependent deformation using a dictionary of general basis functions, capable of detecting both steady and transient processes. The estimation is regularized using a model resolution based smoothing so as to be able to capture rapid deformation where there are temporally dense radar acquisitions and to avoid oscillations during time periods devoid of acquisitions. MInTS also has the flexibility to explicitly parametrize known time-dependent processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). We use cross validation to choose the regularization penalty parameter in the inversion of for the time-dependent deformation field. We demonstrate MInTS using a set of 63 ERS-1/2 and 29 Envisat interferograms for Long Valley Caldera.
A multiscale approach to InSAR time series analysis
NASA Astrophysics Data System (ADS)
Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y. N.; Dicaprio, C.; Rickerby, A.
2008-12-01
We describe a new technique to constrain time-dependent deformation from repeated satellite-based InSAR observations of a given region. This approach, which we call MInTS (Multiscale analysis of InSAR Time Series), relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. This approach also permits a consistent treatment of all data independent of the presence of localized holes in any given interferogram. In essence, MInTS allows one to considers all data at the same time (as opposed to one pixel at a time), thereby taking advantage of both spatial and temporal characteristics of the deformation field. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to includes a set of general functions (e.g., splines) in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate our results by comparing with ground-based observations.
A Multiscale Approach to InSAR Time Series Analysis
NASA Astrophysics Data System (ADS)
Simons, M.; Hetland, E. A.; Muse, P.; Lin, Y.; Dicaprio, C. J.
2009-12-01
We describe progress in the development of MInTS (Multiscale analysis of InSAR Time Series), an approach to constructed self-consistent time-dependent deformation observations from repeated satellite-based InSAR images of a given region. MInTS relies on a spatial wavelet decomposition to permit the inclusion of distance based spatial correlations in the observations while maintaining computational tractability. In essence, MInTS allows one to considers all data at the same time as opposed to one pixel at a time, thereby taking advantage of both spatial and temporal characteristics of the deformation field. This approach also permits a consistent treatment of all data independent of the presence of localized holes due to unwrapping issues in any given interferogram. Specifically, the presence of holes is accounted for through a weighting scheme that accounts for the extent of actual data versus the area of holes associated with any given wavelet. In terms of the temporal representation, we have the flexibility to explicitly parametrize known processes that are expected to contribute to a given set of observations (e.g., co-seismic steps and post-seismic transients, secular variations, seasonal oscillations, etc.). Our approach also allows for the temporal parametrization to include a set of general functions in order to account for unexpected processes. We allow for various forms of model regularization using a cross-validation approach to select penalty parameters. We also experiment with the use of sparsity inducing regularization as a way to select from a large dictionary of time functions. The multiscale analysis allows us to consider various contributions (e.g., orbit errors) that may affect specific scales but not others. The methods described here are all embarrassingly parallel and suitable for implementation on a cluster computer. We demonstrate the use of MInTS using a large suite of ERS-1/2 and Envisat interferograms for Long Valley Caldera, and validate
NASA Astrophysics Data System (ADS)
Donner, R. V.; Zou, Y.; Donges, J. F.; Marwan, N.; Kurths, J.
2009-12-01
We present a new approach for analysing structural properties of time series from complex systems. Starting from the concept of recurrences in phase space, the recurrence matrix of a time series is interpreted as the adjacency matrix of an associated complex network which links different points in time if the evolution of the considered states is very similar. A critical comparison of these recurrence networks with similar existing techniques is presented, revealing strong conceptual benefits of the new approach which can be considered as a unifying framework for transforming time series into complex networks that also includes other methods as special cases. Based on different model systems, we demonstrate that there are fundamental interrelationships between the topological properties of recurrence networks and the statistical properties of the phase space density of the underlying dynamical system. Hence, the network description yields new quantitative characteristics of the dynamical complexity of a time series, which substantially complement existing measures of recurrence quantification analysis. Finally, we illustrate the potential of our approach for detecting hidden dynamical transitions from geoscientific time series by applying it to different paleoclimate records. In particular, we are able to resolve previously unknown climatic regime shifts in East Africa during the last about 4 million years, which might have had a considerable influence on the evolution of hominids in the area.
Li, Cheng; Ding, Guang-Hong; Wu, Guo-Qiang; Poon, Chi-Sang
2009-01-01
A wide variety of methods based on fractal, entropic or chaotic approaches have been applied to the analysis of complex physiological time series. In this paper, we show that fractal and entropy measures are poor indicators of nonlinearity for gait data and heart rate variability data. In contrast, the noise titration method based on Volterra autoregressive modeling represents the most reliable currently available method for testing nonlinear determinism and chaotic dynamics in the presence of measurement noise and dynamic noise.
Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Pilger, E.; Wright, R.
2011-07-01
We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.
Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall
2016-01-01
Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973
Detection of chaos: New approach to atmospheric pollen time-series analysis
NASA Astrophysics Data System (ADS)
Bianchi, M. M.; Arizmendi, C. M.; Sanchez, J. R.
1992-09-01
Pollen and spores are biological particles that are ubiquitous to the atmosphere and are pathologically significant, causing plant diseases and inhalant allergies. One of the main objectives of aerobiological surveys is forecasting. Prediction models are required in order to apply aerobiological knowledge to medical or agricultural practice; a necessary condition of these models is not to be chaotic. The existence of chaos is detected through the analysis of a time series. The time series comprises hourly counts of atmospheric pollen grains obtained using a Burkard spore trap from 1987 to 1989 at Mar del Plata. Abraham's method to obtain the correlation dimension was applied. A low and fractal dimension shows chaotic dynamics. The predictability of models for atomspheric pollen forecasting is discussed.
Dynamic analysis of traffic time series at different temporal scales: A complex networks approach
NASA Astrophysics Data System (ADS)
Tang, Jinjun; Wang, Yinhai; Wang, Hua; Zhang, Shen; Liu, Fang
2014-07-01
The analysis of dynamics in traffic flow is an important step to achieve advanced traffic management and control in Intelligent Transportation System (ITS). Complexity and periodicity are definitely two fundamental properties in traffic dynamics. In this study, we first measure the complexity of traffic flow data by Lempel-Ziv algorithm at different temporal scales, and the data are collected from loop detectors on freeway. Second, to obtain more insight into the complexity and periodicity in traffic time series, we then construct complex networks from traffic time series by considering each day as a cycle and each cycle as a single node. The optimal threshold value of complex networks is estimated by the distribution of density and its derivative. In addition, the complex networks are subsequently analyzed in terms of some statistical properties, such as average path length, clustering coefficient, density, average degree and betweenness. Finally, take 2 min aggregation data as example, we use the correlation coefficient matrix, adjacent matrix and closeness to exploit the periodicity of weekdays and weekends in traffic flow data. The findings in this paper indicate that complex network is a practical tool for exploring dynamics in traffic time series.
Oomens, Wouter; Maes, Joseph H. R.; Hasselman, Fred; Egger, Jos I. M.
2015-01-01
The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG) task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA), a non-linear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation. PMID:26097449
Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments
NASA Astrophysics Data System (ADS)
Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel
2017-03-01
This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.
McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques
2007-04-01
The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.
Hsu, Han-Hsiu; Araki, Michihiro; Mochizuki, Masao; Hori, Yoshimi; Murata, Masahiro; Kahar, Prihardi; Yoshida, Takanobu; Hasunuma, Tomohisa; Kondo, Akihiko
2017-03-02
Chinese hamster ovary (CHO) cells are the primary host used for biopharmaceutical protein production. The engineering of CHO cells to produce higher amounts of biopharmaceuticals has been highly dependent on empirical approaches, but recent high-throughput "omics" methods are changing the situation in a rational manner. Omics data analyses using gene expression or metabolite profiling make it possible to identify key genes and metabolites in antibody production. Systematic omics approaches using different types of time-series data are expected to further enhance understanding of cellular behaviours and molecular networks for rational design of CHO cells. This study developed a systematic method for obtaining and analysing time-dependent intracellular and extracellular metabolite profiles, RNA-seq data (enzymatic mRNA levels) and cell counts from CHO cell cultures to capture an overall view of the CHO central metabolic pathway (CMP). We then calculated correlation coefficients among all the profiles and visualised the whole CMP by heatmap analysis and metabolic pathway mapping, to classify genes and metabolites together. This approach provides an efficient platform to identify key genes and metabolites in CHO cell culture.
Hsu, Han-Hsiu; Araki, Michihiro; Mochizuki, Masao; Hori, Yoshimi; Murata, Masahiro; Kahar, Prihardi; Yoshida, Takanobu; Hasunuma, Tomohisa; Kondo, Akihiko
2017-01-01
Chinese hamster ovary (CHO) cells are the primary host used for biopharmaceutical protein production. The engineering of CHO cells to produce higher amounts of biopharmaceuticals has been highly dependent on empirical approaches, but recent high-throughput “omics” methods are changing the situation in a rational manner. Omics data analyses using gene expression or metabolite profiling make it possible to identify key genes and metabolites in antibody production. Systematic omics approaches using different types of time-series data are expected to further enhance understanding of cellular behaviours and molecular networks for rational design of CHO cells. This study developed a systematic method for obtaining and analysing time-dependent intracellular and extracellular metabolite profiles, RNA-seq data (enzymatic mRNA levels) and cell counts from CHO cell cultures to capture an overall view of the CHO central metabolic pathway (CMP). We then calculated correlation coefficients among all the profiles and visualised the whole CMP by heatmap analysis and metabolic pathway mapping, to classify genes and metabolites together. This approach provides an efficient platform to identify key genes and metabolites in CHO cell culture. PMID:28252038
Simplification of multiple Fourier series - An example of algorithmic approach
NASA Technical Reports Server (NTRS)
Ng, E. W.
1981-01-01
This paper describes one example of multiple Fourier series which originate from a problem of spectral analysis of time series data. The example is exercised here with an algorithmic approach which can be generalized for other series manipulation on a computer. The generalized approach is presently pursued towards applications to a variety of multiple series and towards a general purpose algorithm for computer algebra implementation.
Bruno, Paula Marta; Pereira, Fernando Duarte; Fernandes, Renato; de Mendonça, Goncalo Vilhena
2011-02-01
The responses to supramaximal exercise testing have been traditionally analyzed by means of standard parametric and nonparametric statistics. Unfortunately, these statistical approaches do not allow insight into the pattern of variation of a given parameter over time. The purpose of this study was to determine if the application of dynamic factor analysis (DFA) allowed discriminating different patterns of power output (PO), during supramaximal exercise, in two groups of children engaged in competitive sports: swimmers and soccer players. Data derived from Wingate testing were used in this study. Analyses were performed on epochs (30 s) of upper and lower body PO obtained from twenty two healthy boys (11 swimmers and 11 soccer players) age 11-12 years old. DFA revealed two distinct patterns of PO during Wingate. Swimmers tended to attain their peak PO (upper and lower body) earlier than soccer players. As importantly, DFA showed that children with a given pattern of upper body PO tend to perform similarly during lower body exercise.
Cabinetmaker. Occupational Analysis Series.
ERIC Educational Resources Information Center
Chinien, Chris; Boutin, France
This document contains the analysis of the occupation of cabinetmaker, or joiner, that is accepted by the Canadian Council of Directors as the national standard for the occupation. The front matter preceding the analysis includes exploration of the development of the analysis, structure of the analysis, validation method, scope of the cabinetmaker…
Permutations and time series analysis.
Cánovas, Jose S; Guillamón, Antonio
2009-12-01
The main aim of this paper is to show how the use of permutations can be useful in the study of time series analysis. In particular, we introduce a test for checking the independence of a time series which is based on the number of admissible permutations on it. The main improvement in our tests is that we are able to give a theoretical distribution for independent time series.
NASA Astrophysics Data System (ADS)
Allan, Alasdair
2014-06-01
FROG performs time series analysis and display. It provides a simple user interface for astronomers wanting to do time-domain astrophysics but still offers the powerful features found in packages such as PERIOD (ascl:1406.005). FROG includes a number of tools for manipulation of time series. Among other things, the user can combine individual time series, detrend series (multiple methods) and perform basic arithmetic functions. The data can also be exported directly into the TOPCAT (ascl:1101.010) application for further manipulation if needed.
Predicting road accidents: Structural time series approach
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-07-01
In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.
ERIC Educational Resources Information Center
Locock, Katherine; Tran, Hue; Codd, Rachel; Allan, Robin
2015-01-01
This series of three practical sessions centers on drugs that inhibit the enzyme acetylcholineesterase. This enzyme is responsible for the inactivation of acetylcholine and has been the target of drugs to treat glaucoma and Alzheimer's disease and for a number of insecticides and warfare agents. These sessions relate to a series of carbamate…
Time series analysis of injuries.
Martinez-Schnell, B; Zaidi, A
1989-12-01
We used time series models in the exploratory and confirmatory analysis of selected fatal injuries in the United States from 1972 to 1983. We built autoregressive integrated moving average (ARIMA) models for monthly, weekly, and daily series of deaths and used these models to generate hypotheses. These deaths resulted from six causes of injuries: motor vehicles, suicides, homicides, falls, drownings, and residential fires. For each cause of injury, we estimated calendar effects on the monthly death counts. We confirmed the significant effect of vehicle miles travelled on motor vehicle fatalities with a transfer function model. Finally, we applied intervention analysis to deaths due to motor vehicles.
Task Analysis Inventories. Series II.
ERIC Educational Resources Information Center
Wesson, Carl E.
This second in a series of task analysis inventories contains checklists of work performed in twenty-two occupations. Each inventory is a comprehensive list of work activities, responsibilities, educational courses, machines, tools, equipment, and work aids used and the products produced or services rendered in a designated occupational area. The…
Visibility Graph Based Time Series Analysis
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
NASA Astrophysics Data System (ADS)
Feigin, Alexander; Mukhin, Dmitry; Gavrilov, Andrey; Volodin, Evgeny; Loskutov, Evgeny
2014-05-01
Natural systems are in general space-distributed, and their evolution represents a broad spectrum of temporal scales. The multiscale nature may be resulted from multiplicity of mechanisms governing the system behaviour, and a large number of feedbacks and nonlinearities. A way to reveal and understand the underlying mechanisms as well as to model corresponding sub-systems is decomposition of the full (complex) system into well separated spatio-temporal patterns ("modes") that evolve with essentially different time scales. In the report a new method of a similar decomposition is discussed. The method is based on generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding space-distributed time series in basis of spatio-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points. The method is applied to decomposition of the Earth's climate system: on the base of 156 years time series of SST anomalies distributed over the globe [2] two climatic modes possessing by noticeably different time scales (3-5 and 9-11 years) are separated. For more accurate exclusion of "too slow" (and thus not represented correctly) processes from real data the numerically produced STEOF basis is used. For doing this the time series generated by the INM RAS Coupled Climate Model [3] is utilized. Relations of separated modes to ENSO and PDO are investigated. Possible development of the suggested approach in order to the separation of the modes that are nonlinearly uncorrelated is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/ 3. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm
Introduction to Time Series Analysis
NASA Technical Reports Server (NTRS)
Hardin, J. C.
1986-01-01
The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.
Comparative Analysis on Time Series with Included Structural Break
NASA Astrophysics Data System (ADS)
Andreeski, Cvetko J.; Vasant, Pandian
2009-08-01
The time series analysis (ARIMA models) is a good approach for identification of time series. But, if we have structural break in the time series, we cannot create only one model of time series. Further more, if we don't have enough data between two structural breaks, it's impossible to create valid time series models for identification of the time series. This paper explores the possibility of identification of the inflation process dynamics via of the system-theoretic, by means of both Box-Jenkins ARIMA methodologies and artificial neural networks.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Analysis of Polyphonic Musical Time Series
NASA Astrophysics Data System (ADS)
Sommer, Katrin; Weihs, Claus
A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.
Complex network approach to fractional time series
Manshour, Pouya
2015-10-15
In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.
Analysis of series resonant converter with series-parallel connection
NASA Astrophysics Data System (ADS)
Lin, Bor-Ren; Huang, Chien-Lan
2011-02-01
In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.
A multivariate time-series approach to marital interaction
Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar
2005-01-01
Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation). The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa. Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R², p<0.0001 for the multivariate models). The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction. PMID:19742066
A multivariate time-series approach to marital interaction.
Kupfer, Jörg; Brosig, Burkhard; Brähler, Elmar
2005-08-02
Time-series analysis (TSA) is frequently used in order to clarify complex structures of mutually interacting panel data. The method helps in understanding how the course of a dependent variable is predicted by independent time-series with no time lag, as well as by previous observations of that dependent variable (autocorrelation) and of independent variables (cross-correlation).The study analyzes the marital interaction of a married couple under clinical conditions over a period of 144 days by means of TSA. The data were collected within a course of couple therapy. The male partner was affected by a severe condition of atopic dermatitis and the woman suffered from bulimia nervosa.Each of the partners completed a mood questionnaire and a body symptom checklist. After the determination of auto- and cross-correlations between and within the parallel data sets, multivariate time-series models were specified. Mutual and individual patterns of emotional reactions explained 14% (skin) and 33% (bulimia) of the total variance in both dependent variables (adj. R(2), p<0.0001 for the multivariate models).The question was discussed whether multivariate TSA-models represent a suitable approach to the empirical exploration of clinical marital interaction.
Short time-series microarray analysis: Methods and challenges
Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina
2008-01-01
The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data. PMID:18605994
Nonlinear Analysis of Surface EMG Time Series
NASA Astrophysics Data System (ADS)
Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2004-04-01
Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
Comparing or estimating the costs of educational projects by merely using cost-per-student figures is imprecise and ignores area differences in prices. The resource approach to cost analysis begins by determining specific physical resources (such as facilities, staff, equipment, materials, and services) needed for a project. Then the cost of these…
Distinguishing chaotic time series from noise: A random matrix approach
NASA Astrophysics Data System (ADS)
Ye, Bin; Chen, Jianxing; Ju, Chen; Li, Huijun; Wang, Xuesong
2017-03-01
Deterministically chaotic systems can often give rise to random and unpredictable behaviors which make the time series obtained from them to be almost indistinguishable from noise. Motivated by the fact that data points in a chaotic time series will have intrinsic correlations between them, we propose a random matrix theory (RMT) approach to identify the deterministic or stochastic dynamics of the system. We show that the spectral distributions of the correlation matrices, constructed from the chaotic time series, deviate significantly from the predictions of random matrix ensembles. On the contrary, the eigenvalue statistics for a noisy signal follow closely those of random matrix ensembles. Numerical results also indicate that the approach is to some extent robust to additive observational noise which pollutes the data in many practical situations. Our approach is efficient in recognizing the continuous chaotic dynamics underlying the evolution of the time series.
Circulant Matrices and Time-Series Analysis
ERIC Educational Resources Information Center
Pollock, D. S. G.
2002-01-01
This paper sets forth some salient results in the algebra of circulant matrices which can be used in time-series analysis. It provides easy derivations of some results that are central to the analysis of statistical periodograms and empirical spectral density functions. A statistical test for the stationarity or homogeneity of empirical processes…
Modelling road accidents: An approach using structural time series
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-09-01
In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.
Entropic Analysis of Electromyography Time Series
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Sung, Paul
2005-03-01
We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.
Improved singular spectrum analysis for time series with missing data
NASA Astrophysics Data System (ADS)
Shen, Y.; Peng, F.; Li, B.
2015-07-01
Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution develops an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach is evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60 %, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30 % for the first four principal components, respectively. Both the mean absolute error and mean root mean squared error of the reconstructed time series by ISSA are also smaller than those by SSAM. The respective improvements are 34.45 and 33.91 % when the missing data accounts for 60 %. The results from real incomplete time series also show that the standard deviation (SD) derived by ISSA is 12.27 mg L-1, smaller than the 13.48 mg L-1 derived by SSAM.
Improved singular spectrum analysis for time series with missing data
NASA Astrophysics Data System (ADS)
Shen, Y.; Peng, F.; Li, B.
2014-12-01
Singular spectrum analysis (SSA) is a powerful technique for time series analysis. Based on the property that the original time series can be reproduced from its principal components, this contribution will develop an improved SSA (ISSA) for processing the incomplete time series and the modified SSA (SSAM) of Schoellhamer (2001) is its special case. The approach was evaluated with the synthetic and real incomplete time series data of suspended-sediment concentration from San Francisco Bay. The result from the synthetic time series with missing data shows that the relative errors of the principal components reconstructed by ISSA are much smaller than those reconstructed by SSAM. Moreover, when the percentage of the missing data over the whole time series reaches 60%, the improvements of relative errors are up to 19.64, 41.34, 23.27 and 50.30% for the first four principal components, respectively. Besides, both the mean absolute errors and mean root mean squared errors of the reconstructed time series by ISSA are also much smaller than those by SSAM. The respective improvements are 34.45 and 33.91% when the missing data accounts for 60%. The results from real incomplete time series also show that the SD derived by ISSA is 12.27 mg L-1, smaller than 13.48 mg L-1 derived by SSAM.
Noolvi, Malleshappa N.; Patel, Harun M.
2010-01-01
Epidermal growth factor receptor (EGFR) protein tyrosine kinases (PTKs) are known for its role in cancer. Quinazoline have been reported to be the molecules of interest, with potent anticancer activity and they act by binding to ATP site of protein kinases. ATP binding site of protein kinases provides an extensive opportunity to design newer analogs. With this background, we report an attempt to discern the structural and physicochemical requirements for inhibition of EGFR tyrosine kinase. The k-Nearest Neighbor Molecular Field Analysis (kNN-MFA), a three dimensional quantitative structure activity relationship (3D- QSAR) method has been used in the present case to study the correlation between the molecular properties and the tyrosine kinase (EGFR) inhibitory activities on a series of quinazoline derivatives. kNNMFA calculations for both electrostatic and steric field were carried out. The master grid maps derived from the best model has been used to display the contribution of electrostatic potential and steric field. The statistical results showed significant correlation coefficient r2 (q2) of 0.846, r2 for external test set (pred_r2) 0.8029, coefficient of correlation of predicted data set (pred_r2se) of 0.6658, degree of freedom 89 and k nearest neighbor of 2. Therefore, this study not only casts light on binding mechanism between EGFR and its inhibitors, but also provides hints for the design of new EGFR inhibitors with observable structural diversity PMID:24825983
Complex network analysis of time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Small, Michael; Kurths, Jürgen
2016-12-01
Revealing complicated behaviors from time series constitutes a fundamental problem of continuing interest and it has attracted a great deal of attention from a wide variety of fields on account of its significant importance. The past decade has witnessed a rapid development of complex network studies, which allow to characterize many types of systems in nature and technology that contain a large number of components interacting with each other in a complicated manner. Recently, the complex network theory has been incorporated into the analysis of time series and fruitful achievements have been obtained. Complex network analysis of time series opens up new venues to address interdisciplinary challenges in climate dynamics, multiphase flow, brain functions, ECG dynamics, economics and traffic systems.
Nonlinear Time Series Analysis via Neural Networks
NASA Astrophysics Data System (ADS)
Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin
This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.
Integrated method for chaotic time series analysis
Hively, L.M.; Ng, E.G.
1998-09-29
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.
Integrated method for chaotic time series analysis
Hively, Lee M.; Ng, Esmond G.
1998-01-01
Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.
A probability distribution approach to synthetic turbulence time series
NASA Astrophysics Data System (ADS)
Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael
2016-11-01
The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.
Time-frequency analysis of electroencephalogram series
NASA Astrophysics Data System (ADS)
Blanco, S.; Quiroga, R. Quian; Rosso, O. A.; Kochen, S.
1995-03-01
In this paper we propose a method, based on the Gabor transform, to quantify and visualize the time evolution of the traditional frequency bands defined in the analysis of electroencephalogram (EEG) series. The information obtained in this way can be used for the information transfer analyses of the epileptic seizure as well as for their characterization. We found an optimal correlation between EEG visual inspection and the proposed method in the characterization of paroxism, spikes, and other transient alterations of background activity. The dynamical changes during an epileptic seizure are shown through the phase portrait. The method proposed was examplified with EEG series obtained with depth electrodes in refractory epileptic patients.
Delay Differential Analysis of Time Series
Lainscsek, Claudia; Sejnowski, Terrence J.
2015-01-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time
Delay differential analysis of time series.
Lainscsek, Claudia; Sejnowski, Terrence J
2015-03-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time
Time-Series Analysis: A Cautionary Tale
NASA Technical Reports Server (NTRS)
Damadeo, Robert
2015-01-01
Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data
Richmond, Amy; Sanchez, Belinda; Stevenson, Valerie; Baker, Russell T.; May, James; Nasypany, Alan; Reordan, Don
2016-01-01
ABSTRACT Background Partial meniscectomy does not consistently produce the desired positive outcomes intended for meniscal tears lesions; therefore, a need exists for research into alternatives for treating symptoms of meniscal tears. The purpose of this case series was to examine the effect of the Mulligan Concept (MC) “Squeeze” technique in physically active participants who presented with clinical symptoms of meniscal tears. Description of Cases The MC “Squeeze” technique was applied in five cases of clinically diagnosed meniscal tears in a physically active population. The Numeric Pain Rating Scale (NRS), the Patient Specific Functional Scale (PSFS), the Disability in the Physically Active (DPA) Scale, and the Knee injury and Osteoarthritis Outcomes Score (KOOS) were administered to assess participant pain level and function. Outcomes Statistically significant improvements were found on cumulative NRS (p ≤ 0.001), current NRS (p ≤ 0.002), PSFS (p ≤ 0.003), DPA (p ≤ 0.019), and KOOS (p ≤ 0.002) scores across all five participants. All participants exceeded the minimal clinically important difference (MCID) on the first treatment and reported an NRS score and current pain score of one point or less at discharge. The MC “Squeeze” technique produced statistically and clinically significant changes across all outcome measures in all five participants. Discussion The use of the MC “Squeeze” technique in this case series indicated positive outcomes in five participants who presented with meniscal tear symptoms. Of importance to the athletic population, each of the participants continued to engage in sport activity as tolerated unless otherwise required during the treatment period. The outcomes reported in this case series exceed those reported when using traditional conservative therapy and the return to play timelines for meniscal tears treated with partial meniscectomies. Levels of Evidence Level 4 PMID:27525181
NASA Astrophysics Data System (ADS)
Wang, Dong; Singh, Vijay P.; Shang, Xiaosan; Ding, Hao; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Wang, Shicheng; Wang, Zhenlong
2014-07-01
De-noising meteorologic and hydrologic time series is important to improve the accuracy and reliability of extraction, analysis, simulation, and forecasting. A hybrid approach, combining sample entropy and wavelet de-noising method, is developed to separate noise from original series and is named as AWDA-SE (adaptive wavelet de-noising approach using sample entropy). The AWDA-SE approach adaptively determines the threshold for wavelet analysis. Two kinds of meteorologic and hydrologic data sets, synthetic data set and 3 representative field measured data sets (one is the annual rainfall data of Jinan station and the other two are annual streamflow series from two typical stations in China, Yingluoxia station on the Heihe River, which is little affected by human activities, and Lijin station on the Yellow River, which is greatly affected by human activities), are used to illustrate the approach. The AWDA-SE approach is compared with three conventional de-noising methods, including fixed-form threshold algorithm, Stein unbiased risk estimation algorithm, and minimax algorithm. Results show that the AWDA-SE approach separates effectively the signal and noise of the data sets and is found to be better than the conventional methods. Measures of assessment standards show that the developed approach can be employed to investigate noisy and short time series and can also be applied to other areas.
Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie
2012-01-01
We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221
Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie
2012-01-01
We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock.
Singular spectrum analysis for time series with missing data
Schoellhamer, D.H.
2001-01-01
Geophysical time series often contain missing data, which prevents analysis with many signal processing and multivariate tools. A modification of singular spectrum analysis for time series with missing data is developed and successfully tested with synthetic and actual incomplete time series of suspended-sediment concentration from San Francisco Bay. This method also can be used to low pass filter incomplete time series.
Tremor classification and tremor time series analysis
NASA Astrophysics Data System (ADS)
Deuschl, Günther; Lauk, Michael; Timmer, Jens
1995-03-01
The separation between physiologic tremor (PT) in normal subjects and the pathological tremors of essential tremor (ET) or Parkinson's disease (PD) was investigated on the basis of monoaxial accelerometric recordings of 35 s hand tremor epochs. Frequency and amplitude were insufficient to separate between these conditions, except for the trivial distinction between normal and pathologic tremors that is already defined on the basis of amplitude. We found that waveform analysis revealed highly significant differences between normal and pathologic tremors, and, more importantly, among different forms of pathologic tremors. We found in our group of 25 patients with PT and 15 with ET a reasonable distinction with the third momentum and the time reversal invariance. A nearly complete distinction between these two conditions on the basis of the asymmetric decay of the autocorrelation function. We conclude that time series analysis can probably be developed into a powerful tool for the objective analysis of tremors.
Compounding approach for univariate time series with nonstationary variances
NASA Astrophysics Data System (ADS)
Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich
2015-12-01
A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.
Behavior of road accidents: Structural time series approach
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin
2014-12-01
Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.
Mixed Spectrum Analysis on fMRI Time-Series.
Kumar, Arun; Lin, Feng; Rajapakse, Jagath C
2016-06-01
Temporal autocorrelation present in functional magnetic resonance image (fMRI) data poses challenges to its analysis. The existing approaches handling autocorrelation in fMRI time-series often presume a specific model of autocorrelation such as an auto-regressive model. The main limitation here is that the correlation structure of voxels is generally unknown and varies in different brain regions because of different levels of neurogenic noises and pulsatile effects. Enforcing a universal model on all brain regions leads to bias and loss of efficiency in the analysis. In this paper, we propose the mixed spectrum analysis of the voxel time-series to separate the discrete component corresponding to input stimuli and the continuous component carrying temporal autocorrelation. A mixed spectral analysis technique based on M-spectral estimator is proposed, which effectively removes autocorrelation effects from voxel time-series and identify significant peaks of the spectrum. As the proposed method does not assume any prior model for the autocorrelation effect in voxel time-series, varying correlation structure among the brain regions does not affect its performance. We have modified the standard M-spectral method for an application on a spatial set of time-series by incorporating the contextual information related to the continuous spectrum of neighborhood voxels, thus reducing considerably the computation cost. Likelihood of the activation is predicted by comparing the amplitude of discrete component at stimulus frequency of voxels across the brain by using normal distribution and modeling spatial correlations among the likelihood with a conditional random field. We also demonstrate the application of the proposed method in detecting other desired frequencies.
Nonlinear independent component analysis and multivariate time series analysis
NASA Astrophysics Data System (ADS)
Storck, Jan; Deco, Gustavo
1997-02-01
We derive an information-theory-based unsupervised learning paradigm for nonlinear independent component analysis (NICA) with neural networks. We demonstrate that under the constraint of bounded and invertible output transfer functions the two main goals of unsupervised learning, redundancy reduction and maximization of the transmitted information between input and output (Infomax-principle), are equivalent. No assumptions are made concerning the kind of input and output distributions, i.e. the kind of nonlinearity of correlations. An adapted version of the general NICA network is used for the modeling of multivariate time series by unsupervised learning. Given time series of various observables of a dynamical system, our net learns their evolution in time by extracting statistical dependencies between past and present elements of the time series. Multivariate modeling is obtained by making present value of each time series statistically independent not only from their own past but also from the past of the other series. Therefore, in contrast to univariate methods, the information lying in the couplings between the observables is also used and a detection of higher-order cross correlations is possible. We apply our method to time series of the two-dimensional Hénon map and to experimental time series obtained from the measurements of axial velocities in different locations in weakly turbulent Taylor-Couette flow.
Time series analysis of temporal networks
NASA Astrophysics Data System (ADS)
Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh
2016-01-01
A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue
Flutter Analysis for Turbomachinery Using Volterra Series
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Yao, Weigang
2014-01-01
The objective of this paper is to describe an accurate and efficient reduced order modeling method for aeroelastic (AE) analysis and for determining the flutter boundary. Without losing accuracy, we develop a reduced order model based on the Volterra series to achieve significant savings in computational cost. The aerodynamic force is provided by a high-fidelity solution from the Reynolds-averaged Navier-Stokes (RANS) equations; the structural mode shapes are determined from the finite element analysis. The fluid-structure coupling is then modeled by the state-space formulation with the structural displacement as input and the aerodynamic force as output, which in turn acts as an external force to the aeroelastic displacement equation for providing the structural deformation. NASA's rotor 67 blade is used to study its aeroelastic characteristics under the designated operating condition. First, the CFD results are validated against measured data available for the steady state condition. Then, the accuracy of the developed reduced order model is compared with the full-order solutions. Finally the aeroelastic solutions of the blade are computed and a flutter boundary is identified, suggesting that the rotor, with the material property chosen for the study, is structurally stable at the operating condition, free of encountering flutter.
A novel similarity comparison approach for dynamic ECG series.
Yin, Hong; Zhu, Xiaoqian; Ma, Shaodong; Yang, Shuqiang; Chen, Liqian
2015-01-01
The heart sound signal is a reflection of heart and vascular system motion. Long-term continuous electrocardiogram (ECG) contains important information which can be helpful to prevent heart failure. A single piece of a long-term ECG recording usually consists of more than one hundred thousand data points in length, making it difficult to derive hidden features that may be reflected through dynamic ECG monitoring, which is also very time-consuming to analyze. In this paper, a Dynamic Time Warping based on MapReduce (MRDTW) is proposed to make prognoses of possible lesions in patients. Through comparison of a real-time ECG of a patient with the reference sets of normal and problematic cardiac waveforms, the experimental results reveal that our approach not only retains high accuracy, but also greatly improves the efficiency of the similarity measure in dynamic ECG series.
TSAN: a package for time series analysis.
Wang, D C; Vagnucci, A H
1980-04-01
Many biomedical data are in the form of time series. Analyses of these data include: (1) search for any biorhythm; (2) test of homogeneity of several time series; (3) assessment of stationarity; (4) test of normality of the time series histogram; (5) evaluation of dependence between data points. In this paper we present a subroutine package called TSAN. It is developed to accomplish these tasks. Computational methods, as well as flowcharts, for these subroutines are described. Two sample runs are demonstrated.
Deciding on the best (in this case) approach to time-series forecasting
Pack, D. J.
1980-01-01
This paper was motivated by a Decision Sciences article (v. 10, no. 2, 232-244(April 1979)) that presented comparisons of the adaptive estimation procedure (AEP), adaptive filtering, the Box-Jenkins (BJ) methodology, and multiple regression analysis as they apply to time-series forecasting with single-series models. While such comparisons are to be applauded in general, it is demonstrated that the empirical comparisons of the above paper are quite misleading with respect to choosing between the AEP and BJ approaches. This demonstration is followed by a somewhat philosophical discussion on comparison-of-methods techniques.
The scaling of time series size towards detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Gao, Xiaolei; Ren, Liwei; Shang, Pengjian; Feng, Guochen
2016-06-01
In this paper, we introduce a modification of detrended fluctuation analysis (DFA), called multivariate DFA (MNDFA) method, based on the scaling of time series size N. In traditional DFA method, we obtained the influence of the sequence segmentation interval s, and it inspires us to propose a new model MNDFA to discuss the scaling of time series size towards DFA. The effectiveness of the procedure is verified by numerical experiments with both artificial and stock returns series. Results show that the proposed MNDFA method contains more significant information of series compared to traditional DFA method. The scaling of time series size has an influence on the auto-correlation (AC) in time series. For certain series, we obtain an exponential relationship, and also calculate the slope through the fitting function. Our analysis and finite-size effect test demonstrate that an appropriate choice of the time series size can avoid unnecessary influences, and also make the testing results more accurate.
Apparatus for statistical time-series analysis of electrical signals
NASA Technical Reports Server (NTRS)
Stewart, C. H. (Inventor)
1973-01-01
An apparatus for performing statistical time-series analysis of complex electrical signal waveforms, permitting prompt and accurate determination of statistical characteristics of the signal is presented.
NASA Astrophysics Data System (ADS)
Hu, Y. M.; Liang, Z. M.; Jiang, X. L.; Bu, H.
2015-06-01
In this paper, a novel approach for non-stationary hydrological frequency analysis is proposed. The approach is due to the following consideration that, at present the data series used to detect mutation characteristic is very short, which may only reflect the partial characteristic of the population. That is to say, the mutation characteristic of short series may not fully represent the mutation characteristic of population, such as the difference of mutation degree between short sample and population. In this proposed method, an assumption is done that the variation hydrological series in a big time window owns an expected vibration center (EVC), which is a linear combination of the two mean values of the two subsample series obtained through separating the original hydrological series by a novel optimal segmentation technique (change rate of slope method). Then using the EVC to reconstruct non-stationary series to meet the requirement of stationary, and further ensure the conventional frequency analysis methods is valid.
Time Series in Education: The Analysis of Daily Attendance in Two High Schools
ERIC Educational Resources Information Center
Koopmans, Matthijs
2011-01-01
This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…
A Time Series Approach for Soil Moisture Estimation
NASA Technical Reports Server (NTRS)
Kim, Yunjin; vanZyl, Jakob
2006-01-01
Soil moisture is a key parameter in understanding the global water cycle and in predicting natural hazards. Polarimetric radar measurements have been used for estimating soil moisture of bare surfaces. In order to estimate soil moisture accurately, the surface roughness effect must be compensated properly. In addition, these algorithms will not produce accurate results for vegetated surfaces. It is difficult to retrieve soil moisture of a vegetated surface since the radar backscattering cross section is sensitive to the vegetation structure and environmental conditions such as the ground slope. Therefore, it is necessary to develop a method to estimate the effect of the surface roughness and vegetation reliably. One way to remove the roughness effect and the vegetation contamination is to take advantage of the temporal variation of soil moisture. In order to understand the global hydrologic cycle, it is desirable to measure soil moisture with one- to two-days revisit. Using these frequent measurements, a time series approach can be implemented to improve the soil moisture retrieval accuracy.
Time-series analysis of offshore-wind-wave groupiness
Liang, H.B.
1988-01-01
This research is to applies basic time-series-analysis techniques on the complex envelope function where the study of the offshore-wind-wave groupiness is a relevant interest. In constructing the complex envelope function, a phase-unwrapping technique is integrated into the algorithm for estimating the carrier frequency and preserving the phase information for further studies. The Gaussian random wave model forms the basis of the wave-group statistics by the envelope-amplitude crossings. Good agreement between the theory and the analysis of field records is found. Other linear models, such as the individual-waves approach and the energy approach, are compared to the envelope approach by analyzing the same set of records. It is found that the character of the filter used in each approach dominates the wave-group statistics. Analyses indicate that the deep offshore wind waves are weakly nonlinear and the Gaussian random assumption remains appropriate for describing the sea state. Wave groups statistics derived from the Gaussian random wave model thus become applicable.
Approaches for mechanical joining of 7xxx series aluminum alloys
NASA Astrophysics Data System (ADS)
Jäckel, M.; Grimm, T.; Landgrebe, D.
2016-10-01
This paper shows a numerical and experimental analysis of the different problems occurring during or after the conventional self-pierce riveting with semi-tubular and solid rivets of the high strength aluminum alloy EN AW-7021 T4. Furthermore this paper describes different pre-process methods by which the fracture in the high strength aluminum, caused by the self-pierce riveting processes, can be prevented and proper joining results are achieved. On this basis, the different approaches are compared regarding joint strength.
Time-series analysis of Campylobacter incidence in Switzerland.
Wei, W; Schüpbach, G; Held, L
2015-07-01
Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.
A time-series approach to dynamical systems from classical and quantum worlds
Fossion, Ruben
2014-01-08
This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.
A time-series approach to dynamical systems from classical and quantum worlds
NASA Astrophysics Data System (ADS)
Fossion, Ruben
2014-01-01
This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.
Critical Thinking Skills. Analysis and Action Series.
ERIC Educational Resources Information Center
Heiman, Marcia; Slomianko, Joshua
Intended for teachers across grade levels and disciplines, this monograph reviews research on the development of critical thinking skills and introduces a series of these skills that can be incorporated into classroom teaching. Beginning with a definition of critical thinking, the monograph contains two main sections. The first section reviews…
Three Analysis Examples for Time Series Data
Technology Transfer Automated Retrieval System (TEKTRAN)
With improvements in instrumentation and the automation of data collection, plot level repeated measures and time series data are increasingly available to monitor and assess selected variables throughout the duration of an experiment or project. Records and metadata on variables of interest alone o...
Time Series Analysis of Mother-Infant Interaction.
ERIC Educational Resources Information Center
Rosenfeld, Howard M.
A method of studying attachment behavior in infants was devised using time series and time sequence analyses. Time series analysis refers to relationships between events coded over adjacent fixed-time units. Time sequence analysis refers to the distribution of exact times at which particular events happen. Using these techniques, multivariate…
The U-series comminution approach: where to from here
NASA Astrophysics Data System (ADS)
Handley, Heather; Turner, Simon; Afonso, Juan; Turner, Michael; Hesse, Paul
2015-04-01
Quantifying the rates of landscape evolution in response to climate change is inhibited by the difficulty of dating the formation of continental detrital sediments. The 'comminution age' dating model of DePaolo et al. (2006) hypothesises that the measured disequilibria between U-series nuclides (234U and 238U) in fine-grained continental (detrital) sediments can be used to calculate the time elapsed since mechanical weathering of a grain to the threshold size ( 50 µm). The comminution age includes the time that a particle has been mobilised in transport, held in temporary storage (e.g., soils and floodplains) and the time elapsed since final deposition to present day. Therefore, if the deposition age of sediment can be constrained independently, for example via optically stimulated luminescence (OSL) dating, the residence time of sediment (e.g., a palaeochannel deposit) can be determined. Despite the significant potential of this approach, there is still much work to be done before meaningful absolute comminution ages can be obtained. The calculated recoil loss factor and comminution age are highly dependent on the method of recoil loss factor determination used and the inherent assumptions. We present new and recently published uranium isotope data for aeolian sediment deposits, leached and unleached palaeochannel sediments and bedrock samples from Australia to exemplify areas of current uncertainty in the comminution age approach. In addition to the information gained from natural samples, Monte Carlo simulations have been conducted for a synthetic sediment sample to determine the individual and combined comminution age uncertainties associated to each input variable. Using a reasonable associated uncertainty for each input factor and including variations in the source rock and measured (234U/238U) ratios, the total combined uncertainty on comminution age in our simulation (for two methods of recoil loss factor estimation: weighted geometric and surface area
Time series analysis of Monte Carlo neutron transport calculations
NASA Astrophysics Data System (ADS)
Nease, Brian Robert
A time series based approach is applied to the Monte Carlo (MC) fission source distribution to calculate the non-fundamental mode eigenvalues of the system. The approach applies Principal Oscillation Patterns (POPs) to the fission source distribution, transforming the problem into a simple autoregressive order one (AR(1)) process. Proof is provided that the stationary MC process is linear to first order approximation, which is a requirement for the application of POPs. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern k-eigenvalue MC codes calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. The strength of this approach is contrasted against the Fission Matrix method (FMM) in terms of accuracy versus computer memory constraints. Multi-dimensional problems are considered since the approach has strong potential for use in reactor analysis, and the implementation of the method into production codes is discussed. Lastly, the appearance of complex eigenvalues is investigated and solutions are provided.
Interstage Flammability Analysis Approach
NASA Technical Reports Server (NTRS)
Little, Jeffrey K.; Eppard, William M.
2011-01-01
The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.
Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis
NASA Astrophysics Data System (ADS)
Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo
2002-03-01
In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.
A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series
Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M.; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel
2015-01-01
Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in
Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel
2015-01-01
Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in
Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis
NASA Technical Reports Server (NTRS)
Eberhart, C. J.; Casiano, M. J.
2015-01-01
Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.
Taylor series expansion and modified extended Prony analysis for localization
Mosher, J.C.; Lewis, P.S.
1994-12-01
In the multiple source localization problem, many inverse routines use a rooting of a polynomial to determine the source locations. The authors present a rooting algorithm for locating an unknown number of three-dimensional, near-field, static sources from measurements at an arbitrarily spaced three-dimensional array. Since the sources are near-field and static, the spatial covariance matrix is always rank one, and spatial smoothing approaches are inappropriate due to the spatial diversity. The authors approach the solution through spherical harmonics, essentially replacing the point source function with its Taylor series expansion. They then perform a modified extended Prony analysis of the expansion coefficients to determine the number and location of the sources. The full inverse method is typically ill-conditioned, but a portion of the algorithm is suitable for synthesis analysis. They present a simulation for simplifying point charges limited to a spherical region, using an array of voltage potential measurements made outside the region. Future efforts of this work will focus on adapting the analysis to the electroencephalography and magnetoencephalography.
Dean, Dennis A.; Adler, Gail K.; Nguyen, David P.; Klerman, Elizabeth B.
2014-01-01
We present a novel approach for analyzing biological time-series data using a context-free language (CFL) representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP) analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals. PMID:25184442
Dean, Dennis A; Adler, Gail K; Nguyen, David P; Klerman, Elizabeth B
2014-01-01
We present a novel approach for analyzing biological time-series data using a context-free language (CFL) representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP) analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.
Performance of multifractal detrended fluctuation analysis on short time series
NASA Astrophysics Data System (ADS)
López, Juan Luis; Contreras, Jesús Guillermo
2013-02-01
The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.
Nonlinear times series analysis of epileptic human electroencephalogram (EEG)
NASA Astrophysics Data System (ADS)
Li, Dingzhou
The problem of seizure anticipation in patients with epilepsy has attracted significant attention in the past few years. In this paper we discuss two approaches, using methods of nonlinear time series analysis applied to scalp electrode recordings, which is able to distinguish between epochs temporally distant from and just prior to, the onset of a seizure in patients with temporal lobe epilepsy. First we describe a method involving a comparison of recordings taken from electrodes adjacent to and remote from the site of the seizure focus. In particular, we define a nonlinear quantity which we call marginal predictability. This quantity is computed using data from remote and from adjacent electrodes. We find that the difference between the marginal predictabilities computed for the remote and adjacent electrodes decreases several tens of minutes prior to seizure onset, compared to its value interictally. We also show that these difl'crcnc es of marginal predictability intervals are independent of the behavior state of the patient. Next we examine the please coherence between different electrodes both in the long-range and the short-range. When time is distant from seizure onsets ("interictally"), epileptic patients have lower long-range phase coherence in the delta (1-4Hz) and beta (18-30Hz) frequency band compared to nonepileptic subjects. When seizures approach (''preictally"), we observe an increase in phase coherence in the beta band. However, interictally there is no difference in short-range phase coherence between this cohort of patients and non-epileptic subjects. Preictally short-range phase coherence also increases in the alpha (10-13Hz) and the beta band. Next we apply the quantity marginal predictability on the phase difference time series. Such marginal predictabilities are lower in the patients than in the non-epileptic subjects. However, when seizure approaches, the former moves asymptotically towards the latter.
SAGE: A tool for time-series analysis of Greenland
NASA Astrophysics Data System (ADS)
Duerr, R. E.; Gallaher, D. W.; Khalsa, S. S.; Lewis, S.
2011-12-01
The National Snow and Ice Data Center (NSIDC) has developed an operational tool for analysis. This production tool is known as "Services for the Analysis of the Greenland Environment" (SAGE). Using an integrated workspace approach, a researcher has the ability to find relevant data and perform various analysis functions on the data, as well as retrieve the data and analysis results. While there continues to be compelling observational evidence for increased surface melting and rapid thinning along the margins of the Greenland ice sheet, there are still uncertainties with respect to estimates of mass balance of Greenland's ice sheet as a whole. To better understand the dynamics of these issues, it is important for scientists to have access to a variety of datasets from multiple sources, and to be able to integrate and analyze the data. SAGE provides data from various sources, such as AMSR-E and AVHRR datasets, which can be analyzed individually through various time-series plots and aggregation functions; or they can be analyzed together with scatterplots or overlaid time-series plots to provide quick and useful results to support various research products. The application is available at http://nsidc.org/data/sage/. SAGE was built on top of NSIDC's existing Searchlight engine. The SAGE interface gives users access to much of NSIDC's relevant Greenland raster data holdings, as well as data from outside sources. Additionally, various web services provide access for other clients to utilize the functionality that the SAGE interface provides. Combined, these methods of accessing the tool allow scientists the ability to devote more of their time to their research, and less on trying to find and retrieve the data they need.
Wavelet transform approach for fitting financial time series data
NASA Astrophysics Data System (ADS)
Ahmed, Amel Abdoullah; Ismail, Mohd Tahir
2015-10-01
This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip
2016-06-28
The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.
NASA Technical Reports Server (NTRS)
Gao, X. H.; Stanford, J. L.
1988-01-01
The formulas for performing several statistical calculations based on Fourier coefficients are presented for use in atmospheric observational studies. The calculations discussed include a method for estimating the degree of temporal freedoms of two correlated time series and a method for performing seasonal analyses using a half-year summer/winter projection operator in the frequency domain. A modified lag-correlation calculation is proposed for obtaining lag correlations in the frequency domain. Also, a spectral approach for Empirical Orthogonal Function (EOF) and Extended EOF analysis is given which reduces the size of the matrix to be solved in the eigenproblem.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary
2014-11-01
Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management.
A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis
Wang, Qiang; Megalooikonomou, Vasileios
2008-01-01
We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587
Wavelet analysis and scaling properties of time series
NASA Astrophysics Data System (ADS)
Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.
2005-10-01
We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.
Nonlinear Time Series Analysis in Earth Sciences - Potentials and Pitfalls
NASA Astrophysics Data System (ADS)
Kurths, Jürgen; Donges, Jonathan F.; Donner, Reik V.; Marwan, Norbert; Zou, Yong
2010-05-01
The application of methods of nonlinear time series analysis has a rich tradition in Earth sciences and has enabled substantially new insights into various complex processes there. However, some approaches and findings have been controversially discussed over the last decades. One reason is that they are often bases on strong restrictions and their violation may lead to pitfalls and misinterpretations. Here, we discuss three general concepts of nonlinear dynamics and statistical physics, synchronization, recurrence and complex networks and explain how to use them for data analysis. We show that the corresponding methods can be applied even to rather short and non-stationary data which are typical in Earth sciences. References Marwan, N., Romano, M., Thiel, M., Kurths, J.: Recurrence plots for the analysis of complex systems, Physics Reports 438, 237-329 (2007) Arenas, A., Diaz-Guilera, A., Kurths, J., Moreno, Y., Zhou, C.: Synchronization in complex networks, Physics Reports 469, 93-153 (2008) Marwan, N., Donges, J.F., Zou, Y., Donner, R. and Kurths, J., Phys. Lett. A 373, 4246 (2009) Donges, J.F., Zou, Y., Marwan, N. and Kurths, J. Europhys. Lett. 87, 48007 (2009) Donner, R., Zou, Y., Donges, J.F., Marwan, N. and Kurths, J., Phys. Rev. E 81, 015101(R) (2010)
Detrended fluctuation analysis of multivariate time series
NASA Astrophysics Data System (ADS)
Xiong, Hui; Shang, P.
2017-01-01
In this work, we generalize the detrended fluctuation analysis (DFA) to the multivariate case, named multivariate DFA (MVDFA). The validity of the proposed MVDFA is illustrated by numerical simulations on synthetic multivariate processes, where the cases that initial data are generated independently from the same system and from different systems as well as the correlated variate from one system are considered. Moreover, the proposed MVDFA works well when applied to the multi-scale analysis of the returns of stock indices in Chinese and US stock markets. Generally, connections between the multivariate system and the individual variate are uncovered, showing the solid performances of MVDFA and the multi-scale MVDFA.
Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory
NASA Astrophysics Data System (ADS)
Wang, Na; Li, Dong; Wang, Qiwen
2012-12-01
The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government
Schoolwide Approaches to Discipline. The Informed Educator Series.
ERIC Educational Resources Information Center
Porch, Stephanie
Although there are no simple solutions for how to turn around a school with serious discipline problems, schoolwide approaches have been effective, according to this report. The report examines research on schoolwide approaches to discipline and discusses the characteristics of programs that promote a culture of safety and support, improved…
Topic Time Series Analysis of Microblogs
2014-10-01
is generated by Instagram. Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite...center in Commerce, CA (a subdivision of Los Angeles). Topic 80, Distance: 6.6391 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis...8.65 0.90 0.040 ‘cold’ ‘af’ ‘outside’ 7.88 0.60 0.059 ‘chico’ ‘fluff’ ‘ice’ 9.10 0.19 0.002 ‘rawr’ ‘ dinosaur ’ ‘jurassic’ ‘seen’ 0.55 0.36 4.15 6.2
Qu, Cheng; Wang, Lin-Yan; Jin, Wen-Tao; Tang, Yu-Ping; Jin, Yi; Shi, Xu-Qin; Shang, Li-Li; Shang, Er-Xin; Duan, Jin-Ao
2016-11-06
The flower of Carthamus tinctorius L. (Carthami Flos, safflower), important in traditional Chinese medicine (TCM), is known for treating blood stasis, coronary heart disease, hypertension, and cerebrovascular disease in clinical and experimental studies. It is widely accepted that hydroxysafflor yellow A (HSYA) and anhydrosafflor yellow B (ASYB) are the major bioactive components of many formulae comprised of safflower. In this study, selective knock-out of target components such as HSYA and ASYB by using preparative high performance liquid chromatography (prep-HPLC) followed by antiplatelet and anticoagulation activities evaluation was used to investigate the roles of bioactive ingredients in safflower series of herb pairs. The results showed that both HSYA and ASYB not only played a direct role in activating blood circulation, but also indirectly made a contribution to the total bioactivity of safflower series of herb pairs. The degree of contribution of HSYA in the safflower and its series herb pairs was as follows: Carthami Flos-Ginseng Radix et Rhizoma Rubra (CF-GR) > Carthami Flos-Sappan Lignum (CF-SL) > Carthami Flos-Angelicae Sinensis Radix (CF-AS) > Carthami Flos-Astragali Radix (CF-AR) > Carthami Flos-Angelicae Sinensis Radix (CF-AS) > Carthami Flos-Glycyrrhizae Radix et Rhizoma (CF-GL) > Carthami Flos-Salviae Miltiorrhizae Radix et Rhizoma (CF-SM) > Carthami Flos (CF), and the contribution degree of ASYB in the safflower and its series herb pairs: CF-GL > CF-PS > CF-AS > CF-SL > CF-SM > CF-AR > CF-GR > CF. So, this study provided a significant and effective approach to elucidate the contribution of different herbal components to the bioactivity of the herb pair, and clarification of the variation of herb-pair compatibilities. In addition, this study provides guidance for investigating the relationship between herbal compounds and the bioactivities of herb pairs. It also provides a scientific basis for reasonable clinical applications and new drug
Time Series Analysis of Insar Data: Methods and Trends
NASA Technical Reports Server (NTRS)
Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique
2015-01-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
NASA Astrophysics Data System (ADS)
Yan, Jun; Dong, Danan; Chen, Wen
2016-04-01
Due to the development of GNSS technology and the improvement of its positioning accuracy, observational data obtained by GNSS is widely used in Earth space geodesy and geodynamics research. Whereas the GNSS time series data of observation stations contains a plenty of information. This includes geographical space changes, deformation of the Earth, the migration of subsurface material, instantaneous deformation of the Earth, weak deformation and other blind signals. In order to decompose some instantaneous deformation underground, weak deformation and other blind signals hided in GNSS time series, we apply Independent Component Analysis (ICA) to daily station coordinate time series of the Southern California Integrated GPS Network. As ICA is based on the statistical characteristics of the observed signal. It uses non-Gaussian and independence character to process time series to obtain the source signal of the basic geophysical events. In term of the post-processing procedure of precise time-series data by GNSS, this paper examines GNSS time series using the principal component analysis (PCA) module of QOCA and ICA algorithm to separate the source signal. Then we focus on taking into account of these two signal separation technologies, PCA and ICA, for separating original signal that related geophysical disturbance changes from the observed signals. After analyzing these separation process approaches, we demonstrate that in the case of multiple factors, PCA exists ambiguity in the separation of source signals, that is the result related is not clear, and ICA will be better than PCA, which means that dealing with GNSS time series that the combination of signal source is unknown is suitable to use ICA.
Spectral analysis of the Elatina varve series
NASA Technical Reports Server (NTRS)
Bracewell, R. N.
1988-01-01
The Elatina formation in South America, which provides a rich fossil record of presumptive solar activity in the late Precambrian, is of great potential significance for the physics of the sun because it contains luminae grouped in cycles of about 12, an appearance suggestive of the solar cycle. Here, the laminae are treated as varves laid down yearly and modulated in thickness in accordance with the late Precambrian sunspot activity for the year of deposition. The purpose is to present a simple structure, or intrinsic spectrum, that will be uncovered by appropriate data analysis.
Automated analysis of brachial ultrasound time series
NASA Astrophysics Data System (ADS)
Liang, Weidong; Browning, Roger L.; Lauer, Ronald M.; Sonka, Milan
1998-07-01
Atherosclerosis begins in childhood with the accumulation of lipid in the intima of arteries to form fatty streaks, advances through adult life when occlusive vascular disease may result in coronary heart disease, stroke and peripheral vascular disease. Non-invasive B-mode ultrasound has been found useful in studying risk factors in the symptom-free population. Large amount of data is acquired from continuous imaging of the vessels in a large study population. A high quality brachial vessel diameter measurement method is necessary such that accurate diameters can be measured consistently in all frames in a sequence, across different observers. Though human expert has the advantage over automated computer methods in recognizing noise during diameter measurement, manual measurement suffers from inter- and intra-observer variability. It is also time-consuming. An automated measurement method is presented in this paper which utilizes quality assurance approaches to adapt to specific image features, to recognize and minimize the noise effect. Experimental results showed the method's potential for clinical usage in the epidemiological studies.
Time series data analysis using DFA
NASA Astrophysics Data System (ADS)
Okumoto, A.; Akiyama, T.; Sekino, H.; Sumi, T.
2014-02-01
Detrended fluctuation analysis (DFA) was originally developed for the evaluation of DNA sequence and interval for heart rate variability (HRV), but it is now used to obtain various biological information. In this study we perform DFA on artificially generated data where we already know the relationship between signal and the physical event causing the signal. We generate artificial data using molecular dynamics. The Brownian motion of a polymer under an external force is investigated. In order to generate artificial fluctuation in the physical properties, we introduce obstacle pillars fixed to nanostructures. Using different conditions such as presence or absence of obstacles, external field, and the polymer length, we perform DFA on energies and positions of the polymer.
Advanced tools for astronomical time series and image analysis
NASA Astrophysics Data System (ADS)
Scargle, Jeffrey D.
The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.
Aroma characterization based on aromatic series analysis in table grapes.
Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping
2016-08-04
Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, 'Kyoho' grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes.
Aroma characterization based on aromatic series analysis in table grapes
Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping
2016-01-01
Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, ‘Kyoho’ grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935
NASA Astrophysics Data System (ADS)
Šilhán, Karel; Stoffel, Markus
2015-05-01
Different approaches and thresholds have been utilized in the past to date landslides with growth ring series of disturbed trees. Past work was mostly based on conifer species because of their well-defined ring boundaries and the easy identification of compression wood after stem tilting. More recently, work has been expanded to include broad-leaved trees, which are thought to produce less and less evident reactions after landsliding. This contribution reviews recent progress made in dendrogeomorphic landslide analysis and introduces a new approach in which landslides are dated via ring eccentricity formed after tilting. We compare results of this new and the more conventional approaches. In addition, the paper also addresses tree sensitivity to landslide disturbance as a function of tree age and trunk diameter using 119 common beech (Fagus sylvatica L.) and 39 Crimean pine (Pinus nigra ssp. pallasiana) trees growing on two landslide bodies. The landslide events reconstructed with the classical approach (reaction wood) also appear as events in the eccentricity analysis, but the inclusion of eccentricity clearly allowed for more (162%) landslides to be detected in the tree-ring series. With respect to tree sensitivity, conifers and broad-leaved trees show the strongest reactions to landslides at ages comprised between 40 and 60 years, with a second phase of increased sensitivity in P. nigra at ages of ca. 120-130 years. These phases of highest sensitivities correspond with trunk diameters at breast height of 6-8 and 18-22 cm, respectively (P. nigra). This study thus calls for the inclusion of eccentricity analyses in future landslide reconstructions as well as for the selection of trees belonging to different age and diameter classes to allow for a well-balanced and more complete reconstruction of past events.
Time series power flow analysis for distribution connected PV generation.
Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger
2013-01-01
Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating
Activity Approach to Just Beyond the Classroom. Environmental Education Series.
ERIC Educational Resources Information Center
Skliar, Norman; La Mantia, Laura
To provide teachers with some of the many activities that can be carried on "just beyond the classroom," the booklet presents plans for more than 40 outdoor education activities, all emphasizing multidisciplinary, inquiry approach to learning. The school grounds offer optimum conditions for initiating studies in the out-of-doors. While every…
Emergent Approaches to Mental Health Problems. The Century Psychology Series.
ERIC Educational Resources Information Center
Cowen, Emory L., Ed.; And Others
Innovative approaches to mental health problems are described. Conceptualizations about the following areas are outlined: psychiatry, the universe, and the community; theoretical malaise and community mental health; the relation of conceptual models to manpower needs; and mental health manpower and institutional change. Community programs and new…
Clinical immunology review series: an approach to desensitization
Krishna, M T; Huissoon, A P
2011-01-01
Allergen immunotherapy describes the treatment of allergic disease through administration of gradually increasing doses of allergen. This form of immune tolerance induction is now safer, more reliably efficacious and better understood than when it was first formally described in 1911. In this paper the authors aim to summarize the current state of the art in immunotherapy in the treatment of inhalant, venom and drug allergies, with specific reference to its practice in the United Kingdom. A practical approach has been taken, with reference to current evidence and guidelines, including illustrative protocols and vaccine schedules. A number of novel approaches and techniques are likely to change considerably the way in which we select and treat allergy patients in the coming decade, and these advances are previewed. PMID:21175592
Multiscale multifractal diffusion entropy analysis of financial time series
NASA Astrophysics Data System (ADS)
Huang, Jingjing; Shang, Pengjian
2015-02-01
This paper introduces a multiscale multifractal diffusion entropy analysis (MMDEA) method to analyze long-range correlation then applies this method to stock index series. The method combines the techniques of diffusion process and Rényi entropy to focus on the scaling behaviors of stock index series using a multiscale, which allows us to extend the description of stock index variability to include the dependence on the magnitude of the variability and time scale. Compared to multifractal diffusion entropy analysis, the MMDEA can show more details of scale properties and provide a reliable analysis. In this paper, we concentrate not only on the fact that the stock index series has multifractal properties but also that these properties depend on the time scale in which the multifractality is measured. This time scale is related to the frequency band of the signal. We find that stock index variability appears to be far more complex than reported in the studies using a fixed time scale.
ERIC Educational Resources Information Center
Sun, Yu-Chih
2016-01-01
Extensive reading for second language learners have been widely documented over the past few decades. However, few studies, if any, have used a corpus analysis approach to analyze the vocabulary coverage within a single-author story series, its repetition of vocabulary, and the incidental and intentional vocabulary learning opportunities therein.…
A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series
Keshmiri, Soheil; Sumioka, Hidenobu; Yamazaki, Ryuji; Ishiguro, Hiroshi
2017-01-01
We present a non-parametric approach to prediction of the n-back n ∈ {1, 2} task as a proxy measure of mental workload using Near Infrared Spectroscopy (NIRS) data. In particular, we focus on measuring the mental workload through hemodynamic responses in the brain induced by these tasks, thereby realizing the potential that they can offer for their detection in real world scenarios (e.g., difficulty of a conversation). Our approach takes advantage of intrinsic linearity that is inherent in the components of the NIRS time series to adopt a one-step regression strategy. We demonstrate the correctness of our approach through its mathematical analysis. Furthermore, we study the performance of our model in an inter-subject setting in contrast with state-of-the-art techniques in the literature to show a significant improvement on prediction of these tasks (82.50 and 86.40% for female and male participants, respectively). Moreover, our empirical analysis suggest a gender difference effect on the performance of the classifiers (with male data exhibiting a higher non-linearity) along with the left-lateralized activation in both genders with higher specificity in females. PMID:28217088
A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series.
Keshmiri, Soheil; Sumioka, Hidenobu; Yamazaki, Ryuji; Ishiguro, Hiroshi
2017-01-01
We present a non-parametric approach to prediction of the n-back n ∈ {1, 2} task as a proxy measure of mental workload using Near Infrared Spectroscopy (NIRS) data. In particular, we focus on measuring the mental workload through hemodynamic responses in the brain induced by these tasks, thereby realizing the potential that they can offer for their detection in real world scenarios (e.g., difficulty of a conversation). Our approach takes advantage of intrinsic linearity that is inherent in the components of the NIRS time series to adopt a one-step regression strategy. We demonstrate the correctness of our approach through its mathematical analysis. Furthermore, we study the performance of our model in an inter-subject setting in contrast with state-of-the-art techniques in the literature to show a significant improvement on prediction of these tasks (82.50 and 86.40% for female and male participants, respectively). Moreover, our empirical analysis suggest a gender difference effect on the performance of the classifiers (with male data exhibiting a higher non-linearity) along with the left-lateralized activation in both genders with higher specificity in females.
Optimal trading strategies—a time series approach
NASA Astrophysics Data System (ADS)
Bebbington, Peter A.; Kühn, Reimer
2016-05-01
Motivated by recent advances in the spectral theory of auto-covariance matrices, we are led to revisit a reformulation of Markowitz’ mean-variance portfolio optimization approach in the time domain. In its simplest incarnation it applies to a single traded asset and allows an optimal trading strategy to be found which—for a given return—is minimally exposed to market price fluctuations. The model is initially investigated for a range of synthetic price processes, taken to be either second order stationary, or to exhibit second order stationary increments. Attention is paid to consequences of estimating auto-covariance matrices from small finite samples, and auto-covariance matrix cleaning strategies to mitigate against these are investigated. Finally we apply our framework to real world data.
Automatic differentiation for Fourier series and the radii polynomial approach
NASA Astrophysics Data System (ADS)
Lessard, Jean-Philippe; Mireles James, J. D.; Ransford, Julian
2016-11-01
In this work we develop a computer-assisted technique for proving existence of periodic solutions of nonlinear differential equations with non-polynomial nonlinearities. We exploit ideas from the theory of automatic differentiation in order to formulate an augmented polynomial system. We compute a numerical Fourier expansion of the periodic orbit for the augmented system, and prove the existence of a true solution nearby using an a-posteriori validation scheme (the radii polynomial approach). The problems considered here are given in terms of locally analytic vector fields (i.e. the field is analytic in a neighborhood of the periodic orbit) hence the computer-assisted proofs are formulated in a Banach space of sequences satisfying a geometric decay condition. In order to illustrate the use and utility of these ideas we implement a number of computer-assisted existence proofs for periodic orbits of the Planar Circular Restricted Three-Body Problem (PCRTBP).
Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology
NASA Astrophysics Data System (ADS)
Scheier, Christian; Tschacher, Wolfgang
Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.
The QuakeSim System for GPS Time Series Analysis
NASA Astrophysics Data System (ADS)
Granat, R. A.; Gao, X.; Pierce, M.; Wang, J.
2010-12-01
We present a system for analysis of GPS time series data available to geosciences users through a web services / web portal interface. The system provides two time series analysis methods, one based on hidden Markov model (HMM) segmentation, the other based on covariance descriptor analysis (CDA). In addition, it provides data pre-processing routines that perform spike noise removal, linear de-trending, sum-of-sines removal, and common mode removal using probabilistic principle components analysis (PPCA). These components can be composed by the user into the desired series of processing steps for analysis through an intuitive graphical interface. The system is accessed through a web portal that allows both micro-scale (individual station) and macro-scale (whole network) exploration of data sets and analysis results via Google Maps. Users can focus in on or scroll through particular spatial or temporal time windows, or observe dynamic behavior by created movies that display the system state. Analysis results can be exported to KML format for easy combination with other sources of data, such as fault databases and InSAR interferograms. GPS solutions for California member stations of the plate boundary observatory from both the SOPAC and JPL gipsy context groups are automatically imported into the system as that data becomes available. We show the results of the methods as applied to these data sets for an assortment of case studies, and show how the system can be used to analyze both seismic and aseismic signals.
Time Series Analysis Based on Running Mann Whitney Z Statistics
Technology Transfer Automated Retrieval System (TEKTRAN)
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
Nonlinear Analysis of Surface EMG Time Series of Back Muscles
NASA Astrophysics Data System (ADS)
Dolton, Donald C.; Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2004-10-01
A nonlinear analysis of surface electromyography time series of subjects with and without low back pain is presented. The mean-square displacement and entropy shows anomalous diffusive behavior on intermediate time range 10 ms < t < 1 s. This behavior implies the presence of correlations in the signal. We discuss the shape of the power spectrum of the signal.
ADAPTIVE DATA ANALYSIS OF COMPLEX FLUCTUATIONS IN PHYSIOLOGIC TIME SERIES
PENG, C.-K.; COSTA, MADALENA; GOLDBERGER, ARY L.
2009-01-01
We introduce a generic framework of dynamical complexity to understand and quantify fluctuations of physiologic time series. In particular, we discuss the importance of applying adaptive data analysis techniques, such as the empirical mode decomposition algorithm, to address the challenges of nonlinearity and nonstationarity that are typically exhibited in biological fluctuations. PMID:20041035
Wavelet analysis for non-stationary, nonlinear time series
NASA Astrophysics Data System (ADS)
Schulte, Justin A.
2016-08-01
Methods for detecting and quantifying nonlinearities in nonstationary time series are introduced and developed. In particular, higher-order wavelet analysis was applied to an ideal time series and the quasi-biennial oscillation (QBO) time series. Multiple-testing problems inherent in wavelet analysis were addressed by controlling the false discovery rate. A new local autobicoherence spectrum facilitated the detection of local nonlinearities and the quantification of cycle geometry. The local autobicoherence spectrum of the QBO time series showed that the QBO time series contained a mode with a period of 28 months that was phase coupled to a harmonic with a period of 14 months. An additional nonlinearly interacting triad was found among modes with periods of 10, 16 and 26 months. Local biphase spectra determined that the nonlinear interactions were not quadratic and that the effect of the nonlinearities was to produce non-smoothly varying oscillations. The oscillations were found to be skewed so that negative QBO regimes were preferred, and also asymmetric in the sense that phase transitions between the easterly and westerly phases occurred more rapidly than those from westerly to easterly regimes.
NASA Astrophysics Data System (ADS)
Assireu, A. T.; Rosa, R. R.; Vijaykumar, N. L.; Lorenzzetti, J. A.; Rempel, E. L.; Ramos, F. M.; Abreu Sá, L. D.; Bolzan, M. J. A.; Zanandrea, A.
2002-08-01
Based on the gradient pattern analysis (GPA) technique we introduce a new methodology for analyzing short nonstationary time series. Using the asymmetric amplitude fragmentation (AAF) operator from GPA we analyze Lagrangian data observed as velocity time series for ocean flow. The results show that quasi-periodic, chaotic and turbulent regimes can be well characterized by means of this new geometrical approach.
Interglacial climate dynamics and advanced time series analysis
NASA Astrophysics Data System (ADS)
Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit
2013-04-01
Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R
Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin
NASA Astrophysics Data System (ADS)
zhang, L.
2011-12-01
Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be
Time series segmentation: a new approach based on Genetic Algorithm and Hidden Markov Model
NASA Astrophysics Data System (ADS)
Toreti, A.; Kuglitsch, F. G.; Xoplaki, E.; Luterbacher, J.
2009-04-01
The subdivision of a time series into homogeneous segments has been performed using various methods applied to different disciplines. In climatology, for example, it is accompanied by the well-known homogenization problem and the detection of artificial change points. In this context, we present a new method (GAMM) based on Hidden Markov Model (HMM) and Genetic Algorithm (GA), applicable to series of independent observations (and easily adaptable to autoregressive processes). A left-to-right hidden Markov model, estimating the parameters and the best-state sequence, respectively, with the Baum-Welch and Viterbi algorithms, was applied. In order to avoid the well-known dependence of the Baum-Welch algorithm on the initial condition, a Genetic Algorithm was developed. This algorithm is characterized by mutation, elitism and a crossover procedure implemented with some restrictive rules. Moreover the function to be minimized was derived following the approach of Kehagias (2004), i.e. it is the so-called complete log-likelihood. The number of states was determined applying a two-fold cross-validation procedure (Celeux and Durand, 2008). Being aware that the last issue is complex, and it influences all the analysis, a Multi Response Permutation Procedure (MRPP; Mielke et al., 1981) was inserted. It tests the model with K+1 states (where K is the state number of the best model) if its likelihood is close to K-state model. Finally, an evaluation of the GAMM performances, applied as a break detection method in the field of climate time series homogenization, is shown. 1. G. Celeux and J.B. Durand, Comput Stat 2008. 2. A. Kehagias, Stoch Envir Res 2004. 3. P.W. Mielke, K.J. Berry, G.W. Brier, Monthly Wea Rev 1981.
A perturbative approach for enhancing the performance of time series forecasting.
de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C
2017-04-01
This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published.
The application of complex network time series analysis in turbulent heated jets.
Charakopoulos, A Κ; Karakasidis, T E; Papanicolaou, P N; Liakopoulos, A
2014-06-01
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.
The application of complex network time series analysis in turbulent heated jets
Charakopoulos, A. K.; Karakasidis, T. E. Liakopoulos, A.; Papanicolaou, P. N.
2014-06-15
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.
Mode Analysis with Autocorrelation Method (Single Time Series) in Tokamak
NASA Astrophysics Data System (ADS)
Saadat, Shervin; Salem, Mohammad K.; Goranneviss, Mahmoud; Khorshid, Pejman
2010-08-01
In this paper plasma mode analyzed with statistical method that designated Autocorrelation function. Auto correlation function used from one time series, so for this purpose we need one Minov coil. After autocorrelation analysis on mirnov coil data, spectral density diagram is plotted. Spectral density diagram from symmetries and trends can analyzed plasma mode. RHF fields effects with this method ate investigated in IR-T1 tokamak and results corresponded with multichannel methods such as SVD and FFT.
Profile Analysis: Multidimensional Scaling Approach.
ERIC Educational Resources Information Center
Ding, Cody S.
2001-01-01
Outlines an exploratory multidimensional scaling-based approach to profile analysis called Profile Analysis via Multidimensional Scaling (PAMS) (M. Davison, 1994). The PAMS model has the advantages of being applied to samples of any size easily, classifying persons on a continuum, and using person profile index for further hypothesis studies, but…
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.
Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series
Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C.; Golino, Caroline A.; Kemper, Peter; Saha, Margaret S.
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features. PMID:27977764
Satellite time series analysis using Empirical Mode Decomposition
NASA Astrophysics Data System (ADS)
Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.
2016-04-01
Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.
Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina
2015-01-01
Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.
Observability of nonlinear dynamics: Normalized results and a time-series approach
NASA Astrophysics Data System (ADS)
Aguirre, Luis A.; Bastos, Saulo B.; Alves, Marcela A.; Letellier, Christophe
2008-03-01
This paper investigates the observability of nonlinear dynamical systems. Two difficulties associated with previous studies are dealt with. First, a normalized degree observability is defined. This permits the comparison of different systems, which was not generally possible before. Second, a time-series approach is proposed based on omnidirectional nonlinear correlation functions to rank a set of time series of a system in terms of their potential use to reconstruct the original dynamics without requiring the knowledge of the system equations. The two approaches proposed in this paper and a former method were applied to five benchmark systems and an overall agreement of over 92% was found.
Time series analysis using semiparametric regression on oil palm production
NASA Astrophysics Data System (ADS)
Yundari, Pasaribu, U. S.; Mukhaiyar, U.
2016-04-01
This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).
The multiscale analysis between stock market time series
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian
2015-11-01
This paper is devoted to multiscale cross-correlation analysis on stock market time series, where multiscale DCCA cross-correlation coefficient as well as multiscale cross-sample entropy (MSCE) is applied. Multiscale DCCA cross-correlation coefficient is a realization of DCCA cross-correlation coefficient on multiple scales. The results of this method present a good scaling characterization. More significantly, this method is able to group stock markets by areas. Compared to multiscale DCCA cross-correlation coefficient, MSCE presents a more remarkable scaling characterization and the value of each log return of financial time series decreases with the increasing of scale factor. But the results of grouping is not as good as multiscale DCCA cross-correlation coefficient.
Nonlinear time series analysis of solar and stellar data
NASA Astrophysics Data System (ADS)
Jevtic, Nada
2003-06-01
Nonlinear time series analysis was developed to study chaotic systems. Its utility was investigated for the study of solar and stellar data time series. Sunspot data are the longest astronomical time series, and it reflects the long-term variation of the solar magnetic field. Due to periods of low solar activity, such as the Maunder minimum, and the solar cycle's quasiperiodicity, it has been postulated that the solar dynamo is a chaotic system. We show that, due to the definition of sunspot number, using nonlinear time series methods, it is not possible to test this postulate. To complement the sunspot data analysis, theoretically generated data for the α-Ω solar dynamo with meridional circulation were analyzed. Effects of stochastic fluctuations on the energy of an α-Ω dynamo with meridional circulation were investigated. This proved extremely useful in generating a clearer understanding of the effect of dynamical noise on the unperturbed system. This was useful in the study of the light intensity curve of white dwarf PG 1351+489. Dynamical resetting was identified for PG 1351+489, using phase space methods, and then, using nonlinear noise reduction methods, the white noise tail of the power spectrum was lowered by a factor of 40. This allowed the identification of 10 new lines in the power spectrum. Finally, using Poincare section return times, a periodicity in the light curve of cataclysmic variable SS Cygni was identified. We initially expected that time delay methods would be useful as a qualitative comparison tool. However, they were capable, under the proper set of constraints on the data sets, of providing quantitative information about the signal source.
Three-dimensional Neumann-series approach to model light transport in nonuniform media
Jha, Abhinav K.; Kupinski, Matthew A.; Barrett, Harrison H.; Clarkson, Eric; Hartman, John H.
2014-01-01
We present the implementation, validation, and performance of a three-dimensional (3D) Neumann-series approach to model photon propagation in nonuniform media using the radiative transport equation (RTE). The RTE is implemented for nonuniform scattering media in a spherical harmonic basis for a diffuse-optical-imaging setup. The method is parallelizable and implemented on a computing system consisting of NVIDIA Tesla C2050 graphics processing units (GPUs). The GPU implementation provides a speedup of up to two orders of magnitude over non-GPU implementation, which leads to good computational efficiency for the Neumann-series method. The results using the method are compared with the results obtained using the Monte Carlo simulations for various small-geometry phantoms, and good agreement is observed. We observe that the Neumann-series approach gives accurate results in many cases where the diffusion approximation is not accurate. PMID:23201945
NASA Astrophysics Data System (ADS)
Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar
2016-02-01
The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time
Time series analysis for psychological research: examining and forecasting change.
Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.
Time series analysis for psychological research: examining and forecasting change
Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341
FROG: Time Series Analysis for the Web Service Era
NASA Astrophysics Data System (ADS)
Allan, A.
2005-12-01
The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).
Chaotic time series analysis in economics: Balance and perspectives
Faggini, Marisa
2014-12-15
The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.
Gompel, Jamie J. Van; Alikhani, Puya; Youssef, A. Samy; Loveren, Harry R. van; Boyev, K. Paul; Agazzi, Sivero
2015-01-01
Objective Anterior petrosectomy(AP) was popularized in the 1980s and 1990s as micro-neurosurgery proliferated. Original reports concentrated on the anatomy of the approach and small case series. Recently, with the advent of additional endonasal approaches to the petrous apex, the morbidity of AP remains unclear. This report details approach-related morbidity around and under the temporal lobe. Methods A total of 46 consecutive patients identified from our surgical database were reviewed retrospectively. Results Of the 46 patients, 61% were women. Median age of the patients was 50 years (mean: 48 ± 2 years). Median follow-up of this cohort was 66 months. Most procedures dealt with intradural pathology (n = 40 [87%]). Approach-related morbidity consisted of only two patients (4%) with new postoperative seizures. There were only two significant postoperative hemorrhages (4%). Cerebrospinal fluid leakage occurred in two patients (4%) requiring reoperation. Conclusion Approach-related complications such as seizures and hematoma were infrequent in this series, < 4%. This report describes a contemporary group of patients treated with open AP and should serve as a comparison for approach-related morbidity of endoscopic approaches. Given the pathologies treated with this approach, the morbidity appears acceptable. PMID:26401480
Van Gompel, Jamie J; Alikhani, Puya; Youssef, A Samy; Loveren, Harry R van; Boyev, K Paul; Agazzi, Sivero
2015-09-01
Objective Anterior petrosectomy(AP) was popularized in the 1980s and 1990s as micro-neurosurgery proliferated. Original reports concentrated on the anatomy of the approach and small case series. Recently, with the advent of additional endonasal approaches to the petrous apex, the morbidity of AP remains unclear. This report details approach-related morbidity around and under the temporal lobe. Methods A total of 46 consecutive patients identified from our surgical database were reviewed retrospectively. Results Of the 46 patients, 61% were women. Median age of the patients was 50 years (mean: 48 ± 2 years). Median follow-up of this cohort was 66 months. Most procedures dealt with intradural pathology (n = 40 [87%]). Approach-related morbidity consisted of only two patients (4%) with new postoperative seizures. There were only two significant postoperative hemorrhages (4%). Cerebrospinal fluid leakage occurred in two patients (4%) requiring reoperation. Conclusion Approach-related complications such as seizures and hematoma were infrequent in this series, < 4%. This report describes a contemporary group of patients treated with open AP and should serve as a comparison for approach-related morbidity of endoscopic approaches. Given the pathologies treated with this approach, the morbidity appears acceptable.
Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability.
Krafty, Robert T
2016-07-01
Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modeled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possess favorable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability.
Lehman, Li-wei; Ghassemi, Mohammad; Snoek, Jasper; Nemati, Shamim
2016-01-01
In this work, we propose a stacked switching vector-autoregressive (SVAR)-CNN architecture to model the changing dynamics in physiological time series for patient prognosis. The SVAR-layer extracts dynamical features (or modes) from the time-series, which are then fed into the CNN-layer to extract higher-level features representative of transition patterns among the dynamical modes. We evaluate our approach using 8-hours of minute-by-minute mean arterial blood pressure (BP) from over 450 patients in the MIMIC-II database. We modeled the time-series using a third-order SVAR process with 20 modes, resulting in first-level dynamical features of size 20×480 per patient. A fully connected CNN is then used to learn hierarchical features from these inputs, and to predict hospital mortality. The combined CNN/SVAR approach using BP time-series achieved a median and interquartile-range AUC of 0.74 [0.69, 0.75], significantly outperforming CNN-alone (0.54 [0.46, 0.59]), and SVAR-alone with logistic regression (0.69 [0.65, 0.72]). Our results indicate that including an SVAR layer improves the ability of CNNs to classify nonlinear and nonstationary time-series. PMID:27790623
Weighted permutation entropy based on different symbolic approaches for financial time series
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2016-02-01
In this paper, we introduce weighted permutation entropy (WPE) and three different symbolic approaches to investigate the complexities of stock time series containing amplitude-coded information and explore the influence of using different symbolic approaches on obtained WPE results. We employ WPE based on symbolic approaches to the US and Chinese stock markets and make a comparison between the results of US and Chinese stock markets. Three symbolic approaches are able to help the complexity containing in the stock time series by WPE method drop whatever the embedding dimension is. The similarity between these stock markets can be detected by the WPE based on Binary Δ-coding-method, while the difference between them can be revealed by the WPE based on σ-method, Max-min-method. The combinations of the symbolic approaches: σ-method and Max-min-method, and WPE method are capable of reflecting the multiscale structure of complexity by different time delay and analyze the differences between complexities of stock time series in more detail and more accurately. Furthermore, the correlations between stock markets in the same region and the similarities hidden in the S&P500 and DJI, ShangZheng and ShenCheng are uncovered by the comparison of the WPE based on Binary Δ-coding-method of six stock markets.
McKinney, B A; Crowe, J E; Voss, H U; Crooke, P S; Barney, N; Moore, J H
2006-02-01
We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual's response to the smallpox vaccine.
NASA Astrophysics Data System (ADS)
McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.
2006-02-01
We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.
Permutation approach, high frequency trading and variety of micro patterns in financial time series
NASA Astrophysics Data System (ADS)
Aghamohammadi, Cina; Ebrahimian, Mehran; Tahmooresi, Hamed
2014-11-01
Permutation approach is suggested as a method to investigate financial time series in micro scales. The method is used to see how high frequency trading in recent years has affected the micro patterns which may be seen in financial time series. Tick to tick exchange rates are considered as examples. It is seen that variety of patterns evolve through time; and that the scale over which the target markets have no dominant patterns, have decreased steadily over time with the emergence of higher frequency trading.
Amirjamshidi, Abbas; Abbasioun, Kazem; Amiri, Rouzbeh Shams; Ardalan, Ali; Hashemi, Seyyed Mahmood Ramak
2015-01-01
Background: Sphenoid wing meningiomas extending to the orbit (ePMSW) are currently removed through several transcranial approaches. Presenting the largest surgical cohort of hyperostosing ePMSW with the longest follow up period, we will provide data supporting minilateral orbitotomy with excellent exposure for wide resection of all compartments of the tumor. Methods: A retrospective survival analysis is made of the data cumulated prospectively during a period of 34 years, including 88 cases of ePMSW with a mean follow up period of 136.4 months. The impact of preoperative variables upon different outcome measures is evaluated. Standard pterional craniotomy was performed in 12 patients (C) while the other 76 cases underwent the proposed modified lateral miniorbitotomy (LO). Results: There were 31 men and 57 women. The age range varied between 12 and 70 years. Patients presented with unilateral exophthalmos (Uex) ranging between 3 and 16 mm. Duration of proptosis before operation varied between 6 months and 16 years. The status of visual acuity (VA) prior to operation was: no light perception (NLP) in 16, light perception (LP) up to 0.2 in 3, 0.3–0.5 in 22, 0.6–0.9 in 24, and full vision in 23 patients. Postoperatively, acceptable cosmetic appearance of the eyes was seen in 38 cases and in 46 mild inequality of < 2 mm was detected. Four cases had mild enophthalmos (En). Among those who had the worst VA, two improved and one became almost blind after operation. The cases with VA in the range of 0.3–0.5 improved. Among those with good VA (0.5 to full vision), 2 became blind, vision diminished in 10, and improved or remained full in the other 35 cases. Tumor recurrence occurred in 33.3% of group C and 10.5% of group LO (P = 0.05). The major determinant of tumor regrowth was the technique of LO (P = 0.008). Conclusion: Using LO technique, the risky corners involved by the tumor is visualized from the latero-inferior side rather than from the latero-superior avenue
Time series clustering analysis of health-promoting behavior
NASA Astrophysics Data System (ADS)
Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng
2013-10-01
Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.
Monthly hail time series analysis related to agricultural insurance
NASA Astrophysics Data System (ADS)
Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.
2010-05-01
Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.
On statistical inference in time series analysis of the evolution of road safety.
Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora
2013-11-01
Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research.
An iterative approach to optimize change classification in SAR time series data
NASA Astrophysics Data System (ADS)
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2016-10-01
The detection of changes using remote sensing imagery has become a broad field of research with many approaches for many different applications. Besides the simple detection of changes between at least two images acquired at different times, analyses which aim on the change type or category are at least equally important. In this study, an approach for a semi-automatic classification of change segments is presented. A sparse dataset is considered to ensure the fast and simple applicability for practical issues. The dataset is given by 15 high resolution (HR) TerraSAR-X (TSX) amplitude images acquired over a time period of one year (11/2013 to 11/2014). The scenery contains the airport of Stuttgart (GER) and its surroundings, including urban, rural, and suburban areas. Time series imagery offers the advantage of analyzing the change frequency of selected areas. In this study, the focus is set on the analysis of small-sized high frequently changing regions like parking areas, construction sites and collecting points consisting of high activity (HA) change objects. For each HA change object, suitable features are extracted and a k-means clustering is applied as the categorization step. Resulting clusters are finally compared to a previously introduced knowledge-based class catalogue, which is modified until an optimal class description results. In other words, the subjective understanding of the scenery semantics is optimized by the data given reality. Doing so, an even sparsely dataset containing only amplitude imagery can be evaluated without requiring comprehensive training datasets. Falsely defined classes might be rejected. Furthermore, classes which were defined too coarsely might be divided into sub-classes. Consequently, classes which were initially defined too narrowly might be merged. An optimal classification results when the combination of previously defined key indicators (e.g., number of clusters per class) reaches an optimum.
Revuelta-Gutiérrez, Rogelio; Morales-Martínez, Andres Humberto; Mejías-Soto, Carolina; Martínez-Anda, Jaime Jesús; Ortega-Porcayo, Luis Alberto
2016-01-01
Background: Glossopharyngeal neuralgia (GPN) is an uncommon craniofacial pain syndrome. It is characterized by a sudden onset lancinating pain usually localized in the sensory distribution of the IX cranial nerve associated with excessive vagal outflow, which leads to bradycardia, hypotension, syncope, or cardiac arrest. This study aims to review our surgical experience performing microvascular decompression (MVD) in patients with GPN. Methods: Over the last 20 years, 14 consecutive cases were diagnosed with GPN. MVD using a microasterional approach was performed in all patients. Demographic data, clinical presentation, surgical findings, clinical outcome, complications, and long-term follow-up were reviewed. Results: The median age of onset was 58.7 years. The mean time from onset of symptoms to treatment was 8.8 years. Glossopharyngeal and vagus nerve compression was from the posterior inferior cerebellar artery in eleven cases (78.5%), vertebral artery in two cases (14.2%), and choroid plexus in one case (7.1%). Postoperative mean follow-up was 26 months (3–180 months). Pain analysis demonstrated long-term pain improvement of 114 ± 27.1 months and pain remission in 13 patients (92.9%) (P = 0.0001) two complications were documented, one patient had a cerebrospinal fluid leak, and another had bacterial meningitis. There was no surgical mortality. Conclusions: GPN is a rare entity, and secondary causes should be discarded. MVD through a retractorless microasterional approach is a safe and effective technique. Our series demonstrated an excellent clinical outcome with pain remission in 92.9%. PMID:27213105
[Causal analysis approaches in epidemiology].
Dumas, O; Siroux, V; Le Moual, N; Varraso, R
2014-02-01
Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the
NASA Astrophysics Data System (ADS)
Balidakis, Kyriakos; Heinkelmann, Robert; Lu, Cuixian; Soja, Benedikt; Karbon, Maria; Nilsson, Tobias; Glaser, Susanne; Andres Mora-Diaz, Julian; Anderson, James; Liu, Li; Raposo-Pulido, Virginia; Xu, Minghui; Schuh, Harald
2015-04-01
Time series of meteorological parameters recorded at VLBI (Very Long Baseline Interferometry) observatories allow us to realistically model and consequently to eliminate the atmosphere-induced effects in the VLBI products to a large extent. Nevertheless, this advantage of VLBI is not fully exploited since such information is contaminated with inconsistencies, such as uncertainties regarding the calibration and location of the meteorological sensors, outliers, missing data points, and breaks. It has been shown that such inconsistencies in meteorological data used for VLBI data analysis impose problems in the geodetic products (e.g vertical site position) and result in mistakes in geophysical interpretation. The aim of the procedure followed here is to optimally model the tropospheric delay and bending effects that are still the main sources of error in VLBI data analysis. In this study, the meteorological data recorded with sensors mounted in the vicinity of VLBI stations have been homogenized spanning the period from 1979 until today. In order to meet this objective, inhomogeneities were detected and adjusted using test results and metadata. Some of the approaches employed include Alexandersson's Standard Normal Homogeneity Test and an iterative procedure, of which the segmentation part is based on a dynamic programming algorithm and the functional part on a LASSO (Least Absolute Shrinkage and Selection Operator) estimator procedure. For the provision of reference time series that are necessary to apply the aforementioned methods, ECMWF's (European Centre for Medium-Range Weather Forecasts) ERA-Interim reanalysis surface data were employed. Special care was taken regarding the datum definition of this model. Due to the significant height difference between the VLBI antenna's reference point and the elevation included in geopotential fields of the specific numerical weather models, a hypsometric adjustment is applied using the absolute pressure level from the WMO
Time series modeling by a regression approach based on a latent process.
Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice
2009-01-01
Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.
Practical approaches in accident analysis
NASA Astrophysics Data System (ADS)
Stock, M.
An accident analysis technique based on successive application of structural response, explosion dynamics, gas cloud formation, and plant operation failure mode models is proposed. The method takes into account the nonideal explosion characteristic of a deflagration in the unconfined cloud. The resulting pressure wave differs significantly from a shock wave and the response of structures like lamp posts and walls can differ correspondingly. This gives a more realistic insight into explosion courses than a simple TNT-equivalent approach.
Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool
NASA Technical Reports Server (NTRS)
McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall
2008-01-01
The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify
A new approach to calibrate steady groundwater flow models with time series of head observations
NASA Astrophysics Data System (ADS)
Obergfell, C.; Bakker, M.; Maas, C.
2012-04-01
We developed a new method to calibrate aquifer parameters of steady-state well field models using measured time series of head fluctuations. Our method is an alternative to standard pumping tests and is based on time series analysis using parametric impulse response functions. First, the pumping influence is isolated from the overall groundwater fluctuation observed at monitoring wells around the well field, and response functions are determined for each individual well. Time series parameters are optimized using a quasi-Newton algorithm. For one monitoring well, time series model parameters are also optimized by means of SCEM-UA, a Markov Chain Monte Carlo algorithm, as a control on the validity of the parameters obtained by the faster quasi-Newton method. Subsequently, the drawdown corresponding to an average yearly pumping rate is calculated from the response functions determined by time series analysis. The drawdown values estimated with acceptable confidence intervals are used as calibration targets of a steady groundwater flow model. A case study is presented of the drinking water supply well field of Waalwijk (Netherlands). In this case study, a uniform aquifer transmissivity is optimized together with the conductance of ditches in the vicinity of the well field. Groundwater recharge or boundary heads do not have to be entered, which eliminates two import sources of uncertainty. The method constitutes a cost-efficient alternative to pumping tests and allows the determination of pumping influences without changes in well field operation.
Inorganic chemical analysis of environmental materials—A lecture series
Crock, J.G.; Lamothe, P.J.
2011-01-01
At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.
Analysis of temperature time-series: Embedding dynamics into the MDS method
NASA Astrophysics Data System (ADS)
Lopes, António M.; Tenreiro Machado, J. A.
2014-04-01
Global warming and the associated climate changes are being the subject of intensive research due to their major impact on social, economic and health aspects of the human life. Surface temperature time-series characterise Earth as a slow dynamics spatiotemporal system, evidencing long memory behaviour, typical of fractional order systems. Such phenomena are difficult to model and analyse, demanding for alternative approaches. This paper studies the complex correlations between global temperature time-series using the Multidimensional scaling (MDS) approach. MDS provides a graphical representation of the pattern of climatic similarities between regions around the globe. The similarities are quantified through two mathematical indices that correlate the monthly average temperatures observed in meteorological stations, over a given period of time. Furthermore, time dynamics is analysed by performing the MDS analysis over slices sampling the time series. MDS generates maps describing the stations' locus in the perspective that, if they are perceived to be similar to each other, then they are placed on the map forming clusters. We show that MDS provides an intuitive and useful visual representation of the complex relationships that are present among temperature time-series, which are not perceived on traditional geographic maps. Moreover, MDS avoids sensitivity to the irregular distribution density of the meteorological stations.
Multidimensional stock network analysis: An Escoufier's RV coefficient approach
NASA Astrophysics Data System (ADS)
Lee, Gan Siew; Djauhari, Maman A.
2013-09-01
The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.
Charakopoulos, A K; Karakasidis, T E; Papanicolaou, P N; Liakopoulos, A
2014-03-01
In the present work we approach the hydrodynamic problem of discriminating the state of the turbulent fluid region as a function of the distance from the axis of a turbulent jet axis. More specifically, we analyzed temperature fluctuations in vertical turbulent heated jets where temperature time series were recorded along a horizontal line through the jet axis. We employed data from different sets of experiments with various initial conditions out of circular and elliptical shaped nozzles in order to identify time series taken at the jet axis, and discriminate them from those taken near the boundary with ambient fluid using nonconventional hydrodynamics methods. For each temperature time series measured at a different distance from jet axis, we estimated mainly nonlinear measures such as mutual information combined with descriptive statistics measures, as well as some linear and nonlinear dynamic detectors such as Hurst exponent, detrended fluctuation analysis, and Hjorth parameters. The results obtained in all cases have shown that the proposed methodology allows us to distinguish the flow regime around the jet axis and identify the time series corresponding to the jet axis in agreement with the conventional statistical hydrodynamic method. Furthermore, in order to reject the null hypothesis that the time series originate from a stochastic process, we applied the surrogate data method.
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-02-20
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-01-01
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.
Forced response approach of a parametric vibration with a trigonometric series
NASA Astrophysics Data System (ADS)
Huang, Dishan
2015-02-01
A forced vibration problem with parametric stiffness is modeled by feedback structure in this manuscript, and the forced response is expressed as a special trigonometric series. The forced response of this problem is determined by algebraic equation. By applying harmonic balance and limitation operation, all coefficients of the harmonic components in the forced response solution are fully approached. The investigation result shows that the new approach has an advantage in the computational time and accuracy, and it is very significant for the theoretical research and engineering application in dealing with the problem of forced parametric vibration.
Swetapadma, Aleena; Yadav, Anamika
2015-01-01
Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance.
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.
2005-01-01
We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.
Time series analysis for minority game simulations of financial markets
NASA Astrophysics Data System (ADS)
Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy
2003-04-01
The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.
An Alternative Approach to Atopic Dermatitis: Part I—Case-Series Presentation
2004-01-01
Atopic dermatitis (AD) is a complex disease of obscure pathogenesis. A substantial portion of AD patients treated with conventional therapy become intractable after several cycles of recurrence. Over the last 20 years we have developed an alternative approach to treat many of these patients by diet and Kampo herbal medicine. However, as our approach is highly individualized and the Kampo formulae sometimes complicated, it is not easy to provide evidence to establish usefulness of this approach. In this Review, to demonstrate the effectiveness of the method of individualized Kampo therapy, results are presented for a series of patients who had failed with conventional therapy but were treated afterwards in our institution. Based on these data, we contend that there exist a definite subgroup of AD patients in whom conventional therapy fails, but the ‘Diet and Kampo’ approach succeeds, to heal. Therefore, this approach should be considered seriously as a second-line treatment for AD patients. In the Discussion, we review the evidential status of the current conventional strategies for AD treatment in general, and then specifically discuss the possibility of integrating Kampo regimens into it, taking our case-series presented here as evidential basis. We emphasize that Kampo therapy for AD is more ‘art’ than technology, for which expertise is an essential pre-requisite. PMID:15257326
Geostatistical analysis as applied to two environmental radiometric time series.
Dowdall, Mark; Lind, Bjørn; Gerland, Sebastian; Rudjord, Anne Liv
2003-03-01
This article details the results of an investigation into the application of geostatistical data analysis to two environmental radiometric time series. The data series employed consist of 99Tc values for seaweed (Fucus vesiculosus) and seawater samples taken as part of a marine monitoring program conducted on the coast of northern Norway by the Norwegian Radiation Protection Authority. Geostatistical methods were selected in order to provide information on values of the variables at unsampled times and to investigate the temporal correlation exhibited by the data sets. This information is of use in the optimisation of future sampling schemes and for providing information on the temporal behaviour of the variables in question that may not be obtained during a cursory analysis. The results indicate a high degree of temporal correlation within the data sets, the correlation for the seawater and seaweed data being modelled with an exponential and linear function, respectively. The semi-variogram for the seawater data indicates a temporal range of correlation of approximately 395 days with no apparent random component to the overall variance structure and was described best by an exponential function. The temporal structure of the seaweed data was best modelled by a linear function with a small nugget component. Evidence of drift was present in both semi-variograms. Interpolation of the data sets using the fitted models and a simple kriging procedure were compared, using a cross-validation procedure, with simple linear interpolation. Results of this exercise indicate that, for the seawater data, the kriging procedure outperformed the simple interpolation with respect to error distribution and correlation of estimates with actual values. Using the unbounded linear model with the seaweed data produced estimates that were only marginally better than those produced by the simple interpolation.
Visibility graph analysis for re-sampled time series from auto-regressive stochastic processes
NASA Astrophysics Data System (ADS)
Zhang, Rong; Zou, Yong; Zhou, Jie; Gao, Zhong-Ke; Guan, Shuguang
2017-01-01
Visibility graph (VG) and horizontal visibility graph (HVG) play a crucial role in modern complex network approaches to nonlinear time series analysis. However, depending on the underlying dynamic processes, it remains to characterize the exponents of presumably exponential degree distributions. It has been recently conjectured that there is a critical value of exponent λc = ln 3 / 2 , which separates chaotic from correlated stochastic processes. Here, we systematically apply (H)VG analysis to time series from autoregressive (AR) models, which confirms the hypothesis that an increased correlation length results in larger values of λ > λc. On the other hand, we numerically find a regime of negatively correlated process increments where λ < λc, which is in contrast to this hypothesis. Furthermore, by constructing graphs based on re-sampled time series, we find that network measures show non-trivial dependencies on the autocorrelation functions of the processes. We propose to choose the decorrelation time as the maximal re-sampling delay for the algorithm. Our results are detailed for time series from AR(1) and AR(2) processes.
Multiscale multifractal detrended cross-correlation analysis of financial time series
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian; Wang, Jing; Lin, Aijing
2014-06-01
In this paper, we introduce a method called multiscale multifractal detrended cross-correlation analysis (MM-DCCA). The method allows us to extend the description of the cross-correlation properties between two time series. MM-DCCA may provide new ways of measuring the nonlinearity of two signals, and it helps to present much richer information than multifractal detrended cross-correlation analysis (MF-DCCA) by sweeping all the range of scale at which the multifractal structures of complex system are discussed. Moreover, to illustrate the advantages of this approach we make use of the MM-DCCA to analyze the cross-correlation properties between financial time series. We show that this new method can be adapted to investigate stock markets under investigation. It can provide a more faithful and more interpretable description of the dynamic mechanism between financial time series than traditional MF-DCCA. We also propose to reduce the scale ranges to analyze short time series, and some inherent properties which remain hidden when a wide range is used may exhibit perfectly in this way.
Finite element techniques in computational time series analysis of turbulent flows
NASA Astrophysics Data System (ADS)
Horenko, I.
2009-04-01
In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical
Interrupted time-series analysis: studying trends in neurosurgery.
Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K
2015-12-01
OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.
Inverting geodetic time series with a principal component analysis-based inversion method
NASA Astrophysics Data System (ADS)
Kositsky, A. P.; Avouac, J.-P.
2010-03-01
The Global Positioning System (GPS) system now makes it possible to monitor deformation of the Earth's surface along plate boundaries with unprecedented accuracy. In theory, the spatiotemporal evolution of slip on the plate boundary at depth, associated with either seismic or aseismic slip, can be inferred from these measurements through some inversion procedure based on the theory of dislocations in an elastic half-space. We describe and test a principal component analysis-based inversion method (PCAIM), an inversion strategy that relies on principal component analysis of the surface displacement time series. We prove that the fault slip history can be recovered from the inversion of each principal component. Because PCAIM does not require externally imposed temporal filtering, it can deal with any kind of time variation of fault slip. We test the approach by applying the technique to synthetic geodetic time series to show that a complicated slip history combining coseismic, postseismic, and nonstationary interseismic slip can be retrieved from this approach. PCAIM produces slip models comparable to those obtained from standard inversion techniques with less computational complexity. We also compare an afterslip model derived from the PCAIM inversion of postseismic displacements following the 2005 8.6 Nias earthquake with another solution obtained from the extended network inversion filter (ENIF). We introduce several extensions of the algorithm to allow statistically rigorous integration of multiple data sources (e.g., both GPS and interferometric synthetic aperture radar time series) over multiple timescales. PCAIM can be generalized to any linear inversion algorithm.
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
An approach for estimating time-variable rates from geodetic time series
NASA Astrophysics Data System (ADS)
Didova, Olga; Gunter, Brian; Riva, Riccardo; Klees, Roland; Roese-Koerner, Lutz
2016-11-01
There has been considerable research in the literature focused on computing and forecasting sea-level changes in terms of constant trends or rates. The Antarctic ice sheet is one of the main contributors to sea-level change with highly uncertain rates of glacial thinning and accumulation. Geodetic observing systems such as the Gravity Recovery and Climate Experiment (GRACE) and the Global Positioning System (GPS) are routinely used to estimate these trends. In an effort to improve the accuracy and reliability of these trends, this study investigates a technique that allows the estimated rates, along with co-estimated seasonal components, to vary in time. For this, state space models are defined and then solved by a Kalman filter (KF). The reliable estimation of noise parameters is one of the main problems encountered when using a KF approach, which is solved by numerically optimizing likelihood. Since the optimization problem is non-convex, it is challenging to find an optimal solution. To address this issue, we limited the parameter search space using classical least-squares adjustment (LSA). In this context, we also tested the usage of inequality constraints by directly verifying whether they are supported by the data. The suggested technique for time-series analysis is expanded to classify and handle time-correlated observational noise within the state space framework. The performance of the method is demonstrated using GRACE and GPS data at the CAS1 station located in East Antarctica and compared to commonly used LSA. The results suggest that the outlined technique allows for more reliable trend estimates, as well as for more physically valuable interpretations, while validating independent observing systems.
Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.
Ak, Ronay; Fink, Olga; Zio, Enrico
2016-08-01
The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.
Three approaches to reliability analysis
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.
1989-01-01
It is noted that current reliability analysis tools differ not only in their solution techniques, but also in their approach to model abstraction. The analyst must be satisfied with the constraints that are intrinsic to any combination of solution technique and model abstraction. To get a better idea of the nature of these constraints, three reliability analysis tools (HARP, ASSIST/SURE, and CAME) were used to model portions of the Integrated Airframe/Propulsion Control System architecture. When presented with the example problem, all three tools failed to produce correct results. In all cases, either the tool or the model had to be modified. It is suggested that most of the difficulty is rooted in the large model size and long computational times which are characteristic of Markov model solutions.
Time Series Analysis of Symbiotic Stars and Cataclysmic Variables
NASA Astrophysics Data System (ADS)
Ren, Jiaying; MacLachlan, G.; Panchmal, A.; Dhuga, K.; Morris, D.
2010-01-01
Symbiotic stars (SSs) and Cataclysmic Variables (CVs) are two families of binary systems which occasionally vary in brightness because of accretion from the secondary star. High frequency oscillations, also known as flickering, are thought to occur because of turbulence in the accretion disk especially in and near the vicinity of the boundary layer between the surface of the compact object and the inner edge of the disk. Lower frequency oscillations are also observed but these are typically associated with the orbital and spin motions of the binary system and may be modulated by the presence of a magnetic field. By studying these variations, we probe the emission regions in these compact systems and gain a better understanding of the accretion process. Time-ordered series of apparent magnitudes for several SSs and CVs, obtained from the American Association of Variable Star Observers (AAVSO), have been analyzed. The analysis techniques include Power Spectral Densities, Rescaled R/S Analysis, and Discrete Wavelet Transforms. The results are used to estimate a Hurst exponent which is a measure of long-range memory dependence and self-similarity.
Multi-Granular Trend Detection for Time-Series Analysis.
Arthur Van, Goethem; Staals, Frank; Loffler, Maarten; Dykes, Jason; Speckmann, Bettina
2017-01-01
Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data sets. Trend detection is an effective way to simplify time-varying data and to summarize salient information for visual display and interactive analysis. We propose a geometric model for trend-detection in one-dimensional time-varying data, inspired by topological grouping structures for moving objects in two- or higher-dimensional space. Our model gives provable guarantees on the trends detected and uses three natural parameters: granularity, support-size, and duration. These parameters can be changed on-demand. Our system also supports a variety of selection brushes and a time-sweep to facilitate refined searches and interactive visualization of (sub-)trends. We explore different visual styles and interactions through which trends, their persistence, and evolution can be explored.
Financial time series analysis based on effective phase transfer entropy
NASA Astrophysics Data System (ADS)
Yang, Pengbo; Shang, Pengjian; Lin, Aijing
2017-02-01
Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.
2011-01-01
Background Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. Methods We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Results Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of
Evaluating disease management program effectiveness: an introduction to time-series analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2003-01-01
Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
Dequéant, Mary-Lee; Fagegaltier, Delphine; Hu, Yanhui; Spirohn, Kerstin; Simcox, Amanda; Hannon, Gregory J.; Perrimon, Norbert
2015-01-01
The use of time series profiling to identify groups of functionally related genes (synexpression groups) is a powerful approach for the discovery of gene function. Here we apply this strategy during RasV12 immortalization of Drosophila embryonic cells, a phenomenon not well characterized. Using high-resolution transcriptional time-series datasets, we generated a gene network based on temporal expression profile similarities. This analysis revealed that common immortalized cells are related to adult muscle precursors (AMPs), a stem cell-like population contributing to adult muscles and sharing properties with vertebrate satellite cells. Remarkably, the immortalized cells retained the capacity for myogenic differentiation when treated with the steroid hormone ecdysone. Further, we validated in vivo the transcription factor CG9650, the ortholog of mammalian Bcl11a/b, as a regulator of AMP proliferation predicted by our analysis. Our study demonstrates the power of time series synexpression analysis to characterize Drosophila embryonic progenitor lines and identify stem/progenitor cell regulators. PMID:26438832
NASA Astrophysics Data System (ADS)
Liu, Bin; Dai, Wujiao; Peng, Wei; Meng, Xiaolin
2015-11-01
GPS has been widely used in the field of geodesy and geodynamics thanks to its technology development and the improvement of positioning accuracy. A time series observed by GPS in vertical direction usually contains tectonic signals, non-tectonic signals, residual atmospheric delay, measurement noise, etc. Analyzing these information is the basis of crustal deformation research. Furthermore, analyzing the GPS time series and extracting the non-tectonic information are helpful to study the effect of various geophysical events. Principal component analysis (PCA) is an effective tool for spatiotemporal filtering and GPS time series analysis. But as it is unable to extract statistically independent components, PCA is unfavorable for achieving the implicit information in time series. Independent component analysis (ICA) is a statistical method of blind source separation (BSS) and can separate original signals from mixed observations. In this paper, ICA is used as a spatiotemporal filtering method to analyze the spatial and temporal features of vertical GPS coordinate time series in the UK and Sichuan-Yunnan region in China. Meanwhile, the contributions from atmospheric and soil moisture mass loading are evaluated. The analysis of the relevance between the independent components and mass loading with their spatial distribution shows that the signals extracted by ICA have a strong correlation with the non-tectonic deformation, indicating that ICA has a better performance in spatiotemporal analysis.
Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay
2013-12-01
Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations.
A New Approach To Teaching Dimensional Analysis.
ERIC Educational Resources Information Center
Churchill, Stuart W.
1997-01-01
Explains an approach to teaching dimensional analysis that differs slightly from the traditional approach. The difference lies in the novelty of exposition in the presentation and interpretation of dimensional analysis as a speculative process. (DDR)
NASA Astrophysics Data System (ADS)
Chen, Wei-Shing
2011-04-01
The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.
Traffic time series analysis by using multiscale time irreversibility and entropy.
Wang, Xuejiao; Shang, Pengjian; Fang, Jintang
2014-09-01
Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method-multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy.
Water Resources Management Plan for Ganga River using SWAT Modelling and Time series Analysis
NASA Astrophysics Data System (ADS)
Satish, L. N. V.
2015-12-01
Water resources management of the Ganga River is one of the primary objectives of National Ganga River Basin Environmental Management Plan. The present study aims to carry out water balance study and development of appropriate methodologies to compute environmental flow in the middle Ganga river basin between Patna-Farraka, India. The methodology adopted here are set-up a hydrological model to estimate monthly discharge at the tributaries under natural condition, hydrological alternation analysis of both observed and simulated discharge series, flow health analysis to obtain status of the stream health in the last 4 decades and estimating the e-flow using flow health indicators. ArcSWAT, was used to simulate 8 tributaries namely Kosi, Gandak and others. This modelling is quite encouraging and helps to provide the monthly water balance analysis for all tributaries for this study. The water balance analysis indicates significant change in surface and ground water interaction pattern within the study time period Indicators of hydrological alternation has been used for both observed and simulated data series to quantify hydrological alternation occurred in the tributaries and the main river in the last 4 decades,. For temporal variation of stream health, flow health tool has been used for observed and simulated discharge data. A detailed stream health analysis has been performed by considering 3 approaches based on i) observed flow time series, ii) observed and simulated flow time series and iii) simulated flow time series at small upland basin, major tributary and main Ganga river basin levels. At upland basin level, these approaches show that stream health and its temporal variations are good with non-significant temporal variation. At major tributary level, the stream health and its temporal variations are found to be deteriorating from 1970s. At the main Ganga reach level river health and its temporal variations does not show any declining trend. Finally, E- flows
Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.
Durdu, Omer Faruk
2010-10-01
In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic
Approximate Symmetry Reduction Approach: Infinite Series Reductions to the KdV-Burgers Equation
NASA Astrophysics Data System (ADS)
Jiao, Xiaoyu; Yao, Ruoxia; Zhang, Shunli; Lou, Sen Y.
2009-11-01
For weak dispersion and weak dissipation cases, the (1+1)-dimensional KdV-Burgers equation is investigated in terms of approximate symmetry reduction approach. The formal coherence of similarity reduction solutions and similarity reduction equations of different orders enables series reduction solutions. For the weak dissipation case, zero-order similarity solutions satisfy the Painlevé II, Painlevé I, and Jacobi elliptic function equations. For the weak dispersion case, zero-order similarity solutions are in the form of Kummer, Airy, and hyperbolic tangent functions. Higher-order similarity solutions can be obtained by solving linear variable coefficients ordinary differential equations.
Volterra Series Approach for Nonlinear Aeroelastic Response of 2-D Lifting Surfaces
NASA Technical Reports Server (NTRS)
Silva, Walter A.; Marzocca, Piergiovanni; Librescu, Liviu
2001-01-01
The problem of the determination of the subcritical aeroelastic response and flutter instability of nonlinear two-dimensional lifting surfaces in an incompressible flow-field via Volterra series approach is addressed. The related aeroelastic governing equations are based upon the inclusion of structural nonlinearities, of the linear unsteady aerodynamics and consideration of an arbitrary time-dependent external pressure pulse. Unsteady aeroelastic nonlinear kernels are determined, and based on these, frequency and time histories of the subcritical aeroelastic response are obtained, and in this context the influence of geometric nonlinearities is emphasized. Conclusions and results displaying the implications of the considered effects are supplied.
Stochastic time series analysis of fetal heart-rate variability
NASA Astrophysics Data System (ADS)
Shariati, M. A.; Dripps, J. H.
1990-06-01
Fetal Heart Rate(FHR) is one of the important features of fetal biophysical activity and its long term monitoring is used for the antepartum(period of pregnancy before labour) assessment of fetal well being. But as yet no successful method has been proposed to quantitatively represent variety of random non-white patterns seen in FHR. Objective of this paper is to address this issue. In this study the Box-Jenkins method of model identification and diagnostic checking was used on phonocardiographic derived FHR(averaged) time series. Models remained exclusively autoregressive(AR). Kalman filtering in conjunction with maximum likelihood estimation technique forms the parametric estimator. Diagnosrics perfonned on the residuals indicated that a second order model may be adequate in capturing type of variability observed in 1 up to 2 mm data windows of FHR. The scheme may be viewed as a means of data reduction of a highly redundant information source. This allows a much more efficient transmission of FHR information from remote locations to places with facilities and expertise for doser analysis. The extracted parameters is aimed to reflect numerically the important FHR features. These are normally picked up visually by experts for their assessments. As a result long term FHR recorded during antepartum period could then be screened quantitatively for detection of patterns considered normal or abnonnal. 1.
On the Fourier and Wavelet Analysis of Coronal Time Series
NASA Astrophysics Data System (ADS)
Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.
2016-07-01
Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.
Chaotic time series analysis of vision evoked EEG
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Wang, Hong
2009-12-01
To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.
Chaotic time series analysis of vision evoked EEG
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Wang, Hong
2010-01-01
To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.
The Prediction of Teacher Turnover Employing Time Series Analysis.
ERIC Educational Resources Information Center
Costa, Crist H.
The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…
Analytical framework for recurrence network analysis of time series.
Donges, Jonathan F; Heitzig, Jobst; Donner, Reik V; Kurths, Jürgen
2012-04-01
Recurrence networks are a powerful nonlinear tool for time series analysis of complex dynamical systems. While there are already many successful applications ranging from medicine to paleoclimatology, a solid theoretical foundation of the method has still been missing so far. Here, we interpret an ɛ-recurrence network as a discrete subnetwork of a "continuous" graph with uncountably many vertices and edges corresponding to the system's attractor. This step allows us to show that various statistical measures commonly used in complex network analysis can be seen as discrete estimators of newly defined continuous measures of certain complex geometric properties of the attractor on the scale given by ɛ. In particular, we introduce local measures such as the ɛ-clustering coefficient, mesoscopic measures such as ɛ-motif density, path-based measures such as ɛ-betweennesses, and global measures such as ɛ-efficiency. This new analytical basis for the so far heuristically motivated network measures also provides an objective criterion for the choice of ɛ via a percolation threshold, and it shows that estimation can be improved by so-called node splitting invariant versions of the measures. We finally illustrate the framework for a number of archetypical chaotic attractors such as those of the Bernoulli and logistic maps, periodic and two-dimensional quasiperiodic motions, and for hyperballs and hypercubes by deriving analytical expressions for the novel measures and comparing them with data from numerical experiments. More generally, the theoretical framework put forward in this work describes random geometric graphs and other networks with spatial constraints, which appear frequently in disciplines ranging from biology to climate science.
Time series analysis of collective motions in proteins
NASA Astrophysics Data System (ADS)
Alakent, Burak; Doruker, Pemra; ćamurdan, Mehmet C.
2004-01-01
The dynamics of α-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the Cα atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm-1 range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers.
Dowling, Thomas E; Turner, Thomas F; Carson, Evan W; Saltzgiver, Melody J; Adams, Deborah; Kesner, Brian; Marsh, Paul C
2014-01-01
Time-series analysis is used widely in ecology to study complex phenomena and may have considerable potential to clarify relationships of genetic and demographic processes in natural and exploited populations. We explored the utility of this approach to evaluate population responses to management in razorback sucker, a long-lived and fecund, but declining freshwater fish species. A core population in Lake Mohave (Arizona-Nevada, USA) has experienced no natural recruitment for decades and is maintained by harvesting naturally produced larvae from the lake, rearing them in protective custody, and repatriating them at sizes less vulnerable to predation. Analyses of mtDNA and 15 microsatellites characterized for sequential larval cohorts collected over a 15-year time series revealed no changes in geographic structuring but indicated significant increase in mtDNA diversity for the entire population over time. Likewise, ratios of annual effective breeders to annual census size (Nb/Na) increased significantly despite sevenfold reduction of Na. These results indicated that conservation actions diminished near-term extinction risk due to genetic factors and should now focus on increasing numbers of fish in Lake Mohave to ameliorate longer-term risks. More generally, time-series analysis permitted robust testing of trends in genetic diversity, despite low precision of some metrics. PMID:24665337
Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...
2017-02-16
Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less
A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series
NASA Technical Reports Server (NTRS)
Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.
1998-01-01
An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.
A Radial Basis Function Approach to Financial Time Series Analysis
1993-12-01
could lead to higher fidelity models in regions of high data density by dropping constraints of obtaining a spatially uniform fit. Finally. it may be...However. we would like to point out the advantage of looking at eliminating multi - pie parameters simultaneously. In RBF networks. for instance, it makes...using the learning networks introduced in Chapter 1. Although not a substitute for the more traditional arbitrage -based pricing formulas, network
Dai, Hongliang; Lu, Xiwu; Peng, Yonghong; Zou, Haiming; Shi, Jing
2016-12-01
Homogeneous nucleation of hydroxyapatite (HAP) crystallization in high levels of supersaturation solution has a negative effect on phosphorus recovery efficiency because of the poor settleability of the generated HAP microcrystalline. In this study, a new high-performance approach for phosphorus recovery from anaerobic supernatant using three series-coupled air-agitated crystallization reactors was developed and characterized. During 30-day operation, the proposed process showed a high recovery efficiency (∼95.82%) and low microcrystalline ratio (∼3.11%). Particle size analysis showed that the microcrystalline size was successively increased (from 5.81 to 26.32 μm) with the sequence of series-coupled reactors, confirming the conjectural mechanism that a multistage-induced crystallization system provided an appropriate condition for the growth, aggregation, and precipitation of crystallized products. Furthermore, the new process showed a broad spectrum of handling ability for different concentrations of phosphorus-containing solution in the range of 5-350 mg L(-1), and the obtained results of phosphorus conversion ratio and recovery efficiency were more than 92% and 80%, respectively. Overall, these results showed that the new process exhibited an excellent ability of efficient phosphorus recovery as well as wide application scope, and might be used as an effective approach for phosphorus removal and recovery from wastewater.
Time-series intervention analysis of pedestrian countdown timer effects.
Huitema, Bradley E; Van Houten, Ron; Manal, Hana
2014-11-01
Pedestrians account for 40-50% of traffic fatalities in large cities. Several previous studies based on relatively small samples have concluded that Pedestrian Countdown Timers (PCT) may reduce pedestrian crashes at signalized intersections, but other studies report no reduction. The purposes of the present article are to (1) describe a new methodology to evaluate the effectiveness of introducing PCT signals and (2) to present results of applying this methodology to pedestrian crash data collected in a large study carried out in Detroit, Michigan. The study design incorporated within-unit as well as between-unit components. The main focus was on dynamic effects that occurred within the PCT unit of 362 treated sites during the 120 months of the study. An interrupted time-series analysis was developed to evaluate whether change in crash frequency depended upon of the degree to which the countdown timers penetrated the treatment unit. The between-unit component involved comparisons between the treatment unit and a control unit. The overall conclusion is that the introduction of PCT signals in Detroit reduced pedestrian crashes to approximately one-third of the preintervention level. The evidence for this reductionis strong and the change over time was shown to be a function of the extent to which the timers were introduced during the intervention period. There was no general drop-off in crash frequency throughout the baseline interval of over five years; only when the PCT signals were introduced in large numbers was consistent and convincing crash reduction observed. Correspondingly, there was little evidence of change in the control unit.
Pose Estimation from Line Correspondences: A Complete Analysis and A Series of Solutions.
Xu, Chi; Zhang, Lilian; Cheng, Li; Koch, Reinhard
2016-06-20
In this paper we deal with the camera pose estimation problem from a set of 2D/3D line correspondences, which is also known as PnL (Perspective-n-Line) problem. We carry out our study by comparing PnL with the well-studied PnP (Perspective-n-Point) problem, and our contributions are threefold: (1) We provide a complete 3D configuration analysis for P3L, which includes the well-known P3P problem as well as several existing analyses as special cases. (2) By exploring the similarity between PnL and PnP, we propose a new subset-based PnL approach as well as a series of linear-formulation-based PnL approaches inspired by their PnP counterparts. (3) The proposed linear-formulation-based methods can be easily extended to deal with the line and point features simultaneously.
VAET: A Visual Analytics Approach for E-Transactions Time-Series.
Xie, Cong; Chen, Wei; Huang, Xinxin; Hu, Yueqi; Barlowe, Scott; Yang, Jing
2014-12-01
Previous studies on E-transaction time-series have mainly focused on finding temporal trends of transaction behavior. Interesting transactions that are time-stamped and situation-relevant may easily be obscured in a large amount of information. This paper proposes a visual analytics system, Visual Analysis of E-transaction Time-Series (VAET), that allows the analysts to interactively explore large transaction datasets for insights about time-varying transactions. With a set of analyst-determined training samples, VAET automatically estimates the saliency of each transaction in a large time-series using a probabilistic decision tree learner. It provides an effective time-of-saliency (TOS) map where the analysts can explore a large number of transactions at different time granularities. Interesting transactions are further encoded with KnotLines, a compact visual representation that captures both the temporal variations and the contextual connection of transactions. The analysts can thus explore, select, and investigate knotlines of interest. A case study and user study with a real E-transactions dataset (26 million records) demonstrate the effectiveness of VAET.
Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M
2016-01-26
Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP).
Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M
2016-01-01
Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP
NASA Astrophysics Data System (ADS)
Phillips, D. A.; Meertens, C. M.; Hodgkinson, K. M.; Puskas, C. M.; Boler, F. M.; Snett, L.; Mattioli, G. S.
2013-12-01
We present an overview of time series data, tools and services available from UNAVCO along with two specific and compelling examples of geodetic time series analysis. UNAVCO provides a diverse suite of geodetic data products and cyberinfrastructure services to support community research and education. The UNAVCO archive includes data from 2500+ continuous GPS stations, borehole geophysics instruments (strainmeters, seismometers, tiltmeters, pore pressure sensors), and long baseline laser strainmeters. These data span temporal scales from seconds to decades and provide global spatial coverage with regionally focused networks including the EarthScope Plate Boundary Observatory (PBO) and COCONet. This rich, open access dataset is a tremendous resource that enables the exploration, identification and analysis of time varying signals associated with crustal deformation, reference frame determinations, isostatic adjustments, atmospheric phenomena, hydrologic processes and more. UNAVCO provides a suite of time series exploration and analysis resources including static plots, dynamic plotting tools, and data products and services designed to enhance time series analysis. The PBO GPS network allow for identification of ~1 mm level deformation signals. At some GPS stations seasonal signals and longer-term trends in both the vertical and horizontal components can be dominated by effects of hydrological loading from natural and anthropogenic sources. Modeling of hydrologic deformation using GLDAS and a variety of land surface models (NOAH, MOSAIC, VIC and CLM) shows promise for independently modeling hydrologic effects and separating them from tectonic deformation as well as anthropogenic loading sources. A major challenge is to identify where loading is dominant and corrections from GLDAS can apply and where pumping is the dominant signal and corrections are not possible without some other data. In another arena, the PBO strainmeter network was designed to capture small short
NASA Astrophysics Data System (ADS)
Peng, Wei; Dai, Wujiao; Santerre, Rock; Cai, Changsheng; Kuang, Cuilin
2017-02-01
Daily vertical coordinate time series of Global Navigation Satellite System (GNSS) stations usually contains tectonic and non-tectonic deformation signals, residual atmospheric delay signals, measurement noise, etc. In geophysical studies, it is very important to separate various geophysical signals from the GNSS time series to truthfully reflect the effect of mass loadings on crustal deformation. Based on the independence of mass loadings, we combine the Ensemble Empirical Mode Decomposition (EEMD) with the Phase Space Reconstruction-based Independent Component Analysis (PSR-ICA) method to analyze the vertical time series of GNSS reference stations. In the simulation experiment, the seasonal non-tectonic signal is simulated by the sum of the correction of atmospheric mass loading and soil moisture mass loading. The simulated seasonal non-tectonic signal can be separated into two independent signals using the PSR-ICA method, which strongly correlated with atmospheric mass loading and soil moisture mass loading, respectively. Likewise, in the analysis of the vertical time series of GNSS reference stations of Crustal Movement Observation Network of China (CMONOC), similar results have been obtained using the combined EEMD and PSR-ICA method. All these results indicate that the EEMD and PSR-ICA method can effectively separate the independent atmospheric and soil moisture mass loading signals and illustrate the significant cause of the seasonal variation of GNSS vertical time series in the mainland of China.
NASA Astrophysics Data System (ADS)
Bakr, Mahmoud I.; Butler, Adrian P.
2005-01-01
Nonstationarity of flow fields due to pumping wells and its impact on advective transport is of particular interest in well capture zone design and wellhead protection. However, techniques based on Monte Carlo methods to characterize the associated capture zone uncertainty are time consuming and cumbersome. This paper introduces an alternative approach. The mean and covariance of system state variables (i.e., head, pore water velocity, and particle trajectory) are approximated using a first-order Taylor's series with sensitivity coefficients estimated from the adjoint operator for a system of discrete equations. The approach allows nonstationarity due to several sources (e.g., transmissivity, pumping, boundary conditions) to be treated. By employing numerical solution methods, it is able to handle irregular geometry, varying boundary conditions, complicated sink/source terms, and different covariance functions, all of which are important factors for real-world applications. A comparison of results for the Taylor's series approximation with those from Monte Carlo analysis showed, in general, good agreement for most of the tested particles. Particle trajectory variance calculated using Taylor's series approximation is then used to predict well capture zone probabilities under the assumption of normality of the mass transport's state variables. Verification of this assumption showed that not all particle trajectories (depending on their starting location) are normally or log-normally distributed. However, the risk of using the first-order method to delineate the confidence interval of a well capture zone is minimal since it marginally overestimates the 2.5% probability contour. Furthermore, this should be balanced against its greater computation efficiency over the Monte Carlo approach.
Jha, Abhinav K.; Kupinski, Matthew A.; Masumura, Takahiro; Clarkson, Eric; Maslov, Alexey V.; Barrett, Harrison H.
2014-01-01
We present the implementation, validation, and performance of a Neumann-series approach for simulating light propagation at optical wavelengths in uniform media using the radiative transport equation (RTE). The RTE is solved for an anisotropic-scattering medium in a spherical harmonic basis for a diffuse-optical-imaging setup. The main objectives of this paper are threefold: to present the theory behind the Neumann-series form for the RTE, to design and develop the mathematical methods and the software to implement the Neumann series for a diffuse-optical-imaging setup, and, finally, to perform an exhaustive study of the accuracy, practical limitations, and computational efficiency of the Neumann-series method. Through our results, we demonstrate that the Neumann-series approach can be used to model light propagation in uniform media with small geometries at optical wavelengths. PMID:23201893
Scaling behaviour of heartbeat intervals obtained by wavelet-based time-series analysis
NASA Astrophysics Data System (ADS)
Ivanov, Plamen Ch.; Rosenblum, Michael G.; Peng, C.-K.; Mietus, Joseph; Havlin, Shlomo; Stanley, H. Eugene; Goldberger, Ary L.
1996-09-01
BIOLOGICAL time-series analysis is used to identify hidden dynamical patterns which could yield important insights into underlying physiological mechanisms. Such analysis is complicated by the fact that biological signals are typically both highly irregular and non-stationary, that is, their statistical character changes slowly or intermittently as a result of variations in background influences1-3. Previous statistical analyses of heartbeat dynamics4-6 have identified long-range correlations and power-law scaling in the normal heartbeat, but not the phase interactions between the different frequency components of the signal. Here we introduce a new approach, based on the wavelet transform and an analytic signal approach, which can characterize non-stationary behaviour and elucidate such phase interactions. We find that, when suitably rescaled, the distributions of the variations in the beat-to-beat intervals for all healthy subjects are described by a single function stable over a wide range of timescales. However, a similar scaling function does not exist for a group with cardiopulmonary instability caused by sleep apnoea. We attribute the functional form of the scaling observed in the healthy subjects to underlying nonlinear dynamics, which seem to be essential to normal heart function. The approach introduced here should be useful in the analysis of other nonstationary biological signals.
On fractal analysis of cardiac interbeat time series
NASA Astrophysics Data System (ADS)
Guzmán-Vargas, L.; Calleja-Quevedo, E.; Angulo-Brown, F.
2003-09-01
In recent years the complexity of a cardiac beat-to-beat time series has been taken as an auxiliary tool to identify the health status of human hearts. Several methods has been employed to characterize the time series complexity. In this work we calculate the fractal dimension of interbeat time series arising from three groups: 10 young healthy persons, 8 elderly healthy persons and 10 patients with congestive heart failures. Our numerical results reflect evident differences in the dynamic behavior corresponding to each group. We discuss these results within the context of the neuroautonomic control of heart rate dynamics. We also propose a numerical simulation which reproduce aging effects of heart rate behavior.
NASA Astrophysics Data System (ADS)
Mitchell, R.; Hilton, E.; Rosenfield, P.
2011-12-01
Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series is hosted on the University of Washington campus and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the importance of an interdisciplinary approach, and discuss strategies for
NASA Astrophysics Data System (ADS)
Mitchell, R.; Hilton, E.; Rosenfield, P.
2012-12-01
Communicating the results and significance of basic research to the general public is of critical importance. Federal funding and university budgets are under substantial pressure, and taxpayer support of basic research is critical. Public outreach by ecologists is an important vehicle for increasing support and understanding of science in an era of anthropogenic global change. At present, very few programs or courses exist to allow young scientists the opportunity to hone and practice their public outreach skills. Although the need for science outreach and communication is recognized, graduate programs often fail to provide any training in making science accessible. Engage: The Science Speaker Series represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed a novel, interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk. The course incorporates elements of story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This course was offered to graduate students and post-doctoral researchers from a wide variety of sciences in the autumn of 2010 and 2011, and will be retaught in 2012. Students who participated in the Engage course were then given the opportunity to participate in Engage: The Science Speaker Series. This free, public-friendly speaker series has been hosted at the University of Washington campus and Seattle Town Hall, and has had substantial public attendance and participation. The growing success of Engage illustrates the need for such programs throughout graduate level science curricula. We present the impetus for the development of the program, elements of the curriculum covered in the Engage course, the
Investigation on Law and Economics Based on Complex Network and Time Series Analysis.
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
3-D, bluff body drag estimation using a Green's function/Gram-Charlier series approach.
Barone, Matthew Franklin; De Chant, Lawrence Justin
2004-05-01
In this study, we describe the extension of the 2-d preliminary design bluff body drag estimation tool developed by De Chant to apply for 3-d flows. As with the 2-d method, the 3-d extension uses a combined approximate Green's function/Gram-Charlier series approach to retain the body geometry information. Whereas, the 2-d methodology relied solely upon the use of small disturbance theory for the inviscid flow field associated with the body of interest to estimate the near-field initial conditions, e.g. velocity defect, the 3-d methodology uses both analytical (where available) and numerical inviscid solutions. The defect solution is then used as an initial condition in an approximate 3-d Green's function solution. Finally, the Green's function solution is matched to the 3-d analog of the classical 2-d Gram-Charlier series and then integrated to yield the net form drag on the bluff body. Preliminary results indicate that drag estimates computed are of accuracy equivalent to the 2-d method for flows with large separation, i.e. less than 20% relative error. As was the lower dimensional method, the 3-d concept is intended to be a supplement to turbulent Navier-Stokes and experimental solution for estimating drag coefficients over blunt bodies.
3-D, bluff body drag estimation using a Green's function/Gram-Charlier series approach.
Barone, Matthew Franklin; De Chant, Lawrence Justin
2005-01-01
In this study, we describe the extension of the 2-d preliminary design bluff body drag estimation tool developed by De Chant1 to apply for 3-d flows. As with the 2-d method, the 3-d extension uses a combined approximate Green's function/Gram-Charlier series approach to retain the body geometry information. Whereas, the 2-d methodology relied solely upon the use of small disturbance theory for the inviscid flow field associated with the body of interest to estimate the near-field initial conditions, e.g. velocity defect, the 3-d methodology uses both analytical (where available) and numerical inviscid solutions. The defect solution is then used as an initial condition in an approximate 3-d Green's function solution. Finally, the Green's function solution is matched to the 3-d analog of the classical 2-d Gram-Charlier series and then integrated to yield the net form drag on the bluff body. Preliminary results indicate that drag estimates computed are of accuracy equivalent to the 2-d method for flows with large separation, i.e. less than 20% relative error. As was the lower dimensional method, the 3-d concept is intended to be a supplement to turbulent Navier-Stokes and experimental solution for estimating drag coefficients over blunt bodies.
Time series analysis of sferics rate data associated with severe weather patterns
NASA Technical Reports Server (NTRS)
Wang, P. P.; Burns, R. C.
1976-01-01
Data obtained by an electronic transducer measuring the rate of occurrence of electrical disturbances in the atmosphere (the sferic rate in the form of a time series) over the life of electrical storms are analyzed. It is found that the sferic rate time series are not stationary. The sferics rate time series has a complete life cycle associated with a particular storm. The approach to recognition of a spectral pattern is somewhat similar to real-time recognition of the spoken word.
Hirano, Shoji; Tsumoto, Shusaku
2002-01-01
In this paper, we present an analysis method of time-series laboratory examination data based on multiscale matching and rough clustering. We obtain similarity of sequences by multi-scale matching, which compares two sequences throughout various scales of view. It has an advantage that connectivity of segments is preserved in the matching results even when the partial segments are obtained from different scales. Given relative similarity of the sequences, we cluster them by a rough-set based clustering technique. It groups the sequences based on their indiscernibility and has an ability to produce interpretable clusters without calculating the centroid or variance of a cluster. In the experiments we demonstrate that the features of patterns were successfully captured by this hybrid approach.
Nader, T; Rothenberg, S; Averbach, R; Charles, B; Fields, J Z; Schneider, R H
2000-01-01
Approximately 40% of the US population report using complementary and alternative medicine, including Maharishi Vedic Medicine (MVM), a traditional, comprehensive system of natural medicine, for relief from chronic and other disorders. Although many reports suggest health benefits from individual MVM techniques, reports on integrated holistic approaches are rare. This case series, designed to investigate the effectiveness of an integrated, multimodality MVM program in an ideal clinical setting, describes the outcomes in four patients: one with sarcoidosis; one with Parkinson's disease; a third with renal hypertension; and a fourth with diabetes/essential hypertension/anxiety disorder. Standard symptom reports and objective markers of disease were evaluated before, during, and after the treatment period. Results suggested substantial improvements as indicated by reductions in major signs, symptoms, and use of conventional medications in the four patients during the 3-week in-residence treatment phase and continuing through the home follow-up program.
Nader, Tony; Rothenberg, Stuart; Averbach, Richard; Charles, Barry; Fields, Jeremy Z.; Schneider, Robert H.
2008-01-01
Approximately 40% of the US population report using complementary and alternative medicine, including Maharishi Vedic Medicine (MVM), a traditional, comprehensive system of natural medicine, for relief from chronic and other disorders. Although many reports suggest health benefits from individual MVM techniques, reports on integrated holistic approaches are rare. This case series, designed to investigate the effectiveness of an integrated, multi-modality MVM program in an ideal clinical setting, describes the outcomes in four patients: one with sarcoidosis; one with Parkinson’s disease; a third with renal hypertension; and a fourth with diabetes/essential hypertension/anxiety disorder. Standard symptom reports and objective markers of disease were evaluated before, during, and after the treatment period. Results suggested substantial improvements as indicated by reductions in major signs, symptoms, and use of conventional medications in the four patients during the 3-week in-residence treatment phase and continuing through the home follow-up program. PMID:10971882
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy
NASA Astrophysics Data System (ADS)
Yujun, Yang; Jianping, Li; Yimei, Yang
This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.
Time-Series Analyses of Air Pollution and Mortality in the United States: A Subsampling Approach
McClellan, Roger O.; Dewanji, Anup; Turim, Jay; Luebeck, E. Georg; Edwards, Melanie
2012-01-01
Background: Hierarchical Bayesian methods have been used in previous papers to estimate national mean effects of air pollutants on daily deaths in time-series analyses. Objectives: We obtained maximum likelihood estimates of the common national effects of the criteria pollutants on mortality based on time-series data from ≤ 108 metropolitan areas in the United States. Methods: We used a subsampling bootstrap procedure to obtain the maximum likelihood estimates and confidence bounds for common national effects of the criteria pollutants, as measured by the percentage increase in daily mortality associated with a unit increase in daily 24-hr mean pollutant concentration on the previous day, while controlling for weather and temporal trends. We considered five pollutants [PM10, ozone (O3), carbon monoxide (CO), nitrogen dioxide (NO2), and sulfur dioxide (SO2)] in single- and multipollutant analyses. Flexible ambient concentration–response models for the pollutant effects were considered as well. We performed limited sensitivity analyses with different degrees of freedom for time trends. Results: In single-pollutant models, we observed significant associations of daily deaths with all pollutants. The O3 coefficient was highly sensitive to the degree of smoothing of time trends. Among the gases, SO2 and NO2 were most strongly associated with mortality. The flexible ambient concentration–response curve for O3 showed evidence of nonlinearity and a threshold at about 30 ppb. Conclusions: Differences between the results of our analyses and those reported from using the Bayesian approach suggest that estimates of the quantitative impact of pollutants depend on the choice of statistical approach, although results are not directly comparable because they are based on different data. In addition, the estimate of the O3-mortality coefficient depends on the amount of smoothing of time trends. PMID:23108284
A Time-Series Analysis of Hispanic Unemployment.
ERIC Educational Resources Information Center
Defreitas, Gregory
1986-01-01
This study undertakes the first systematic time-series research on the cyclical patterns and principal determinants of Hispanic joblessness in the United States. The principal findings indicate that Hispanics tend to bear a disproportionate share of increases in unemployment during recessions. (Author/CT)
Dynamic Factor Analysis of Nonstationary Multivariate Time Series.
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; And Others
1992-01-01
The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)
Model Identification in Time-Series Analysis: Some Empirical Results.
ERIC Educational Resources Information Center
Padia, William L.
Model identification of time-series data is essential to valid statistical tests of intervention effects. Model identification is, at best, inexact in the social and behavioral sciences where one is often confronted with small numbers of observations. These problems are discussed, and the results of independent identifications of 130 social and…
On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis
NASA Astrophysics Data System (ADS)
Vigo, Isabel M.; Trottini, Mario; Belda, Santiago
2016-04-01
In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.
ERIC Educational Resources Information Center
Towgood, Karren J.; Meuwese, Julia D. I.; Gilbert, Sam J.; Turner, Martha S.; Burgess, Paul W.
2009-01-01
In the neuropsychological case series approach, tasks are administered that tap different cognitive domains, and differences within rather than across individuals are the basis for theorising; each individual is effectively their own control. This approach is a mainstay of cognitive neuropsychology, and is particularly suited to the study of…
On statistical approaches to climate change analysis
NASA Astrophysics Data System (ADS)
Lee, Terry Chun Kit
Evidence for a human contribution to climatic changes during the past century is accumulating rapidly. Given the strength of the evidence, it seems natural to ask whether forcing projections can be used to forecast climate change. A Bayesian method for post-processing forced climate model simulations that produces probabilistic hindcasts of inter-decadal temperature changes on large spatial scales is proposed. Hindcasts produced for the last two decades of the 20th century are shown to be skillful. The suggestion that skillful decadal forecasts can be produced on large regional scales by exploiting the response to anthropogenic forcing provides additional evidence that anthropogenic change in the composition of the atmosphere has influenced our climate. In the absence of large negative volcanic forcing on the climate system (which cannot presently be forecast), the global mean temperature for the decade 2000-2009 is predicted to lie above the 1970-1999 normal with probability 0.94. The global mean temperature anomaly for this decade relative to 1970-1999 is predicted to be 0.35°C (5-95% confidence range: 0.21°C--0.48°C). Reconstruction of temperature variability of the past centuries using climate proxy data can also provide important information on the role of anthropogenic forcing in the observed 20th century warming. A state-space model approach that allows incorporation of additional non-temperature information, such as the estimated response to external forcing, to reconstruct historical temperature is proposed. An advantage of this approach is that it permits simultaneous reconstruction and detection analysis as well as future projection. A difficulty in using this approach is that estimation of several unknown state-space model parameters is required. To take advantage of the data structure in the reconstruction problem, the existing parameter estimation approach is modified, resulting in two new estimation approaches. The competing estimation approaches
Catchment classification based on a comparative analysis of time series of natural tracers
NASA Astrophysics Data System (ADS)
Lehr, Christian; Lischeid, Gunnar; Tetzlaff, Doerthe
2014-05-01
Catchments do not only smooth the precipitation signal into the discharge hydrograph, but transform also chemical signals (e.g. contaminations or nutrients) in a characteristic way. Under the assumption of an approximately homogeneous input signal of a conservative tracer in the catchment the transformation of the signal at different locations can be used to infer hydrological properties of the catchment. For this study comprehensive data on geology, soils, topography, land use, etc. as well as hydrological knowledge about transit times, mixing ratio of base flow, etc. is available for the catchment of the river Dee (1849 km2) in Scotland, UK. The Dee has its origin in the Cairngorm Mountains in Central Scotland and flows towards the eastern coast of Scotland where it ends in the Northern Sea at Aberdeen. From the source in the west to the coast in the east there is a distinct decrease in precipitation and altitude. For one year water quality in the Dee has been sampled biweekly at 59 sites along the main stem of the river and outflows of a number of tributaries. A nonlinear variant of Principal Component Analysis (Isometric Feature Mapping) has been applied on time series of different chemical parameters that were assumed to be relative conservative and applicable as natural tracers. Here, the information in the time series was not used to analyse the temporal development at the different sites, but in a snapshot kind of approach, the spatial expression of the different solutes at the 26 sampling dates. For all natural tracers the first component depicted > 89 % of the variance in the series. Subsequently, the spatial expression of the first component was related to the spatial patterns of the catchment characteristics. The presented approach allows to characterise a catchment in a spatial discrete way according to the hydrologically active properties of the catchment on the landscape scale, which is often the scale of interest for water managing purposes.
Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis
NASA Astrophysics Data System (ADS)
Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.
2015-06-01
This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.
Multifractal analysis of time series generated by discrete Ito equations
Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele
2015-06-15
In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.
Acoustic neuroma surgery as an interdisciplinary approach: a neurosurgical series of 508 patients
Tonn, J.; Schlake, H.; Goldbrunner, R.; Milewski, C.; Helms, J.; Roosen, K.
2000-01-01
OBJECTIVES—To evaluate an interdisciplinary concept (neurosurgery/ear, nose, and throat (ENT)) of treating acoustic neuromas with extrameatal extension via the retromastoidal approach. To analyse whether monitoring both facial nerve EMG and BAEP improved the functional outcome in acoustic neuroma surgery. METHODS—In a series of 508 patients consecutively operated on over a period of 7 years, functional outcome of the facial nerve was evaluated according to the House/Brackmann scale and hearing preservation was classified using the Gardner/Robertson system. RESULTS—Facial monitoring (396 of 508 operations) and continuous BAEP recording (229 of 399 cases with preserved hearing preoperatively) were performed routinely. With intraoperative monitoring, the rate of excellent/good facial nerve function (House/Brackmann I-II) was 88.7%. Good functional hearing (Gardner/Robertson 1-3) was preserved in 39.8%. CONCLUSION—Acoustic neuroma surgery via a retrosigmoidal approach is a safe and effective treatment for tumours with extrameatal extension. Functional results can be substantially improved by intraoperative monitoring. The interdisciplinary concept of surgery performed by ENT and neurosurgeons was particularly convincing as each pathoanatomical phase of the operation is performed by a surgeon best acquainted with the regional specialties. PMID:10896686
Class D management implementation approach of the first orbital mission of the Earth Venture series
NASA Astrophysics Data System (ADS)
Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris
2013-09-01
A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.
Class D Management Implementation Approach of the First Orbital Mission of the Earth Venture Series
NASA Technical Reports Server (NTRS)
Wells, James E.; Scherrer, John; Law, Richard; Bonniksen, Chris
2013-01-01
A key element of the National Research Council's Earth Science and Applications Decadal Survey called for the creation of the Venture Class line of low-cost research and application missions within NASA (National Aeronautics and Space Administration). One key component of the architecture chosen by NASA within the Earth Venture line is a series of self-contained stand-alone spaceflight science missions called "EV-Mission". The first mission chosen for this competitively selected, cost and schedule capped, Principal Investigator-led opportunity is the CYclone Global Navigation Satellite System (CYGNSS). As specified in the defining Announcement of Opportunity, the Principal Investigator is held responsible for successfully achieving the science objectives of the selected mission and the management approach that he/she chooses to obtain those results has a significant amount of freedom as long as it meets the intent of key NASA guidance like NPR 7120.5 and 7123. CYGNSS is classified under NPR 7120.5E guidance as a Category 3 (low priority, low cost) mission and carries a Class D risk classification (low priority, high risk) per NPR 8705.4. As defined in the NPR guidance, Class D risk classification allows for a relatively broad range of implementation strategies. The management approach that will be utilized on CYGNSS is a streamlined implementation that starts with a higher risk tolerance posture at NASA and that philosophy flows all the way down to the individual part level.
Dynamical Analysis and Visualization of Tornadoes Time Series
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
Financial time series analysis based on information categorization method
NASA Astrophysics Data System (ADS)
Tian, Qiang; Shang, Pengjian; Feng, Guochen
2014-12-01
The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.
Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series
Schroeder, Jonathan P.
2012-01-01
The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193
Time-Series Analysis of Supergranule Characterstics at Solar Minimum
NASA Technical Reports Server (NTRS)
Williams, Peter E.; Pesnell, W. Dean
2013-01-01
Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.
Surrogate-assisted network analysis of nonlinear time series
NASA Astrophysics Data System (ADS)
Laut, Ingo; Räth, Christoph
2016-10-01
The performance of recurrence networks and symbolic networks to detect weak nonlinearities in time series is compared to the nonlinear prediction error. For the synthetic data of the Lorenz system, the network measures show a comparable performance. In the case of relatively short and noisy real-world data from active galactic nuclei, the nonlinear prediction error yields more robust results than the network measures. The tests are based on surrogate data sets. The correlations in the Fourier phases of data sets from some surrogate generating algorithms are also examined. The phase correlations are shown to have an impact on the performance of the tests for nonlinearity.
The analysis of behavior in orbit GSS two series of US early-warning system
NASA Astrophysics Data System (ADS)
Sukhov, P. P.; Epishev, V. P.; Sukhov, K. P.; Motrunych, I. I.
2016-09-01
Satellites Early Warning System Series class SBIRS US Air Force must replace on GEO early series DSP Series. During 2014-2016 the authors received more than 30 light curves "DSP-18 and "Sbirs-Geo 2". The analysis of the behavior of these satellites in orbit by a coordinate and photometric data. It is shown that for the monitoring of the Earth's surface is enough to place GEO 4 unit SBIRS across 90 deg.
Spectral analysis of time series of categorical variables in earth sciences
NASA Astrophysics Data System (ADS)
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.; Dorador, Javier
2016-10-01
Time series of categorical variables often appear in Earth Science disciplines and there is considerable interest in studying their cyclic behavior. This is true, for example, when the type of facies, petrofabric features, ichnofabrics, fossil assemblages or mineral compositions are measured continuously over a core or throughout a stratigraphic succession. Here we deal with the problem of applying spectral analysis to such sequences. A full indicator approach is proposed to complement the spectral envelope often used in other disciplines. Additionally, a stand-alone computer program is provided for calculating the spectral envelope, in this case implementing the permutation test to assess the statistical significance of the spectral peaks. We studied simulated sequences as well as real data in order to illustrate the methodology.
A Global Approach to Image Texture Analysis
1990-03-01
segmented images based on texture by convolution with small masks ranging from 3 x 3 to 7 x 7 pixels. The local approach is not optimal for the sea ice... image , then differences of texture will clearly be reflected in the two-dimensional po~wer spectrum of the image . To look at spectral distribution...resulting from convolutions with Laws’ masks are actually the vaiues of image energy falling in a series of spectral bins. Consider the seventh.-order
Anatomy of the ICDS series: A bibliometric analysis
NASA Astrophysics Data System (ADS)
Cardona, Manuel; Marxa, Werner
2007-12-01
In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called “source journals” covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories.
A new complexity measure for time series analysis and classification
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth
2013-07-01
Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).
NASA Astrophysics Data System (ADS)
Baldysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; Kroszczynski, Krzysztof; Araszkiewicz, Andrzej
2015-04-01
In recent years, the GNSS system began to play an increasingly important role in the research related to the climate monitoring. Based on the GPS system, which has the longest operational capability in comparison with other systems, and a common computational strategy applied to all observations, long and homogeneous ZTD (Zenith Tropospheric Delay) time series were derived. This paper presents results of analysis of 16-year ZTD time series obtained from the EPN (EUREF Permanent Network) reprocessing performed by the Military University of Technology. To maintain the uniformity of data, analyzed period of time (1998-2013) is exactly the same for all stations - observations carried out before 1998 were removed from time series and observations processed using different strategy were recalculated according to the MUT LAC approach. For all 16-year time series (59 stations) Lomb-Scargle periodograms were created to obtain information about the oscillations in ZTD time series. Due to strong annual oscillations which disturb the character of oscillations with smaller amplitude and thus hinder their investigation, Lomb-Scargle periodograms for time series with the deleted annual oscillations were created in order to verify presence of semi-annual, ter-annual and quarto-annual oscillations. Linear trend and seasonal components were estimated using LSE (Least Square Estimation) and Mann-Kendall trend test were used to confirm the presence of linear trend designated by LSE method. In order to verify the effect of the length of time series on the estimated size of the linear trend, comparison between two different length of ZTD time series was performed. To carry out a comparative analysis, 30 stations which have been operating since 1996 were selected. For these stations two periods of time were analyzed: shortened 16-year (1998-2013) and full 18-year (1996-2013). For some stations an additional two years of observations have significant impact on changing the size of linear
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG).
Fourier series analysis of fractal lenses: theory and experiments with a liquid-crystal display.
Davis, Jeffrey A; Sigarlaki, Sean P; Craven, Julia M; Calvo, María Luisa
2006-02-20
We report on a Fourier series approach that predicts the focal points and intensities produced by fractal zone plate lenses. This approach allows us to separate the effects of the fractal order from those of the lens aperture. We implement these fractal lenses onto a liquid-crystal display and show experimental verification of our theory.
Different approaches of spectral analysis
NASA Technical Reports Server (NTRS)
Lacoume, J. L.
1977-01-01
Several approaches to the problem of the calculation of spectral power density of a random function from an estimate of the autocorrelation function were studied. A comparative study was presented of these different methods. The principles on which they are based and the hypothesis implied were pointed out. Some indications on the optimization of the length of the estimated correlation function was given. An example of application of the different methods discussed in this paper was included.
Multiscale InSAR Time Series (MInTS) analysis of surface deformation
NASA Astrophysics Data System (ADS)
Hetland, E. A.; Muse, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; DiCaprio, C. J.
2011-12-01
We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, such that coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least-squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.
Multiscale InSAR Time Series (MInTS) analysis of surface deformation
NASA Astrophysics Data System (ADS)
Hetland, E. A.; Musé, P.; Simons, M.; Lin, Y. N.; Agram, P. S.; Dicaprio, C. J.
2012-02-01
We present a new approach to extracting spatially and temporally continuous ground deformation fields from interferometric synthetic aperture radar (InSAR) data. We focus on unwrapped interferograms from a single viewing geometry, estimating ground deformation along the line-of-sight. Our approach is based on a wavelet decomposition in space and a general parametrization in time. We refer to this approach as MInTS (Multiscale InSAR Time Series). The wavelet decomposition efficiently deals with commonly seen spatial covariances in repeat-pass InSAR measurements, since the coefficients of the wavelets are essentially spatially uncorrelated. Our time-dependent parametrization is capable of capturing both recognized and unrecognized processes, and is not arbitrarily tied to the times of the SAR acquisitions. We estimate deformation in the wavelet-domain, using a cross-validated, regularized least squares inversion. We include a model-resolution-based regularization, in order to more heavily damp the model during periods of sparse SAR acquisitions, compared to during times of dense acquisitions. To illustrate the application of MInTS, we consider a catalog of 92 ERS and Envisat interferograms, spanning 16 years, in the Long Valley caldera, CA, region. MInTS analysis captures the ground deformation with high spatial density over the Long Valley region.
A multi-modal treatment approach for the shoulder: A 4 patient case series
Pribicevic, Mario; Pollard, Henry
2005-01-01
Background This paper describes the clinical management of four cases of shoulder impingement syndrome using a conservative multimodal treatment approach. Clinical Features Four patients presented to a chiropractic clinic with chronic shoulder pain, tenderness in the shoulder region and a limited range of motion with pain and catching. After physical and orthopaedic examination a clinical diagnosis of shoulder impingement syndrome was reached. The four patients were admitted to a multi-modal treatment protocol including soft tissue therapy (ischaemic pressure and cross-friction massage), 7 minutes of phonophoresis (driving of medication into tissue with ultrasound) with 1% cortisone cream, diversified spinal and peripheral joint manipulation and rotator cuff and shoulder girdle muscle exercises. The outcome measures for the study were subjective/objective visual analogue pain scales (VAS), range of motion (goniometer) and return to normal daily, work and sporting activities. All four subjects at the end of the treatment protocol were symptom free with all outcome measures being normal. At 1 month follow up all patients continued to be symptom free with full range of motion and complete return to normal daily activities. Conclusion This case series demonstrates the potential benefit of a multimodal chiropractic protocol in resolving symptoms associated with a suspected clinical diagnosis of shoulder impingement syndrome. PMID:16168053
A sequential approach to calibrate ecosystem models with multiple time series data
NASA Astrophysics Data System (ADS)
Oliveros-Ramos, Ricardo; Verley, Philippe; Echevin, Vincent; Shin, Yunne-Jai
2017-02-01
When models are aimed to support decision-making, their credibility is essential to consider. Model fitting to observed data is one major criterion to assess such credibility. However, due to the complexity of ecosystem models making their calibration more challenging, the scientific community has given more attention to the exploration of model behavior than to a rigorous comparison to observations. This work highlights some issues related to the comparison of complex ecosystem models to data and proposes a methodology for a sequential multi-phases calibration (or parameter estimation) of ecosystem models. We first propose two criteria to classify the parameters of a model: the model dependency and the time variability of the parameters. Then, these criteria and the availability of approximate initial estimates are used as decision rules to determine which parameters need to be estimated, and their precedence order in the sequential calibration process. The end-to-end (E2E) ecosystem model ROMS-PISCES-OSMOSE applied to the Northern Humboldt Current Ecosystem is used as an illustrative case study. The model is calibrated using an evolutionary algorithm and a likelihood approach to fit time series data of landings, abundance indices and catch at length distributions from 1992 to 2008. Testing different calibration schemes regarding the number of phases, the precedence of the parameters' estimation, and the consideration of time varying parameters, the results show that the multiple-phase calibration conducted under our criteria allowed to improve the model fit.
Cluster analysis of long time-series medical datasets
NASA Astrophysics Data System (ADS)
Hirano, Shoji; Tsumoto, Shusaku
2004-04-01
This paper presents a comparative study about the characteristics of clustering methods for inhomogeneous time-series medical datasets. Using various combinations of comparison methods and grouping methods, we performed clustering experiments of the hepatitis data set and evaluated validity of the results. The results suggested that (1) complete-linkage (CL) criterion in agglomerative hierarchical clustering (AHC) outperformed average-linkage (AL) criterion in terms of the interpretability of a dendrogram and clustering results, (2) combination of dynamic time warping (DTW) and CL-AHC constantly produced interpretable results, (3) combination of DTW and rough clustering (RC) would be used to find the core sequences of the clusters, (4) multiscale matching may suffer from the treatment of 'no-match' pairs, however, the problem may be eluded by using RC as a subsequent grouping method.
Joint analysis of celestial pole offset and free core nutation series
NASA Astrophysics Data System (ADS)
Malkin, Zinovy
2016-10-01
Three combined celestial pole offset (CPO) series computed at the Paris Observatory (C04), the United States Naval Observatory (USNO), and the International VLBI Service for Geodesy and Astrometry (IVS), as well as six free core nutation (FCN) models, were compared from different perspectives, such as stochastic and systematic differences, and FCN amplitude and phase variations. The differences between the C04 and IVS CPO series were mostly stochastic, whereas a low-frequency bias at the level of several tens of μ as was found between the C04 and USNO CPO series. The stochastic differences between the C04 and USNO series became considerably smaller when computed at the IVS epochs, which can indicate possible problems with the interpolation of the IVS data at the midnight epochs during the computation of the C04 and USNO series. The comparison of the FCN series showed that the series computed with similar window widths of 1.1-1.2 years were close to one another at a level of 10-20 μ as, whereas the differences between these series and the series computed with a larger window width of 4 and 7 years reached 100 μ as. The dependence of the FCN model on the underlying CPO series was investigated. The RMS differences between the FCN models derived from the C04, USNO, and IVS CPO series were at a level of approximately 15 μ as, which was considerably smaller than the differences among the CPO series. The analysis of the differences between the IVS, C04, and USNO CPO series suggested that the IVS series would be preferable for both precession-nutation and FCN-related studies.
CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach
An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.
NASA Astrophysics Data System (ADS)
Hinnov, L. A.; Yao, X.; Zhou, Y.
2014-12-01
We describe a Middle Permian radiolarian chert sequence in South China (Chaohu area), with sequence of chert and mudstone layers formulated into binary series.Two interpolation approaches were tested: linear interpolation resulting in a "triangle" series, and staircase interpolation resulting in a "boxcar" series. Spectral analysis of the triangle series reveals decimeter chert-mudstone cycles which represent theoretical Middle Permian 32 kyr obliquity cycling. Tuning these cycles to a 32-kyr periodicity reveals that other cm-scale cycles are in the precession index band and have a strong ~400 kyr amplitude modulation. Additional tuning tests further support a hypothesis of astronomical forcing of the chert sequence. Analysis of the boxcar series reveals additional "eccentricity" terms transmitted by the boxcar representation of the modulating precession-scale cycles. An astronomical time scale reconstructed from these results assumes a Roadian/Wordian boundary age of 268.8 Ma for the onset of the first chert layer at the base of the sequence and ends at 264.1 Ma, for a total duration of 4.7 Myrs. We propose that monsoon-controlled upwelling contributed to the development of the chert-mudstone cycles. A seasonal monsoon controlled by astronomical forcing influenced the intensity of upwelling, modulating radiolarian productivity and silica deposition.
Taxation in Public Education. Analysis and Bibliography Series, No. 12.
ERIC Educational Resources Information Center
Ross, Larry L.
Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…
Supported Employment: Review of Literature. Policy Analysis Series, No. 26.
ERIC Educational Resources Information Center
Minnesota Governor's Planning Council on Developmental Disabilities, St. Paul.
This review of the literature on supported employment for individuals with severe disabilities begins by outlining two Federal definitions of supported employment and noting their similarities. Literature on approaches to supported employment is examined, focusing on individual jobs at distributed or scattered sites, enclaves, mobile crews, and…
Data Reorganization for Optimal Time Series Data Access, Analysis, and Visualization
NASA Astrophysics Data System (ADS)
Rui, H.; Teng, W. L.; Strub, R.; Vollmer, B.
2012-12-01
The way data are archived is often not optimal for their access by many user communities (e.g., hydrological), particularly if the data volumes and/or number of data files are large. The number of data records of a non-static data set generally increases with time. Therefore, most data sets are commonly archived by time steps, one step per file, often containing multiple variables. However, many research and application efforts need time series data for a given geographical location or area, i.e., a data organization that is orthogonal to the way the data are archived. The retrieval of a time series of the entire temporal coverage of a data set for a single variable at a single data point, in an optimal way, is an important and longstanding challenge, especially for large science data sets (i.e., with volumes greater than 100 GB). Two examples of such large data sets are the North American Land Data Assimilation System (NLDAS) and Global Land Data Assimilation System (GLDAS), archived at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC; Hydrology Data Holdings Portal, http://disc.sci.gsfc.nasa.gov/hydrology/data-holdings). To date, the NLDAS data set, hourly 0.125x0.125° from Jan. 1, 1979 to present, has a total volume greater than 3 TB (compressed). The GLDAS data set, 3-hourly and monthly 0.25x0.25° and 1.0x1.0° Jan. 1948 to present, has a total volume greater than 1 TB (compressed). Both data sets are accessible, in the archived time step format, via several convenient methods, including Mirador search and download (http://mirador.gsfc.nasa.gov/), GrADS Data Server (GDS; http://hydro1.sci.gsfc.nasa.gov/dods/), direct FTP (ftp://hydro1.sci.gsfc.nasa.gov/data/s4pa/), and Giovanni Online Visualization and Analysis (http://disc.sci.gsfc.nasa.gov/giovanni). However, users who need long time series currently have no efficient way to retrieve them. Continuing a longstanding tradition of facilitating data access, analysis, and
Pitfalls in Fractal Time Series Analysis: fMRI BOLD as an Exemplary Case
Eke, Andras; Herman, Peter; Sanganahalli, Basavaraju G.; Hyder, Fahmeed; Mukli, Peter; Nagy, Zoltan
2012-01-01
This article will be positioned on our previous work demonstrating the importance of adhering to a carefully selected set of criteria when choosing the suitable method from those available ensuring its adequate performance when applied to real temporal signals, such as fMRI BOLD, to evaluate one important facet of their behavior, fractality. Earlier, we have reviewed on a range of monofractal tools and evaluated their performance. Given the advance in the fractal field, in this article we will discuss the most widely used implementations of multifractal analyses, too. Our recommended flowchart for the fractal characterization of spontaneous, low frequency fluctuations in fMRI BOLD will be used as the framework for this article to make certain that it will provide a hands-on experience for the reader in handling the perplexed issues of fractal analysis. The reason why this particular signal modality and its fractal analysis has been chosen was due to its high impact on today’s neuroscience given it had powerfully emerged as a new way of interpreting the complex functioning of the brain (see “intrinsic activity”). The reader will first be presented with the basic concepts of mono and multifractal time series analyses, followed by some of the most relevant implementations, characterization by numerical approaches. The notion of the dichotomy of fractional Gaussian noise and fractional Brownian motion signal classes and their impact on fractal time series analyses will be thoroughly discussed as the central theme of our application strategy. Sources of pitfalls and way how to avoid them will be identified followed by a demonstration on fractal studies of fMRI BOLD taken from the literature and that of our own in an attempt to consolidate the best practice in fractal analysis of empirical fMRI BOLD signals mapped throughout the brain as an exemplary case of potentially wide interest. PMID:23227008
InSAR and GPS time series analysis: Crustal deformation in the Yucca Mountain, Nevada region
NASA Astrophysics Data System (ADS)
Li, Z.; Hammond, W. C.; Blewitt, G.; Kreemer, C. W.; Plag, H.
2010-12-01
Several previous studies have successfully demonstrated that long time series (e.g. >5 years) of GPS measurements can be employed to detect tectonic signals with a vertical rate greater than 0.3 mm/yr (e.g. Hill and Blewitt, 2006; Bennett et al. 2009). However, GPS stations are often sparse, with spacing from a few kilometres to a few hundred kilometres. Interferometric SAR (InSAR) can complement GPS by providing high horizontal spatial resolution (e.g. meters to tens-of metres) over large regions (e.g. 100 km × 100 km). A major source of error for repeat-pass InSAR is the phase delay in radio signal propagation through the atmosphere. The portion of this attributable to tropospheric water vapour causes errors as large as 10-20 cm in deformation retrievals. InSAR Time Series analysis with Atmospheric Estimation Models (InSAR TS + AEM), developed at the University of Glasgow, is a robust time series analysis approach, which mainly uses interferograms with small geometric baselines to minimise the effects of decorrelation and inaccuracies in topographic data. In addition, InSAR TS + AEM can be used to separate deformation signals from atmospheric water vapour effects in order to map surface deformation as it evolves in time. The principal purposes of this study are to assess: (1) how consistent InSAR-derived deformation time series are with GPS; and (2) how precise InSAR-derived atmospheric path delays can be. The Yucca Mountain, Nevada region is chosen as the study site because of its excellent GPS network and extensive radar archives (>10 years of dense and high-quality GPS stations, and >17 years of ERS and ENVISAT radar acquisitions), and because of its arid environment. The latter results in coherence that is generally high, even for long periods that span the existing C-band radar archives of ERS and ENVISAT. Preliminary results show that our InSAR LOS deformation map agrees with GPS measurements to within 0.35 mm/yr RMS misfit at the stations which is the
A novel water quality data analysis framework based on time-series data mining.
Deng, Weihui; Wang, Guoyin
2017-03-18
The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data.
Shallice, Tim; Buiatti, Tania
2011-10-01
The paper addresses a weakness in the Schwartz and Dell paper (2010)-namely, its discussion of the inclusion criteria for case series. The paper distinguishes the different types that exist and how they constrain the theoretical conclusions that one can draw about the organization of the normal cognitive system. Four different types of inclusion criteria are considered. Two are those treated by Schwartz and Dell-namely, theoretically derived clinical criteria, such as the example of semantic dementia, and broad clinical criteria such as the presence of aphasia. In addition, in the present paper two different types of anatomically based criteria are assessed-those using anatomical regions selected a priori and also regions selected as a result of an anatomical group study analysis. Putative functional syndromes are argued to be the empirical building blocks for cognitive neuropsychology. Anatomically based case series can aid in their construction or in their fractionation.
Methodology Series Module 6: Systematic Reviews and Meta-analysis
Setia, Maninder Singh
2016-01-01
Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the “highest level of evidence” for various clinical questions. There are a lot of studies – sometimes with contradictory conclusions – on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist. PMID:27904176
Variance Analysis of Unevenly Spaced Time Series Data
NASA Technical Reports Server (NTRS)
Hackman, Christine; Parker, Thomas E.
1996-01-01
We have investigated the effect of uneven data spacing on the computation of delta (sub chi)(gamma). Evenly spaced simulated data sets were generated for noise processes ranging from white phase modulation (PM) to random walk frequency modulation (FM). Delta(sub chi)(gamma) was then calculated for each noise type. Data were subsequently removed from each simulated data set using typical two-way satellite time and frequency transfer (TWSTFT) data patterns to create two unevenly spaced sets with average intervals of 2.8 and 3.6 days. Delta(sub chi)(gamma) was then calculated for each sparse data set using two different approaches. First the missing data points were replaced by linear interpolation and delta (sub chi)(gamma) calculated from this now full data set. The second approach ignored the fact that the data were unevenly spaced and calculated delta(sub chi)(gamma) as if the data were equally spaced with average spacing of 2.8 or 3.6 days. Both approaches have advantages and disadvantages, and techniques are presented for correcting errors caused by uneven data spacing in typical TWSTFT data sets.
A unified nonlinear stochastic time series analysis for climate science.
Moon, Woosok; Wettlaufer, John S
2017-03-13
Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.
Physiological time-series analysis: what does regularity quantify?
NASA Technical Reports Server (NTRS)
Pincus, S. M.; Goldberger, A. L.
1994-01-01
Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.
A unified nonlinear stochastic time series analysis for climate science
Moon, Woosok; Wettlaufer, John S.
2017-01-01
Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability. PMID:28287128
Structural analysis of a series of strontium-substituted apatites.
O'Donnell, M D; Fredholm, Y; de Rouffignac, A; Hill, R G
2008-09-01
A series of Sr-substituted hydroxyapatites, (Sr(x)Ca(1-)(x))(5)(PO(4))(3)OH, where x=0.00, 0.25, 0.50, 0.75 and 1.00, were made by a standard wet chemical route and investigated using X-ray diffraction (XRD), Rietveld refinement and Raman spectroscopy. We report apatites manufactured by two synthesis routes under 90 degrees C, and only the fully Sr-substituted sample had a small amount of an impurity phase, which is believed to be strontium pyrophosphate. Lattice parameters (a and c), unit cell volume and density were shown to increase linearly with strontium addition and were consistent with the addition of a slightly larger and heavier ion (Sr) in place of Ca. XRD Lorentzian peak widths increased to a maximum at x=0.50, then decreased with increasing Sr content. This indicated an increase in crystallite size when moving away from the x=0.50 composition (d approximately 9.4nm). There was a slight preference for strontium to enter the Ca(II) site in the mixed apatites (6 to 12% depending on composition). The position of the Raman band attributed to v(1)PO(4)(3-) at around 963cm(-1) in hydroxyapatite decreased linearly to 949cm(-1) at full Sr-substitution. The full width at half maximum of this peak also correlated well and increased linearly with increasing crystallite size calculated from XRD.
A unified nonlinear stochastic time series analysis for climate science
NASA Astrophysics Data System (ADS)
Moon, Woosok; Wettlaufer, John S.
2017-03-01
Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.
Detrended fluctuation analysis of time series of a firing fusimotor neuron
NASA Astrophysics Data System (ADS)
Blesić, S.; Milošević, S.; Stratimirović, Dj.; Ljubisavljević, M.
We study the interspike intervals (ISI) time series of the spontaneous fusimotor neuron activity by applying the detrended fluctuation analysis that is a modification of the random walk model analysis. Thus, we have found evidence for the white noise characteristics of the ISI time series, which means that the fusimotor activity does not possess temporal correlations. We conclude that such an activity represents the requisite noisy component for occurrence of the stochastic resonance mechanism in the neural coordination of muscle spindles.
NASA Astrophysics Data System (ADS)
Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi
2016-05-01
This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable
Intra-cholecystic approach for laparoscopic management of Mirizzi's syndrome: A case series
Nag, Hirdaya H.; Gangadhara, Vageesh Bettageri; Dangi, Amit
2016-01-01
INTRODUCTION: Laparoscopic management of patients with Mirizzi's syndrome (MS) is not routinely recommended due to the high risk of iatrogenic complications. PATIENTS AND METHODS: Intra-cholecystic (IC) or inside-gall bladder (GB) approach was used for laparoscopic management of 16 patients with MS at a tertiary care referral centre in North India from May 2010 to August 2014; a retrospective analysis of prospectively collected data was performed. RESULTS: Mean age was 40.1 ± 14.7 years, the male-to-female ratio was 1:3, and 9 (56.25%) patients had type 1 MS (MS1) and 7 (43.75%) had type 2 MS (MS2) (McSherry's classification). The laparoscopic intra-cholecystic approach (LICA) was successful in 11 (68.75%) patients, whereas 5 patients (31.25%) required conversion to open method. Median blood loss was 100 mL (range: 50-400 mL), and median duration of surgery was 3.25 h (range: 2-7.5 h). No major complications were encountered except 1 patient (6.5%) who required re-operation for retained bile duct stones. The final histopathology report was benign in all the patients. No remote complications were noted during a mean follow-up of 20.18 months. CONCLUSION: LICA is a feasible and safe approach for selected patients with Mirizzi's syndrome; however, a low threshold for conversion is necessary to avoid iatrogenic complications. PMID:27251843
New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…
An approach to constructing a homogeneous time series of soil mositure using SMOS
Technology Transfer Automated Retrieval System (TEKTRAN)
Overlapping soil moisture time series derived from two satellite microwave radiometers (SMOS, Soil Moisture and Ocean Salinity; AMSR-E, Advanced Microwave Scanning Radiometer - Earth Observing System) are used to generate a soil moisture time series from 2003 to 2010. Two statistical methodologies f...
NASA Astrophysics Data System (ADS)
Meroni, M.; Fasbender, D.; Kayitakire, F.; Pini, G.; Rembold, F.; Urbano, F.; Verstraete, M. M.
2013-12-01
Timely information on vegetation development at regional scale is needed in arid and semiarid African regions where rainfall variability leads to high inter-annual fluctuations in crop and pasture productivity, as well as to high risk of food crisis in the presence of severe drought events. The present study aims at developing and testing an automatic procedure to estimate the probability of experiencing a seasonal biomass production deficit solely on the basis of historical and near real-time remote sensing observations. The method is based on the extraction of vegetation phenology from SPOT-VEGTATION time series of the Fraction of Absorbed Photosynthetically Active Radiation (FAPAR) and the subsequent computation of seasonally cumulated FAPAR as a proxy for vegetation gross primary production. Within season forecasts of the overall seasonal performance, expressed in terms of probability of experiencing a critical deficit, are based on a statistical approach taking into account two factors: i) the similarity between the current FAPAR profile and past profiles observable in the 15 years FAPAR time series; ii) the uncertainty of past predictions of season outcome as derived using jack-knifing technique. The method is applicable at the regional to continental scale and can be updated regularly during the season (whenever a new satellite observation is made available) to provide a synoptic view of the hot spots of likely production deficit. The specific objective of the procedure described here is to deliver to the food security analyst, as early as possible within the season, only the relevant information (e.g., masking out areas without active vegetation at the time of analysis), expressed through a reliable and easily interpretable measure of impending risk. Evaluation of method performance and examples of application in the Sahel region are discussed.
ERIC Educational Resources Information Center
Stifter, Cynthia A.; Rovine, Michael
2015-01-01
The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…
NASA Astrophysics Data System (ADS)
Schwatke, Christian; Dettmering, Denise
2016-04-01
Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, for some years, this technology has also been used to retrieve water levels from lakes, reservoirs, rivers, wetlands and in general any inland water body. In this contribution, a new approach for the estimation of inland water level time series is presented. The method is the basis for the computation of time series of rivers and lakes available through the web service 'Database for Hydrological Time Series over Inland Water' (DAHITI). It is based on an extended outlier rejection and a Kalman filter approach incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1, Jason-2, Topex/Poseidon, and SARAL/AltiKa, including their uncertainties. The new approach yields RMS differences with respect to in situ data between 4 cm and 36 cm for lakes and 8 cm and 114 cm for rivers, respectively. Within this presentation, the new approach will be introduced and examples for water level time series for a variety of lakes and rivers will be shown featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases.
Structure in Photon Maps and Time Series: A New Approach to Bayesian Blocks
NASA Astrophysics Data System (ADS)
Scargle, J. D.; Norris, J. P.
2000-10-01
The Bayesian Blocks algorithm finds the most probable piecewise constant ("blocky") representation for time series in the form of binned, time-tagged, or time-to-spill photon counting data. In (Scargle, 1998, ApJ 504, 405) the number of blocks was determined in an ad hoc iterative procedure. Another approach maximizes the posterior -- after marginalizing all parameters except the number of blocks -- computed with Markov Chain Monte Carlo methods. A new, better algorithm starts with the Voronoi tessellation of the individual events in an arbitrary dimensioned data space. (This generalization allows solution of problems such as detection of clusters in high dimensional parameter spaces, and identification of structures in images.) In successive steps, these many cells are merged to form fewer, larger ones. The decision to merge two cells or keep them apart is based on comparison of the corresponding posterior probabilities. Let P(N,V) be the posterior for a Poisson model of a volume of size V containing N events, a function easily calculated explicitly. Then cells j and k are merged if $P( Nj + Nk, Vj + Vk ) > P( Nj, Vj ) P( Nk, Vk )$ and kept separate otherwise. When this criterion favors the merging of no further cells computation halts. Local structures ("shots") in the variability of Cygnus X-1 and RXTE 1118+480 were detected in this way, using time-tagged photon data from the USA X-ray Telescope. Since no time bins are invoked, the full sub-millisecond time resolution of the USA instrument is maintained. The method contains no parameters other than those defining prior probability distributions, and therefor yields objective structure estimates. For image data, the cells need not be restricted to be simply connected, e.g. in order to treat background regions surrounding sources. Partly funded by the NASA Applied Information Systems Research Program.
Sociology: Discipline Analysis. Women in the Curriculum Series.
ERIC Educational Resources Information Center
Johnson, Jacqueline; Risman, Barbara J.
This essay examines the ways in which sociology, as a discipline, has been influenced by feminist scholarship in the field, and three major contributions of feminist scholarship are presented: the introduction of women into sociological theory and research during the era of "sex role" analysis; the shift to analyzing gender as a basic axis of…
Philosophy: Discipline Analysis. Women in the Curriculum Series.
ERIC Educational Resources Information Center
Nye, Andrea
This essay examines the ways in which philosophy, as a discipline, has been influenced by feminist scholarship in the field. It explains that in the 1970s feminist philosophers introduced questions regarding personal life and sexuality as matters for philosophical analysis, and that scholars began to challenge the notions of the Western canon.…
Educational Attainment: Analysis by Immigrant Generation. IZA Discussion Paper Series.
ERIC Educational Resources Information Center
Chiswick, Barry R.; DebBurman, Noyna
This paper presents a theoretical and empirical analysis of the largely ignored issue of the determinants of the educational attainment of adults by immigrant generation. Using Current Population Survey (CPS) data, differences in educational attainment are analyzed by immigrant generation (first, second, and higher order generations), and among…
Lecca, Paola; Mura, Ivan; Re, Angela; Barker, Gary C.; Ihekwaba, Adaoha E. C.
2016-01-01
Chaotic behavior refers to a behavior which, albeit irregular, is generated by an underlying deterministic process. Therefore, a chaotic behavior is potentially controllable. This possibility becomes practically amenable especially when chaos is shown to be low-dimensional, i.e., to be attributable to a small fraction of the total systems components. In this case, indeed, including the major drivers of chaos in a system into the modeling approach allows us to improve predictability of the systems dynamics. Here, we analyzed the numerical simulations of an accurate ordinary differential equation model of the gene network regulating sporulation initiation in Bacillus subtilis to explore whether the non-linearity underlying time series data is due to low-dimensional chaos. Low-dimensional chaos is expectedly common in systems with few degrees of freedom, but rare in systems with many degrees of freedom such as the B. subtilis sporulation network. The estimation of a number of indices, which reflect the chaotic nature of a system, indicates that the dynamics of this network is affected by deterministic chaos. The neat separation between the indices obtained from the time series simulated from the model and those obtained from time series generated by Gaussian white and colored noise confirmed that the B. subtilis sporulation network dynamics is affected by low dimensional chaos rather than by noise. Furthermore, our analysis identifies the principal driver of the networks chaotic dynamics to be sporulation initiation phosphotransferase B (Spo0B). We then analyzed the parameters and the phase space of the system to characterize the instability points of the network dynamics, and, in turn, to identify the ranges of values of Spo0B and of the other drivers of the chaotic dynamics, for which the whole system is highly sensitive to minimal perturbation. In summary, we described an unappreciated source of complexity in the B. subtilis sporulation network by gathering
Analysis of cervical ribs in a series of human fetuses.
Bots, Jessica; Wijnaendts, Liliane C D; Delen, Sofie; Van Dongen, Stefan; Heikinheimo, Kristiina; Galis, Frietson
2011-09-01
In humans, an increasing body of evidence has linked the frequency of cervical ribs to stillbirths, other malformations and early childhood cancers. However, the frequency of cervical ribs in a putatively healthy fetal population is not sufficiently known to assess the actual medical risks of these prenatal findings. We therefore analyzed the presence of skeletal anomalies in a series of 199 electively aborted fetuses, which were whole-mount stained with alizarin red specific for skeletal tissues. Results show that approximately 40% of the fetuses had cervical ribs, even though external congenital abnormalities such as craniofacial and limb defects were absent. A literature overview indicates that the observed frequency of cervical ribs is comparable to results previously obtained for deceased fetuses with no or minor congenital anomalies, and higher than expected for healthy fetuses. This unexpected result can probably in part be explained by a higher detection rate of small cervical ribs when using alizarin red staining instead of radiographs. Additionally, studies in the literature suggest that the size of a cervical rib may indicate the severity of abnormalities, but this possibility requires further research. Anomalies of the axial skeleton are known to be caused by a disturbance of early development, which alters Hox gene expression, but in this study the origin of the stress could not be verified as maternal medical data were not available. The co-occurrence of rudimentary or absent 12th ribs in 23.6% of the cases with cervical ribs indicates that in approximately 8% of the fetuses a homeotic shift occurred over a larger part of the vertebral column. This suggests that the expression of multiple Hox genes may have been affected in these fetuses. Together, the high incidence of cervical ribs and also their co-occurrence with rudimentary or absent 12th ribs suggests that there may have been a disturbance of early development such that the studied fetuses are
Analysis of cervical ribs in a series of human fetuses
Bots, Jessica; Wijnaendts, Liliane C D; Delen, Sofie; Van Dongen, Stefan; Heikinheimo, Kristiina; Galis, Frietson
2011-01-01
In humans, an increasing body of evidence has linked the frequency of cervical ribs to stillbirths, other malformations and early childhood cancers. However, the frequency of cervical ribs in a putatively healthy fetal population is not sufficiently known to assess the actual medical risks of these prenatal findings. We therefore analyzed the presence of skeletal anomalies in a series of 199 electively aborted fetuses, which were whole-mount stained with alizarin red specific for skeletal tissues. Results show that approximately 40% of the fetuses had cervical ribs, even though external congenital abnormalities such as craniofacial and limb defects were absent. A literature overview indicates that the observed frequency of cervical ribs is comparable to results previously obtained for deceased fetuses with no or minor congenital anomalies, and higher than expected for healthy fetuses. This unexpected result can probably in part be explained by a higher detection rate of small cervical ribs when using alizarin red staining instead of radiographs. Additionally, studies in the literature suggest that the size of a cervical rib may indicate the severity of abnormalities, but this possibility requires further research. Anomalies of the axial skeleton are known to be caused by a disturbance of early development, which alters Hox gene expression, but in this study the origin of the stress could not be verified as maternal medical data were not available. The co-occurrence of rudimentary or absent 12th ribs in 23.6% of the cases with cervical ribs indicates that in approximately 8% of the fetuses a homeotic shift occurred over a larger part of the vertebral column. This suggests that the expression of multiple Hox genes may have been affected in these fetuses. Together, the high incidence of cervical ribs and also their co-occurrence with rudimentary or absent 12th ribs suggests that there may have been a disturbance of early development such that the studied fetuses are
André, Claire; Guyon, Catherine; Thomassin, Mireille; Barbier, Alexandre; Richert, Lysiane; Guillaume, Yves-Claude
2005-06-05
The binding constants (K) of a series of anticoagulant rodenticides with the main soil organic component, humic acid (HA), were determined using frontal analysis approach. The order of the binding constants was identical as the one obtained in a previous paper [J. Chromatogr. B 813 (2004) 295], i.e. bromadiolone>brodifacoum>difenacoum>chlorophacinone>diphacinone, confirming the power of this frontal analysis approach for the determination of binding constants. Moreover, and for the first time, the concentration of unbound rodenticide to HAs could be determined. Thanks this approach, we could clearly demonstrate that HA acid protected the human hepatoma cell line HepG2 against the cytotoxicity of all the rodenticides tested and that the toxicity of rodenticides was directly linked to the free rodenticide fraction in the medium (i.e. unbound rodenticide to HA).
Mapping mountain pine beetle mortality through growth trend analysis of time-series landsat data
Liang, Lu; Chen, Yanlei; Hawbaker, Todd J.; Zhu, Zhi-Liang; Gong, Peng
2014-01-01
Disturbances are key processes in the carbon cycle of forests and other ecosystems. In recent decades, mountain pine beetle (MPB; Dendroctonus ponderosae) outbreaks have become more frequent and extensive in western North America. Remote sensing has the ability to fill the data gaps of long-term infestation monitoring, but the elimination of observational noise and attributing changes quantitatively are two main challenges in its effective application. Here, we present a forest growth trend analysis method that integrates Landsat temporal trajectories and decision tree techniques to derive annual forest disturbance maps over an 11-year period. The temporal trajectory component successfully captures the disturbance events as represented by spectral segments, whereas decision tree modeling efficiently recognizes and attributes events based upon the characteristics of the segments. Validated against a point set sampled across a gradient of MPB mortality, 86.74% to 94.00% overall accuracy was achieved with small variability in accuracy among years. In contrast, the overall accuracies of single-date classifications ranged from 37.20% to 75.20% and only become comparable with our approach when the training sample size was increased at least four-fold. This demonstrates that the advantages of this time series work flow exist in its small training sample size requirement. The easily understandable, interpretable and modifiable characteristics of our approach suggest that it could be applicable to other ecoregions.
The Use of Scaffolding Approach to Enhance Students' Engagement in Learning Structural Analysis
ERIC Educational Resources Information Center
Hardjito, Djwantoro
2010-01-01
This paper presents a reflection on the use of Scaffolding Approach to engage Civil Engineering students in learning Structural Analysis subjects. In this approach, after listening to the lecture on background theory, students are provided with a series of practice problems, each one comes with the steps, formulas, hints, and tables needed to…
Genetic Programming Based Approach for Modeling Time Series Data of Real Systems
NASA Astrophysics Data System (ADS)
Ahalpara, Dilip P.; Parikh, Jitendra C.
Analytic models of a computer generated time series (logistic map) and three real time series (ion saturation current in Aditya Tokamak plasma, NASDAQ composite index and Nifty index) are constructed using Genetic Programming (GP) framework. In each case, the optimal map that results from fitting part of the data set also provides a very good description of the rest of the data. Predictions made using the map iteratively are very good for computer generated time series but not for the data of real systems. For such cases, an extended GP model is proposed and illustrated. A comparison of these results with those obtained using Artificial Neural Network (ANN) is also carried out.
A Mixed Approach Of Automated ECG Analysis
NASA Astrophysics Data System (ADS)
De, A. K.; Das, J.; Majumder, D. Dutta
1982-11-01
ECG is one of the non-invasive and risk-free technique for collecting data about the functional state of the heart. However, all these data-processing techniques can be classified into two basically different approaches -- the first and second generation ECG computer program. Not the opposition, but simbiosis of these two approaches will lead to systems with the highest accuracy. In our paper we are going to describe a mixed approach which will show higher accuracy with lesser amount of computational work. Key Words : Primary features, Patients' parameter matrix, Screening, Logical comparison technique, Multivariate statistical analysis, Mixed approach.
Analysis of the temporal properties in car accident time series
NASA Astrophysics Data System (ADS)
Telesca, Luciano; Lovallo, Michele
2008-05-01
In this paper we study the time-clustering behavior of sequences of car accidents, using data from a freely available database in the internet. The Allan Factor analysis, which is a well-suited method to investigate time-dynamical behaviors in point processes, has revealed that the car accident sequences are characterized by a general time-scaling behavior, with the presence of cyclic components. These results indicate that the time dynamics of the events are not Poissonian but long range correlated with periodicities ranging from 12 h to 1 year.
NASA Astrophysics Data System (ADS)
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources
Mixed Multifractal Analysis of Crude Oil, Gold and Exchange Rate Series
NASA Astrophysics Data System (ADS)
Dai, Meifeng; Shao, Shuxiang; Gao, Jianyu; Sun, Yu; Su, Weiyi
2016-11-01
The multifractal analysis of one time series, e.g. crude oil, gold and exchange rate series, is often referred. In this paper, we apply the classical multifractal and mixed multifractal spectrum to study multifractal properties of crude oil, gold and exchange rate series and their inner relationships. The obtained results show that in general, the fractal dimension of gold and crude oil is larger than that of exchange rate (RMB against the US dollar), reflecting a fact that the price series in gold and crude oil are more heterogeneous. Their mixed multifractal spectra have a drift and the plot is not symmetric, so there is a low level of mixed multifractal between each pair of crude oil, gold and exchange rate series.
NASA Astrophysics Data System (ADS)
Schwatke, C.; Dettmering, D.; Bosch, W.; Seitz, F.
2015-05-01
Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, since some years, this technology is also used for observing inland water levels of lakes and rivers. In this paper, a new approach for the estimation of inland water level time series is described. It is used for the computation of time series available through the web service "Database for Hydrological Time Series over Inland Water" (DAHITI). The method is based on a Kalman filter approach incorporating multi-mission altimeter observations and their uncertainties. As input data, cross-calibrated altimeter data from Envisat, ERS-2, Jason-1, Jason-2, Topex/Poseidon, and SARAL/AltiKa are used. The paper presents water level time series for a variety of lakes and rivers in North and South America featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparison with in-situ gauge data and results from external inland altimeter databases. The new approach yields RMS differences with respect to in-situ data between 4 and 38 cm for lakes and 12 and 139 cm for rivers, respectively. For most study cases, more accurate height information than from available other altimeter data bases can be achieved.
NASA Astrophysics Data System (ADS)
Schwatke, C.; Dettmering, D.; Bosch, W.; Seitz, F.
2015-10-01
Satellite altimetry has been designed for sea level monitoring over open ocean areas. However, for some years, this technology has also been used to retrieve water levels from reservoirs, wetlands and in general any inland water body, although the radar altimetry technique has been especially applied to rivers and lakes. In this paper, a new approach for the estimation of inland water level time series is described. It is used for the computation of time series of rivers and lakes available through the web service "Database for Hydrological Time Series over Inland Waters" (DAHITI). The new method is based on an extended outlier rejection and a Kalman filter approach incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1, Jason-2, TOPEX/Poseidon, and SARAL/AltiKa, including their uncertainties. The paper presents water level time series for a variety of lakes and rivers in North and South America featuring different characteristics such as shape, lake extent, river width, and data coverage. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases. The new approach yields rms differences with respect to in situ data between 4 and 36 cm for lakes and 8 and 114 cm for rivers. For most study cases, more accurate height information than from other available altimeter databases can be achieved.
[Analysis of a case series of workers with mobbing syndrome].
Marinoni, B; Minelli, C M; Franzina, B; Martellosio, V; Scafa, F; Giorgi, I; Mazzacane, F; Stancanelli, M; Mennoia, N V; Candura, S M
2007-01-01
Mobbing represents nowadays a major challenge for Occupational Medicine. We examined, during the last seven years, 253 patients who asked medical assistance for psychopathological problems by them ascribed to mobbing in the working environment. All patients underwent occupational health visit, psychological counselling (including personality tests administration), and psychiatric evaluation. A clinical picture probably due to mobbing was diagnosed in 37 workers: 2 cases of Post-Traumatic Stress Disorder (PTSD), 33 of Adjustment Disorder (AD), and 2 of anxiety disorder. Regarding mobbing typology, we found 19 cases of vertical mobbing (by an employer/manager to employees), 14 cases of strategic mobbing, 3 cases of horizontal mobbing (among colleagues), and one non intentional mobbing. In conclusion, a pure mobbing syndrome was diagnosed in a lower proportion than that reported by other investigators. The described interdisciplinary approach appears useful for the diagnostic assessment of suspect mobbing cases, that in turn is crucial for prognosis and treatment, as well as in relation to medico-legal issues and work-related compensation claims.
Mariani, Luigi; Zavatti, Franco
2017-03-24
The spectral periods in North Atlantic Oscillation (NAO), Atlantic Multidecadal Oscillation (AMO) and El Nino Southern Oscillation (ENSO) were analyzed and has been verified how they imprint a time series of European temperature anomalies (ETA), two European temperature time series and some phenological series (dates of cherry flowering and grapevine harvest). Such work had as reference scenario the linear causal chain MCTP (Macroscale Circulation→Temperature→Phenology of crops) that links oceanic and atmospheric circulation to surface air temperature which in its turn determines the earliness of appearance of phenological phases of plants. Results show that in the three segments of the MCTP causal chain are present cycles with the following central period in years (the % of the 12 analyzed time series interested by these cycles are in brackets): 65 (58%), 24 (58%), 20.5 (58%), 13.5 (50%), 11.5 (58%), 7.7 (75%), 5.5 (58%), 4.1 (58%), 3 (50%), 2.4 (67%). A comparison with short term spectral peaks of the four El Niño regions (nino1+2, nino3, nino3.4 and nino4) show that 10 of the 12 series are imprinted by periods around 2.3-2.4yr while 50-58% of the series are imprinted by El Niño periods of 4-4.2, 3.8-3.9, 3-3.1years. The analysis highlights the links among physical and biological variables of the climate system at scales that range from macro to microscale whose knowledge is crucial to reach a suitable understanding of the ecosystem behavior. The spectral analysis was also applied to a time series of spring - summer precipitation in order to evaluate the presence of peaks common with other 12 selected series with result substantially negative which brings us to rule out the existence of a linear causal chain MCPP (Macroscale Circulation→Precipitation→Phenology).
Crystallographic analysis of a series of inorganic compounds
NASA Astrophysics Data System (ADS)
Borisov, S. V.; Magarill, S. A.; Pervukhina, N. V.
2015-04-01
The method of crystallographic analysis relies on the mechanical-wave concept that treats the crystalline state as the result of ordering of atomic positions by families of parallel equidistant planes. Using this method, a large set of fluoride, oxide and sulfide structures was analyzed. The pseudo-translational ordering of various atomic groups (including the presence of cation and anion sublattices) in the structures of various classes of inorganic compounds was established. The crucial role of local ordering of heavy cations (coherent assembly) in the structures comprising large cluster fragments (Keggin polyanions, polyoxoniobates, etc.) is discussed. The role of symmetry and the regular distribution of heavy atoms in the formation of stable crystal structures, which is to be taken into account in the targeted design, is considered. The universality of configurations of atomic positions in the structures of various classes of inorganic compounds resulting from the ordering mechanism organized by mechanical (elastic) forces is demonstrated. The bibliography includes 158 references.
Approaches to remote sensing data analysis
Pettinger, Lawrence R.
1978-01-01
Objectives: To present an overview of the essential steps in the remote sensing data analysis process, and to compare and contrast manual (visual) and automated analysis methods Rationale: This overview is intended to provide a framework for choosing a manual of digital analysis approach to collecting resource information. It can also be used as a basis for understanding/evaluating invited papers and poster sessions during the Symposium
A quantitative approach to scar analysis.
Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia
2011-02-01
Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology.
Engine Control Improvement through Application of Chaotic Time Series Analysis
Green, J.B., Jr.; Daw, C.S.
2003-07-15
The objective of this program was to investigate cyclic variations in spark-ignition (SI) engines under lean fueling conditions and to develop options to reduce emissions of nitrogen oxides (NOx) and particulate matter (PM) in compression-ignition direct-injection (CIDI) engines at high exhaust gas recirculation (EGR) rates. The CIDI activity builds upon an earlier collaboration between ORNL and Ford examining combustion instabilities in SI engines. Under the original CRADA, the principal objective was to understand the fundamental causes of combustion instability in spark-ignition engines operating with lean fueling. The results of this earlier activity demonstrated that such combustion instabilities are dominated by the effects of residual gas remaining in each cylinder from one cycle to the next. A very simple, low-order model was developed that explained the observed combustion instability as a noisy nonlinear dynamical process. The model concept lead to development of a real-time control strategy that could be employed to significantly reduce cyclic variations in real engines using existing sensors and engine control systems. This collaboration led to the issuance of a joint patent for spark-ignition engine control. After a few years, the CRADA was modified to focus more on EGR and CIDI engines. The modified CRADA examined relationships between EGR, combustion, and emissions in CIDI engines. Information from CIDI engine experiments, data analysis, and modeling were employed to identify and characterize new combustion regimes where it is possible to simultaneously achieve significant reductions in NOx and PM emissions. These results were also used to develop an on-line combustion diagnostic (virtual sensor) to make cycle-resolved combustion quality assessments for active feedback control. Extensive experiments on engines at Ford and ORNL led to the development of the virtual sensor concept that may be able to detect simultaneous reductions in NOx and PM
Discrete Fourier analysis of ultrasound RF time series for detection of prostate cancer.
Moradi, M; Mousavi, P; Siemens, D R; Sauerbrei, E E; Isotalo, P; Boag, A; Abolmaesumi, P
2007-01-01
In this paper, we demonstrate that a set of six features extracted from the discrete Fourier transform of ultrasound Radio-Frequency (RF) time series can be used to detect prostate cancer with high sensitivity and specificity. Ultrasound RF time series refer to a series of echoes received from one spatial location of tissue while the imaging probe and the tissue are fixed in position. Our previous investigations have shown that at least one feature, fractal dimension, of these signals demonstrates strong correlation with the tissue microstructure. In the current paper, six new features that represent the frequency spectrum of the RF time series have been used, in conjunction with a neural network classification approach, to detect prostate cancer in regions of tissue as small as 0.03 cm2. Based on pathology results used as gold standard, we have acquired mean accuracy of 91%, mean sensitivity of 92% and mean specificity of 90% on seven human prostates.
Chaos in Electronic Circuits: Nonlinear Time Series Analysis
Wheat, Jr., Robert M.
2003-07-01
Chaos in electronic circuits is a phenomenon that has been largely ignored by engineers, manufacturers, and researchers until the early 1990’s and the work of Chua, Matsumoto, and others. As the world becomes more dependent on electronic devices, the detrimental effects of non-normal operation of these devices becomes more significant. Developing a better understanding of the mechanisms involved in the chaotic behavior of electronic circuits is a logical step toward the prediction and prevention of any potentially catastrophic occurrence of this phenomenon. Also, a better understanding of chaotic behavior, in a general sense, could potentially lead to better accuracy in the prediction of natural events such as weather, volcanic activity, and earthquakes. As a first step in this improvement of understanding, and as part of the research being reported here, methods of computer modeling, identifying and analyzing, and producing chaotic behavior in simple electronic circuits have been developed. The computer models were developed using both the Alternative Transient Program (ATP) and Spice, the analysis techniques have been implemented using the C and C++ programming languages, and the chaotically behaving circuits developed using “off the shelf” electronic components.
CCD Observing and Dynamical Time Series Analysis of Active Galactic Nuclei.
NASA Astrophysics Data System (ADS)
Nair, Achotham Damodaran
1995-01-01
The properties, working and operations procedure of the Charge Coupled Device (CCD) at the 30" telescope at Rosemary Hill Observatory (RHO) are discussed together with the details of data reduction. Several nonlinear techniques of time series analysis, based on the behavior of the nearest neighbors, have been used to analyze the time series of the quasar 3C 345. A technique using Artificial Neural Networks based on prediction of the time series is used to study the dynamical properties of 3C 345. Finally, a heuristic model for variability of Active Galactic Nuclei is discussed.
Highway Subsidence Analysis Based on the Advanced InSAR Time Series Analysis Method
NASA Astrophysics Data System (ADS)
Zhang, Qingyun; Zhang, Jingfa; Liu, Guolin; Li, Yongsheng
2016-08-01
The synthetic aperture radar (InSAR) measurements have the advantages of all-weather, wide range, high precision on the surface deformation monitoring. Highway as an important index of modern social and economic development, the quality and deformation changes in the process of using have a significant impact in the social development and people's life and property security. In practical applications the InSAR technology should do a variety of error correction analysis. By using a new analysis method – FRAM- SBAS time-series analysis method, to analyze the settlement of highway on Yanzhou area by the ALOS PALSAR datas. Use FRAM- SBAS timing analysis method to obtain the surface timing changes during 2008-09-21 to 2010-07-18 in the Jining area and obtained good results, the Jining area maximum timing settlement is 60mm, the maximum settlement rate reached 30mm/yr. The maximum settlement of the highway section is 53mm, the maximum settlement rate is 32mm/yr. And the settlement of highway worst sections were in severe ground subsidence, thus proving the mining and vehicle load effect on settlement of highway. And it is proved that the timing method on the ground and highway subsidence monitoring is feasible.
AnalogExplorer2 – Stereochemistry sensitive graphical analysis of large analog series
Hu, Ye; Zhang, Bijun; Vogt, Martin; Bajorath, Jürgen
2015-01-01
AnalogExplorer is a computational methodology for the extraction and organization of series of structural analogs from compound data sets and their graphical analysis. The method is suitable for the analysis of large analog series originating from lead optimization programs. Herein we report AnalogExplorer2 designed to explicitly take stereochemical information during graphical analysis into account and describe a freely available deposition of the original AnalogExplorer program, AnalogExplorer2, and exemplary compound sets to illustrate their use. PMID:26913194
Providing web-based tools for time series access and analysis
NASA Astrophysics Data System (ADS)
Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane
2014-05-01
Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30
A hybrid-domain approach for modeling climate data time series
NASA Astrophysics Data System (ADS)
Wen, Qiuzi H.; Wang, Xiaolan L.; Wong, Augustine
2011-09-01
In order to model climate data time series that often contain periodic variations, trends, and sudden changes in mean (mean shifts, mostly artificial), this study proposes a hybrid-domain (HD) algorithm, which incorporates a time domain test and a newly developed frequency domain test through an iterative procedure that is analogue to the well known backfitting algorithm. A two-phase competition procedure is developed to address the confounding issue between modeling periodic variations and mean shifts. A variety of distinctive features of climate data time series, including trends, periodic variations, mean shifts, and a dependent noise structure, can be modeled in tandem using the HD algorithm. This is particularly important for homogenization of climate data from a low density observing network in which reference series are not available to help preserve climatic trends and long-term periodic variations, preventing them from being mistaken as artificial shifts. The HD algorithm is also powerful in estimating trend and periodicity in a homogeneous data time series (i.e., in the absence of any mean shift). The performance of the HD algorithm (in terms of false alarm rate and hit rate in detecting shifts/cycles, and estimation accuracy) is assessed via a simulation study. Its power is further illustrated through its application to a few climate data time series.
Attrition and Augmentation Biases in Time Series Analysis: Evaluation of Clinical Programs.
ERIC Educational Resources Information Center
Fitz, Don; Tryon, Warren W.
1989-01-01
Methods of using simplified time series analysis (STSA) in evaluating clinical programs are discussed. STSA assists in addressing problems of attrition/augmentation of subjects in programs with changing populations. Combining individually calculated "C" statistics in a simple aggregate analysis of restraint usage by nursing home staff…
An Interactive Analysis of Hyperboles in a British TV Series: Implications For EFL Classes
ERIC Educational Resources Information Center
Sert, Olcay
2008-01-01
This paper, part of an ongoing study on the analysis of hyperboles in a British TV series, reports findings drawing upon a 90,000 word corpus. The findings are compared to the ones from CANCODE (McCarthy and Carter 2004), a five-million word corpus of spontaneous speech, in order to identify similarities between the two. The analysis showed that…
Time series analysis of Mexico City subsidence constrained by radar interferometry
NASA Astrophysics Data System (ADS)
López-Quiroz, Penélope; Doin, Marie-Pierre; Tupin, Florence; Briole, Pierre; Nicolas, Jean-Marie
2009-09-01
In Mexico City, subsidence rates reach up to 40 cm/yr mainly due to soil compaction led by the over exploitation of the Mexico Basin aquifer. In this paper, we map the spatial and temporal patterns of the Mexico City subsidence by differential radar interferometry, using 38 ENVISAT images acquired between end of 2002 and beginning of 2007. We present the severe interferogram unwrapping problems partly due to the coherence loss but mostly due to the high fringe rates. These difficulties are overcome by designing a new methodology that helps the unwrapping step. Our approach is based on the fact that the deformation shape is stable for similar time intervals during the studied period. As a result, a stack of the five best interferograms can be used to compute an average deformation rate for a fixed time interval. Before unwrapping, the number of fringes is then decreased in wrapped interferograms using a scaled version of the stack together with the estimation of the atmospheric phase contribution related with the troposphere vertical stratification. The residual phase, containing less fringes, is more easily unwrapped than the original interferogram. The unwrapping procedure is applied in three iterative steps. The 71 small baseline unwrapped interferograms are inverted to obtain increments of radar propagation delays between the 38 acquisition dates. Based on the redundancy of the interferometric data base, we quantify the unwrapping errors and show that they are strongly decreased by iterations in the unwrapping process. A map of the RMS interferometric system misclosure allows to define the unwrapping reliability for each pixel. Finally, we present a new algorithm for time series analysis that differs from classical SVD decomposition and is best suited to the present data base. Accurate deformation time series are then derived over the metropolitan area of the city with a spatial resolution of 30 × 30 m.
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Spectral analysis of hydrological time series of a river basin in southern Spain
NASA Astrophysics Data System (ADS)
Luque-Espinar, Juan Antonio; Pulido-Velazquez, David; Pardo-Igúzquiza, Eulogio; Fernández-Chacón, Francisca; Jiménez-Sánchez, Jorge; Chica-Olmo, Mario
2016-04-01
Spectral analysis has been applied with the aim to determine the presence and statistical significance of climate cycles in data series from different rainfall, piezometric and gauging stations located in upper Genil River Basin. This river starts in Sierra Nevada Range at 3,480 m a.s.l. and is one of the most important rivers of this region. The study area has more than 2.500 km2, with large topographic differences. For this previous study, we have used more than 30 rain data series, 4 piezometric data series and 3 data series from gauging stations. Considering a monthly temporal unit, the studied period range from 1951 to 2015 but most of the data series have some lacks. Spectral analysis is a methodology widely used to discover cyclic components in time series. The time series is assumed to be a linear combination of sinusoidal functions of known periods but of unknown amplitude and phase. The amplitude is related with the variance of the time series, explained by the oscillation at each frequency (Blackman and Tukey, 1958, Bras and Rodríguez-Iturbe, 1985, Chatfield, 1991, Jenkins and Watts, 1968, among others). The signal component represents the structured part of the time series, made up of a small number of embedded periodicities. Then, we take into account the known result for the one-sided confidence band of the power spectrum estimator. For this study, we established confidence levels of <90%, 90%, 95%, and 99%. Different climate signals have been identified: ENSO, QBO, NAO, Sun Spot cycles, as well as others related to sun activity, but the most powerful signals correspond to the annual cycle, followed by the 6 month and NAO cycles. Nevertheless, significant differences between rain data series and piezometric/flow data series have been pointed out. In piezometric data series and flow data series, ENSO and NAO signals could be stronger than others with high frequencies. The climatic peaks in lower frequencies in rain data are smaller and the confidence
NASA Astrophysics Data System (ADS)
Shen, Chenhua
2017-02-01
We applied traditional principal component analysis (TPCA) and nonstationary principal component analysis (NSPCA) to determine principal components in the six daily air-pollutant concentration series (SO2, NO2, CO, O3, PM2.5 and PM10) in Nanjing from January 2013 to March 2016. The results show that using TPCA, two principal components can reflect the variance of these series: primary pollutants (SO2, NO2, CO, PM2.5 and PM10) and secondary pollutants (e.g., O3). However, using NSPCA, three principal components can be determined to reflect the detrended variance of these series: 1) a mixture of primary and secondary pollutants, 2) primary pollutants and 3) secondary pollutants. Various approaches can obtain different principal components. This phenomenon is closely related to methods for calculating the cross-correlation between each of the air pollutants. NSPCA is a more applicable, reliable method for analyzing the principal components of a series in the presence of nonstationarity and for a long-range correlation than can TPCA. Moreover, using detrended cross-correlation analysis (DCCA), the cross-correlation between O3 and NO2 is negative at a short timescale and positive at a long timescale. In hourly timescales, O3 is negatively correlated with NO2 due to a photochemical interaction, and in daily timescales, O3 is positively correlated with NO2 because of the decomposition of O3. In monthly timescales, the cross-correlation between O3 with NO2 has similar performance to those of O3 with meteorological elements. DCCA is again shown to be more appropriate for disclosing the cross-correlation between series in the presence of nonstationarity than is Pearson's method. DCCA can improve our understanding of their interactional mechanisms.
Documentation of a spreadsheet for time-series analysis and drawdown estimation
Halford, Keith J.
2006-01-01
Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user
A Physiological Time Series Dynamics-Based Approach to Patient Monitoring and Outcome Prediction
Lehman, Li-Wei H.; Adams, Ryan P.; Mayaud, Louis; Moody, George B.; Malhotra, Atul; Mark, Roger G.; Nemati, Shamim
2015-01-01
Cardiovascular variables such as heart rate (HR) and blood pressure (BP) are regulated by an underlying control system, and therefore, the time series of these vital signs exhibit rich dynamical patterns of interaction in response to external perturbations (e.g., drug administration), as well as pathological states (e.g., onset of sepsis and hypotension). A question of interest is whether “similar” dynamical patterns can be identified across a heterogeneous patient cohort, and be used for prognosis of patients’ health and progress. In this paper, we used a switching vector autoregressive framework to systematically learn and identify a collection of vital sign time series dynamics, which are possibly recurrent within the same patient and may be shared across the entire cohort. We show that these dynamical behaviors can be used to characterize the physiological “state” of a patient. We validate our technique using simulated time series of the cardiovascular system, and human recordings of HR and BP time series from an orthostatic stress study with known postural states. Using the HR and BP dynamics of an intensive care unit (ICU) cohort of over 450 patients from the MIMIC II database, we demonstrate that the discovered cardiovascular dynamics are significantly associated with hospital mortality (dynamic modes 3 and 9, p = 0.001, p = 0.006 from logistic regression after adjusting for the APACHE scores). Combining the dynamics of BP time series and SAPS-I or APACHE-III provided a more accurate assessment of patient survival/mortality in the hospital than using SAPS-I and APACHE-III alone (p = 0.005 and p = 0.045). Our results suggest that the discovered dynamics of vital sign time series may contain additional prognostic value beyond that of the baseline acuity measures, and can potentially be used as an independent predictor of outcomes in the ICU. PMID:25014976
Approaches to Literature through Theme. The Oryx Reading Motivation Series No. 1.
ERIC Educational Resources Information Center
Montgomery, Paula Kay
Intended to help teachers and librarians inspire students in grades 5-9 to read and keep reading, this book provides literature theme approaches and teaching strategies for reading and studying literature. Chapter 1 discusses approaches, methods, techniques, and strategies in using literature approaches to motivate reading. Chapter 2 defines a…
Comparison of nonparametric trend analysis according to the types of time series data
NASA Astrophysics Data System (ADS)
Heo, J.; Shin, H.; Kim, T.; Jang, H.; Kim, H.
2013-12-01
In the analysis of hydrological data, the determination of the existence of overall trend due to climate change has been a major concern and the important part of design and management of water resources for the future. The existence of trend could be identified by plotting hydrologic time series. However, statistical methods are more accurate and objective tools to perform trend analysis. Statistical methods divided into parametric and nonparametric methods. In the case of parametric method, the population should be assumed to be normally distributed. However, most of hydrological data tend to be represented by non-normal distribution, then the nonparametric method considered more suitable than parametric method. In this study, simulations were performed with different types of time series data and four nonparametric methods (Mann-Kendall test, Spearman's rho test, SEN test, and Hotelling-Pabst test) generally used in trend analysis were applied to assess the power of each trend analysis. The time series data were classified into three types which are Trend+Random, Trend+Cycle+Random, and Trend+Non-random. In order to add a change to the data, 11 kinds of different slopes were overlapped at each simulation. As the results, nonparametric methods have almost similar power for Trend+random type and Trend+Non-random series. On the other hand, Mann-Kendall and SEN tests have slightly higher power than Spearman's rho and Hotelling-Pabst tests for Trend+Cycle+Random series.
NASA Astrophysics Data System (ADS)
Scafetta, Nicola; West, Bruce J.
2004-04-01
The multiresolution diffusion entropy analysis is used to evaluate the stochastic information left in a time series after systematic removal of certain non-stationarities. This method allows us to establish whether the identified patterns are sufficient to capture all relevant information contained in a time series. If they do not, the method suggests the need for further interpretation to explain the residual memory in the signal. We apply the multiresolution diffusion entropy analysis to the daily count of births to teens in Texas from 1964 through 2000 because it is a typical example of a non-stationary time series, having an anomalous trend, an annual variation, as well as short time fluctuations. The analysis is repeated for the three main racial/ethnic groups in Texas (White, Hispanic and African American), as well as, to married and unmarried teens during the years from 1994 to 2000 and we study the differences that emerge among the groups.
Application of nonlinear time series analysis techniques to high-frequency currency exchange data
NASA Astrophysics Data System (ADS)
Strozzi, Fernanda; Zaldívar, José-Manuel; Zbilut, Joseph P.
2002-09-01
In this work we have applied nonlinear time series analysis to high-frequency currency exchange data. The time series studied are the exchange rates between the US Dollar and 18 other foreign currencies from within and without the Euro zone. Our goal was to determine if their dynamical behaviours were in some way correlated. The nonexistence of stationarity called for the application of recurrence quantification analysis as a tool for this analysis, and is based on the definition of several parameters that allow for the quantification of recurrence plots. The method was checked using the European Monetary System currency exchanges. The results show, as expected, the high correlation between the currencies that are part of the Euro, but also a strong correlation between the Japanese Yen, the Canadian Dollar and the British Pound. Singularities of the series are also demonstrated taking into account historical events, in 1996, in the Euro zone.
Optimal Approaches to Microcomputer Implementation in the Schools. [CREATS Monograph Series.
ERIC Educational Resources Information Center
Bakke, Thomas W.
Fifth in a series of six monographs on the use of new technologies in the instruction of learning disabled students, the paper describes how schools can plan for the acquisition of computer hardware and software, and how they can provide district-level staff training in its use. Discussion focuses on the development of a technology implementation…
Series-hybrid bearing - An approach to extending bearing fatigue life at high speeds
NASA Technical Reports Server (NTRS)
Anderson, W. J.; Coe, H. H.; Fleming, D. P.; Parker, R. J.
1971-01-01
Fluid film bearing of hybrid device consists of orifice compensated annular thrust bearing and self-acting journal bearing. In series hybrid bearing, both ball bearing and annular thrust bearing carry full system thrust load, but two bearings share speed. Operation of system is stable and automatically fail-safe.
[Optic neuritis in childhood. A pediatric series, literature review and treatment approach].
Lopez-Martin, D; Martinez-Anton, J
2016-08-01
Introduccion. En la edad pediatrica, la forma mas frecuente de neuritis optica se presenta generalmente despues de un cuadro infeccioso, con edema de papila, que suele ser bilateral y tiene buen pronostico. La conversion a esclerosis multiple es infrecuente. Objetivo. Presentar las caracteristicas clinicas y de laboratorio de una serie pediatrica de neuritis optica. Pacientes y metodos. Se analiza una serie de 17 casos de neuritis optica en niños y jovenes de 4 a 14 años, referidos entre los años 2000 y 2015. Resultados. La edad mediana de la serie fue de 11 años. Predominaron los pacientes de sexo femenino y el antecedente infeccioso fue poco frecuente; en cinco pacientes, la afectacion fue bilateral, y cuatro casos se presentaron como neuritis optica retrobulbar. La resonancia magnetica mostro hiperintensidad en T2 en los nervios opticos afectados en cinco pacientes. El estudio del liquido cefalorraquideo y bandas oligoclonales fue normal en todos los casos. Los pacientes, tratados con metilprednisolona intravenosa, tuvieron buena recuperacion. Solo en tres casos se comprobo una evolucion posterior a esclerosis multiple. Conclusiones. En esta serie, los casos que evolucionaron a esclerosis multiple no mostraron diferencias clinicas, aunque si presentaron mayor cantidad de lesiones hiperintensas en la resonancia magnetica. Este hecho, descrito en trabajos previos, apoya nuestro esquema diagnostico y terapeutico en un intento por acercarnos al manejo optimo de esta patologia.
Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.
2014-01-01
Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and
Mayaud, C; Wagner, T; Benischke, R; Birk, S
2014-04-16
The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the
Goto, Kensuke; Kumarendran, Balachandran; Mettananda, Sachith; Gunasekara, Deepa; Fujii, Yoshito; Kaneko, Satoshi
2013-01-01
In tropical and subtropical regions of eastern and South-eastern Asia, dengue fever (DF) and dengue hemorrhagic fever (DHF) outbreaks occur frequently. Previous studies indicate an association between meteorological variables and dengue incidence using time series analyses. The impacts of meteorological changes can affect dengue outbreak. However, difficulties in collecting detailed time series data in developing countries have led to common use of monthly data in most previous studies. In addition, time series analyses are often limited to one area because of the difficulty in collecting meteorological and dengue incidence data in multiple areas. To gain better understanding, we examined the effects of meteorological factors on dengue incidence in three geographically distinct areas (Ratnapura, Colombo, and Anuradhapura) of Sri Lanka by time series analysis of weekly data. The weekly average maximum temperature and total rainfall and the total number of dengue cases from 2005 to 2011 (7 years) were used as time series data in this study. Subsequently, time series analyses were performed on the basis of ordinary least squares regression analysis followed by the vector autoregressive model (VAR). In conclusion, weekly average maximum temperatures and the weekly total rainfall did not significantly affect dengue incidence in three geographically different areas of Sri Lanka. However, the weekly total rainfall slightly influenced dengue incidence in the cities of Colombo and Anuradhapura.
Methods for serial analysis of long time series in the study of biological rhythms
2013-01-01
When one is faced with the analysis of long time series, one often finds that the characteristics of circadian rhythms vary with time throughout the series. To cope with this situation, the whole series can be fragmented into successive sections which are analyzed one after the other, which constitutes a serial analysis. This article discusses serial analysis techniques, beginning with the characteristics that the sections must have and how they can affect the results. After consideration of the effects of some simple filters, different types of serial analysis are discussed systematically according to the variable analyzed or the estimated parameters: scalar magnitudes, angular magnitudes (time or phase), magnitudes related to frequencies (or periods), periodograms, and derived and / or special magnitudes and variables. The use of wavelet analysis and convolutions in long time series is also discussed. In all cases the fundamentals of each method are exposed, jointly with practical considerations and graphic examples. The final section provides information about software available to perform this type of analysis. PMID:23867052
Shielding analysis of the TRUPACT-series casks for transportation of Hanford HLW
Banjac, V.; Sanchez, P.E.; Hills, C.R.; Heger, A.S. )
1993-01-01
In this paper, the authors propose the possibility of utilizing the TRUPACT-series casks for the transportation of high-level waste (HLW) from the Hanford reservation. The configurations of the TRUPACT series are a rectangular parallelepiped and a right circular cylinder, which are the TRUPACT-1 and -11, respectively. The TRUPACT series was designed as a type B contact-handled transuranic (CH-TRU) waste transportation system for use in Waste Isolation Pilot Plant-related operations and was subjected to type B container accident tests, which it successfully passed. Thus from a safety standpoint, the TRUPACT series is provided with double containment, impact limitation, and fire-retardant capabilities. However, the shielding analysis has shown the major modifications are required to allow for the transport of even a reasonable fraction of Hanford HLW.
3D analysis of the performances degradation caused by series resistance in concentrator solar cells
Daliento, Santolo; Lancellotti, Laura
2010-01-15
This paper deals with the modeling of series resistance components in silicon concentrator solar cells. The main components of the macroscopic series resistance are analyzed by means of one-dimensional (1D), two-dimensional (2D), and three-dimensional (3D) numerical simulations. It is shown that the contribution of the lateral current flux, flowing along the emitter region, and of the transverse current flux, flowing along the metal grid, cannot be neglected and, hence, the operation of solar cells subjected to high current densities cannot be described by simple one-dimensional models. The percentage weight of 2D and 3D components on the total value of the series resistance is evaluated and rules for the proper design of the cell geometries are given. An analysis of the effectiveness of the most popular methods for the extraction of the series resistance from the I-V curves of solar cells is also proposed. (author)
Fitzgerald, Michael G.; Karlinger, Michael R.
1983-01-01
Time-series models were constructed for analysis of daily runoff and sediment discharge data from selected rivers of the Eastern United States. Logarithmic transformation and first-order differencing of the data sets were necessary to produce second-order, stationary time series and remove seasonal trends. Cyclic models accounted for less than 42 percent of the variance in the water series and 31 percent in the sediment series. Analysis of the apparent oscillations of given frequencies occurring in the data indicates that frequently occurring storms can account for as much as 50 percent of the variation in sediment discharge. Components of the frequency analysis indicate that a linear representation is reasonable for the water-sediment system. Models that incorporate lagged water discharge as input prove superior to univariate techniques in modeling and prediction of sediment discharges. The random component of the models includes errors in measurement and model hypothesis and indicates no serial correlation. An index of sediment production within or between drain-gage basins can be calculated from model parameters.
NASA Technical Reports Server (NTRS)
Rudasill-Neigh, Christopher S.; Bolton, Douglas K.; Diabate, Mouhamad; Williams, Jennifer J.; Carvalhais, Nuno
2014-01-01
Forests contain a majority of the aboveground carbon (C) found in ecosystems, and understanding biomass lost from disturbance is essential to improve our C-cycle knowledge. Our study region in the Wisconsin and Minnesota Laurentian Forest had a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 2007, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometer (AVHRR). To understand the potential role of disturbances in the terrestrial C-cycle, we developed an algorithm to map forest disturbances from either harvest or insect outbreak for Landsat time-series stacks. We merged two image analysis approaches into one algorithm to monitor forest change that included: (1) multiple disturbance index thresholds to capture clear-cut harvest; and (2) a spectral trajectory-based image analysis with multiple confidence interval thresholds to map insect outbreak. We produced 20 maps and evaluated classification accuracy with air-photos and insect air-survey data to understand the performance of our algorithm. We achieved overall accuracies ranging from 65% to 75%, with an average accuracy of 72%. The producer's and user's accuracy ranged from a maximum of 32% to 70% for insect disturbance, 60% to 76% for insect mortality and 82% to 88% for harvested forest, which was the dominant disturbance agent. Forest disturbances accounted for 22% of total forested area (7349 km2). Our algorithm provides a basic approach to map disturbance history where large impacts to forest stands have occurred and highlights the limited spectral sensitivity of Landsat time-series to outbreaks of defoliating insects. We found that only harvest and insect mortality events can be mapped with adequate accuracy with a non-annual Landsat time-series. This limited our land cover understanding of NDVI decline drivers. We demonstrate that to capture more subtle disturbances with spectral trajectories, future observations
NASA Astrophysics Data System (ADS)
Kang, Xiaoyan; He, Anqi; Guo, Ran; Zhai, Yanjun; Xu, Yizhuang; Noda, Isao; Wu, Jinguang
2016-11-01
We propose a substantially simplified approach to construct a pair of 2D asynchronous spectra based on the DAOSD approach proposed in our previous papers. By using a new concentration series, only three 1D spectra are used to generate a pair of 2D correlation spectra together with two reference spectra. By using this method, the previous problem of labor intensive traditional DAOSD approach has been successfully addressed. We apply the new approach to characterize intermolecular interaction between acetonitrile and butanone dissolved in carbon tetrachloride. The existence of intermolecular interaction between the two solutes can be confirmed by the presence of a cross peak in the resultant 2D IR spectra. In addition, the absence of cross peak around (2254, 2292) in Ψbutanone provides another experimental evidence to reveal the intrinsic relationship between the Ctbnd N stretching band and an overtone band (δCH3+νC-C).
Assessing coal-mine safety regulation: A pooled time-series analysis
Chun Youngpyoung.
1991-01-01
This study attempts to assess the independent, relative, and conjoint effects of four types of variables on coal-mine safety: administrative (mine inspections, mine investigations, and mine safety grants); political (state party competition, gubernatorial party affiliation, and deregulation); economic (state per-capita income and unemployment rates); task-related (mine size, technology, and type of mining), and state dummy variables. Trend, Pearson correlation, and pooled time-series analyses are performed on fatal and nonfatal injury rates reported in 25 coal-producing states during the 1975-1985 time period. These are then interpreted in light of three competing theories of regulation: capture, nonmarket failure, and threshold. Analysis reveals: (1) distinctions in the total explanatory power of the model across different types of injuries, as well as across presidential administrations; (2) a consistently more powerful impact on safety of informational implementation tools (safety education grants) over command-and-control approaches (inspections and investigations) or political variables; and (3) limited, albeit conjectural, support for a threshold theory of regulation in the coal mine safety arena.
Cheng, Qing; Lu, Xin; Wu, Joseph T.; Liu, Zhong; Huang, Jincai
2016-01-01
Guangdong experienced the largest dengue epidemic in recent history. In 2014, the number of dengue cases was the highest in the previous 10 years and comprised more than 90% of all cases. In order to analyze heterogeneous transmission of dengue, a multivariate time series model decomposing dengue risk additively into endemic, autoregressive and spatiotemporal components was used to model dengue transmission. Moreover, random effects were introduced in the model to deal with heterogeneous dengue transmission and incidence levels and power law approach was embedded into the model to account for spatial interaction. There was little spatial variation in the autoregressive component. In contrast, for the endemic component, there was a pronounced heterogeneity between the Pearl River Delta area and the remaining districts. For the spatiotemporal component, there was considerable heterogeneity across districts with highest values in some western and eastern department. The results showed that the patterns driving dengue transmission were found by using clustering analysis. And endemic component contribution seems to be important in the Pearl River Delta area, where the incidence is high (95 per 100,000), while areas with relatively low incidence (4 per 100,000) are highly dependent on spatiotemporal spread and local autoregression. PMID:27666657
Vibration-based damage detection in plates by using time series analysis
NASA Astrophysics Data System (ADS)
Trendafilova, Irina; Manoach, Emil
2008-07-01
This paper deals with the problem of vibration health monitoring (VHM) in structures with nonlinear dynamic behaviour. It aims to introduce two viable VHM methods that use large amplitude vibrations and are based on nonlinear time series analysis. The methods suggested explore some changes in the state space geometry/distribution of the structural dynamic response with damage and their use for damage detection purposes. One of the methods uses the statistical distribution of state space points on the attractor of a vibrating structure, while the other one is based on the Poincaré map of the state space projected dynamic response. In this paper both methods are developed and demonstrated for a thin vibrating plate. The investigation is based on finite element modelling of the plate vibration response. The results obtained demonstrate the influence of damage on the local dynamic attractor of the plate state space and the applicability of the proposed strategies for damage assessment. The approach taken in this study and the suggested VHM methods are rather generic and permit development and applications for other more complex nonlinear structures.
The Terror Attacks of 9/11 and Suicides in Germany: A Time Series Analysis.
Medenwald, Daniel
2016-04-01
Data on the effect of the September 11, 2001 (9/11) terror attacks on suicide rates remain inconclusive. Reportedly, even people located far from the attack site have considerable potential for personalizing the events that occurred on 9/11. Durkheim's theory states that suicides decrease during wartime; thus, a decline in suicides might have been expected after 9/11. We conducted a time series analysis of 164,136 officially recorded suicides in Germany between 1995 and 2009 using the algorithm introduced by Box and Jenkins. Compared with the average death rate, we observed no relevant change in the suicide rate of either sex after 9/11. Our estimates of an excess of suicides approached the null effect value on and within a 7-day period after 9/11, which also held when subsamples of deaths in urban or rural settings were examined. No evidence of Durkheim's theory attributable to the 9/11attacks was found in this sample.
O'Neill, I K; Fishbein, L
1986-01-01
Since 1975, the IARC has been preparing a series of volumes entitled "Environmental Carcinogens: Selected Methods of Analysis" (IARC Manual series) of which the purposes are to assist analysts, epidemiologists and regulatory authorities in planning or performing exposure measurements that are truly comparable between different studies. The Manual series provides expert information within each volume on multi-media sampling, methods of analyses and some background of epidemiology, metabolism, use/occurrence for a group of known or suspect carcinogens. So far, eleven volumes have been published or are in preparation on the following subjects: N-nitrosamines, vinyl chloride, PAH, aromatic amines, mycotoxins, N-nitroso compounds, volatile halogenated hydrocarbons, metals, passive smoking, benzene and alkylated benzenes, dioxins, PCDFs and PCBs. The presentation will discuss needs and priorities for use of analytical chemistry in estimating exposures of apparently greatest relevance to cancer causation, i.e. the approach to developing this series. Indications from epidemiology, evaluations of carcinogenic risk to humans, and recent developments in total exposure assessment are that new methods and matrices need more emphasis, e.g. as with biochemical dosimetry, exhaled breath, and in indoor air.
Task Analysis: A Top-Down Approach.
ERIC Educational Resources Information Center
Harmon, Paul
1983-01-01
This approach to task analysis includes descriptions of (1) inputs, outputs, and jobs; (2) flow of materials and decisions between jobs; (3) inputs, major tasks, and outputs of each job; (4) sequence of steps for major tasks; (5) heuristics/algorithms for each sequence step; and (6) information needed to use heuristics algorithms. (EAO)
NASA Astrophysics Data System (ADS)
Sheng, X.; Li, M.; Jones, C. J. C.; Thompson, D. J.
2007-06-01
In this paper, the Fourier-series approach is employed to study wheel-rail interactions generated by a single, or multiple wheels moving at a constant speed along a railway track. This approach has been previously explored by other researchers and what is presented here is an improved version. In this approach, the track is represented by an infinitely long periodic structure with the period equal to the sleeper spacing and the vertical irregular profile (roughness) of the railhead is assumed to be periodic in the track direction with the period equal to the length of a number (integer), N, of sleeper bays. By assuming linear dynamics for the wheel/track system and for steady state, each wheel/rail force is a periodic function of time and can be expressed as a Fourier series. Fourier coefficients are then shown to be determined by solving, separately, N sets of linear algebraic equations. The coefficient matrix of each set of equations is independent of rail roughness and therefore this approach is particularly useful in modelling the generation and growth of rail roughness of short wavelengths. Excitation purely from the axle loads moving over the periodic track structure is realised by assuming a smooth railhead surface, and subsequently roughness equivalent to such an excitation is defined and evaluated. This equivalent roughness may, in addition to the actual rail roughness, be input into models in which the effect of moving axle loads has been excluded, so that the predictions from those models can be improved. Results are produced using the improved Fourier-series approach to investigate the effects of wheel speeds, roughness wavelengths and interactions between multiple wheels on wheel/rail contact forces.
Bosnjak, Roman; Benedicic, Mitja; Vittori, Alenka
2013-01-01
Background The choice of endoscopic expanded endonasal approach introduces the possibility of improved gross total resection of craniopharyngioma while minimizing surgical morbidity in a significant subset of patients. Methods From our trans-sphenoidal surgical series of 331 cases, we retrospectively reviewed visual, endocrine and neuro-cognitive outcomes in the first consecutive eight patients (median age 63 years; range 47–73 years) with newly diagnosed supradiaphragmatic craniopharyngioma (median tumour height 23 mm; range 15–34 mm), removed by expanded endonasal approach (median follow-up 27 months; range 10–69 months). Gross total resection was attempted in all patients. Results Gross total resection was achieved in 6 of 8 patients. Visual improvement was present in 6 of 8 patients of patients or in 14 of 16 eyes. New endocrinopathy, including diabetes insipidus, appeared in 5 of 8 patients. Stalk was preserved in 4 patients. Cognitive decline was present in 2 cases. Five of 8 patients retained previous quality of life. Conclusions Our early outcome results are comparable to the recent few expanded endonasal approach series, except for the incidence of new endocrinopathy and cerebrospinal fluid leak rate. This was influenced by higher number of transinfundibular tumours in our series, where stalk preservation is less likely, and not using nasoseptal flap or gasket closure in the first half of cases. Including data from the literature and ours, expanded endonasal approach shows a trend for improved gross total resection rate with less morbidity, more obviously for visual outcome and quality of life than for endocrine outcome. However, validity of expanded endonasal approach should be confirmed in a larger number of patients with a longer follow-up period. PMID:24133392
Mobile Visualization and Analysis Tools for Spatial Time-Series Data
NASA Astrophysics Data System (ADS)
Eberle, J.; Hüttich, C.; Schmullius, C.
2013-12-01
The Siberian Earth System Science Cluster (SIB-ESS-C) provides access and analysis services for spatial time-series data build on products from the Moderate Resolution Imaging Spectroradiometer (MODIS) and climate data from meteorological stations. Until now a webportal for data access, visualization and analysis with standard-compliant web services was developed for SIB-ESS-C. As a further enhancement a mobile app was developed to provide an easy access to these time-series data for field campaigns. The app sends the current position from the GPS receiver and a specific dataset (like land surface temperature or vegetation indices) - selected by the user - to our SIB-ESS-C web service and gets the requested time-series data for the identified pixel back in real-time. The data is then being plotted directly in the app. Furthermore the user has possibilities to analyze the time-series data for breaking points and other phenological values. These processings are executed on demand of the user on our SIB-ESS-C web server and results are transfered to the app. Any processing can also be done at the SIB-ESS-C webportal. The aim of this work is to make spatial time-series data and analysis functions available for end users without the need of data processing. In this presentation the author gives an overview on this new mobile app, the functionalities, the technical infrastructure as well as technological issues (how the app was developed, our made experiences).
NASA Astrophysics Data System (ADS)
Li, Y.; Bai, C.
2013-12-01
Predicting the distribution of engineered nanomaterials (ENMs) in the environment will provide critical information for risk assessment and policy development to regulate these emerging contaminants. The fate and transport of ENMs in natural subsurface environment is a function of time and subjected to various uncertainties. Here, we explore the feasibility of applying advanced statistical methodologies (i.e., time series analysis) to forecast the ENM concentration distribution in porous media with time. Hypothetical scenarios for the release of nanoparticles into a subsurface aquifer were simulated using randomly generated permeability fields that were based on a mildly heterogeneous field site in Oscoda, MI. A modified Modular Three-Dimensional Multispecies Transport Model (MT3DMS) with a capability to simulate ENM transport was used for simulation. The time series data of five ENM distribution parameters, including the far-front of aqueous phase ENM plume, the far-front of attached phase EMN distribution, the x-centroid of aqueous phase ENM plume, and the x-centroid of attached phase ENM distribution, were calculated based on the simulated results of fifteen random fields. Time series analysis was then applied to forecast the future values of these ENM distribution parameters. The time series predicted future values and confidence interval were find in good agreement with numerically simulated values. This proof-of-concept effort demonstrates the possibility of applying time series analysis to predict the ENM distribution at a field site.
A new approach for agroecosystems monitoring using high-revisit multitemporal satellite data series
NASA Astrophysics Data System (ADS)
Diez, M.; Moclán, C.; Romo, A.; Pirondini, F.
2014-10-01
With increasing population pressure throughout the world and the need for increased agricultural production there is a definite need for improved management of the world's agricultural resources. Comprehensive, reliable and timely information on agricultural resources is necessary for the implementation of effective management decisions. In that sense, the demand for high-quality and high-frequency geo-information for monitoring of agriculture and its associated ecosystems has been growing in the recent decades. Satellite image data enable direct observation of large areas at frequent intervals and therefore allow unprecedented mapping and monitoring of crops evolution. Furthermore, real time analysis can assist in making timely management decisions that affect the outcome of the crops. The DEIMOS-1 satellite, owned and operated by ELECNOR DEIMOS IMAGING (Spain), provides 22m, 3-band imagery with a very wide (620-km) swath, and has been specifically designed to produce high-frequency revisit on very large areas. This capability has been proved through the contracts awarded to Airbus Defence and Space every year since 2011, where DEIMOS-1 has provided the USDA with the bulk of the imagery used to monitor the crop season in the Lower 48, in cooperation with its twin satellite DMCii's UK-DMC2. Furthermore, high density agricultural areas have been targeted with increased frequency and analyzed in near real time to monitor tightly the evolution. In this paper we present the results obtained from a campaign carried out in 2013 with DEIMOS-1 and UK-DMC2 satellites. These campaigns provided a high-frequency revisit of target areas, with one image every two days on average: almost a ten-fold frequency improvement with respect to Landsat-8. The results clearly show the effectiveness of a high-frequency monitoring approach with high resolution images with respect to classic strategies where results are more exposed to weather conditions.
Modified cross sample entropy and surrogate data analysis method for financial time series
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2015-09-01
For researching multiscale behaviors from the angle of entropy, we propose a modified cross sample entropy (MCSE) and combine surrogate data analysis with it in order to compute entropy differences between original dynamics and surrogate series (MCSDiff). MCSDiff is applied to simulated signals to show accuracy and then employed to US and Chinese stock markets. We illustrate the presence of multiscale behavior in the MCSDiff results and reveal that there are synchrony containing in the original financial time series and they have some intrinsic relations, which are destroyed by surrogate data analysis. Furthermore, the multifractal behaviors of cross-correlations between these financial time series are investigated by multifractal detrended cross-correlation analysis (MF-DCCA) method, since multifractal analysis is a multiscale analysis. We explore the multifractal properties of cross-correlation between these US and Chinese markets and show the distinctiveness of NQCI and HSI among the markets in their own region. It can be concluded that the weaker cross-correlation between US markets gives the evidence for the better inner mechanism in the US stock markets than that of Chinese stock markets. To study the multiscale features and properties of financial time series can provide valuable information for understanding the inner mechanism of financial markets.
An Integrated Approach to Life Cycle Analysis
NASA Technical Reports Server (NTRS)
Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.
2006-01-01
Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.
ERIC Educational Resources Information Center
Chambers, Jay G.
This report describes two alternative approaches to measuring resources in K-12 education. One approach relies heavily on traditional accounting data, whereas the other draws on detailed information about the jobs and assignments of individual school personnel. It outlines the differences between accounting and economics and discusses how each…
Using Time Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit
Kennedy, Curtis E; Aoki, Noriaki; Mariscalco, Michele; Turley, James P
2015-01-01
Objectives To build and test cardiac arrest prediction models in a pediatric intensive care unit, using time series analysis as input, and to measure changes in prediction accuracy attributable to different classes of time series data. Methods A retrospective cohort study of pediatric intensive care patients over a 30 month study period. All subjects identified by code documentation sheets with matches in hospital physiologic and laboratory data repositories and who underwent chest compressions for two minutes were included as arrest cases. Controls were randomly selected from patients that did not experience arrest and who survived to discharge. Modeling data was based on twelve hours of data preceding the arrest (reference time for controls). Measurements and Main Results 103 cases of cardiac arrest and 109 control cases were used to prepare a baseline data set that consisted of 1025 variables in four data classes: multivariate, raw time series, clinical calculations, and time series trend analysis. We trained 20 arrest prediction models using a matrix of five feature sets (combinations of data classes) with four modeling algorithms: linear regression, decision tree, neural network and support vector machine. The reference model (multivariate data with regression algorithm) had an accuracy of 78% and 87% area under the receiver operating characteristic curve (AUROC). The best model (multivariate + trend analysis data with support vector machine algorithm) had an accuracy of 94% and 98% AUROC. Conclusions Cardiac arrest predictions based on a traditional model built with multivariate data and a regression algorithm misclassified cases 3.7 times more frequently than predictions that included time series trend analysis and built with a support vector machine algorithm. Although the final model lacks the specificity necessary for clinical application, we have demonstrated how information from time series data can be used to increase the accuracy of clinical
Approach to uncertainty in risk analysis
Rish, W.R.
1988-08-01
In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.
Multivariate analysis: A statistical approach for computations
NASA Astrophysics Data System (ADS)
Michu, Sachin; Kaushik, Vandana
2014-10-01
Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.
Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis
NASA Astrophysics Data System (ADS)
Zhou, Z.-G.; Tang, P.; Zhou, M.
2016-06-01
Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.
Filter-based multiscale entropy analysis of complex physiological time series.
Xu, Yuesheng; Zhao, Liang
2013-08-01
Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.
Complexity analysis of the air temperature and the precipitation time series in Serbia
NASA Astrophysics Data System (ADS)
Mimić, G.; Mihailović, D. T.; Kapor, D.
2017-02-01
In this paper, we have analyzed the time series of daily values for three meteorological elements, two continuous and a discontinuous one, i.e., the maximum and minimum air temperature and the precipitation. The analysis was done based on the observations from seven stations in Serbia from the period 1951-2010. The main aim of this paper was to quantify the complexity of the annual values for the mentioned time series and to calculate the rate of its change. For that purpose, we have used the sample entropy and the Kolmogorov complexity as the measures which can indicate the variability and irregularity of a given time series. Results obtained show that the maximum temperature has increasing trends in the given period which points out a warming, ranged in the interval 1-2 °C. The increasing temperature indicates the higher internal energy of the atmosphere, changing the weather patterns, manifested in the time series. The Kolmogorov complexity of the maximum temperature time series has statistically significant increasing trends, while the sample entropy has increasing but statistically insignificant trend. The trends of complexity measures for the minimum temperature depend on the location. Both complexity measures for the precipitation time series have decreasing trends.
The Effect of Divorce on Suicide in Japan: A Time Series Analysis, 1950-1980.
ERIC Educational Resources Information Center
Stack, Steve
1992-01-01
Explored relationship between divorce and suicide in Japan. Time series analysis was unable to substantiate divorce-suicide pattern for Japan. Although research did not offer support for relationship between divorce and suicide which Durkheim predicted, it did corroborate Durkheim's general theory of family integration. (Author/NB)
ERIC Educational Resources Information Center
Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler
2001-01-01
Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…
Donges, Jonathan F; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V; Marwan, Norbert; Dijkstra, Henk A; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
A Comparison of Missing-Data Procedures for Arima Time-Series Analysis
ERIC Educational Resources Information Center
Velicer, Wayne F.; Colby, Suzanne M.
2005-01-01
Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
Rinnsal: Exercises in Location Analysis. Instructional Activities Series IA/S-6.
ERIC Educational Resources Information Center
Croft, Jerry
This activity is one of a series of 17 teacher-developed instructional activities for geography at the secondary-grade level described in SO 009 140. The activity investigates economic change in a developing region in the United States. "Rinnsal" is a geographical simulation game lasting three weeks that involves location analysis concepts.…
Measuring teaching through hormones and time series analysis: Towards a comparative framework.
Ravignani, Andrea; Sonnweber, Ruth
2015-01-01
Arguments about the nature of teaching have depended principally on naturalistic observation and some experimental work. Additional measurement tools, and physiological variations and manipulations can provide insights on the intrinsic structure and state of the participants better than verbal descriptions alone: namely, time-series analysis, and examination of the role of hormones and neuromodulators on the behaviors of teacher and pupil.
Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H Eugene
2015-06-01
When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.
Driver Education Task Analysis: Instructional Objectives. HumRRO Safety Series.
ERIC Educational Resources Information Center
McKnight, A. James; Hundt, Alan G.
Developed from a systematic analysis of driving behaviors, this publication contains a set of instructional objectives for driver education courses and a series of tests designed to measure the degree to which the instructional objectives have been met by students. Part 1 provides a description of objectives for 74 learning units, including such…
Time series analysis of Mexico City subsidence constrained by radar interferometry
NASA Astrophysics Data System (ADS)
Doin, Marie-Pierre; Lopez-Quiroz, Penelope; Yan, Yajing; Bascou, Pascale; Pinel, Virginie
2010-05-01
unwrapping errors for each pixel and show that they are strongly decreased by iterations in the unwrapping process. (3) Finally, we present a new algorithm for time series analysis that differs from classical SVD decomposition and is best suited to the present data base. Accurate deformation time series are then derived over the metropolitan area of the city with a spatial resolution of 30 × 30 m. We also use the Gamma-PS software on the same data set. The phase differences are unwrapped within small patches with respect to a reference point chosen in each patch, whose phase is in turn unwrapped relatively to a reference point common for the whole area of interest. After removing the modelled contribution of the linear displacement rate and DEM error, some residual interferograms, presenting unwrapping errors because of strong residual orbital ramp or atmospheric phase screen, are spatially unwrapped by a minimum cost-flow algorithm. The next steps are to estimate and remove the residual orbital ramp and to apply temporal low-pass filter to remove atmospheric contributions. The step by step comparison of the SBAS and PS approaches shows both methods complementarity. The SBAS analysis provide subsidence rates with an accuracy of a mm/yr over the whole basin in a large area, together with the subsidence non linear behavior through time, however at the expense of some spatial regularization. The PS method provides locally accurate and punctual deformation rates, but fails in this case to yield a good large scale map and the non linear temporal behavior of the subsidence. We conclude that the relative contrast in subsidence between individual buildings and infrastructure must be relatively small, on average of the order of 5mm/yr.
NASA Astrophysics Data System (ADS)
Eduardo Virgilio Silva, Luiz; Otavio Murta, Luiz
2012-12-01
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiffmax) for q ≠1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiffmax values were capable of distinguish HRV groups (p-values 5.10×10-3, 1.11×10-7, and 5.50×10-7 for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis.
Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio
2015-12-01
This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.
NASA Astrophysics Data System (ADS)
Ruhi, A.; Olden, J. D.; Sabo, J. L.
2015-12-01
In the American Southwest, hydrologic drought has become a new normal as a result of increasing human appropriation of freshwater resources and increased aridity associated with global warming. Although drought has often been touted to threaten freshwater biodiversity, connecting drought to extinction risk of highly-imperiled faunas remains a challenge. Here we combine time-series methods from signal processing and econometrics to analyze a spatially comprehensive and long-term dataset to link discharge variation and community abundance of fish across the American Southwest. This novel time series framework identifies ongoing trends in daily discharge anomalies across the Southwest, quantifies the effect of the historical hydrologic drivers on fish community abundance, and allows us to simulate species trajectories and range-wide risk of decline (quasiextinction) under scenarios of future climate. Spectral anomalies are declining over the last 30 years in at least a quarter of the stream gaging stations across the American Southwest and these anomalies are robust predictors of historical abundance of native and non-native fishes. Quasiextinction probabilities are high (>50 %) for nearly ¾ of the native species across several large river basins in the same region; and the negative trend in annual anomalies increases quasiextinction risk for native but reduces this risk for non-native fishes. These findings suggest that ongoing drought is causing range-wide collapse and replacement of native fish faunas, and that this homogenization of western fish faunas will continue given the prevailing negative trend in discharge anomalies. Additionally, this combination of methods can be applied elsewhere as long as environmental and biological long-term time-series data are available. Collectively, these methods allow identifying the link between hydroclimatic forcing and ecological responses and thus may help anticipating the potential impacts of ongoing and future hydrologic
Lee, Yi-Hsuan; von Davier, Alina A
2013-07-01
Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.
A New Approach for High Efficiency Buck-Boost DC/DC Converters Using Series Compensation
NASA Astrophysics Data System (ADS)
Itoh, Jun-Ichi; Fujii, Takashi
This paper proposes a novel concept for non-isolated buck-boost DC/DC converter and control method. The proposed concept uses a series connection converter that only regulates the differential voltage between the input and output voltage. As a result, the power converter capacity is decreased. Moreover, the proposed circuit has advantages such as improved efficiency and losses reduction. The fundamental operation, control method, and design method of the proposed circuit are described in this paper. In addition, the validity of the proposed circuit is confirmed by carrying out simulations and experiments.
Nedorezov, L V
2015-01-01
For approximation of some well-known time series of Paramecia caudatun population dynamics (G. F. Gause, The Struggle for Existence, 1934) Verhulst and Gompertz models were used. The parameters were estimated for each of the models in two different ways: with the least squares method (global fitting) and non-traditional approach (a method of extreme points). The results obtained were compared and also with those represented by G. F. Gause. Deviations of theoretical (model) trajectories from experimental time series were tested using various non-parametric statistical tests. It was shown that the least square method-estimations lead to the results which not always meet the requirements imposed for a "fine" model. But in some cases a small modification of the least square method-estimations is possible allowing for satisfactory representations of experimental data set for approximation.
Eddy Currents Signatures Classification by Using Time Series: a System Modeling Approach
2014-12-23
maintenance. In this paper we propose to classify flaws in ferromagnetic materials by measuring Eddy currents. Our approach consists of two steps. First... materials without causing damage. This is useful for pre- dictive maintenance. The most important methods for non- destructive detection of structural flaws...monitoring of ferromagnetic materials based on Eddy currents. Our approach is based on two steps. First, using the measured data, we find a parametric
The study of coastal groundwater depth and salinity variation using time-series analysis
Tularam, G.A. . E-mail: a.tularam@griffith.edu.au; Keeler, H.P. . E-mail: p.keeler@ms.unimelb.edu.au
2006-10-15
A time-series approach is applied to study and model tidal intrusion into coastal aquifers. The authors examine the effect of tidal behaviour on groundwater level and salinity intrusion for the coastal Brisbane region using auto-correlation and spectral analyses. The results show a close relationship between tidal behaviour, groundwater depth and salinity levels for the Brisbane coast. The known effect can be quantified and incorporated into new models in order to more accurately map salinity intrusion into coastal groundwater table.
NASA Astrophysics Data System (ADS)
Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen
2016-04-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].
NASA Technical Reports Server (NTRS)
Biezad, D. J.; Schmidt, D. K.; Leban, F.; Mashiko, S.
1986-01-01
Single-channel pilot manual control output in closed-tracking tasks is modeled in terms of linear discrete transfer functions which are parsimonious and guaranteed stable. The transfer functions are found by applying a modified super-position time series generation technique. A Levinson-Durbin algorithm is used to determine the filter which prewhitens the input and a projective (least squares) fit of pulse response estimates is used to guarantee identified model stability. Results from two case studies are compared to previous findings, where the source of data are relatively short data records, approximately 25 seconds long. Time delay effects and pilot seasonalities are discussed and analyzed. It is concluded that single-channel time series controller modeling is feasible on short records, and that it is important for the analyst to determine a criterion for best time domain fit which allows association of model parameter values, such as pure time delay, with actual physical and physiological constraints. The purpose of the modeling is thus paramount.
Nease, Brian R. Ueki, Taro
2009-12-10
A time series approach has been applied to the nuclear fission source distribution generated by Monte Carlo (MC) particle transport in order to calculate the non-fundamental mode eigenvalues of the system. The novel aspect is the combination of the general technical principle of projection pursuit for multivariate data with the neutron multiplication eigenvalue problem in the nuclear engineering discipline. Proof is thoroughly provided that the stationary MC process is linear to first order approximation and that it transforms into one-dimensional autoregressive processes of order one (AR(1)) via the automated choice of projection vectors. The autocorrelation coefficient of the resulting AR(1) process corresponds to the ratio of the desired mode eigenvalue to the fundamental mode eigenvalue. All modern MC codes for nuclear criticality calculate the fundamental mode eigenvalue, so the desired mode eigenvalue can be easily determined. This time series approach was tested for a variety of problems including multi-dimensional ones. Numerical results show that the time series approach has strong potential for three dimensional whole reactor core. The eigenvalue ratio can be updated in an on-the-fly manner without storing the nuclear fission source distributions at all previous iteration cycles for the mean subtraction. Lastly, the effects of degenerate eigenvalues are investigated and solutions are provided.
Microscopic saw mark analysis: an empirical approach.
Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles
2015-01-01
Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%.
Endoscopic approaches to brainstem cavernous malformations: Case series and review of the literature
Nayak, Nikhil R.; Thawani, Jayesh P.; Sanborn, Matthew R.; Storm, Phillip B.; Lee, John Y.K.
2015-01-01
Background: Symptomatic cavernous malformations involving the brainstem are frequently difficult to access via traditional methods. Conventional skull-base approaches require significant brain retraction or bone removal to provide an adequate operative corridor. While there has been a trend toward limited employment of the most invasive surgical approaches, recent advances in endoscopic technology may complement existing methods to access these difficult to reach areas. Case Descriptions: Four consecutive patients were treated for symptomatic, hemorrhagic brainstem cavernous malformations via fully endoscopic approaches (endonasal, transclival; retrosigmoid; lateral supracerebellar, infratentorial; endonasal, transclival). Together, these lesions encompassed all three segments of the brainstem. Three of the patients had complete resection of the cavernous malformation, while one patient had stable residual at long-term follow up. Associated developmental venous anomalies were preserved in the two patients where one was identified preoperatively. Three of the four patients maintained stable or improved neurological examinations following surgery, while one patient experienced ipsilateral palsies of cranial nerves VII and VIII. The first transclival approach resulted in a symptomatic cerebrospinal fluid leak requiring re-operation, but the second did not. Although there are challenges associated with endoscopic approaches, relative to our prior microsurgical experience with similar cases, visualization and illumination of the surgical corridors were superior without significant limitations on operative mobility. Conclusion: The endoscope is a promising adjunct to the neurosurgeon's ability to approach difficult to access brainstem cavernous malformations. It allows the surgeon to achieve well-illuminated, panoramic views, and by combining approaches, can provide minimally invasive access to most regions of the brainstem. PMID:25984383
A Deliberate Practice Approach to Teaching Phylogenetic Analysis
Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.
2013-01-01
One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294
A deliberate practice approach to teaching phylogenetic analysis.
Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D
2013-01-01
One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts.
Analysis approaches and interventions with occupational performance
Ahn, Sinae
2016-01-01
[Purpose] The purpose of this study was to analyze approaches and interventions with occupational performance in patients with stroke. [Subjects and Methods] In this study, articles published in the past 10 years were searched. The key terms used were “occupational performance AND stroke” and “occupational performance AND CVA”. A total 252 articles were identified, and 79 articles were selected. All interventions were classified according to their approaches according to 6 theories. All interventions were analyzed for frequency. [Results] Regarding the approaches, there were 25 articles for studies that provided high frequency interventions aimed at improving biomechanical approaches (31.6%). This included electrical stimulation therapy, robot therapy, and sensory stimulation training, as well as others. Analysis of the frequency of interventions revealed that the most commonly used interventions, which were used in 18 articles (22.8%), made use of the concept of constraint-induced therapy. [Conclusion] The results of this study suggest an approach for use in clinics for selecting an appropriate intervention for occupational performance. PMID:27799719
Primary Dentition Analysis: Exploring a Hidden Approach
Vanjari, Kalasandhya; Kamatham, Rekhalakshmi; Gaddam, Kumar Raja
2016-01-01
ABSTRACT Background: Accurate prediction of the mesiodistal widths (MDWs) of canines and premolars in children with primary dentition facilitates interception of malocclusion at an early age. Boston University (BU) approach is one, i.e., based on primary teeth for predicting canine and premolar dimensions. Aim: To predict the canine and premolar dimensions, in the contemporary population, using BU approach and compare with the values obtained using Tanaka-Johnston (T/J) approach. Design: Children in the age range of 7-11 years with presence of all permanent mandibular incisors and primary maxillary and mandibular canines and first molars were included in the study. Those with interproximal caries or restorations, abnormalities in shape or size and history of orthodontic treatment were excluded. Impressions of both arches were made using irreversible hydrocolloid and poured with dental stone. The MDWs of the required teeth were measured on the models using electronic digital vernier caliper from which widths of permanent canines and premolars were predicted using both T/J and BU approaches. Results: Statistically significant (p = 0.00) positive correlation (r = 0.52-0.55) was observed between T/J and BU approaches. A statistically significant (p = 0.00) strong positive correlation (r = 0.72-0.77) was observed among girls, whereas boys showed a statistically nonsignificant weak positive correlation (r=0.17-0.42) based on gender. Conclusion: Boston University approach can be further studied prospectively to make it possible as a prediction method of permanent tooth dimensions for children in primary dentition stage. How to cite this article: Nuvvula S, Vanjari K, Kamatham R, Gaddam KR. Primary Dentition Analysis: Exploring a Hidden Approach. Int J Clin Pediatr Dent 2016;9(1):1-4. PMID:27274146
NASA Astrophysics Data System (ADS)
von der Linden, Jens; Hilton, Eric; Mitchell, Rachel; Rosenfield, Phil
2011-10-01
Communicating the results and significance of basic research to the general public is of critical importance. At present, very few programs exist to allow young scientists the opportunity to practice their public outreach skills. Although the need for science outreach is recognized, graduate programs often fail to provide any training in making science accessible. Engage represents a unique, graduate student-led effort to improve public outreach skills. Founded in 2009, Engage was created by three science graduate students at the University of Washington. The students developed an interdisciplinary curriculum to investigate why science outreach often fails, to improve graduate student communication skills, and to help students create a dynamic, public-friendly talk about their research. The course incorporates story-telling, improvisational arts, and development of analogy, all with a focus on clarity, brevity and accessibility. This free, public-friendly speaker series is hosted at the University of Washington and has substantial public attendance and participation.
Time Series Analysis of Sound Data on Interactive Calling Behavior of Japanese Tree Frogs
NASA Astrophysics Data System (ADS)
Horai, Shunsuke; Aihara, Ikkyu; Aihara, Kazuyuki
We have analyzed time series data of sound on interactive calling behavior of two male Japanese tree frogs (Hyla japonica Nihon-Ama-Gaeru). First, we have extracted two time series data mainly corresponding to respective frogs from the single time series data of calls of two frogs by the free and cross-platform sound editor Audacity. Then, we have quantitatively analyzed timing and inter-call intervals of respective frogs. Finally, we have characterized nonstationarily temporal change of the interactive calling behavior of two frogs by analysis of the cross recurrence plot. The results have shown that a pair of male frogs called in almost anti-phase synchronization after a short-term period of nearly in-phase synchronization, which implies existence of complex interactive calling behavior of two male frogs.
Graphic analysis and multifractal on percolation-based return interval series
NASA Astrophysics Data System (ADS)
Pei, A. Q.; Wang, J.
2015-05-01
A financial time series model is developed and investigated by the oriented percolation system (one of the statistical physics systems). The nonlinear and statistical behaviors of the return interval time series are studied for the proposed model and the real stock market by applying visibility graph (VG) and multifractal detrended fluctuation analysis (MF-DFA). We investigate the fluctuation behaviors of return intervals of the model for different parameter settings, and also comparatively study these fluctuation patterns with those of the real financial data for different threshold values. The empirical research of this work exhibits the multifractal features for the corresponding financial time series. Further, the VGs deviated from both of the simulated data and the real data show the behaviors of small-world, hierarchy, high clustering and power-law tail for the degree distributions.
Exploring Two Inflationary Regimes in Latin-American Economies:. a Binary Time Series Analysis
NASA Astrophysics Data System (ADS)
Brida, Juan Gabriel; Garrido, Nicolas
The aim of this paper is to apply the methods of Symbolic Time Series Analysis (STSA) to a series of inflation from a group of Latin-American economies. Starting with a partition of two inflation regimes, we use data symbolization for identifying temporal patterns. Afterwards the statistical information obtained from the patterns is used to estimate the parameters of a nonlinear model proposed by Brida (2000).1 We compare the performance of the model against a naive benchmark predictor to verify its power to anticipate the qualitative behavior of the inflation time series. When the use of STSA is made through pure optimization criteria, the performance of the model is poor. However, when the partition of the space of states is made according to economics intuition, the performance of the model increases considerably.
Statistical Analysis of Sensor Network Time Series at Multiple Time Scales
NASA Astrophysics Data System (ADS)
Granat, R. A.; Donnellan, A.
2013-12-01
Modern sensor networks often collect data at multiple time scales in order to observe physical phenomena that occur at different scales. Whether collected by heterogeneous or homogenous sensor networks, measurements at different time scales are usually subject to different dynamics, noise characteristics, and error sources. We explore the impact of these effects on the results of statistical time series analysis methods applied to multi-scale time series data. As a case study, we analyze results from GPS time series position data collected in Japan and the Western United States, which produce raw observations at 1Hz and orbit corrected observations at time resolutions of 5 minutes, 30 minutes, and 24 hours. We utilize the GPS analysis package (GAP) software to perform three types of statistical analysis on these observations: hidden Markov modeling, probabilistic principle components analysis, and covariance distance analysis. We compare the results of these methods at the different time scales and discuss the impact on science understanding of earthquake fault systems generally and recent large seismic events specifically, including the Tohoku-Oki earthquake in Japan and El Mayor-Cucupah earthquake in Mexico.
ERIC Educational Resources Information Center
Schofield, Dee
The first of a series of 13 monthly reports, this paper reviews the issue of the year-round school -- a variety of calendar changes aimed at increasing the educational and economic efficiency of the school system. The author first reviews the major pros and cons of the year-round controversy, focusing on the questions of potential learning…
ERIC Educational Resources Information Center
New York State Education Dept., Albany.
This book is designed to assist those who work with non-English dominant students by providing resource information relevant to second language teaching and learning. The articles in the series encompass both theory and practical learning techniques in six general topics. The articles in the first text of the series, concerning background and…
Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.
Malkin, Zinovy
2016-04-01
The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.
Social network analysis of character interaction in the Stargate and Star Trek television series
NASA Astrophysics Data System (ADS)
Tan, Melody Shi Ai; Ujum, Ephrance Abu; Ratnavelu, Kuru
This paper undertakes a social network analysis of two science fiction television series, Stargate and Star Trek. Television series convey stories in the form of character interaction, which can be represented as “character networks”. We connect each pair of characters that exchanged spoken dialogue in any given scene demarcated in the television series transcripts. These networks are then used to characterize the overall structure and topology of each series. We find that the character networks of both series have similar structure and topology to that found in previous work on mythological and fictional networks. The character networks exhibit the small-world effects but found no significant support for power-law. Since the progression of an episode depends to a large extent on the interaction between each of its characters, the underlying network structure tells us something about the complexity of that episode’s storyline. We assessed the complexity using techniques from spectral graph theory. We found that the episode networks are structured either as (1) closed networks, (2) those containing bottlenecks that connect otherwise disconnected clusters or (3) a mixture of both.
Sample entropy applied to the analysis of synthetic time series and tachograms
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.; Gálvez-Coyt, G. G.; Solís-Montufar, E.
2017-01-01
Entropy is a method of non-linear analysis that allows an estimate of the irregularity of a system, however, there are different types of computational entropy that were considered and tested in order to obtain one that would give an index of signals complexity taking into account the data number of the analysed time series, the computational resources demanded by the method, and the accuracy of the calculation. An algorithm for the generation of fractal time-series with a certain value of β was used for the characterization of the different entropy algorithms. We obtained a significant variation for most of the algorithms in terms of the series size, which could result counterproductive for the study of real signals of different lengths. The chosen method was sample entropy, which shows great independence of the series size. With this method, time series of heart interbeat intervals or tachograms of healthy subjects and patients with congestive heart failure were analysed. The calculation of sample entropy was carried out for 24-hour tachograms and time subseries of 6-hours for sleepiness and wakefulness. The comparison between the two populations shows a significant difference that is accentuated when the patient is sleeping.
ERIC Educational Resources Information Center
Social Policy Research Associates, Menlo Park, CA.
Of the 19 projects conducted as part of the Defense Conversion Adjustment (DCA) Demonstration administered by the U.S. Department of Labor's Office of Work-Based Learning, 8 tested the worker mobility approach. The projects, which shared the common goal of helping dislocated defense workers find high-quality jobs, tested one or more of the…
ERIC Educational Resources Information Center
Glazer, Susan Mandel
This concise book shares several sensible, logical, and meaningful approaches that guide young children to use the written coding system to read, spell, and make meaning of the English language coding system. The book demonstrates that phonics, spelling, and word study are essential parts of literacy learning. After an introduction, chapters are:…
THE MENTALLY RETARDED CHILD, A PSYCHOLOGICAL APPROACH. MCGRAW-HILL SERIES IN PSYCHOLOGY.
ERIC Educational Resources Information Center
ROBINSON, HALBERT B.; ROBINSON, NANCY M.
PRESENTING A PSYCHOLOGICAL APPROACH TO MENTAL RETARDATION, THIS TEXT BEGINS WITH A DISCUSSION OF THEORIES OF INTELLIGENCE, PROBLEMS OF DEFINITION, AND THE CURRENT STATUS OF THE FIELD OF MENTAL RETARDATION. A SECTION ON ETIOLOGY AND SYNDROMES PRESENTS INFORMATION ON GENETIC FACTORS AND GENETIC SYNDROMES AND THE PHYSICAL AND PSYCHOLOGICAL…
ERIC Educational Resources Information Center
Parette, Howard P.; Blum, Craig; Boeckmann, Nichole M.
2009-01-01
As assistive technology applications are increasingly implemented in early childhood settings for children who are at risk or who have disabilities, it is critical that teachers utilize observational approaches to determine whether targeted assistive technology-supported interventions make a difference in children's learning. One structured…
Faces of Change. Visual Evidence: An Instructional Approach. Instructor's Notes: Film/Essay Series.
ERIC Educational Resources Information Center
Miller, Norman N.
Designed for use with the multidisciplinary film project, "Faces of Change, Five Rural Societies in Transition" for the college social studies curriculum, this manual contains an overview of the material and its underlying philosophy and suggests teaching strategies. The first section discusses the overall approach, the use of films in…
ERIC Educational Resources Information Center
Braid, Bernice, Ed.; Long, Ada, Ed.
2010-01-01
The decade since publication of "Place as Text: Approaches to Active Learning" has seen an explosion of interest and productivity in the field of experiential education. This monograph presents a story of an experiment and a blueprint of sorts for anyone interested in enriching an existing program or willing to experiment with pedagogy…
ERIC Educational Resources Information Center
Godley, Susan Harrington; Meyers, Robert J.; Smith, Jane Ellen; Karvinen, Tracy; Titus, Janet C.; Godley, Mark D.; Dent, George; Passetti, Lora; Kelberg, Pamela
This publication was written for therapists and their supervisors who may want to implement the adolescent community reinforcement approach intervention, which was one of the five interventions tested by the Center for Substance Abuse Treatment's (CSAT's) Cannabis Youth Treatment (CYT) Project. The CYT Project provided funding to support a study…
ERIC Educational Resources Information Center
Social Policy Research Associates, Menlo Park, CA.
Of the 19 projects conducted as part of the Defense Conversion Adjustment (DCA) Demonstration administered by the U.S. Department of Labor's Office of Work-Based Learning, 9 tested the dislocation aversion approach. The projects attempted to alleviate the negative impacts of defense cutbacks on communities, firms, and workers. Six projects…
ERIC Educational Resources Information Center
Heathcote, Dorothy; Bolton, Gavin
This book describes how theater can create an impetus for productive learning across the curriculum. Dorothy Heathcote's "mantle of the expert" approach is discussed in which teachers and students explore, in role, the knowledge they already have about a problem or task while making new discoveries along the way. The book also presents a…
ERIC Educational Resources Information Center
Ettinger, Blanche; Perfetto, Edda
Using a developmental, hands-on approach, this text/workbook helps students master the basic English skills that are essential to write effective business correspondence, to recognize language errors, and to develop decision-making and problem-solving skills. Its step-by-step focus and industry-specific format encourages students to review,…
TIME SERIES ANALYSIS OF REMOTELY-SENSED TIR EMISSION: linking anomalies to physical processes
NASA Astrophysics Data System (ADS)
Pavlidou, E.; van der Meijde, M.; Hecker, C.; van der Werff, H.; Ettema, J.
2013-12-01
In the last 15 years, remote sensing has been evaluated for detecting thermal anomalies as precursor to earthquakes. Important issues that need yet to be tackled include definition of: (a) thermal anomaly, taking into account weather conditions, observation settings and ';natural' variability caused by background sources (b) the length of observations required for this purpose; and (c) the location of detected anomalies, which should be physically related to the tectonic activity. To determine whether thermal anomalies are statistical noise, mere meteorological conditions, or actual earthquake-related phenomena, we apply a novel approach. We use brightness temperature (top-of-atmosphere) data from thermal infrared imagery acquired at a hypertemporal (sub-hourly) interval, from geostationary weather satellites over multiple years. The length of the time series allows for analysis of meteorological effects (diurnal, seasonal or annual trends) and background variability, through the application of a combined spatial and temporal filter to distinguish extreme occurrences from trends. The definition of potential anomalies is based on statistical techniques, taking into account published (geo)physical characteristics of earthquake related thermal anomalies. We use synthetic data to test the performance of the proposed detection method and track potential factors affecting the results. Subsequently, we apply the method on original data from Iran and Turkey, in quiescent and earthquake-struck periods alike. We present our findings with main focus to assess resulting anomalies in relation to physical processes thereby considering: (a) meteorological effects, (b) the geographical, geological and environmental settings, and (c) physically realistic distances and potential physical relations with the activity of causative faults.
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael
2014-01-01
Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and
Zapata-Fonseca, Leonardo; Dotov, Dobromir; Fossion, Ruben; Froese, Tom
2016-01-01
There is a growing consensus that a fuller understanding of social cognition depends on more systematic studies of real-time social interaction. Such studies require methods that can deal with the complex dynamics taking place at multiple interdependent temporal and spatial scales, spanning sub-personal, personal, and dyadic levels of analysis. We demonstrate the value of adopting an extended multi-scale approach by re-analyzing movement time-series generated in a study of embodied dyadic interaction in a minimal virtual reality environment (a perceptual crossing experiment). Reduced movement variability revealed an interdependence between social awareness and social coordination that cannot be accounted for by either subjective or objective factors alone: it picks out interactions in which subjective and objective conditions are convergent (i.e., elevated coordination is perceived as clearly social, and impaired coordination is perceived as socially ambiguous). This finding is consistent with the claim that interpersonal interaction can be partially constitutive of direct social perception. Clustering statistics (Allan Factor) of salient events revealed fractal scaling. Complexity matching defined as the similarity between these scaling laws was significantly more pronounced in pairs of participants as compared to surrogate dyads. This further highlights the multi-scale and distributed character of social interaction and extends previous complexity matching results from dyadic conversation to non-verbal social interaction dynamics. Trials with successful joint interaction were also associated with an increase in local coordination. Consequently, a local coordination pattern emerges on the background of complex dyadic interactions in the PCE task and makes joint successful performance possible. PMID:28018274
Metaplastic Breast Carcinoma: Analysis of Clinical and Pathologic Characteristics - A Case Series
Salimoğlu, Semra; Sert, İsmail; Emiroğlu, Mustafa; Karaali, Cem; Kuzukıran, Dilek; Kırmızı, Yasemin Akyüz; Diniz, Gülden; Aydın, Cengiz
2016-01-01
Objective Metaplastic breast cancer (MBC) is a rare type of breast cancer that is considered to be clinically aggressive. The clinical significance and prognostic risk factors of MBC are limited. This study comprises a retrospective analysis of the clinical and pathologic findings of a series of patients treated for MBC. Materials and Methods The files of 657 patients who underwent surgery because of breast cancer at our clinic were examined and the data found on 11 patients who were diagnosed as having MBC were analyzed. Results With a median age of 56 years, all patients were postmenopausal and presented with a palpable mass on physical examination. Symptoms of ulceration and skin involvement were seen in only one patient. Eight patients were diagnosed as having squamous cell carcinoma (SCC), and 3 had both SCC and osseous differentiation. The median diameter was 3.8 cm (max. 14 cm; min. 1.5 cm). Lymph node metastasis was detected in 5 (45%) patients. Progesterone (PR) and estrogen (ER) were both negative in 11 (100%) patients and 10 (90.9%) patients, respectively, and CerbB2 was negative in 7 (63.6%) patients. Patients were followed up for a median period of 15 months (range, 6–40 months); at the end of which, 10 patients survived and one died of cardiac arrest at 7 months post-operatively. No instances of local recurrence or distant organ metastasis were found in any patients. The overall patient survival rate was 90%. Conclusion There is no consensus on the clinical significance or best treatment approach for metaplastic carcinoma. In our study, patients with MBC were of advanced age, had tumors with large margins, high negativity for hormone receptors, and moderate- to well-differentiated histology.
Zapata-Fonseca, Leonardo; Dotov, Dobromir; Fossion, Ruben; Froese, Tom
2016-01-01
There is a growing consensus that a fuller understanding of social cognition depends on more systematic studies of real-time social interaction. Such studies require methods that can deal with the complex dynamics taking place at multiple interdependent temporal and spatial scales, spanning sub-personal, personal, and dyadic levels of analysis. We demonstrate the value of adopting an extended multi-scale approach by re-analyzing movement time-series generated in a study of embodied dyadic interaction in a minimal virtual reality environment (a perceptual crossing experiment). Reduced movement variability revealed an interdependence between social awareness and social coordination that cannot be accounted for by either subjective or objective factors alone: it picks out interactions in which subjective and objective conditions are convergent (i.e., elevated coordination is perceived as clearly social, and impaired coordination is perceived as socially ambiguous). This finding is consistent with the claim that interpersonal interaction can be partially constitutive of direct social perception. Clustering statistics (Allan Factor) of salient events revealed fractal scaling. Complexity matching defined as the similarity between these scaling laws was significantly more pronounced in pairs of participants as compared to surrogate dyads. This further highlights the multi-scale and distributed character of social interaction and extends previous complexity matching results from dyadic conversation to non-verbal social interaction dynamics. Trials with successful joint interaction were also associated with an increase in local coordination. Consequently, a local coordination pattern emerges on the background of complex dyadic interactions in the PCE task and makes joint successful performance possible.
Computational approaches to fMRI analysis.
Cohen, Jonathan D; Daw, Nathaniel; Engelhardt, Barbara; Hasson, Uri; Li, Kai; Niv, Yael; Norman, Kenneth A; Pillow, Jonathan; Ramadge, Peter J; Turk-Browne, Nicholas B; Willke, Theodore L
2017-02-23
Analysis methods in cognitive neuroscience have not always matched the richness of fMRI data. Early methods focused on estimating neural activity within individual voxels or regions, averaged over trials or blocks and modeled separately in each participant. This approach mostly neglected the distributed nature of neural representations over voxels, the continuous dynamics of neural activity during tasks, the statistical benefits of performing joint inference over multiple participants and the value of using predictive models to constrain analysis. Several recent exploratory and theory-driven methods have begun to pursue these opportunities. These methods highlight the importance of computational techniques in fMRI analysis, especially machine learning, algorithmic optimization and parallel computing. Adoption of these techniques is enabling a new generation of experiments and analyses that could transform our understanding of some of the most complex-and distinctly human-signals in the brain: acts of cognition such as thoughts, intentions and memories.
Hatch, C.E.; Fisher, A.T.; Revenaugh, J.S.; Constantz, J.; Ruehl, C.
2006-01-01
We present a method for determining streambed seepage rates using time series thermal data. The new method is based on quantifying changes in phase and amplitude of temperature variations between pairs of subsurface sensors. For a reasonable range of streambed thermal properties and sensor spacings the time series method should allow reliable estimation of seepage rates for a range of at least ??10 m d-1 (??1.2 ?? 10-2 m s-1), with amplitude variations being most sensitive at low flow rates and phase variations retaining sensitivity out to much higher rates. Compared to forward modeling, the new method requires less observational data and less setup and data handling and is faster, particularly when interpreting many long data sets. The time series method is insensitive to streambed scour and sedimentation, which allows for application under a wide range of flow conditions and allows time series estimation of variable streambed hydraulic conductivity. This new approach should facilitate wider use of thermal methods and improve understanding of the complex spatial and temporal dynamics of surface water-groundwater interactions. Copyright 2006 by the American Geophysical Union.
Spaeder, M C; Fackler, J C
2012-04-01
Respiratory syncytial virus (RSV) is the most common cause of documented viral respiratory infections, and the leading cause of hospitalization, in young children. We performed a retrospective time-series analysis of all patients aged <18 years with laboratory-confirmed RSV within a network of multiple affiliated academic medical institutions. Forecasting models of weekly RSV incidence for the local community, inpatient paediatric hospital and paediatric intensive-care unit (PICU) were created. Ninety-five percent confidence intervals calculated around our models' 2-week forecasts were accurate to ±9·3, ±7·5 and ±1·5 cases/week for the local community, inpatient hospital and PICU, respectively. Our results suggest that time-series models may be useful tools in forecasting the burden of RSV infection at the local and institutional levels, helping communities and institutions to optimize distribution of resources based on the changing burden and severity of illness in their respective communities.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
Bayesian time-series analysis of a repeated-measures poisson outcome with excess zeroes.
Murphy, Terrence E; Van Ness, Peter H; Araujo, Katy L B; Pisani, Margaret A
2011-12-01
In this article, the authors demonstrate a time-series analysis based on a hierarchical Bayesian model of a Poisson outcome with an excessive number of zeroes. The motivating example for this analysis comes from the intensive care unit (ICU) of an urban university teaching hospital (New Haven, Connecticut, 2002-2004). Studies of medication use among older patients in the ICU are complicated by statistical factors such as an excessive number of zero doses, periodicity, and within-person autocorrelation. Whereas time-series techniques adjust for autocorrelation and periodicity in outcome measurements, Bayesian analysis provides greater precision for small samples and the flexibility to conduct posterior predictive simulations. By applying elements of time-series analysis within both frequentist and Bayesian frameworks, the authors evaluate differences in shift-based dosing of medication in a medical ICU. From a small sample and with adjustment for excess zeroes, linear trend, autocorrelation, and clinical covariates, both frequentist and Bayesian models provide evidence of a significant association between a specific nursing shift and dosing level of a sedative medication. Furthermore, the posterior distributions from a Bayesian random-effects Poisson model permit posterior predictive simulations of related results that are potentially difficult to model.
Cluster analysis of resting-state fMRI time series.
Mezer, Aviv; Yovel, Yossi; Pasternak, Ofer; Gorfine, Tali; Assaf, Yaniv
2009-05-01
Functional MRI (fMRI) has become one of the leading methods for brain mapping in neuroscience. Recent advances in fMRI analysis were used to define the default state of brain activity, functional connectivity and basal activity. Basal activity measured with fMRI raised tremendous interest among neuroscientists since synchronized brain activity pattern could be retrieved while the subject rests (resting state fMRI). During recent years, a few signal processing schemes have been suggested to analyze the resting state blood oxygenation level dependent (BOLD) signal from simple correlations to spectral decomposition. In most of these analysis schemes, the question asked was which brain areas "behave" in the time domain similarly to a pre-specified ROI. In this work we applied short time frequency analysis and clustering to study the spatial signal characteristics of resting state fMRI time series. Such analysis revealed that clusters of similar BOLD fluctuations are found in the cortex but also in the white matter and sub-cortical gray matter regions (thalamus). We found high similarities between the BOLD clusters and the neuro-anatomical appearance of brain regions. Additional analysis of the BOLD time series revealed a strong correlation between head movements and clustering quality. Experiments performed with T1-weighted time series also provided similar quality of clustering. These observations led us to the conclusion that non-functional contributions to the BOLD time series can also account for symmetric appearance of signal fluctuations. These contributions may include head motions, the underling microvasculature anatomy and cellular morphology.
Fractal time series analysis of postural stability in elderly and control subjects
Amoud, Hassan; Abadi, Mohamed; Hewson, David J; Michel-Pellegrino, Valérie; Doussot, Michel; Duchêne, Jacques
2007-01-01
Background The study of balance using stabilogram analysis is of particular interest in the study of falls. Although simple statistical parameters derived from the stabilogram have been shown to predict risk of falls, such measures offer little insight into the underlying control mechanisms responsible for degradation in balance. In contrast, fractal and non-linear time-series analysis of stabilograms, such as estimations of the Hurst exponent (H), may provide information related to the underlying motor control strategies governing postural stability. In order to be adapted for a home-based follow-up of balance, such methods need to be robust, regardless of the experimental protocol, while producing time-series that are as short as possible. The present study compares two methods of calculating H: Detrended Fluctuation Analysis (DFA) and Stabilogram Diffusion Analysis (SDA) for elderly and control subjects, as well as evaluating the effect of recording duration. Methods Centre of pressure signals were obtained from 90 young adult subjects and 10 elderly subjects. Data were sampled at 100 Hz for 30 s, including stepping onto and off the force plate. Estimations of H were made using sliding windows of 10, 5, and 2.5 s durations, with windows slid forward in 1-s increments. Multivariate analysis of variance was used to test for the effect of time, age and estimation method on the Hurst exponent, while the intra-class correlation coefficient (ICC) was used as a measure of reliability. Results Both SDA and DFA methods were able to identify differences in postural stability between control and elderly subjects for time series as short as 5 s, with ICC values as high as 0.75 for DFA. Conclusion Both methods would be well-suited to non-invasive longitudinal assessment of balance. In addition, reliable estimations of H were obtained from time series as short as 5 s. PMID:17470303
Low Cost Beam-Steering Approach for a Series-Fed Array
NASA Technical Reports Server (NTRS)
Host, Nicholas K.; Chen, Chi-Chih; Volakis, John L.; Miranda, Felix A.
2013-01-01
Phased array antennas showcase many advantages over mechanically steered systems. However, they are also more complex and costly. This paper presents a concept which overcomes these detrimental attributes by eliminating all of the phased array backend (including phase shifters). Instead, a propagation constant reconfigurable transmission line in a series fed array arrangement is used to allow phase shifting with one small (less than or equal to 100mil) linear mechanical motion. A novel slotted coplanar stripline design improves on previous transmission lines by demonstrating a greater control of propagation constant, thus allowing practical prototypes to be built. Also, beam steering pattern control is explored. We show that with correct choice of line impedance, pattern control is possible for all scan angles. A 20 element array scanning from -25 deg less than or equal to theta less than or equal to 21 deg. with mostly uniform gain at 13GHz is presented. Measured patterns show a reduced scan range of 12 deg. less than or equal to theta less than or equal to 25 deg. due to a correctable manufacturing error as verified by simulation. Beam squint is measured to be plus or minus 2.5 deg for a 600MHz bandwidth and cross-pol is measured to be at least -15dB.
Ignotti, Eliane; Hacon, Sandra de Souza; Junger, Washington Leite; Mourão, Dennys; Longo, Karla; Freitas, Saulo; Artaxo, Paulo; Leon, Antônio Carlos Monteiro Ponce de
2010-04-01
The objective of the study is to evaluate the effect of the daily variation in concentrations of fine particulate matter (diameter less than 2.5 microm--PM2.5) resulting from the burning of biomass on the daily number of hospitalizations of children and elderly people for respiratory diseases, in Alta Floresta and Tangará da Serra in the Brazilian Amazon in 2005. This is an ecological time series study that uses data on daily number of hospitalizations of children and the elderly for respiratory diseases, and estimated concentration of PM2.5. In Alta Floresta, the percentage increases in the relative risk (%RR) of hospitalization for respiratory diseases in children were significant for the whole year and for the dry season with 3-4 day lags. In the dry season these measurements reach 6% (95%CI: 1.4-10.8). The associations were significant for moving averages of 3-5 days. The %RR for the elderly was significant for the current day of the drought, with a 6.8% increase (95%CI: 0.5-13.5) for each additional 10 microg/m3 of PM2.5. No associations were verified for Tangará da Serra. The PM2.5 from the burning of biomass increased hospitalizations for respiratory diseases in children and the elderly.
NASA Astrophysics Data System (ADS)
Erkyihun, S. T.
2013-12-01
Understanding streamflow variability and the ability to generate realistic scenarios at multi-decadal time scales is important for robust water resources planning and management in any River Basin - more so on the Colorado River Basin with its semi-arid climate and highly stressed water resources It is increasingly evident that large scale climate forcings such as El Nino Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) are known to modulate the Colorado River Basin hydrology at multi-decadal time scales. Thus, modeling these large scale Climate indicators is important to then conditionally modeling the multi-decadal streamflow variability. To this end, we developed a simulation model that combines the wavelet-based time series method, Wavelet Auto Regressive Moving Average (WARMA) with a K-nearest neighbor (K-NN) bootstrap approach. In this, for a given time series (climate forcings), dominant periodicities/frequency bands are identified from the wavelet spectrum that pass the 90% significant test. The time series is filtered at these frequencies in each band to create ';components'; the components are orthogonal and when added to the residual (i.e., noise) results in the original time series. The components, being smooth, are easily modeled using parsimonious Auto Regressive Moving Average (ARMA) time series models. The fitted ARMA models are used to simulate the individual components which are added to obtain simulation of the original series. The WARMA approach is applied to all the climate forcing indicators which are used to simulate multi-decadal sequences of these forcing. For the current year, the simulated forcings are considered the ';feature vector' and K-NN of this are identified; one of the neighbors (i.e., one of the historical year) is resampled using a weighted probability metric (with more weights to nearest neighbor and least to the farthest) and the corresponding streamflow is the
Factors influencing crime rates: an econometric analysis approach
NASA Astrophysics Data System (ADS)
Bothos, John M. A.; Thomopoulos, Stelios C. A.
2016-05-01
The scope of the present study is to research the dynamics that determine the commission of crimes in the US society. Our study is part of a model we are developing to understand urban crime dynamics and to enhance citizens' "perception of security" in large urban environments. The main targets of our research are to highlight dependence of crime rates on certain social and economic factors and basic elements of state anticrime policies. In conducting our research, we use as guides previous relevant studies on crime dependence, that have been performed with similar quantitative analyses in mind, regarding the dependence of crime on certain social and economic factors using statistics and econometric modelling. Our first approach consists of conceptual state space dynamic cross-sectional econometric models that incorporate a feedback loop that describes crime as a feedback process. In order to define dynamically the model variables, we use statistical analysis on crime records and on records about social and economic conditions and policing characteristics (like police force and policing results - crime arrests), to determine their influence as independent variables on crime, as the dependent variable of our model. The econometric models we apply in this first approach are an exponential log linear model and a logit model. In a second approach, we try to study the evolvement of violent crime through time in the US, independently as an autonomous social phenomenon, using autoregressive and moving average time-series econometric models. Our findings show that there are certain social and economic characteristics that affect the formation of crime rates in the US, either positively or negatively. Furthermore, the results of our time-series econometric modelling show that violent crime, viewed solely and independently as a social phenomenon, correlates with previous years crime rates and depends on the social and economic environment's conditions during previous years.
Applying fractal analysis to heart rate time series of sheep experiencing pain.
Stubsjøen, Solveig M; Bohlin, Jon; Skjerve, Eystein; Valle, Paul S; Zanella, Adroaldo J
2010-08-04
The objective assessment of pain is difficult in animals and humans alike. Detrended fluctuation analysis (DFA) is a method which extracts "hidden" information from heart rate time series, and may offer a novel way of assessing the subjective experience associated with pain. The aim of this study was to investigate whether any fractal differences could be detected in heart rate time series of sheep due to the infliction of ischaemic pain. Heart rate variability (HRV) was recorded continuously in five ewes during treatment sequences of baseline, intervention and post-intervention for up to 60 min. Heart rate time series were subjected to a DFA, and the median of the scaling coefficients (alpha) was found to be alpha=1.10 for the baseline sequences, 1.01 for the intervention sequences and 1.00 for the post-intervention sequences. The complexity in the regulation of heartbeats decreased between baseline and intervention (p approximately 0.03) and baseline and post-intervention (p approximately 0.01), indicating reperfusion pain and nociceptive sensitization in the post-intervention sequence. Random time series based on Gaussian white noise were generated, with similar mean and variance to the HRV sequences. No difference was found between these series (p approximately 0.28), pointing to a true difference in complexity in the original data. We found no difference in the scaling coefficient alpha between the different treatments, possibly due to the small sample size or a fear induced sympathetic arousal during test day 1 confounding the results. The decrease in the scaling coefficient alpha may be due to sympathetic activation and vagal withdrawal. DFA of heart rate time series may be a useful method to evaluate the progressive shift of cardiac regulation toward sympathetic activation and vagal withdrawal produced by pain or negative emotional responses such as fear.
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Astrophysics Data System (ADS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades.Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
Higher Order Residual Analysis for Nonlinear Time Series with Autoregressive Correlation Structures.
1984-09-25
ANALYSIS FOR NONLINEAR TIMEF SERIES WITH AI.OREGRESSIVE CORRELATION STRUVIURES BY P.A.W. Lewis & A. J. Lawrance September 1984 Approved for public release...of all or part of this report is authorized. P.A.W. Lewis A. J. Lawrance Professor of Operations Research University of Birmingham, England ’ Naval...J. Lawrance P. A. W. Lewis 9 PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK AREA G WORK UNIT NUMBERS Naval Postgraduate
Time series analysis and long range correlations of Nordic spot electricity market data
NASA Astrophysics Data System (ADS)
Erzgräber, Hartmut; Strozzi, Fernanda; Zaldívar, José-Manuel; Touchette, Hugo; Gutiérrez, Eugénio; Arrowsmith, David K.
2008-11-01
The electricity system price of the Nord Pool spot market is analysed. Different time scale analysis tools are assessed with focus on the Hurst exponent and long range correlations. Daily and weekly periodicities of the spot market are identified. Even though space time separation plots suggest more stationary behaviour than other financial time series, we find large fluctuations of the spot price market which suggest time-dependent scaling parameters.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen
2016-04-01
Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i
Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin
NASA Astrophysics Data System (ADS)
zhangli, Sun; xiufang, Zhu; yaozhong, Pan
2016-04-01
Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.
Microfluidic Approaches for Protein Crystal Structure Analysis.
Maeki, Masatoshi; Yamaguchi, Hiroshi; Tokeshi, Manabu; Miyazaki, Masaya
2016-01-01
This review summarizes two microfluidic-based protein crystallization methods, protein crystallization behavior in the microfluidic devices, and their applications for X-ray crystal structure analysis. Microfluidic devices provide many advantages for protein crystallography; they require small sample volumes, provide high-throughput screening, and allow control of the protein crystallization. A droplet-based protein crystallization method is a useful technique for high-throughput screening and the formation of a single crystal without any complicated device fabrication process. Well-based microfluidic platforms also enable effective protein crystallization. This review also summarizes the protein crystal growth behavior in microfluidic devices as, is known from viewpoints of theoretical and experimental approaches. Finally, we introduce applications of microfluidic devices for on-chip crystal structure analysis.
Surgery-first orthognathic approach case series: Salient features and guidelines
Gandedkar, Narayan H; Chng, Chai Kiat; Tan, Winston
2016-01-01
Conventional orthognathic surgery treatment involves a prolonged period of orthodontic treatment (pre- and post-surgery), making the total treatment period of 3–4 years too exhaustive. Surgery-first orthognathic approach (SFOA) sees orthognathic surgery being carried out first, followed by orthodontic treatment to align the teeth and occlusion. Following orthognathic surgery, a period of rapid metabolic activity within tissues ensues is known as the regional acceleratory phenomenon (RAP). By performing surgery first, RAP can be harnessed to facilitate efficient orthodontic treatment. This phenomenon is believed to be a key factor in the notable reduction in treatment duration using SFOA. This article presents two cases treated with SFOA with emphasis on “case selection, treatment strategy, merits, and limitations” of SFOA. Further, salient features comparison of “conventional orthognathic surgery” and “SFOA” with an overview of author's SFOA treatment protocol is enumerated. PMID:26998476
Regenerative Approach to Bilateral Rostral Mandibular Reconstruction in a Case Series of Dogs
Arzi, Boaz; Cissell, Derek D.; Pollard, Rachel E.; Verstraete, Frank J. M.
2015-01-01
Extensive rostral mandibulectomy in dogs typically results in instability of the mandibles that may lead to malocclusion, difficulty in prehension, mastication, and pain of the temporomandibular joint. Large rostral mandibular defects are challenging to reconstruct due to the complex geometry of this region. In order to restore mandibular continuity and stability following extensive rostral mandibulectomy, we developed a surgical technique using a combination of intraoral and extraoral approaches, a locking titanium plate, and a compression resistant matrix (CRM) infused with rhBMP-2. Furthermore, surgical planning that consisted of computed tomographic (CT) scanning and 3D model printing was utilized. We describe a regenerative surgical technique for immediate or delayed reconstruction of critical-size rostral mandibular defects in five dogs. Three dogs had healed with intact gingival covering over the mandibular defect and had immediate return to normal function and occlusion. Two dogs had the complication of focal plate exposure and dehiscence, which was corrected with mucosal flaps and suturing; these dogs have since healed with intact gingival covering over the mandibular defect. Mineralized tissue formation was palpated clinically within 2 weeks and solid bone formation within 3 months. CT findings at 6 months postoperatively demonstrated that the newly regenerated mandibular bone had increased in mineral volume with evidence of integration between the native bone, new bone, and CRM compared to the immediate postoperative CT. We conclude that rostral mandibular reconstruction using a regenerative approach provides an excellent solution for restoring mandibular continuity and preventing mandibular instability in dogs. PMID:26664933
The application of artificial neural networks to magnetotelluric time-series analysis
NASA Astrophysics Data System (ADS)
Manoj, C.; Nagarajan, Nandini
2003-05-01
Magnetotelluric (MT) signals are often contaminated with noise from natural or man-made processes that may not fit a normal distribution or are highly correlated. This may lead to serious errors in computed MT transfer functions and result in erroneous interpretation. A substantial improvement is possible when the time-series are presented as clean as possible for further processing. Cleaning of MT time-series is often done by manual editing. Editing of magnetotelluric time-series is subjective in nature and time consuming. Automation of such a process is difficult to achieve by statistical methods. Artificial neural networks (ANNs) are widely used to automate processes that require human intelligence. The objective here is to automate MT long-period time-series editing using ANN. A three-layer feed-forward artificial neural network (FANN) was adopted for the problem. As ANN-based techniques are computationally intensive, a novel approach was made, which involves editing of five simultaneously measured MT time-series that have been subdivided into stacks (a stack=5 × 256 data points). Neural network training was done at two levels. Signal and noise patterns of individual channels were taught first. Five channel parameters along with interchannel correlation and amplitude ratios formed the input for a final network, which predicts the quality of a stack. A large database (5000 traces for pattern training and 900 vectors for interchannel training) was prepared to train the network. There were two error parameters to minimize while training: training error and testing error. Training was stopped when both errors were below an acceptable level. The sensitivity of the neural network to the signal-to-noise ratio and the relative significance of its inputs were tested to ensure that the training was correct. MT time-series from four stations with varying degrees of noise contamination were used to demonstrate the application of the network. The application brought out
Harmonic analysis of environmental time series with missing data or irregular sample spacing.
Dilmaghani, Shabnam; Henry, Isaac C; Soonthornnonda, Puripus; Christensen, Erik R; Henry, Ronald C
2007-10-15
The Lomb periodogram and discrete Fourier transform are described and applied to harmonic analysis of two typical data sets, one air quality time series and one water quality time series. The air quality data is a 13 year series of 24 hour average particulate elemental carbon data from the IMPROVE station in Washington, D.C. The water quality data are from the stormwater monitoring network in Milwaukee, WI and cover almost 2 years of precipitation events. These data have irregular sampling periods and missing data that preclude the straightforward application of the fast Fourier transform (FFT). In both cases, an anthropogenic periodicity is identified; a 7-day weekday/ weekend effect in the Washington elemental carbon series and a 1 month cycle in several constituents of stormwater. Practical aspects of application of the Lomb periodogram are discussed, particularly quantifying the effects of random noise. The proper application of the FFT to data that are irregularly spaced with missing values is demonstrated on the air quality data. Recommendations are given when to use the Lomb periodogram and when to use the FFT.
Hazledine, Saul; Sun, Jongho; Wysham, Derin; Downie, J. Allan; Oldroyd, Giles E. D.; Morris, Richard J.
2009-01-01
Legume plants form beneficial symbiotic interactions with nitrogen fixing bacteria (called rhizobia), with the rhizobia being accommodated in unique structures on the roots of the host plant. The legume/rhizobial symbiosis is responsible for a significant proportion of the global biologically available nitrogen. The initiation of this symbiosis is governed by a characteristic calcium oscillation within the plant root hair cells and this signal is activated by the rhizobia. Recent analyses on calcium time series data have suggested that stochastic effects have a large role to play in defining the nature of the oscillations. The use of multiple nonlinear time series techniques, however, suggests an alternative interpretation, namely deterministic chaos. We provide an extensive, nonlinear time series analysis on the nature of this calcium oscillation response. We build up evidence through a series of techniques that test for determinism, quantify linear and nonlinear components, and measure the local divergence of the system. Chaos is common in nature and it seems plausible that properties of chaotic dynamics might be exploited by biological systems to control processes within the cell. Systems possessing chaotic control mechanisms are more robust in the sense that the enhanced flexibility allows more rapid response to environmental changes with less energetic costs. The desired behaviour could be most efficiently targeted in this manner, supporting some intriguing speculations about nonlinear mechanisms in biological signaling. PMID:19675679
Spatial analysis of precipitation time series over the Upper Indus Basin
NASA Astrophysics Data System (ADS)
Latif, Yasir; Yaoming, Ma; Yaseen, Muhammad
2016-12-01
The upper Indus basin (UIB) holds one of the most substantial river systems in the world, contributing roughly half of the available surface water in Pakistan. This water provides necessary support for agriculture, domestic consumption, and hydropower generation; all critical for a stable economy in Pakistan. This study has identified trends, analyzed variability, and assessed changes in both annual and seasonal precipitation during four time series, identified herein as: (first) 1961-2013, (second) 1971-2013, (third) 1981-2013, and (fourth) 1991-2013, over the UIB. This study investigated spatial characteristics of the precipitation time series over 15 weather stations and provides strong evidence of annual precipitation by determining significant trends at 6 stations (Astore, Chilas, Dir, Drosh, Gupis, and Kakul) out of the 15 studied stations, revealing a significant negative trend during the fourth time series. Our study also showed significantly increased precipitation at Bunji, Chitral, and Skardu, whereas such trends at the rest of the stations appear insignificant. Moreover, our study found that seasonal precipitation decreased at some locations (at a high level of significance), as well as periods of scarce precipitation during all four seasons. The observed decreases in precipitation appear stronger and more significant in autumn; having 10 stations exhibiting decreasing precipitation during the fourth time series, with respect to time and space. Furthermore, the observed decreases in precipitation appear robust and more significant for regions at high elevation (>1300 m). This analysis concludes that decreasing precipitation dominated the UIB, both temporally and spatially including in the higher areas.
Effective low-order models for atmospheric dynamics and time series analysis.
Gluhovsky, Alexander; Grady, Kevin
2016-02-01
The paper focuses on two interrelated problems: developing physically sound low-order models (LOMs) for atmospheric dynamics and employing them as novel time-series models to overcome deficiencies in current atmospheric time series analysis. The first problem is warranted since arbitrary truncations in the Galerkin method (commonly used to derive LOMs) may result in LOMs that violate fundamental conservation properties of the original equations, causing unphysical behaviors such as unbounded solutions. In contrast, the LOMs we offer (G-models) are energy conserving, and some retain the Hamiltonian structure of the original equations. This work examines LOMs from recent publications to show that all of them that are physically sound can be converted to G-models, while those that cannot lack energy conservation. Further, motivated by recent progress in statistical properties of dynamical systems, we explore G-models for a new role of atmospheric time series models as their data generating mechanisms are well in line with atmospheric dynamics. Currently used time series models, however, do not specifically utilize the physics of the governing equations and involve strong statistical assumptions rarely met in real data.
Costa, Madalena D.; Goldberger, Ary L.
2016-01-01
We introduce a generalization of multiscale entropy (MSE) analysis. The method is termed MSEn, where the subscript denotes the moment used to coarse-grain a time series. MSEμ, described previously, uses the mean value (first moment). Here, we focus on MSEσ2, which uses the second moment, i.e., the variance. MSEσ2 quantifies the dynamics of the volatility (variance) of a signal over multiple time scales. We use the method to analyze the structure of heartbeat time series. We find that the dynamics of the volatility of heartbeat time series obtained from healthy young subjects is highly complex. Furthermore, we find that the multiscale complexity of the volatility, not only the multiscale complexity of the mean heart rate, degrades with aging and pathology. The “bursty” behavior of the dynamics may be related to intermittency in energy and information flows, as part of multiscale cycles of activation and recovery. Generalized MSE may also be useful in quantifying the dynamical properties of other physiologic and of non-physiologic time series. PMID:27099455
Lutaif, N A; Palazzo, R; Gontijo, J A R
2014-01-01
Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.
Aldahak, Nouman; El Tantowy, Mohamed; Dupre, Derrick; Yu, Alexander; Keller, Jeffrey T.; Froelich, Sebastien; Aziz, Khaled M.
2016-01-01
Background: The marginal tubercle (MT) of zygomatic bone can be an obstacle in the standard mini pterional (MPT) craniotomy; we aim to evaluate the effect of drilling this MT in enhancing the exposure of MPT craniotomy for resection of sphenoid wing meningiomas (SWMs). Methods: The authors utilized 60 dry skulls to perform the anatomical part of the study. The MT size was reflected by the AB distance, wherein point A is the most prominent part of MT and point B is located on the orbital rim in the same axial plane as point A. The authors analyzed the effect of MT size in masking the sphenozygomatic suture (SZS), which is the most anterior part of the MPT craniotomy. One silicon-injected embalmed specimen was used to demonstrate other modifications to the standard MPT approach. The results of the anatomical analysis were translated into the second part of the study, which consisted of the resection of 25 SWMs. Results: The MT obscured visualization when the AB distance measured 13 mm or greater. In the clinical series of SWMs, drilling such prominent MT maximized exposure during MPT approach. Conclusion: The MPT approach could be used for the resection of SWMs. Drilling of prominent MTs can enhance and optimize exposure to SWMs through standard MPT approaches. PMID:28144471
Reference manual for generation and analysis of Habitat Time Series: version II
Milhous, Robert T.; Bartholow, John M.; Updike, Marlys A.; Moos, Alan R.
1990-01-01
The selection of an instream flow requirement for water resource management often requires the review of how the physical habitat changes through time. This review is referred to as 'Time Series Analysis." The Tune Series Library (fSLIB) is a group of programs to enter, transform, analyze, and display time series data for use in stream habitat assessment. A time series may be defined as a sequence of data recorded or calculated over time. Examples might be historical monthly flow, predicted monthly weighted usable area, daily electrical power generation, annual irrigation diversion, and so forth. The time series can be analyzed, both descriptively and analytically, to understand the importance of the variation in the events over time. This is especially useful in the development of instream flow needs based on habitat availability. The TSLIB group of programs assumes that you have an adequate study plan to guide you in your analysis. You need to already have knowledge about such things as time period and time step, species and life stages to consider, and appropriate comparisons or statistics to be produced and displayed or tabulated. Knowing your destination, you must first evaluate whether TSLIB can get you there. Remember, data are not answers. This publication is a reference manual to TSLIB and is intended to be a guide to the process of using the various programs in TSLIB. This manual is essentially limited to the hands-on use of the various programs. a TSLIB use interface program (called RTSM) has been developed to provide an integrated working environment where the use has a brief on-line description of each TSLIB program with the capability to run the TSLIB program while in the user interface. For information on the RTSM program, refer to Appendix F. Before applying the computer models described herein, it is recommended that the user enroll in the short course "Problem Solving with the Instream Flow Incremental Methodology (IFIM)." This course is offered
Studies in astronomical time series analysis. I - Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1981-01-01
Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.
Subsonic flutter analysis addition to NASTRAN. [for use with CDC 6000 series digital computers
NASA Technical Reports Server (NTRS)
Doggett, R. V., Jr.; Harder, R. L.
1973-01-01
A subsonic flutter analysis capability has been developed for NASTRAN, and a developmental version of the program has been installed on the CDC 6000 series digital computers at the Langley Research Center. The flutter analysis is of the modal type, uses doublet lattice unsteady aerodynamic forces, and solves the flutter equations by using the k-method. Surface and one-dimensional spline functions are used to transform from the aerodynamic degrees of freedom to the structural degrees of freedom. Some preliminary applications of the method to a beamlike wing, a platelike wing, and a platelike wing with a folded tip are compared with existing experimental and analytical results.
Comprehensive Model of Annual Plankton Succession Based on the Whole-Plankton Time Series Approach
Romagnan, Jean-Baptiste; Legendre, Louis; Guidi, Lionel; Jamet, Jean-Louis; Jamet, Dominique; Mousseau, Laure; Pedrotti, Maria-Luiza; Picheral, Marc; Gorsky, Gabriel; Sardet, Christian; Stemmann, Lars
2015-01-01
Ecological succession provides a widely accepted description of seasonal changes in phytoplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes) and larger (i.e. macroplankton) organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankton in the whole-plankton assemblage that spanned 5 orders of magnitude in size from microbes to macroplankton predators (not including fish or fish larvae, for which no consistent data were available). Samples were collected in the northwestern Mediterranean Sea (Bay of Villefranche) weekly during 10 months. Simultaneously collected samples were analyzed by flow cytometry, inverse microscopy, FlowCam, and ZooScan. The whole-plankton assemblage underwent sharp reorganizations that corresponded to bottom-up events of vertical mixing in the water-column, and its development was top-down controlled by large gelatinous filter feeders and predators. Based on the results provided by our novel whole-plankton assemblage approach, we propose a new comprehensive conceptual model of the annual plankton succession (i.e. whole plankton model) characterized by both stepwise stacking of four broad trophic communities from early spring through summer, which is a new concept, and progressive replacement of ecological plankton categories within the different trophic communities, as recognised traditionally. PMID:25780912
Comprehensive model of annual plankton succession based on the whole-plankton time series approach.
Romagnan, Jean-Baptiste; Legendre, Louis; Guidi, Lionel; Jamet, Jean-Louis; Jamet, Dominique; Mousseau, Laure; Pedrotti, Maria-Luiza; Picheral, Marc; Gorsky, Gabriel; Sardet, Christian; Stemmann, Lars
2015-01-01
Ecological succession provides a widely accepted description of seasonal changes in phytoplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes) and larger (i.e. macroplankton) organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankton in the whole-plankton assemblage that spanned 5 orders of magnitude in size from microbes to macroplankton predators (not including fish or fish larvae, for which no consistent data were available). Samples were collected in the northwestern Mediterranean Sea (Bay of Villefranche) weekly during 10 months. Simultaneously collected samples were analyzed by flow cytometry, inverse microscopy, FlowCam, and ZooScan. The whole-plankton assemblage underwent sharp reorganizations that corresponded to bottom-up events of vertical mixing in the water-column, and its development was top-down controlled by large gelatinous filter feeders and predators. Based on the results provided by our novel whole-plankton assemblage approach, we propose a new comprehensive conceptual model of the annual plankton succession (i.e. whole plankton model) characterized by both stepwise stacking of four broad trophic communities from early spring through summer, which is a new concept, and progressive replacement of ecological plankton categories within the different trophic communities, as recognised traditionally.
On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series
NASA Astrophysics Data System (ADS)
Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman
2016-04-01
The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for
Applications and development of new algorithms for displacement analysis using InSAR time series
NASA Astrophysics Data System (ADS)
Osmanoglu, Batuhan
Time series analysis of Synthetic Aperture Radar Interferometry (InSAR) data has become an important scientific tool for monitoring and measuring the displacement of Earth's surface due to a wide range of phenomena, including earthquakes, volcanoes, landslides, changes in ground water levels, and wetlands. Time series analysis is a product of interferometric phase measurements, which become ambiguous when the observed motion is larger than half of the radar wavelength. Thus, phase observations must first be unwrapped in order to obtain physically meaningful results. Persistent Scatterer Interferometry (PSI), Stanford Method for Persistent Scatterers (StaMPS), Short Baselines Interferometry (SBAS) and Small Temporal Baseline Subset (STBAS) algorithms solve for this ambiguity using a series of spatio-temporal unwrapping algorithms and filters. In this dissertation, I improve upon current phase unwrapping algorithms, and apply the PSI method to study subsidence in Mexico City. PSI was used to obtain unwrapped deformation rates in Mexico City (Chapter 3),where ground water withdrawal in excess of natural recharge causes subsurface, clay-rich sediments to compact. This study is based on 23 satellite SAR scenes acquired between January 2004 and July 2006. Time series analysis of the data reveals a maximum line-of-sight subsidence rate of 300mm/yr at a high enough resolution that individual subsidence rates for large buildings can be determined. Differential motion and related structural damage along an elevated metro rail was evident from the results. Comparison of PSI subsidence rates with data from permanent GPS stations indicate root mean square (RMS) agreement of 6.9 mm/yr, about the level expected based on joint data uncertainty. The Mexico City results suggest negligible recharge, implying continuing degradation and loss of the aquifer in the third largest metropolitan area in the world. Chapters 4 and 5 illustrate the link between time series analysis and three
NASA Astrophysics Data System (ADS)
Kaiser, O.; Horenko, I.
2012-04-01
Given an observed series of extreme events we are interested to capture the significant trend in the underlying dynamics. Since the character of such systems is strongly non-linear and non-stationary, the detection of significant characteristics and their attribution is a complex task. A standard tool in statistics to describe the probability distribution of extreme events is the General Extreme Value Theory (GEV). While the univariate stationary GEV distribution is well studied and results in fitting the data to the model parameters using Likelihood Techniques and Bayesian Methods (Coles,'01; Davison, Rames, '00 ), analysis of non-stationary extremes is based on the a priori assumption about the trend behavior (e.g linear combination of external factors/polynomials (Coles,'01)). Additionally, analysis of multivariate, non-stationary extreme events remains still a strong challenge, since analysis without strong a priori assumptions is limited to low dimensional cases (Nychka, Cooley,'09). We introduce FEM-GEV approach, which is based on GEV and advanced Finite Element time series analysis Methods (FEM) (Horenko,'10-11). The main idea of the FEM framework is to interpolate adaptively the corresponding non-stationary model parameters by a linear convex combination of K local stationary models and a switching process between them. To apply FEM framework to a time series of extremes we extend FEM by defining the model parameters wrt GEV distribution, as external factors we consider global atmospheric patterns. The optimal number of local models K and the best combination of external factors is estimated using Akaike Information Criteria. FEM-GEV approach allows to study the non-stationary dynamics of GEV parameters without a priori assumptions on the trend behavior and also captures the non-linear, non-stationary dependence on external factors. The series of extremes has by definition no connection to real time scale, for this reason the results of FEM-GEV can be only
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Wright, R.; Pilger, E.
2009-12-01
We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found
Coenzyme Q(10): a novel therapeutic approach for Fibromyalgia? case series with 5 patients.
Cordero, Mario D; Alcocer-Gómez, Elísabet; de Miguel, Manuel; Cano-García, Francisco Javier; Luque, Carlos M; Fernández-Riejo, Patricia; Fernández, Ana María Moreno; Sánchez-Alcazar, José Antonio
2011-07-01
Coenzyme Q(10) (CoQ(10)) is an essential electron carrier in the mitochondrial respiratory chain and a strong antioxidant. Low CoQ(10) levels have been detected in patients with Fibromyalgia (FM). The purpose of the present work was to assess the effect of CoQ(10) on symptoms of five patients with FM. Patients were evaluated clinically with Visual Analogical Scale of pain (VAS), and Fibromyalgia Impact Questionnaire (FIQ). Patients with CoQ(10) deficiency showed a statistically significant reduction on symptoms after CoQ(10) treatment during 9 months (300 mg/day). Determination of deficiency and consequent supplementation in FM may result in clinical improvement. Further analysis involving more scientifically rigorous methodology will be required to confirm this observation.
Geospatial Analysis of Near-Surface Soil Moisture Time Series Data Over Indian Region
NASA Astrophysics Data System (ADS)
Berwal, P.; Murthy, C. S.; Raju, P. V.; Sesha Sai, M. V. R.
2016-06-01
The present study has developed the time series database surface soil moisture over India, for June, July and August months for the period of 20 years from 1991 to 2010, using data products generated under Climate Change Initiative Programme of European Space Agency. These three months represent the crop sowing period in the prime cropping season in the country and the soil moisture data during this period is highly useful to detect the drought conditions and assess the drought impact. The time series soil moisture data which is in 0.25 degree spatial resolution was analyzed to generate different indicators. Rainfall data of same spatial resolution for the same period, generated by India Meteorological Department was also procured and analyzed. Geospatial analysis of soil moisture and rainfall derived indicators was carried out to study (1) inter annual variability of soil moisture and rainfall, (2) soil moisture deviations from normal during prominent drought years, (3) soil moisture and rainfall correlations and (4) drought exposure based on soil moisture and rainfall variability. The study has successfully demonstrated the potential of these soil moisture time series data sets for generating regional drought surveillance information products, drought hazard mapping, drought exposure analysis and detection of drought sensitive areas in the crop planting period.
Water quality management using statistical analysis and time-series prediction model
NASA Astrophysics Data System (ADS)
Parmar, Kulwinder Singh; Bhardwaj, Rashmi
2014-12-01
This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.
Time-series analysis of the transcriptome and proteome of Escherichia coli upon glucose repression.
Borirak, Orawan; Rolfe, Matthew D; de Koning, Leo J; Hoefsloot, Huub C J; Bekker, Martijn; Dekker, Henk L; Roseboom, Winfried; Green, Jeffrey; de Koster, Chris G; Hellingwerf, Klaas J
2015-10-01
Time-series transcript- and protein-profiles were measured upon initiation of carbon catabolite repression in Escherichia coli, in order to investigate the extent of post-transcriptional control in this prototypical response. A glucose-limited chemostat culture was used as the CCR-free reference condition. Stopping the pump and simultaneously adding a pulse of glucose, that saturated the cells for at least 1h, was used to initiate the glucose response. Samples were collected and subjected to quantitative time-series analysis of both the transcriptome (using microarray analysis) and the proteome (through a combination of 15N-metabolic labeling and mass spectrometry). Changes in the transcriptome and corresponding proteome were analyzed using statistical procedures designed specifically for time-series data. By comparison of the two sets of data, a total of 96 genes were identified that are post-transcriptionally regulated. This gene list provides candidates for future in-depth investigation of the molecular mechanisms involved in post-transcriptional regulation during carbon catabolite repression in E. coli, like the involvement of small RNAs.
Langevin equations from time series.
Racca, E; Porporato, A
2005-02-01
We discuss the link between the approach to obtain the drift and diffusion of one-dimensional Langevin equations from time series, and Pope and Ching's relationship for stationary signals. The two approaches are based on different interpretations of conditional averages of the time derivatives of the time series at given levels. The analysis provides a useful indication for the correct application of Pope and Ching's relationship to obtain stochastic differential equations from time series and shows its validity, in a generalized sense, for nondifferentiable processes originating from Langevin equations.
Rivera, Diego; Lillo, Mario; Granda, Stalin
2014-12-01
The concept of time stability has been widely used in the design and assessment of monitoring networks of soil moisture, as well as in hydrological studies, because it is as a technique that allows identifying of particular locations having the property of representing mean values of soil moisture in the field. In this work, we assess the effect of time stability calculations as new information is added and how time stability calculations are affected at shorter periods, subsampled from the original time series, containing different amounts of precipitation. In doing so, we defined two experiments to explore the time stability behavior. The first experiment sequentially adds new data to the previous time series to investigate the long-term influence of new data in the results. The second experiment applies a windowing approach, taking sequential subsamples from the entire time series to investigate the influence of short-term changes associated with the precipitation in each window. Our results from an operating network (seven monitoring points equipped with four sensors each in a 2-ha blueberry field) show that as information is added to the time series, there are changes in the location of the most stable point (MSP), and that taking the moving 21-day windows, it is clear that most of the variability of soil water content changes is associated with both the amount and intensity of rainfall. The changes of the MSP over each window depend on the amount of water entering the soil and the previous state of the soil water content. For our case study, the upper strata are proxies for hourly to daily changes in soil water content, while the deeper strata are proxies for medium-range stored water. Thus, different locations and depths are representative of processes at different time scales. This situation must be taken into account when water management depends on soil water content values from fixed locations.
The Re-Analysis of Ozone Profile Data from a 41-Year Series of SBUV Instruments
NASA Technical Reports Server (NTRS)
Kramarova, Natalya; Frith, Stacey; Bhartia, Pawan K.; McPeters, Richard; Labow, Gordon; Taylor, Steven; Fisher, Bradford
2012-01-01
In this study we present the validation of ozone profiles from a number of Solar Back Scattered Ultra Violet (SBUV) and SBUV/2 instruments that were recently reprocessed using an updated (Version 8.6) algorithm. The SBUV dataset provides the longest available record of global ozone profiles, spanning a 41-year period from 1970 to 2011 (except a 5-year gap in the 1970s) and includes ozone profile records obtained from the Nimbus-4 BUV and Nimbus-7 SBUV instruments, and a series of SBUV(/2) instruments launched on NOAA operational satellites (NOAA 09, 11, 14, 16, 17, 18, 19). Although modifications in instrument design were made in the evolution from the BUV instrument to the modern SBUV(/2) model, the basic principles of the measurement technique and retrieval algorithm remain the same. The long term SBUV data record allows us to create a consistent, calibrated dataset of ozone profiles that can be used for climate studies and trend analyses. In particular, we focus on estimating the various sources of error in the SBUV profile ozone retrievals using independent observations and analysis of the algorithm itself. For the first time we include in the metadata a quantitative estimate of the smoothing error, defined as the error due to profile variability that the SBUV observing system cannot inherently measure. The magnitude of the smoothing error varies with altitude, latitude, season and solar zenith angle. Between 10 and 1 hPa the smoothing errors for the SBUV monthly zonal mean retrievals are of the order of 1 %, but start to increase above and below this layer. The largest smoothing errors, as large as 15-20%, were detected in in the troposphere. The SBUV averaging kernels, provided with the ozone profiles in version 8.6, help to eliminate the smoothing effect when comparing the SBUV profiles with high vertical resolution measurements, and make it convenient to use the SBUV ozone profiles for data assimilation and model validation purposes. The smoothing error can
ERIC Educational Resources Information Center
Dunn, Pierre
This review of the literature briefly outlines the history of management by objectives (MBO), describing the Madison, Wisconsin, school district's application of this systems approach to school management, as well as alternatives to the Madison MBO plan developed by other districts across the country. Although there are many variations of MBO and…
ERIC Educational Resources Information Center
Coursen, David
Estimates of the precise costs of school vandalism vary widely, but the seriousness of the problem is beyond dispute. It is possible to get a general idea of the nature and motivation of most vandals and, in doing so, begin to understand the problem and devise solutions for it. There are two basic approaches to vandalism prevention. Currently, as…
Extensive mapping of coastal change in Alaska by Landsat time-series analysis, 1972-2013 (Invited)
NASA Astrophysics Data System (ADS)
Macander, M. J.; Swingley, C. S.; Reynolds, J.
2013-12-01
The landscape-scale effects of coastal storms on Alaska's Bering Sea and Gulf of Alaska coasts includes coastal erosion, migration of spits and barrier islands, breaching of coastal lakes and lagoons, and inundation and salt-kill of vegetation. Large changes in coastal storm frequency and intensity are expected due to climate change and reduced sea-ice extent. Storms have a wide range of impacts on carbon fluxes and on fish and wildlife resources, infrastructure siting and operation, and emergency response planning. In areas experiencing moderate to large effects, changes can be mapped by analyzing trends in time series of Landsat imagery from Landsat 1 through Landsat 8. ABR, Inc.--Environmental Research & Services and the Western Alaska Landscape Conservation Cooperative are performing a time-series trend analysis for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska. The archive of Landsat imagery covers the time period 1972-present. For a pilot study area in Kotzebue Sound, we conducted a regression analysis of changes in near-infrared reflectance to identify areas with significant changes in coastal features, 1972-2011. Suitable ice- and cloud-free Landsat imagery was obtained for 28 of the 40 years during the period. The approach captured several coastal changes over the 40-year study period, including coastal erosion exceeding the 60-m pixel resolution of the Multispectral Scanner (MSS) data and migrations of coastal spits and estuarine channels. In addition several lake drainage events were identified, mostly inland from the coastal zone. Analysis of shorter, decadal time periods produced noisier results that were generally consistent with the long-term trend analysis. Unusual conditions at the start or end of the time-series can strongly influence decadal results. Based on these results the study is being scaled up to map coastal change for over 22,000 kilometers of coastline along the Bering Sea and Gulf of Alaska coast. The
Paces, James B.; Nichols, Paul J.; Neymark, Leonid A.; Rajaram, Harihar
2013-01-01
Groundwater flow through fractured felsic tuffs and lavas at the Nevada National Security Site represents the most likely mechanism for transport of radionuclides away from underground nuclear tests at Pahute Mesa. To help evaluate fracture flow and matrix–water exchange, we have determined U-series isotopic compositions on more than 40 drill core samples from 5 boreholes that represent discrete fracture surfaces, breccia zones, and interiors of unfractured core. The U-series approach relies on the disruption of radioactive secular equilibrium between isotopes in the uranium-series decay chain due to preferential mobilization of 234U relative to 238U, and U relative to Th. Samples from discrete fractures were obtained by milling fracture surfaces containing thin secondary mineral coatings of clays, silica, Fe–Mn oxyhydroxides, and zeolite. Intact core interiors and breccia fragments were sampled in bulk. In addition, profiles of rock matrix extending 15 to 44 mm away from several fractures that show evidence of recent flow were analyzed to investigate the extent of fracture/matrix water exchange. Samples of rock matrix have 234U/238U and 230Th/238U activity ratios (AR) closest to radioactive secular equilibrium indicating only small amounts of groundwater penetrated unfractured matrix. Greater U mobility was observed in welded-tuff matrix with elevated porosity and in zeolitized bedded tuff. Samples of brecciated core were also in secular equilibrium implying a lack of long-range hydraulic connectivity in these cases. Samples of discrete fracture surfaces typically, but not always, were in radioactive disequilibrium. Many fractures had isotopic compositions plotting near the 230Th-234U 1:1 line indicating a steady-state balance between U input and removal along with radioactive decay. Numerical simulations of U-series isotope evolution indicate that 0.5 to 1 million years are required to reach steady-state compositions. Once attained, disequilibrium 234U/238U
The flyby anomaly: a multivariate analysis approach
NASA Astrophysics Data System (ADS)
Acedo, L.
2017-02-01
The flyby anomaly is the unexpected variation of the asymptotic post-encounter velocity of a spacecraft with respect to the pre-encounter velocity as it performs a slingshot manoeuvre. This effect has been detected in, at least, six flybys of the Earth but it has not appeared in other recent flybys. In order to find a pattern in these, apparently contradictory, data several phenomenological formulas have been proposed but all have failed to predict a new result in agreement with the observations. In this paper we use a multivariate dimensional analysis approach to propose a fitting of the data in terms of the local parameters at perigee, as it would occur if this anomaly comes from an unknown fifth force with latitude dependence. Under this assumption, we estimate the range of this force around 300 km.
Random matrix approach to categorical data analysis
NASA Astrophysics Data System (ADS)
Patil, Aashay; Santhanam, M. S.
2015-09-01
Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings, and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow random matrix predictions with the dominant eigenvalue being an exception. We demonstrate this approach by applying it to the data for Indian general elections and sea level pressures in the North Atlantic ocean.
Variational approach for nonpolar solvation analysis
Chen, Zhan; Zhao, Shan; Chun, Jaehun; Thomas, Dennis G.; Baker, Nathan A.; Bates, Peter W.; Wei, G. W.
2012-01-01
Solvation analysis is one of the most important tasks in chemical and biological modeling. Implicit solvent models are some of the most popular approaches. However, commonly used implicit solvent models rely on unphysical definitions of solvent-solute boundaries. Based on differential geometry, the present work defines the solvent-solute boundary via the variation of the nonpolar solvation free energy. The solvation free energy functional of the system is constructed based on a continuum description of the solvent and the discrete description of the solute, which are dynamically coupled by the solvent-solute boundaries via van der Waals interactions. The first variation of the energy functional gives rise to the governing Laplace-Beltrami equation. The present model predictions of the nonpolar solvation energies are in an excellent agreement with experimental data, which supports the validity of the proposed nonpolar solvation model. PMID:22938212
Analysis and evaluation of EHR approaches.
Blobel, Bernd G M E; Pharow, Peter
2008-01-01
EHR systems are core applications in any eHealth/pHealth environment and represent basic services for health telematics platforms. Many projects are performed at the level of Standards Developing Organizations or national programs, respectively, for defining EHR architectures as well as related design, implementation, and deployment processes. Claiming to meet the challenge for semantic interoperability and offering the right pathway, the resulting documents and specifications are sometimes controversial and even inconsistent. Based on a long tradition in the EHR domain, on the collective experience of academic groups such as the EFMI EHR Working Group, and on an active involvement at CEN, ISO, HL7 and several national projects around the globe, an analysis and evaluation study has been performed using the Generic Component Model reference architecture. Strengths and weaknesses of the different approaches as well as migration pathways for re-using and harmonizing the available materials are offered.
NASA Astrophysics Data System (ADS)
Dutton, Steven James
Particulate air pollution has demonstrated significant health effects ranging from worsening of asthma to increased rates of respiratory and cardiopulmonary mortality. These results have prompted the US-EPA to include particulate matter (PM) as one of the six criteria air pollutants regulated under the Clean Air Act. The diverse chemical make-up and physical characteristics of PM make it a challenging pollutant to characterize and regulate. Particulate matter less than 2.5 microns in diameter (PM2.5) has the ability to travel deep into the lungs and therefore has been linked with some of the more significant health effects. The toxicity of any given particle is likely dependent on its chemical composition. The goal of this project has been to chemically characterize a long time series of PM 2.5 measurements collected at a receptor site in Denver to a level of detail that has not been done before on this size data set. This has involved characterization of inorganic ions using ion chromatography, total elemental and organic carbon using thermal optical transmission, and organic molecular marker species using gas chromatography-mass spectrometry. Methods have been developed to allow for daily measurement and speciation for these compounds over a six year period. Measurement methods, novel approaches to uncertainty estimation, time series analysis, spectral and pattern analyses and source apportionment using two multivariate factor analysis models are presented. Analysis results reveal several natural and anthropogenic sources contributing to PM2.5 in Denver. The most distinguishable sources are motor vehicles and biomass combustion. This information will be used in a health effect analysis as part of a larger study called the Denver Aerosol Sources and Health (DASH) study. Such results will inform regulatory decisions and may help create a better understanding of the underlying mechanisms for the observed adverse health effects associated with PM2.5.
Sinking Chao Phraya delta plain, Thailand, derived from SAR interferometry time series analysis
NASA Astrophysics Data System (ADS)
Tanaka, A.; Mio, A.; Saito, Y.
2013-12-01
The Bangkok Metropolitan region and its surrounding provinces are located in a low-lying delta plain of the Chao Phraya River. Extensive groundwater use from the late 1950s has caused the decline of groundwater levels in the aquifers and Holocene clay compaction beneath the Bangkok Region, resulting in significant subsidence of the ground. This ground deformation has been monitored using leveling surveys since 1978, and differential InSAR (Interferometric Synthetic Aperture Radar) analysis. It shows that the Bangkok Metropolitan region is subsiding at a rate of about 20 mm/year during the recent years due to law-limited groundwater pumping, although the highest subsidence rate as high as 120 mm/year was recorded in 1981. The subsidence rate in the Bangkok area has significantly decreased since the late 1980s; however, the affected area has spread out to the surrounding areas. The maximum subsidence rate up to 30 mm/year occurred in the outlying southeast and southwest coastal zones in 2002. In this study, we apply a SAR interferometry time series analysis to monitor ground deformations in the lower Chao Phraya delta plain (Lower Central Plain), Thailand, using ALOS (Advanced Land Observing Satellite) PALSAR (Phased Array type L-band SAR) data acquired between July 2007 and September 2010. We derive a single reference time series interferogram from the stacking of unwrapped phases under the assumptions that those phases are smoothly and continuously connected, and apply a smoothness-constrained inversion algorithm that optimizes the displacement from the phase unwrapping of multitemporal differential SAR interferograms. The SAR interferometry time series analysis succeeds to monitor the incremental line-of-sight (LOS)-change between SAR scene acquisitions. LOS displacements are converted to vertical displacements, based on the assumption that the ground displacement in this area occurs only in the vertical directions. This reveals an overall pattern of subsidence
Multifractal analysis of geophysical time series in the urban lake of Créteil (France).
NASA Astrophysics Data System (ADS)
Mezemate, Yacine; Tchiguirinskaia, Ioulia; Bonhomme, Celine; Schertzer, Daniel; Lemaire, Bruno Jacques; Vinçon leite, Brigitte; Lovejoy, Shaun
2013-04-01
Urban water bodies take part in the environmental quality of the cities. They regulate heat, contribute to the beauty of landscape and give some space for leisure activities (aquatic sports, swimming). As they are often artificial they are only a few meters deep. It confers them some specific properties. Indeed, they are particularly sensitive to global environmental changes, including climate change, eutrophication and contamination by micro-pollutants due to the urbanization of the watershed. Monitoring their quality has become a major challenge for urban areas. The need for a tool for predicting short-term proliferation of potentially toxic phytoplankton therefore arises. In lakes, the behavior of biological and physical (temperature) fields is mainly driven by the turbulence regime in the water. Turbulence is highly non linear, nonstationary and intermittent. This is why statistical tools are needed to characterize the evolution of the fields. The knowledge of the probability distribution of all the statistical moments of a given field is necessary to fully characterize it. This possibility is offered by the multifractal analysis based on the assumption of scale invariance. To investigate the effect of space-time variability of temperature, chlorophyll and dissolved oxygen on the cyanobacteria proliferation in the urban lake of Creteil (France), a spectral analysis is first performed on each time series (or on subsamples) to have an overall estimate of their scaling behaviors. Then a multifractal analysis (Trace Moment, Double Trace Moment) estimates the statistical moments of different orders. This analysis is adapted to the specific properties of the studied time series, i. e. the presence of large scale gradients. The nonlinear behavior of the scaling functions K(q) confirms that the investigated aquatic time series are indeed multifractal and highly intermittent .The knowledge of the universal multifractal parameters is the key to calculate the different
Construction and Analysis of an Allelic Series of Ccn1 Knockin Mice.
Monzon, Ricardo I; Kim, Ki-Hyun; Lau, Lester F
2017-01-01
The embryonic lethality of mice with conventional global knockout of Ccn1 (Cyr61) precludes analysis of Ccn1 functions in late embryonic development or in adulthood. To circumvent this limitation, we have generated conditional knockout mice that allow cell type-specific deletion of Ccn1, and constructed an allelic series of Ccn1 knockin mice that express CCN1 defective for binding specific integrins in lieu of the wild type protein. Here we describe the construction of these mice and discuss how analysis of these animals can provide unique insights into Ccn1 functions mediated through specific integrin receptors. It is anticipated that future analysis of mice carrying specific mutations in genes of the Ccn family will be greatly facilitated by application of the CRISPR/Cas9 gene editing methodology.
Karakaya, N; Evrendilek, F
2010-06-01
Big Melen stream is one of the major water resources providing 0.268 [corrected] km(3) year(-1) of drinking and municipal water for Istanbul. Monthly time series data between 1991 and 2004 for 25 chemical, biological, and physical water properties of Big Melen stream were separated into linear trend, seasonality, and error components using additive decomposition models. Water quality index (WQI) derived from 17 water quality variables were used to compare Aksu upstream and Big Melen downstream water quality. Twenty-six additive decomposition models of water quality time series data including WQI had R (2) values ranging from 88% for log(water temperature) (P < or = 0.001) to 3% for log(total dissolved solids) (P < or = 0.026). Linear trend models revealed that total hardness, calcium concentration, and log(nitrite concentration) had the highest rate of increase over time. Tukey's multiple comparison pointed to significant decreases in 17 water quality variables including WQI of Big Melen downstream relative to those of Aksu upstream (P < or = 0.001). Monitoring changes in water quality on the basis of watersheds through WQI and decomposition analysis of time series data paves the way for an adaptive management process of water resources that can be tailored in response to effectiveness and dynamics of management practices.
Forecasting malaria cases using climatic factors in delhi, India: a time series analysis.
Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan
2014-01-01
Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.
Li, Shuying; Zhuang, Jun; Shen, Shifei
2016-08-23
In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014.
Analysis of the mass balance time series of glaciers in the Italian Alps
NASA Astrophysics Data System (ADS)
Carturan, Luca; Baroni, Carlo; Brunetti, Michele; Carton, Alberto; Dalla Fontana, Giancarlo; Salvatore, Maria Cristina; Zanoner, Thomas; Zuecco, Giulia
2016-03-01
This work presents an analysis of the mass balance series of nine Italian glaciers, which were selected based on the length, continuity and reliability of observations. All glaciers experienced mass loss in the observation period, which is variable for the different glaciers and ranges between 10 and 47 years. The longest series display increasing mass loss rates, which were mainly due to increased ablation during longer and warmer ablation seasons. The mean annual mass balance (Ba) in the decade from 2004 to 2013 ranged from -1788 to -763 mm w.e. yr-1. Low-altitude glaciers with low range of elevation are more out of balance than the higher, larger and steeper glaciers, which maintain residual accumulation areas in their upper reaches. The response of glaciers is mainly controlled by the combination of October-May precipitations and June-September temperatures, but rapid geometric adjustments and atmospheric changes lead to modifications in their response to climatic variations. In particular, a decreasing correlation of Ba with the June-September temperatures and an increasing correlation with October-May precipitations are observed for some glaciers. In addition, the October-May temperatures tend to become significantly correlated with Ba, possibly indicating a decrease in the fraction of solid precipitation, and/or increased ablation, during the accumulation season. Because most of the monitored glaciers have no more accumulation area, their observations series are at risk due to their impending extinction, thus requiring a replacement soon.
Analysis of the mass balance time series of glaciers in the Italian Alps
NASA Astrophysics Data System (ADS)
Carturan, L.; Baroni, C.; Brunetti, M.; Carton, A.; Dalla Fontana, G.; Salvatore, M. C.; Zanoner, T.; Zuecco, G.
2015-10-01
This work presents an analysis of the mass balance series of nine Italian glaciers, which were selected based on the length, continuity and reliability of observations. All glaciers experienced mass loss in the observation period, which is variable for the different glaciers and ranges between 10 and 47 years. The longest series display increasing mass loss rates, that were mainly due to increased ablation during longer and warmer ablation seasons. The mean annual mass balance (Ba) in the decade from 2004 to 2013 ranged from -1788 mm to -763 mm w.e. yr-1. Low-altitude glaciers with low elevation ranges are more out of balance than the higher, larger and steeper glaciers, which maintain residual accumulation areas in their upper reaches. The response of glaciers is mainly controlled by the combination of October-May precipitation and June-September temperature, but rapid geometric adjustments and atmospheric changes lead to modifications in their response to climatic variations. In particular, a decreasing correlation of Ba with the June-September temperature and an increasing correlation with October-May precipitation are observed for some glaciers. In addition, the October-May temperature tends to become significantly correlated with Ba, possibly indicating a decrease in the fraction of solid precipitation, and/or increased ablation, during the accumulation season. Because most of the monitored glaciers have no more accumulation area, their observations series are at risk due to their impending extinction, thus requiring a soon replacement.
Mkaouer, Bessem; Jemni, Monèm; Amara, Samiha; Chaabène, Helmi; Tabka, Zouhair
2013-01-01
Back swing connections during gymnastics acrobatic series considerably influence technical performance and difficulties, particularly in the back somersault. The aim of this study was to compare the take-off's kinetic and kinematic variables between two acrobatic series leading to perform the backward stretched somersault (also called salto): round-off, flic-flac to stretched salto versus round-off, tempo-salto to stretched salto. Five high level male gymnasts (age 23.17 ± 1.61 yrs; body height 1.65 ± 0.05 m; body mass 56.80 ± 7.66 kg) took part in this investigation. A force plate synchronized with a two dimensional movement analysis system was used to collect kinetic and kinematic data. Statistical analysis via the non-parametric Wilcoxon Rank-sum test showed significant differences between the take-offs' variables. The backswing connections were different in the take-off angle, linear momentum, vertical velocity and horizontal and vertical displacements. In conclusion, considering that the higher elevation of the centre of mass in the flight phase would allow best performance and lower the risk of falls, particularly when combined to a great angular momentum, this study demonstrated that the optimal connection series was round-off, flic-flac to stretched salto which enabled the best height in the somersault. Analysis of the results suggests that both connections facilitate the performance of single and double (or triple) backward somersaults with or without rotations around the longitudinal axis. Gymnasts could perform these later while gaining height if they chose the round-off, flic-flac technique or gaining some backward displacement if they choose the round-off, salto tempo.
Approaches to Cycle Analysis and Performance Metrics
NASA Technical Reports Server (NTRS)
Parson, Daniel E.
2003-01-01
The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.
ZWD time series analysis derived from NRT data processing. A regional study of PW in Greece.
NASA Astrophysics Data System (ADS)
Pikridas, Christos; Balidakis, Kyriakos; Katsougiannopoulos, Symeon
2015-04-01
ZWD (Zenith Wet/non-hydrostatic Delay) estimates are routinely derived Near Real Time from the new established Analysis Center in the Department of Geodesy and Surveying of Aristotle University of Thessaloniki (DGS/AUT-AC), in the framework of E-GVAP (EUMETNET GNSS water vapour project) since October 2014. This process takes place on an hourly basis and yields, among else, station coordinates and tropospheric parameter estimates for a network of 90+ permanent GNSS (Global Navigation Satellite System) stations. These are distributed at the wider part of Hellenic region. In this study, temporal and spatial variability of ZWD estimates were examined, as well as their relation with coordinate series extracted from both float and fixed solution of the initial phase ambiguities. For this investigation, Bernese GNSS Software v5.2 was used for the acquisition of the 6 month dataset from the aforementioned network. For time series analysis we employed techniques such as the Generalized Lomb-Scargle periodogram and Burg's maximum entropy method due to inefficiencies of the Discrete Fourier Transform application in the test dataset. Through the analysis, interesting results for further geophysical interpretation were drawn. In addition, the spatial and temporal distributions of Precipitable Water vapour (PW) obtained from both ZWD estimates and ERA-Interim reanalysis grids were investigated.
Time series analysis of Adaptive Optics wave-front sensor telemetry data
Poyneer, L A; Palmer, D
2004-03-22
Time series analysis techniques are applied to wave-front sensor telemetry data from the Lick Adaptive Optics System. For 28 fully-illuminated subapertures, telemetry data of 4096 consecutive slope estimates for each subaperture are available. The primary problem is performance comparison of alternative wave-front sensing algorithms. Using direct comparison of data in open loop and closed-loop trials, we analyze algorithm performance in terms of gain, noise and residual power. We also explore the benefits of multi-input Wiener filtering and analyze the open-loop and closed-loop spatial correlations of the sensor measurements.
Interpretation of engine cycle-to-cycle variation by chaotic time series analysis
Daw, C.S.; Kahl, W.K.
1990-01-01
In this paper we summarize preliminary results from applying a new mathematical technique -- chaotic time series analysis (CTSA) -- to cylinder pressure data from a spark-ignition (SI) four-stroke engine fueled with both methanol and iso-octane. Our objective is to look for the presence of deterministic chaos'' dynamics in peak pressure variations and to investigate the potential usefulness of CTSA as a diagnostic tool. Our results suggest that sequential peak cylinder pressures exhibit some characteristic features of deterministic chaos and that CTSA can extract previously unrecognized information from such data. 18 refs., 11 figs., 2 tabs.