Multiscale structure of time series revealed by the monotony spectrum.
Vamoş, Călin
2017-03-01
Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.
Identifying hidden common causes from bivariate time series: a method using recurrence plots.
Hirata, Yoshito; Aihara, Kazuyuki
2010-01-01
We propose a method for inferring the existence of hidden common causes from observations of bivariate time series. We detect related time series by excessive simultaneous recurrences in the corresponding recurrence plots. We also use a noncoverage property of a recurrence plot by the other to deny the existence of a directional coupling. We apply the proposed method to real wind data.
NASA Astrophysics Data System (ADS)
Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.
2017-12-01
The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time series change detection and update approach followed here, science outcomes or reports representing one temporal epoch can be considered stable and will not be altered when a time series is updated with newly available data.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-11
... Series, any adjusted option series, and any option series until the time to expiration for such series is... existing requirement may at times discourage liquidity in particular options series because a market maker... the option is subject to the Price/Time execution algorithm, the Directed Market Maker shall receive...
A novel time series link prediction method: Learning automata approach
NASA Astrophysics Data System (ADS)
Moradabadi, Behnaz; Meybodi, Mohammad Reza
2017-09-01
Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.
FATS: Feature Analysis for Time Series
NASA Astrophysics Data System (ADS)
Nun, Isadora; Protopapas, Pavlos; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim
2017-11-01
FATS facilitates and standardizes feature extraction for time series data; it quickly and efficiently calculates a compilation of many existing light curve features. Users can characterize or analyze an astronomical photometric database, though this library is not necessarily restricted to the astronomical domain and can also be applied to any kind of time series data.
Multifractal analysis of the Korean agricultural market
NASA Astrophysics Data System (ADS)
Kim, Hongseok; Oh, Gabjin; Kim, Seunghwan
2011-11-01
We have studied the long-term memory effects of the Korean agricultural market using the detrended fluctuation analysis (DFA) method. In general, the return time series of various financial data, including stock indices, foreign exchange rates, and commodity prices, are uncorrelated in time, while the volatility time series are strongly correlated. However, we found that the return time series of Korean agricultural commodity prices are anti-correlated in time, while the volatility time series are correlated. The n-point correlations of time series were also examined, and it was found that a multifractal structure exists in Korean agricultural market prices.
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.
GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns
Senin, Pavel; Lin, Jessica; Wang, Xing; ...
2018-02-23
The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less
GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Senin, Pavel; Lin, Jessica; Wang, Xing
The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less
Nkiaka, E; Nawaz, N R; Lovett, J C
2016-07-01
Hydro-meteorological data is an important asset that can enhance management of water resources. But existing data often contains gaps, leading to uncertainties and so compromising their use. Although many methods exist for infilling data gaps in hydro-meteorological time series, many of these methods require inputs from neighbouring stations, which are often not available, while other methods are computationally demanding. Computing techniques such as artificial intelligence can be used to address this challenge. Self-organizing maps (SOMs), which are a type of artificial neural network, were used for infilling gaps in a hydro-meteorological time series in a Sudano-Sahel catchment. The coefficients of determination obtained were all above 0.75 and 0.65 while the average topographic error was 0.008 and 0.02 for rainfall and river discharge time series, respectively. These results further indicate that SOMs are a robust and efficient method for infilling missing gaps in hydro-meteorological time series.
Information retrieval for nonstationary data records
NASA Technical Reports Server (NTRS)
Su, M. Y.
1971-01-01
A review and a critical discussion are made on the existing methods for analysis of nonstationary time series, and a new algorithm for splitting nonstationary time series, is applied to the analysis of sunspot data.
Study of Track Irregularity Time Series Calibration and Variation Pattern at Unit Section
Jia, Chaolong; Wei, Lili; Wang, Hanning; Yang, Jiulin
2014-01-01
Focusing on problems existing in track irregularity time series data quality, this paper first presents abnormal data identification, data offset correction algorithm, local outlier data identification, and noise cancellation algorithms. And then proposes track irregularity time series decomposition and reconstruction through the wavelet decomposition and reconstruction approach. Finally, the patterns and features of track irregularity standard deviation data sequence in unit sections are studied, and the changing trend of track irregularity time series is discovered and described. PMID:25435869
A Doubly Stochastic Change Point Detection Algorithm for Noisy Biological Signals.
Gold, Nathan; Frasch, Martin G; Herry, Christophe L; Richardson, Bryan S; Wang, Xiaogang
2017-01-01
Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, noisy time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We propose a novel and robust statistical method for change point detection for noisy biological time sequences. Our method is a significant improvement over traditional change point detection methods, which only examine a potential anomaly at a single time point. In contrast, our method considers all suspected anomaly points and considers the joint probability distribution of the number of change points and the elapsed time between two consecutive anomalies. We validate our method with three simulated time series, a widely accepted benchmark data set, two geological time series, a data set of ECG recordings, and a physiological data set of heart rate variability measurements of fetal sheep model of human labor, comparing it to three existing methods. Our method demonstrates significantly improved performance over the existing point-wise detection methods.
Rainfall disaggregation for urban hydrology: Effects of spatial consistence
NASA Astrophysics Data System (ADS)
Müller, Hannes; Haberlandt, Uwe
2015-04-01
For urban hydrology rainfall time series with a high temporal resolution are crucial. Observed time series of this kind are very short in most cases, so they cannot be used. On the contrary, time series with lower temporal resolution (daily measurements) exist for much longer periods. The objective is to derive time series with a long duration and a high resolution by disaggregating time series of the non-recording stations with information of time series of the recording stations. The multiplicative random cascade model is a well-known disaggregation model for daily time series. For urban hydrology it is often assumed, that a day consists of only 1280 minutes in total as starting point for the disaggregation process. We introduce a new variant for the cascade model, which is functional without this assumption and also outperforms the existing approach regarding time series characteristics like wet and dry spell duration, average intensity, fraction of dry intervals and extreme value representation. However, in both approaches rainfall time series of different stations are disaggregated without consideration of surrounding stations. This yields in unrealistic spatial patterns of rainfall. We apply a simulated annealing algorithm that has been used successfully for hourly values before. Relative diurnal cycles of the disaggregated time series are resampled to reproduce the spatial dependence of rainfall. To describe spatial dependence we use bivariate characteristics like probability of occurrence, continuity ratio and coefficient of correlation. Investigation area is a sewage system in Northern Germany. We show that the algorithm has the capability to improve spatial dependence. The influence of the chosen disaggregation routine and the spatial dependence on overflow occurrences and volumes of the sewage system will be analyzed.
NASA Astrophysics Data System (ADS)
Ningrum, R. W.; Surarso, B.; Farikhin; Safarudin, Y. M.
2018-03-01
This paper proposes the combination of Firefly Algorithm (FA) and Chen Fuzzy Time Series Forecasting. Most of the existing fuzzy forecasting methods based on fuzzy time series use the static length of intervals. Therefore, we apply an artificial intelligence, i.e., Firefly Algorithm (FA) to set non-stationary length of intervals for each cluster on Chen Method. The method is evaluated by applying on the Jakarta Composite Index (IHSG) and compare with classical Chen Fuzzy Time Series Forecasting. Its performance verified through simulation using Matlab.
2017-01-04
response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and maintaining...configurations with a restrained manikin, was evaluated in four different test series . Test Series 1 was conducted to determine the materials and...5 ms TTP. Test Series 2 was conducted to determine the materials and drop heights required for energy attenuation of the seat pan to generate a 4 m
Charles H. Luce; Daniele Tonina; Frank Gariglio; Ralph Applebee
2013-01-01
Work over the last decade has documented methods for estimating fluxes between streams and streambeds from time series of temperature at two depths in the streambed. We present substantial extension to the existing theory and practice of using temperature time series to estimate streambed water fluxes and thermal properties, including (1) a new explicit analytical...
Cross-Sectional Time Series Designs: A General Transformation Approach.
ERIC Educational Resources Information Center
Velicer, Wayne F.; McDonald, Roderick P.
1991-01-01
The general transformation approach to time series analysis is extended to the analysis of multiple unit data by the development of a patterned transformation matrix. The procedure includes alternatives for special cases and requires only minor revisions in existing computer software. (SLD)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... 500 Index option series in the pilot: (1) A time series analysis of open interest; and (2) an analysis... issue's total market share value, which is the share price times the number of shares outstanding. These... other series. Strike price intervals would be set no less than 5 points apart. Consistent with existing...
Application of an Entropic Approach to Assessing Systems Integration
2012-03-01
two econometrical measures of information efficiency – Shannon entropy and Hurst exponent . Shannon entropy (which is explained in Chapter III) can be...applied to evaluate long-term correlation of time series, while Hurst exponent can be applied to classify the time series in accordance to existence...of trend. Hurst exponent is the statistical measure of time series long-range dependence, and its value falls in the interval [0, 1] – a value in
77 FR 6685 - Airworthiness Directives; The Boeing Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-09
... proposed AD reduces compliance times for Model 767-400ER series airplanes. In addition, this proposed AD...). This proposed AD would reduce the compliance times for Model 767-400ER series airplanes. In addition... airplanes, the existing AD also requires a one- time inspection to determine if a tool runout option has...
Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015
2012-01-01
Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561
a Method of Time-Series Change Detection Using Full Polsar Images from Different Sensors
NASA Astrophysics Data System (ADS)
Liu, W.; Yang, J.; Zhao, J.; Shi, H.; Yang, L.
2018-04-01
Most of the existing change detection methods using full polarimetric synthetic aperture radar (PolSAR) are limited to detecting change between two points in time. In this paper, a novel method was proposed to detect the change based on time-series data from different sensors. Firstly, the overall difference image of a time-series PolSAR was calculated by ominous statistic test. Secondly, difference images between any two images in different times ware acquired by Rj statistic test. Generalized Gaussian mixture model (GGMM) was used to obtain time-series change detection maps in the last step for the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection by using the time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can detect the time-series change from different sensors.
Sea change: Charting the course for biogeochemical ocean time-series research in a new millennium
NASA Astrophysics Data System (ADS)
Church, Matthew J.; Lomas, Michael W.; Muller-Karger, Frank
2013-09-01
Ocean time-series provide vital information needed for assessing ecosystem change. This paper summarizes the historical context, major program objectives, and future research priorities for three contemporary ocean time-series programs: The Hawaii Ocean Time-series (HOT), the Bermuda Atlantic Time-series Study (BATS), and the CARIACO Ocean Time-Series. These three programs operate in physically and biogeochemically distinct regions of the world's oceans, with HOT and BATS located in the open-ocean waters of the subtropical North Pacific and North Atlantic, respectively, and CARIACO situated in the anoxic Cariaco Basin of the tropical Atlantic. All three programs sustain near-monthly shipboard occupations of their field sampling sites, with HOT and BATS beginning in 1988, and CARIACO initiated in 1996. The resulting data provide some of the only multi-disciplinary, decadal-scale determinations of time-varying ecosystem change in the global ocean. Facilitated by a scoping workshop (September 2010) sponsored by the Ocean Carbon Biogeochemistry (OCB) program, leaders of these time-series programs sought community input on existing program strengths and for future research directions. Themes that emerged from these discussions included: 1. Shipboard time-series programs are key to informing our understanding of the connectivity between changes in ocean-climate and biogeochemistry 2. The scientific and logistical support provided by shipboard time-series programs forms the backbone for numerous research and education programs. Future studies should be encouraged that seek mechanistic understanding of ecological interactions underlying the biogeochemical dynamics at these sites. 3. Detecting time-varying trends in ocean properties and processes requires consistent, high-quality measurements. Time-series must carefully document analytical procedures and, where possible, trace the accuracy of analyses to certified standards and internal reference materials. 4. Leveraged implementation, testing, and validation of autonomous and remote observing technologies at time-series sites provide new insights into spatiotemporal variability underlying ecosystem changes. 5. The value of existing time-series data for formulating and validating ecosystem models should be promoted. In summary, the scientific underpinnings of ocean time-series programs remain as strong and important today as when these programs were initiated. The emerging data inform our knowledge of the ocean's biogeochemistry and ecology, and improve our predictive capacity about planetary change.
Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance
Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao
2018-01-01
Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy. PMID:29795600
Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance.
Liu, Yongli; Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao
2018-01-01
Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy.
Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.
Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J
2016-02-01
It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.
Boolean network inference from time series data incorporating prior biological knowledge.
Haider, Saad; Pal, Ranadip
2012-01-01
Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.
Sub- and Quasi-Centurial Cycles in Solar and Geomagnetic Activity Data Series
NASA Astrophysics Data System (ADS)
Komitov, B.; Sello, S.; Duchlev, P.; Dechev, M.; Penev, K.; Koleva, K.
2016-07-01
The subject of this paper is the existence and stability of solar cycles with durations in the range of 20-250 years. Five types of data series are used: 1) the Zurich series (1749-2009 AD), the mean annual International sunspot number Ri, 2) the Group sunspot number series Rh (1610-1995 AD), 3) the simulated extended sunspot number from Extended time series of Solar Activity Indices (ESAI) (1090-2002 AD), 4) the simulated extended geomagnetic aa-index from ESAI (1099-2002 AD), 5) the Meudon filament series (1919-1991 AD). Two principally independent methods of time series analysis are used: the T-R periodogram analysis (both in standard and ``scanning window'' regimes) and the wavelet-analysis. The obtained results are very similar. A strong cycle with a mean duration of 55-60 years is found to exist in all series. On the other hand, a strong and stable quasi 110-120 years and ˜200-year cycles are obtained in all of these series except in the Ri one. The high importance of the long term solar activity dynamics for the aims of solar dynamo modeling and predictions is especially noted.
Time-series modeling of long-term weight self-monitoring data.
Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka
2015-08-01
Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.
A high-fidelity weather time series generator using the Markov Chain process on a piecewise level
NASA Astrophysics Data System (ADS)
Hersvik, K.; Endrerud, O.-E. V.
2017-12-01
A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.
OceanSITES: Sustained Ocean Time Series Observations in the Global Ocean.
NASA Astrophysics Data System (ADS)
Weller, R. A.; Gallage, C.; Send, U.; Lampitt, R. S.; Lukas, R.
2016-02-01
Time series observations at critical or representative locations are an essential element of a global ocean observing system that is unique and complements other approaches to sustained observing. OceanSITES is an international group of oceanographers associated with such time series sites. OceanSITES exists to promote the continuation and extension of ocean time series sites around the globe. It also exists to plan and oversee the global array of sites in order to address the needs of research, climate change detection, operational applications, and policy makers. OceanSITES is a voluntary group that sits as an Action Group of the JCOMM-OPS Data Buoy Cooperation Panel, where JCOMM-OPS is the operational ocean observing oversight group of the Joint Commission on Oceanography and Marine Meteorology of the International Oceanographic Commission and the World Meteorological Organization. The way forward includes working to complete the global array, moving toward multidisciplinary instrumentation on a subset of the sites, and increasing utilization of the time series data, which are freely available from two Global Data Assembly Centers, one at the National Data Buoy Center and one at Coriolis at IFREMER. One recnet OceanSITES initiative and several results from OceanSITES time series sites are presented. The recent initiative was the assembly of a pool of temperature/conductivity recorders fro provision to OceanSITES sites in order to provide deep ocean temperature and salinity time series. Examples from specific sites include: a 15-year record of surface meteorology and air-sea fluxes from off northern Chile that shows evidence of long-term trends in surface forcing; change in upper ocean salinity and stratification in association with regional change in the hydrological cycle can be seen at the Hawaii time series site; results from monitoring Atlantic meridional transport; and results from a European multidisciplinary time series site.
A general statistical test for correlations in a finite-length time series.
Hanson, Jeffery A; Yang, Haw
2008-06-07
The statistical properties of the autocorrelation function from a time series composed of independently and identically distributed stochastic variables has been studied. Analytical expressions for the autocorrelation function's variance have been derived. It has been found that two common ways of calculating the autocorrelation, moving-average and Fourier transform, exhibit different uncertainty characteristics. For periodic time series, the Fourier transform method is preferred because it gives smaller uncertainties that are uniform through all time lags. Based on these analytical results, a statistically robust method has been proposed to test the existence of correlations in a time series. The statistical test is verified by computer simulations and an application to single-molecule fluorescence spectroscopy is discussed.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
Superstatistical fluctuations in time series: Applications to share-price dynamics and turbulence
NASA Astrophysics Data System (ADS)
van der Straeten, Erik; Beck, Christian
2009-09-01
We report a general technique to study a given experimental time series with superstatistics. Crucial for the applicability of the superstatistics concept is the existence of a parameter β that fluctuates on a large time scale as compared to the other time scales of the complex system under consideration. The proposed method extracts the main superstatistical parameters out of a given data set and examines the validity of the superstatistical model assumptions. We test the method thoroughly with surrogate data sets. Then the applicability of the superstatistical approach is illustrated using real experimental data. We study two examples, velocity time series measured in turbulent Taylor-Couette flows and time series of log returns of the closing prices of some stock market indices.
United States forest disturbance trends observed with landsat time series
Jeffrey G. Masek; Samuel N. Goward; Robert E. Kennedy; Warren B. Cohen; Gretchen G. Moisen; Karen Schleweiss; Chengquan Huang
2013-01-01
Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing US land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest...
Handbook for Using the Intensive Time-Series Design.
ERIC Educational Resources Information Center
Mayer, Victor J.; Monk, John S.
Work on the development of the intensive time-series design was initiated because of the dissatisfaction with existing research designs. This dissatisfaction resulted from the paucity of data obtained from designs such as the pre-post and randomized posttest-only designs. All have the common characteristic of yielding data from only one or two…
Closed-Loop Optimal Control Implementations for Space Applications
2016-12-01
analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to feedback on the...through the analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering
Sensor-Generated Time Series Events: A Definition Language
Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan
2012-01-01
There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.
Marwaha, Puneeta; Sunkaria, Ramesh Kumar
2016-09-01
The sample entropy (SampEn) has been widely used to quantify the complexity of RR-interval time series. It is a fact that higher complexity, and hence, entropy is associated with the RR-interval time series of healthy subjects. But, SampEn suffers from the disadvantage that it assigns higher entropy to the randomized surrogate time series as well as to certain pathological time series, which is a misleading observation. This wrong estimation of the complexity of a time series may be due to the fact that the existing SampEn technique updates the threshold value as a function of long-term standard deviation (SD) of a time series. However, time series of certain pathologies exhibits substantial variability in beat-to-beat fluctuations. So the SD of the first order difference (short term SD) of the time series should be considered while updating threshold value, to account for period-to-period variations inherited in a time series. In the present work, improved sample entropy (I-SampEn), a new methodology has been proposed in which threshold value is updated by considering the period-to-period variations of a time series. The I-SampEn technique results in assigning higher entropy value to age-matched healthy subjects than patients suffering atrial fibrillation (AF) and diabetes mellitus (DM). Our results are in agreement with the theory of reduction in complexity of RR-interval time series in patients suffering from chronic cardiovascular and non-cardiovascular diseases.
Hanson, Jeffery A; Yang, Haw
2008-11-06
The statistical properties of the cross correlation between two time series has been studied. An analytical expression for the cross correlation function's variance has been derived. On the basis of these results, a statistically robust method has been proposed to detect the existence and determine the direction of cross correlation between two time series. The proposed method has been characterized by computer simulations. Applications to single-molecule fluorescence spectroscopy are discussed. The results may also find immediate applications in fluorescence correlation spectroscopy (FCS) and its variants.
The virtual enhancements - solar proton event radiation (VESPER) model
NASA Astrophysics Data System (ADS)
Aminalragia-Giamini, Sigiava; Sandberg, Ingmar; Papadimitriou, Constantinos; Daglis, Ioannis A.; Jiggens, Piers
2018-02-01
A new probabilistic model introducing a novel paradigm for the modelling of the solar proton environment at 1 AU is presented. The virtual enhancements - solar proton event radiation model (VESPER) uses the European space agency's solar energetic particle environment modelling (SEPEM) Reference Dataset and produces virtual time-series of proton differential fluxes. In this regard it fundamentally diverges from the approach of existing SPE models that are based on probabilistic descriptions of SPE macroscopic characteristics such as peak flux and cumulative fluence. It is shown that VESPER reproduces well the dataset characteristics it uses, and further comparisons with existing models are made with respect to their results. The production of time-series as the main output of the model opens a straightforward way for the calculation of solar proton radiation effects in terms of time-series and the pairing with effects caused by trapped radiation and galactic cosmic rays.
Network structure of multivariate time series.
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-10-21
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.
Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.
Ouyang, Yicun; Yin, Hujun
2018-05-01
Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.
Fifth-order complex Korteweg-de Vries-type equations
NASA Astrophysics Data System (ADS)
Khanal, Netra; Wu, Jiahong; Yuan, Juan-Ming
2012-05-01
This paper studies spatially periodic complex-valued solutions of the fifth-order Korteweg-de Vries (KdV)-type equations. The aim is at several fundamental issues including the existence, uniqueness and finite-time blowup problems. Special attention is paid to the Kawahara equation, a fifth-order KdV-type equation. When a Burgers dissipation is attached to the Kawahara equation, we establish the existence and uniqueness of the Fourier series solution with the Fourier modes decaying algebraically in terms of the wave numbers. We also examine a special series solution to the Kawahara equation and prove the convergence and global regularity of such solutions associated with a single mode initial data. In addition, finite-time blowup results are discussed for the special series solution of the Kawahara equation.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-12
... also proposes to identify EOW and EOM trading patterns by undertaking a time series analysis of open... a Friday, the Exchange will list an End of Month expiration series and not an End of Week expiration... continue to exist. However, any further trading in those series would be restricted to transactions where...
Improvements to surrogate data methods for nonstationary time series.
Lucio, J H; Valdés, R; Rodríguez, L R
2012-05-01
The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.
NASA Astrophysics Data System (ADS)
Gemitzi, Alexandra; Stefanopoulos, Kyriakos
2011-06-01
SummaryGroundwaters and their dependent ecosystems are affected both by the meteorological conditions as well as from human interventions, mainly in the form of groundwater abstractions for irrigation needs. This work aims at investigating the quantitative effects of meteorological conditions and man intervention on groundwater resources and their dependent ecosystems. Various seasonal Auto-Regressive Integrated Moving Average (ARIMA) models with external predictor variables were used in order to model the influence of meteorological conditions and man intervention on the groundwater level time series. Initially, a seasonal ARIMA model that simulates the abstraction time series using as external predictor variable temperature ( T) was prepared. Thereafter, seasonal ARIMA models were developed in order to simulate groundwater level time series in 8 monitoring locations, using the appropriate predictor variables determined for each individual case. The spatial component was introduced through the use of Geographical Information Systems (GIS). Application of the proposed methodology took place in the Neon Sidirochorion alluvial aquifer (Northern Greece), for which a 7-year long time series (i.e., 2003-2010) of piezometric and groundwater abstraction data exists. According to the developed ARIMA models, three distinct groups of groundwater level time series exist; the first one proves to be dependent only on the meteorological parameters, the second group demonstrates a mixed dependence both on meteorological conditions and on human intervention, whereas the third group shows a clear influence from man intervention. Moreover, there is evidence that groundwater abstraction has affected an important protected ecosystem.
Code of Federal Regulations, 2010 CFR
2010-04-01
... can be sold at or near their carrying value within a reasonably short period of time and either: (i... portion. (6) Series of a series company means any class or series of a registered investment company that issues two or more classes or series of preferred or special stock, each of which is preferred over all...
Code of Federal Regulations, 2013 CFR
2013-04-01
... can be sold at or near their carrying value within a reasonably short period of time and either: (i... portion. (6) Series of a series company means any class or series of a registered investment company that issues two or more classes or series of preferred or special stock, each of which is preferred over all...
Code of Federal Regulations, 2014 CFR
2014-04-01
... can be sold at or near their carrying value within a reasonably short period of time and either: (i... portion. (6) Series of a series company means any class or series of a registered investment company that issues two or more classes or series of preferred or special stock, each of which is preferred over all...
Computing the multifractal spectrum from time series: an algorithmic approach.
Harikrishnan, K P; Misra, R; Ambika, G; Amritkar, R E
2009-12-01
We show that the existing methods for computing the f(alpha) spectrum from a time series can be improved by using a new algorithmic scheme. The scheme relies on the basic idea that the smooth convex profile of a typical f(alpha) spectrum can be fitted with an analytic function involving a set of four independent parameters. While the standard existing schemes [P. Grassberger et al., J. Stat. Phys. 51, 135 (1988); A. Chhabra and R. V. Jensen, Phys. Rev. Lett. 62, 1327 (1989)] generally compute only an incomplete f(alpha) spectrum (usually the top portion), we show that this can be overcome by an algorithmic approach, which is automated to compute the D(q) and f(alpha) spectra from a time series for any embedding dimension. The scheme is first tested with the logistic attractor with known f(alpha) curve and subsequently applied to higher-dimensional cases. We also show that the scheme can be effectively adapted for analyzing practical time series involving noise, with examples from two widely different real world systems. Moreover, some preliminary results indicating that the set of four independent parameters may be used as diagnostic measures are also included.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
A Space Affine Matching Approach to fMRI Time Series Analysis.
Chen, Liang; Zhang, Weishi; Liu, Hongbo; Feng, Shigang; Chen, C L Philip; Wang, Huili
2016-07-01
For fMRI time series analysis, an important challenge is to overcome the potential delay between hemodynamic response signal and cognitive stimuli signal, namely the same frequency but different phase (SFDP) problem. In this paper, a novel space affine matching feature is presented by introducing the time domain and frequency domain features. The time domain feature is used to discern different stimuli, while the frequency domain feature to eliminate the delay. And then we propose a space affine matching (SAM) algorithm to match fMRI time series by our affine feature, in which a normal vector is estimated using gradient descent to explore the time series matching optimally. The experimental results illustrate that the SAM algorithm is insensitive to the delay between the hemodynamic response signal and the cognitive stimuli signal. Our approach significantly outperforms GLM method while there exists the delay. The approach can help us solve the SFDP problem in fMRI time series matching and thus of great promise to reveal brain dynamics.
NASA Technical Reports Server (NTRS)
Menenti, M.; Azzali, S.; Verhoef, W.; Van Swol, R.
1993-01-01
Examples are presented of applications of a fast Fourier transform algorithm to analyze time series of images of Normalized Difference Vegetation Index values. The results obtained for a case study on Zambia indicated that differences in vegetation development among map units of an existing agroclimatic map were not significant, while reliable differences were observed among the map units obtained using the Fourier analysis.
Time Series Modelling of Syphilis Incidence in China from 2005 to 2012
Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau
2016-01-01
Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682
Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.
Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau
2016-01-01
The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.
Time irreversibility and intrinsics revealing of series with complex network approach
NASA Astrophysics Data System (ADS)
Xiong, Hui; Shang, Pengjian; Xia, Jianan; Wang, Jing
2018-06-01
In this work, we analyze time series on the basis of the visibility graph algorithm that maps the original series into a graph. By taking into account the all-round information carried by the signals, the time irreversibility and fractal behavior of series are evaluated from a complex network perspective, and considered signals are further classified from different aspects. The reliability of the proposed analysis is supported by numerical simulations on synthesized uncorrelated random noise, short-term correlated chaotic systems and long-term correlated fractal processes, and by the empirical analysis on daily closing prices of eleven worldwide stock indices. Obtained results suggest that finite size has a significant effect on the evaluation, and that there might be no direct relation between the time irreversibility and long-range correlation of series. Similarity and dissimilarity between stock indices are also indicated from respective regional and global perspectives, showing the existence of multiple features of underlying systems.
Providing web-based tools for time series access and analysis
NASA Astrophysics Data System (ADS)
Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane
2014-05-01
Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30, 833-845. Verbesselt, J., R. Hyndman, G. Newnham and D. Culvenor (2010). Detecting trend and seasonal changes in satellite image time series. Remote Sensing of Environment, 114, 106-115. DOI: 10.1016/j.rse.2009.08.014 Forkel, M., N. Carvalhais, J. Verbesselt, M. Mahecha, C. Neigh and M. Reichstein (2013). Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology. Remote Sensing 5, 2113-2144.
Analysis of Site Position Time Series Derived From Space Geodetic Solutions
NASA Astrophysics Data System (ADS)
Angermann, D.; Meisel, B.; Kruegel, M.; Tesmer, V.; Miller, R.; Drewes, H.
2003-12-01
This presentation deals with the analysis of station coordinate time series obtained from VLBI, SLR, GPS and DORIS solutions. We also present time series for the origin and scale derived from these solutions and discuss their contribution to the realization of the terrestrial reference frame. For these investigations we used SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS and DORIS time series were obtained from weekly station coordinates solutions provided by the IGS, and from the joint DORIS analysis center (IGN-JPL). We analysed the time series with respect to various aspects, such as non-linear motions, periodic signals and systematic differences (biases). A major focus is on a comparison of the results at co-location sites in order to identify technique- and/or solution related problems. This may also help to separate and quantify possible effects, and to understand the origin of still existing discrepancies. Technique-related systematic effects (biases) should be reduced to the highest possible extent, before using the space geodetic solutions for a geophysical interpretation of seasonal signals in site position time series.
Empirical intrinsic geometry for nonlinear modeling and time series filtering.
Talmon, Ronen; Coifman, Ronald R
2013-07-30
In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.
NASA Astrophysics Data System (ADS)
Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.
2014-12-01
We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q<0 and greater than GHE when q>0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.
Multivariable nonlinear analysis of foreign exchange rates
NASA Astrophysics Data System (ADS)
Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo
2003-05-01
We analyze the multivariable time series of foreign exchange rates. These are price movements that have often been analyzed, and dealing time intervals and spreads between bid and ask prices. Considering dealing time intervals as event timing such as neurons’ firings, we use raster plots (RPs) and peri-stimulus time histograms (PSTHs) which are popular methods in the field of neurophysiology. Introducing special processings to obtaining RPs and PSTHs time histograms for analyzing exchange rates time series, we discover that there exists dynamical interaction among three variables. We also find that adopting multivariables leads to improvements of prediction accuracy.
77 FR 18872 - Availability of Electric Power Sources
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-28
... the first time that document is referenced. Revision 1 of Regulatory Guide 1.93 is available.... Introduction The NRC is issuing a revision to an existing guide in the NRC's ``Regulatory Guide'' series. This series was developed to describe and make available to the public information such as methods that are...
78 FR 36278 - Fuel Oil Systems for Emergency Power Supplies
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-17
... this notice (if that document is available in ADAMS) is provided the first time that a document is..., Office of Nuclear Regulatory Research, U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001... issuing a revision to an existing guide in the NRC's ``Regulatory Guide'' series. This series was...
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
Deconvolution of mixing time series on a graph
Blocker, Alexander W.; Airoldi, Edoardo M.
2013-01-01
In many applications we are interested in making inference on latent time series from indirect measurements, which are often low-dimensional projections resulting from mixing or aggregation. Positron emission tomography, super-resolution, and network traffic monitoring are some examples. Inference in such settings requires solving a sequence of ill-posed inverse problems, yt = Axt, where the projection mechanism provides information on A. We consider problems in which A specifies mixing on a graph of times series that are bursty and sparse. We develop a multilevel state-space model for mixing times series and an efficient approach to inference. A simple model is used to calibrate regularization parameters that lead to efficient inference in the multilevel state-space model. We apply this method to the problem of estimating point-to-point traffic flows on a network from aggregate measurements. Our solution outperforms existing methods for this problem, and our two-stage approach suggests an efficient inference strategy for multilevel models of multivariate time series. PMID:25309135
Classification of time-series images using deep convolutional neural networks
NASA Astrophysics Data System (ADS)
Hatami, Nima; Gavet, Yann; Debayle, Johan
2018-04-01
Convolutional Neural Networks (CNN) has achieved a great success in image recognition task by automatically learning a hierarchical feature representation from raw data. While the majority of Time-Series Classification (TSC) literature is focused on 1D signals, this paper uses Recurrence Plots (RP) to transform time-series into 2D texture images and then take advantage of the deep CNN classifier. Image representation of time-series introduces different feature types that are not available for 1D signals, and therefore TSC can be treated as texture image recognition task. CNN model also allows learning different levels of representations together with a classifier, jointly and automatically. Therefore, using RP and CNN in a unified framework is expected to boost the recognition rate of TSC. Experimental results on the UCR time-series classification archive demonstrate competitive accuracy of the proposed approach, compared not only to the existing deep architectures, but also to the state-of-the art TSC algorithms.
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Rapid Calculation of Spacecraft Trajectories Using Efficient Taylor Series Integration
NASA Technical Reports Server (NTRS)
Scott, James R.; Martini, Michael C.
2011-01-01
A variable-order, variable-step Taylor series integration algorithm was implemented in NASA Glenn's SNAP (Spacecraft N-body Analysis Program) code. SNAP is a high-fidelity trajectory propagation program that can propagate the trajectory of a spacecraft about virtually any body in the solar system. The Taylor series algorithm's very high order accuracy and excellent stability properties lead to large reductions in computer time relative to the code's existing 8th order Runge-Kutta scheme. Head-to-head comparison on near-Earth, lunar, Mars, and Europa missions showed that Taylor series integration is 15.8 times faster than Runge- Kutta on average, and is more accurate. These speedups were obtained for calculations involving central body, other body, thrust, and drag forces. Similar speedups have been obtained for calculations that include J2 spherical harmonic for central body gravitation. The algorithm includes a step size selection method that directly calculates the step size and never requires a repeat step. High-order Taylor series integration algorithms have been shown to provide major reductions in computer time over conventional integration methods in numerous scientific applications. The objective here was to directly implement Taylor series integration in an existing trajectory analysis code and demonstrate that large reductions in computer time (order of magnitude) could be achieved while simultaneously maintaining high accuracy. This software greatly accelerates the calculation of spacecraft trajectories. At each time level, the spacecraft position, velocity, and mass are expanded in a high-order Taylor series whose coefficients are obtained through efficient differentiation arithmetic. This makes it possible to take very large time steps at minimal cost, resulting in large savings in computer time. The Taylor series algorithm is implemented primarily through three subroutines: (1) a driver routine that automatically introduces auxiliary variables and sets up initial conditions and integrates; (2) a routine that calculates system reduced derivatives using recurrence relations for quotients and products; and (3) a routine that determines the step size and sums the series. The order of accuracy used in a trajectory calculation is arbitrary and can be set by the user. The algorithm directly calculates the motion of other planetary bodies and does not require ephemeris files (except to start the calculation). The code also runs with Taylor series and Runge-Kutta used interchangeably for different phases of a mission.
Copulas and time series with long-ranged dependencies.
Chicheportiche, Rémy; Chakraborti, Anirban
2014-04-01
We review ideas on temporal dependencies and recurrences in discrete time series from several areas of natural and social sciences. We revisit existing studies and redefine the relevant observables in the language of copulas (joint laws of the ranks). We propose that copulas provide an appropriate mathematical framework to study nonlinear time dependencies and related concepts-like aftershocks, Omori law, recurrences, and waiting times. We also critically argue, using this global approach, that previous phenomenological attempts involving only a long-ranged autocorrelation function lacked complexity in that they were essentially monoscale.
Robust extrema features for time-series data analysis.
Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N
2013-06-01
The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.
Using SAR satellite data time series for regional glacier mapping
NASA Astrophysics Data System (ADS)
Winsvold, Solveig H.; Kääb, Andreas; Nuth, Christopher; Andreassen, Liss M.; van Pelt, Ward J. J.; Schellenberger, Thomas
2018-03-01
With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8) and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.
Forbidden patterns in financial time series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano
2008-03-01
The existence of forbidden patterns, i.e., certain missing sequences in a given time series, is a recently proposed instrument of potential application in the study of time series. Forbidden patterns are related to the permutation entropy, which has the basic properties of classic chaos indicators, such as Lyapunov exponent or Kolmogorov entropy, thus allowing to separate deterministic (usually chaotic) from random series; however, it requires fewer values of the series to be calculated, and it is suitable for using with small datasets. In this paper, the appearance of forbidden patterns is studied in different economical indicators such as stock indices (Dow Jones Industrial Average and Nasdaq Composite), NYSE stocks (IBM and Boeing), and others (ten year Bond interest rate), to find evidence of deterministic behavior in their evolutions. Moreover, the rate of appearance of the forbidden patterns is calculated, and some considerations about the underlying dynamics are suggested.
Inhomogeneous scaling behaviors in Malaysian foreign currency exchange rates
NASA Astrophysics Data System (ADS)
Muniandy, S. V.; Lim, S. C.; Murugan, R.
2001-12-01
In this paper, we investigate the fractal scaling behaviors of foreign currency exchange rates with respect to Malaysian currency, Ringgit Malaysia. These time series are examined piecewise before and after the currency control imposed in 1st September 1998 using the monofractal model based on fractional Brownian motion. The global Hurst exponents are determined using the R/ S analysis, the detrended fluctuation analysis and the method of second moment using the correlation coefficients. The limitation of these monofractal analyses is discussed. The usual multifractal analysis reveals that there exists a wide range of Hurst exponents in each of the time series. A new method of modelling the multifractal time series based on multifractional Brownian motion with time-varying Hurst exponents is studied.
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
2016-09-01
Method Scientific Operating Procedure Series : SOP-C En vi ro nm en ta l L ab or at or y Jonathon Brame and Chris Griggs September 2016...BET) Method Scientific Operating Procedure Series : SOP-C Jonathon Brame and Chris Griggs Environmental Laboratory U.S. Army Engineer Research and...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing
NASA Technical Reports Server (NTRS)
Mlynczak, Martin G.; Martin-Torres, F. Javier; Mertens, Christopher J.; Marshall, B. Thomas; Thompson, R. Earl; Kozyra, Janet U.; Remsberg, Ellis E.; Gordley, Larry L.; Russell, James M.; Woods, Thomas
2008-01-01
We examine time series of the daily global power (W) radiated by carbon dioxide (at 15 microns) and by nitric oxide (at 5.3 microns) from the Earth s thermosphere between 100 km and 200 km altitude. Also examined is a time series of the daily absorbed solar ultraviolet power in the same altitude region in the wavelength span 0 to 175 nm. The infrared data are derived from the SABER instrument and the solar data are derived from the SEE instrument, both on the NASA TIMED satellite. The time series cover nearly 5 years from 2002 through 2006. The infrared and solar time series exhibit a decrease in radiated and absorbed power consistent with the declining phase of the current 11-year solar cycle. The infrared time series also exhibits high frequency variations that are not evident in the solar power time series. Spectral analysis shows a statistically significant 9-day periodicity in the infrared data but not in the solar data. A very strong 9-day periodicity is also found to exist in the time series of daily A(sub p) and K(sub p) geomagnetic indexes. These 9-day periodicities are linked to the recurrence of coronal holes on the Sun. These results demonstrate a direct coupling between the upper atmosphere of the Sun and the infrared energy budget of the thermosphere.
High Speed Solution of Spacecraft Trajectory Problems Using Taylor Series Integration
NASA Technical Reports Server (NTRS)
Scott, James R.; Martini, Michael C.
2008-01-01
Taylor series integration is implemented in a spacecraft trajectory analysis code-the Spacecraft N-body Analysis Program (SNAP) - and compared with the code s existing eighth-order Runge-Kutta Fehlberg time integration scheme. Nine trajectory problems, including near Earth, lunar, Mars and Europa missions, are analyzed. Head-to-head comparison at five different error tolerances shows that, on average, Taylor series is faster than Runge-Kutta Fehlberg by a factor of 15.8. Results further show that Taylor series has superior convergence properties. Taylor series integration proves that it can provide rapid, highly accurate solutions to spacecraft trajectory problems.
NASA Astrophysics Data System (ADS)
Cherednichenko, A. V.; Cherednichenko, A. V.; Cherednichenko, V. S.
2018-01-01
It is shown that a significant connection exists between the most important harmonics, extracted in the process of harmonic analysis of time series of precipitation in the catchment area of rivers and the amount of runoff. This allowed us to predict the size of the flow for a period of up to 20 years, assuming that the main parameters of the harmonics are preserved at the predicted time interval. The results of such a forecast for three river basins of Kazakhstan are presented.
Crossing trend analysis methodology and application for Turkish rainfall records
NASA Astrophysics Data System (ADS)
Şen, Zekâi
2018-01-01
Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.
Detection of chaos: New approach to atmospheric pollen time-series analysis
NASA Astrophysics Data System (ADS)
Bianchi, M. M.; Arizmendi, C. M.; Sanchez, J. R.
1992-09-01
Pollen and spores are biological particles that are ubiquitous to the atmosphere and are pathologically significant, causing plant diseases and inhalant allergies. One of the main objectives of aerobiological surveys is forecasting. Prediction models are required in order to apply aerobiological knowledge to medical or agricultural practice; a necessary condition of these models is not to be chaotic. The existence of chaos is detected through the analysis of a time series. The time series comprises hourly counts of atmospheric pollen grains obtained using a Burkard spore trap from 1987 to 1989 at Mar del Plata. Abraham's method to obtain the correlation dimension was applied. A low and fractal dimension shows chaotic dynamics. The predictability of models for atomspheric pollen forecasting is discussed.
2017-12-29
indicated as shaded intervals in cyan) is shown in the context of the 5-6 August 2011 storm energetics. These are depicted by the time series of [b...of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...documented in a series of journal articles [Horvath and Lovell, 2017A; 2017B; 2017C; 2017D]. Our findings contribute to the better understanding of
New Ground Truth Capability from InSAR Time Series Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, S; Vincent, P; Yang, D
2005-07-13
We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Morton, A.
2017-10-01
Identifying erratic or unstable time-series is an area of interest to many fields. Recently, there have been successful developments towards this goal. These new developed methodologies however come from domains where it is typical to have several thousand or more temporal observations. This creates a challenge when attempting to apply these methodologies to time-series with much fewer temporal observations such as for socio-cultural understanding, a domain where a typical time series of interest might only consist of 20-30 annual observations. Most existing methodologies simply cannot say anything interesting with so few data points, yet researchers are still tasked to work within in the confines of the data. Recently a method for characterizing instability in a time series with limitedtemporal observations was published. This method, Attribute Stability Index (ASI), uses an approximate entropy based method tocharacterize a time series' instability. In this paper we propose an explicitly spatially weighted extension of the Attribute StabilityIndex. By including a mechanism to account for spatial autocorrelation, this work represents a novel approach for the characterizationof space-time instability. As a case study we explore national youth male unemployment across the world from 1991-2014.
Xiong, Lihua; Jiang, Cong; Du, Tao
2014-01-01
Time-varying moments models based on Pearson Type III and normal distributions respectively are built under the generalized additive model in location, scale and shape (GAMLSS) framework to analyze the nonstationarity of the annual runoff series of the Weihe River, the largest tributary of the Yellow River. The detection of nonstationarities in hydrological time series (annual runoff, precipitation and temperature) from 1960 to 2009 is carried out using a GAMLSS model, and then the covariate analysis for the annual runoff series is implemented with GAMLSS. Finally, the attribution of each covariate to the nonstationarity of annual runoff is analyzed quantitatively. The results demonstrate that (1) obvious change-points exist in all three hydrological series, (2) precipitation, temperature and irrigated area are all significant covariates of the annual runoff series, and (3) temperature increase plays the main role in leading to the reduction of the annual runoff series in the study basin, followed by the decrease of precipitation and the increase of irrigated area.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-24
... time that a document is referenced. Revision 3 of Regulatory Guide 1.129 is available in ADAMS under...-251-7455; email: [email protected] . Both of the Office of Nuclear Regulatory Research, U.S... NRC is issuing a revision to an existing guide in the NRC's ``Regulatory Guide'' series. This series...
Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin
NASA Astrophysics Data System (ADS)
zhang, L.
2011-12-01
Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be studied through the copula theory. As to the parameter estimation, the maximum likelihood estimation (MLE) will be applied. To illustrate the method, the univariate time series model and the dependence structure will be determined and tested using the monthly discharge time series of Cuyahoga River Basin.
Clinical time series prediction: towards a hierarchical dynamical system framework
Liu, Zitao; Hauskrecht, Milos
2014-01-01
Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671
Long series of geomagnetic measurements - unique at satellite era
NASA Astrophysics Data System (ADS)
Mandea, Mioara; Balasis, Georgios
2017-04-01
We have long appreciated that magnetic measurements obtained at Earth's surface are of great value in characterizing geomagnetic field behavior and then probing the deep interior of our Planet. The existence of new magnetic satellite missions data offer a new detailed global understanding of the geomagnetic field. However, when our interest moves over long-time scales, the very long series of measurements play an important role. Here, we firstly provide an updated series of geomagnetic declination in Paris, shortly after a very special occasion: its value has reached zero after some 350 years of westerly values. We take this occasion to emphasize the importance of long series of continuous measurements, mainly when various techniques are used to detect the abrupt changes in geomagnetic field, the geomagnetic jerks. Many novel concepts originated in dynamical systems or information theory have been developed, partly motivated by specific research questions from the geosciences. This continuously extending toolbox of nonlinear time series analysis is a key to understand the complexity of geomagnetic field. Here, motivated by these efforts, a series of entropy analysis are applied to geomagnetic field time series aiming to detect dynamical complex changes associated with geomagnetic jerks.
Temporal evolution of total ozone and circulation patterns over European mid-latitudes
NASA Astrophysics Data System (ADS)
Monge Sanz, B. M.; Casale, G. R.; Palmieri, S.; Siani, A. M.
2003-04-01
Linear correlation analysis and the running correlation technique are used to investigate the interannual and interdecadal variations of total ozone (TO) over several mid-latitude European locations. The study includes the longest series of ozone data, that of the Swiss station of Arosa. TO series have been related to time series of two circulation indices, the North Atlantic Oscillation Index (NAOI) and the Arctic Oscillation Index (AOI). The analysis has been performed with monthly data, and both series containing all the months of the year and winter (DJFM) series have been used. Special attention has been given to winter series, which exhibit very high correlation coefficients with NAOI and AOI; interannual variations of this relationship are studied by applying the running correlation technique. TO and circulation indices data series have been also partitioned into their different time-scale components with the Kolmogorov-Zurbenko method. Long-term components indicate the existence of strong opposite connection between total ozone and circulation patterns over the studied region during the last three decades. However, it is also observed that this relation has not always been so, and in previous times differences in the correlation amplitude and sign have been detected.
Anomaly on Superspace of Time Series Data
NASA Astrophysics Data System (ADS)
Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin
2017-11-01
We apply the G-theory and anomaly of ghost and antighost fields in the theory of supersymmetry to study a superspace over time series data for the detection of hidden general supply and demand equilibrium in the financial market. We provide proof of the existence of a general equilibrium point over 14 extradimensions of the new G-theory compared with the M-theory of the 11 dimensions model of Edward Witten. We found that the process of coupling between nonequilibrium and equilibrium spinor fields of expectation ghost fields in the superspace of time series data induces an infinitely long exact sequence of cohomology from a short exact sequence of moduli state space model. If we assume that the financial market is separated into two topological spaces of supply and demand as the D-brane and anti-D-brane model, then we can use a cohomology group to compute the stability of the market as a stable point of the general equilibrium of the interaction between D-branes of the market. We obtain the result that the general equilibrium will exist if and only if the 14th Batalin-Vilkovisky cohomology group with the negative dimensions underlying 14 major hidden factors influencing the market is zero.
Observing climate change trends in ocean biogeochemistry: when and where.
Henson, Stephanie A; Beaulieu, Claudie; Lampitt, Richard
2016-04-01
Understanding the influence of anthropogenic forcing on the marine biosphere is a high priority. Climate change-driven trends need to be accurately assessed and detected in a timely manner. As part of the effort towards detection of long-term trends, a network of ocean observatories and time series stations provide high quality data for a number of key parameters, such as pH, oxygen concentration or primary production (PP). Here, we use an ensemble of global coupled climate models to assess the temporal and spatial scales over which observations of eight biogeochemically relevant variables must be made to robustly detect a long-term trend. We find that, as a global average, continuous time series are required for between 14 (pH) and 32 (PP) years to distinguish a climate change trend from natural variability. Regional differences are extensive, with low latitudes and the Arctic generally needing shorter time series (<~30 years) to detect trends than other areas. In addition, we quantify the 'footprint' of existing and planned time series stations, that is the area over which a station is representative of a broader region. Footprints are generally largest for pH and sea surface temperature, but nevertheless the existing network of observatories only represents 9-15% of the global ocean surface. Our results present a quantitative framework for assessing the adequacy of current and future ocean observing networks for detection and monitoring of climate change-driven responses in the marine ecosystem. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Gábor Hatvani, István; Kern, Zoltán; Leél-Őssy, Szabolcs; Demény, Attila
2018-01-01
Uneven spacing is a common feature of sedimentary paleoclimate records, in many cases causing difficulties in the application of classical statistical and time series methods. Although special statistical tools do exist to assess unevenly spaced data directly, the transformation of such data into a temporally equidistant time series which may then be examined using commonly employed statistical tools remains, however, an unachieved goal. The present paper, therefore, introduces an approach to obtain evenly spaced time series (using cubic spline fitting) from unevenly spaced speleothem records with the application of a spectral guidance to avoid the spectral bias caused by interpolation and retain the original spectral characteristics of the data. The methodology was applied to stable carbon and oxygen isotope records derived from two stalagmites from the Baradla Cave (NE Hungary) dating back to the late 18th century. To show the benefit of the equally spaced records to climate studies, their coherence with climate parameters is explored using wavelet transform coherence and discussed. The obtained equally spaced time series are available at https://doi.org/10.1594/PANGAEA.875917.
Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability.
Krafty, Robert T
2016-07-01
Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modeled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possess favorable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability.
Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory
Tao, Qing
2017-01-01
Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM. PMID:29391864
Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.
Yang, Haimin; Pan, Zhisong; Tao, Qing
2017-01-01
Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.
Fast and unbiased estimator of the time-dependent Hurst exponent.
Pianese, Augusto; Bianchi, Sergio; Palazzo, Anna Maria
2018-03-01
We combine two existing estimators of the local Hurst exponent to improve both the goodness of fit and the computational speed of the algorithm. An application with simulated time series is implemented, and a Monte Carlo simulation is performed to provide evidence of the improvement.
Fast and unbiased estimator of the time-dependent Hurst exponent
NASA Astrophysics Data System (ADS)
Pianese, Augusto; Bianchi, Sergio; Palazzo, Anna Maria
2018-03-01
We combine two existing estimators of the local Hurst exponent to improve both the goodness of fit and the computational speed of the algorithm. An application with simulated time series is implemented, and a Monte Carlo simulation is performed to provide evidence of the improvement.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-09
... Airworthiness Directives; The Boeing Company Model 737-100, -200, -200C, -300, -400, and -500 Series Airplanes..., -200, -200C, -300, -400, and - 500 series airplanes. That AD currently requires a one-time inspection... 16211, March 31, 2006). The existing AD applies to all Model 737-100, -200, -200C, -300, -400, and -500...
Wavelet transform approach for fitting financial time series data
NASA Astrophysics Data System (ADS)
Ahmed, Amel Abdoullah; Ismail, Mohd Tahir
2015-10-01
This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.
CI2 for creating and comparing confidence-intervals for time-series bivariate plots.
Mullineaux, David R
2017-02-01
Currently no method exists for calculating and comparing the confidence-intervals (CI) for the time-series of a bivariate plot. The study's aim was to develop 'CI2' as a method to calculate the CI on time-series bivariate plots, and to identify if the CI between two bivariate time-series overlap. The test data were the knee and ankle angles from 10 healthy participants running on a motorised standard-treadmill and non-motorised curved-treadmill. For a recommended 10+ trials, CI2 involved calculating 95% confidence-ellipses at each time-point, then taking as the CI the points on the ellipses that were perpendicular to the direction vector between the means of two adjacent time-points. Consecutive pairs of CI created convex quadrilaterals, and any overlap of these quadrilaterals at the same time or ±1 frame as a time-lag calculated using cross-correlations, indicated where the two time-series differed. CI2 showed no group differences between left and right legs on both treadmills, but the same legs between treadmills for all participants showed differences of less knee extension on the curved-treadmill before heel-strike. To improve and standardise the use of CI2 it is recommended to remove outlier time-series, use 95% confidence-ellipses, and scale the ellipse by the fixed Chi-square value as opposed to the sample-size dependent F-value. For practical use, and to aid in standardisation or future development of CI2, Matlab code is provided. CI2 provides an effective method to quantify the CI of bivariate plots, and to explore the differences in CI between two bivariate time-series. Copyright © 2016 Elsevier B.V. All rights reserved.
A probabilistic method for constructing wave time-series at inshore locations using model scenarios
Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.
2014-01-01
Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.
BGFit: management and automated fitting of biological growth curves.
Veríssimo, André; Paixão, Laura; Neves, Ana Rute; Vinga, Susana
2013-09-25
Existing tools to model cell growth curves do not offer a flexible integrative approach to manage large datasets and automatically estimate parameters. Due to the increase of experimental time-series from microbiology and oncology, the need for a software that allows researchers to easily organize experimental data and simultaneously extract relevant parameters in an efficient way is crucial. BGFit provides a web-based unified platform, where a rich set of dynamic models can be fitted to experimental time-series data, further allowing to efficiently manage the results in a structured and hierarchical way. The data managing system allows to organize projects, experiments and measurements data and also to define teams with different editing and viewing permission. Several dynamic and algebraic models are already implemented, such as polynomial regression, Gompertz, Baranyi, Logistic and Live Cell Fraction models and the user can add easily new models thus expanding current ones. BGFit allows users to easily manage their data and models in an integrated way, even if they are not familiar with databases or existing computational tools for parameter estimation. BGFit is designed with a flexible architecture that focus on extensibility and leverages free software with existing tools and methods, allowing to compare and evaluate different data modeling techniques. The application is described in the context of bacterial and tumor cells growth data fitting, but it is also applicable to any type of two-dimensional data, e.g. physical chemistry and macroeconomic time series, being fully scalable to high number of projects, data and model complexity.
A tool for NDVI time series extraction from wide-swath remotely sensed images
NASA Astrophysics Data System (ADS)
Li, Zhishan; Shi, Runhe; Zhou, Cong
2015-09-01
Normalized Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring the vegetation coverage in land surface. The time series features of NDVI are capable of reflecting dynamic changes of various ecosystems. Calculating NDVI via Moderate Resolution Imaging Spectrometer (MODIS) and other wide-swath remotely sensed images provides an important way to monitor the spatial and temporal characteristics of large-scale NDVI. However, difficulties are still existed for ecologists to extract such information correctly and efficiently because of the problems in several professional processes on the original remote sensing images including radiometric calibration, geometric correction, multiple data composition and curve smoothing. In this study, we developed an efficient and convenient online toolbox for non-remote sensing professionals who want to extract NDVI time series with a friendly graphic user interface. It is based on Java Web and Web GIS technically. Moreover, Struts, Spring and Hibernate frameworks (SSH) are integrated in the system for the purpose of easy maintenance and expansion. Latitude, longitude and time period are the key inputs that users need to provide, and the NDVI time series are calculated automatically.
Wetting and drying of soil in response to precipitation: Data analysis, modeling, and forecasting
Basak, Aniruddha; Kulkarni, Chinmay; Schmidt, Kevin M.; Mengshoel, Ole
2016-01-01
This paper investigates methods to analyze and forecast soil moisture time series. We extend an existing Antecedent Water Index (AWI) model, which expresses soil moisture as a function of time and rainfall. Unfortunately, the existing AWI model does not forecast effectively for time periods beyond a few hours. To overcome this limitation, we develop a novel AWI-based model. Our model accumulates rainfall over a time interval and can fit a diverse range of wetting and drying curves. In addition, parameters in our model reflect hydrologic redistribution processes of gravity and suction.We validate our models using experimental soil moisture and rainfall time series data collected from steep gradient post-wildfire sites in Southern California, where rapid landscape change was observed in response to small to moderate rain storms. We found that our novel model fits the data for three distinct soil textures, occurring at different depths below the ground surface (5, 15, and 30 cm). Our model also successfully forecasts soil moisture trends, such as drying and wetting rate.
Statistical tests for power-law cross-correlated processes
NASA Astrophysics Data System (ADS)
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
NASA Astrophysics Data System (ADS)
Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph
2018-07-01
To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.
NASA Astrophysics Data System (ADS)
Jolivet, R.; Simons, M.
2018-02-01
Interferometric synthetic aperture radar time series methods aim to reconstruct time-dependent ground displacements over large areas from sets of interferograms in order to detect transient, periodic, or small-amplitude deformation. Because of computational limitations, most existing methods consider each pixel independently, ignoring important spatial covariances between observations. We describe a framework to reconstruct time series of ground deformation while considering all pixels simultaneously, allowing us to account for spatial covariances, imprecise orbits, and residual atmospheric perturbations. We describe spatial covariances by an exponential decay function dependent of pixel-to-pixel distance. We approximate the impact of imprecise orbit information and residual long-wavelength atmosphere as a low-order polynomial function. Tests on synthetic data illustrate the importance of incorporating full covariances between pixels in order to avoid biased parameter reconstruction. An example of application to the northern Chilean subduction zone highlights the potential of this method.
Panel data analysis of cardiotocograph (CTG) data.
Horio, Hiroyuki; Kikuchi, Hitomi; Ikeda, Tomoaki
2013-01-01
Panel data analysis is a statistical method, widely used in econometrics, which deals with two-dimensional panel data collected over time and over individuals. Cardiotocograph (CTG) which monitors fetal heart rate (FHR) using Doppler ultrasound and uterine contraction by strain gage is commonly used in intrapartum treatment of pregnant women. Although the relationship between FHR waveform pattern and the outcome such as umbilical blood gas data at delivery has long been analyzed, there exists no accumulated FHR patterns from large number of cases. As time-series economic fluctuations in econometrics such as consumption trend has been studied using panel data which consists of time-series and cross-sectional data, we tried to apply this method to CTG data. The panel data composed of a symbolized segment of FHR pattern can be easily handled, and a perinatologist can get the whole FHR pattern view from the microscopic level of time-series FHR data.
Correlations of stock price fluctuations under multi-scale and multi-threshold scenarios
NASA Astrophysics Data System (ADS)
Sui, Guo; Li, Huajiao; Feng, Sida; Liu, Xueyong; Jiang, Meihui
2018-01-01
The multi-scale method is widely used in analyzing time series of financial markets and it can provide market information for different economic entities who focus on different periods. Through constructing multi-scale networks of price fluctuation correlation in the stock market, we can detect the topological relationship between each time series. Previous research has not addressed the problem that the original fluctuation correlation networks are fully connected networks and more information exists within these networks that is currently being utilized. Here we use listed coal companies as a case study. First, we decompose the original stock price fluctuation series into different time scales. Second, we construct the stock price fluctuation correlation networks at different time scales. Third, we delete the edges of the network based on thresholds and analyze the network indicators. Through combining the multi-scale method with the multi-threshold method, we bring to light the implicit information of fully connected networks.
Linden, Ariel; Yarnold, Paul R
2016-12-01
Single-group interrupted time series analysis (ITSA) is a popular evaluation methodology in which a single unit of observation is being studied, the outcome variable is serially ordered as a time series and the intervention is expected to 'interrupt' the level and/or trend of the time series, subsequent to its introduction. Given that the internal validity of the design rests on the premise that the interruption in the time series is associated with the introduction of the treatment, treatment effects may seem less plausible if a parallel trend already exists in the time series prior to the actual intervention. Thus, sensitivity analyses should focus on detecting structural breaks in the time series before the intervention. In this paper, we introduce a machine-learning algorithm called optimal discriminant analysis (ODA) as an approach to determine if structural breaks can be identified in years prior to the initiation of the intervention, using data from California's 1988 voter-initiated Proposition 99 to reduce smoking rates. The ODA analysis indicates that numerous structural breaks occurred prior to the actual initiation of Proposition 99 in 1989, including perfect structural breaks in 1983 and 1985, thereby casting doubt on the validity of treatment effects estimated for the actual intervention when using a single-group ITSA design. Given the widespread use of ITSA for evaluating observational data and the increasing use of machine-learning techniques in traditional research, we recommend that structural break sensitivity analysis is routinely incorporated in all research using the single-group ITSA design. © 2016 John Wiley & Sons, Ltd.
Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun
2017-12-01
Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/. sunkim.bioinfo@snu.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
2017-10-03
and Microbiome Research Seminar Series . Baylor College of Medicine. 10/26/16. 12. "Rewiring the DNA binding domains ofbacterial two-component system...Structural and Quantitative Biology Seminar Series . 11/16/15. 16. "Engineering bacterial two component signal transduction systems to function as sensors...hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and
Neuronal and network computation in the brain
NASA Astrophysics Data System (ADS)
Babloyantz, A.
1999-03-01
The concepts and methods of non-linear dynamics have been a powerful tool for studying some gamow aspects of brain dynamics. In this paper we show how, from time series analysis of electroencepholograms in sick and healthy subjects, chaotic nature of brain activity could be unveiled. This finding gave rise to the concept of spatiotemporal cortical chaotic networks which in turn was the foundation for a simple brain-like device which is able to become attentive, perform pattern recognition and motion detection. A new method of time series analysis is also proposed which demonstrates for the first time the existence of neuronal code in interspike intervals of coclear cells.
Statistical regularities of Carbon emission trading market: Evidence from European Union allowances
NASA Astrophysics Data System (ADS)
Zheng, Zeyu; Xiao, Rui; Shi, Haibo; Li, Guihong; Zhou, Xiaofeng
2015-05-01
As an emerging financial market, the trading value of carbon emission trading market has definitely increased. In recent years, the carbon emission allowances have already become a way of investment. They are bought and sold not only by carbon emitters but also by investors. In this paper, we analyzed the price fluctuations of the European Union allowances (EUA) futures in European Climate Exchange (ECX) market from 2007 to 2011. The symmetric and power-law probability density function of return time series was displayed. We found that there are only short-range correlations in price changes (return), while long-range correlations in the absolute of price changes (volatility). Further, detrended fluctuation analysis (DFA) approach was applied with focus on long-range autocorrelations and Hurst exponent. We observed long-range power-law autocorrelations in the volatility that quantify risk, and found that they decay much more slowly than the autocorrelation of return time series. Our analysis also showed that the significant cross correlations exist between return time series of EUA and many other returns. These cross correlations exist in a wide range of fields, including stock markets, energy concerned commodities futures, and financial futures. The significant cross-correlations between energy concerned futures and EUA indicate the physical relationship between carbon emission and energy production process. Additionally, the cross-correlations between financial futures and EUA indicate that the speculation behavior may become an important factor that can affect the price of EUA. Finally we modeled the long-range volatility time series of EUA with a particular version of the GARCH process, and the result also suggests long-range volatility autocorrelations.
Analyzing Single-Molecule Time Series via Nonparametric Bayesian Inference
Hines, Keegan E.; Bankston, John R.; Aldrich, Richard W.
2015-01-01
The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. PMID:25650922
NASA Astrophysics Data System (ADS)
Manimaran, P.; Narayana, A. C.
2018-07-01
In this paper, we study the multifractal characteristics and cross-correlation behaviour of Air Pollution Index (API) time series data through multifractal detrended cross-correlation analysis method. We analyse the daily API records of nine air pollutants of the university of Hyderabad campus for a period of three years (2013-2016). The cross-correlation behaviour has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, it is found that the cross-correlation analysis shows anti-correlation behaviour for all possible 36 bivariate time series. We also observe the existence of multifractal nature in all the bivariate time series in which many of them show strong multifractal behaviour. In particular, the hazardous particulate matter PM2.5 and inhalable particulate matter PM10 shows anti-correlated behaviour with all air pollutants.
Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.
Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin
2016-01-01
This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.
Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations
Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin
2016-01-01
This paper introduces a new approach–the Principal Component Gradient Analysis (PCGA)–to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA. PMID:27467508
Jo, Kyuri; Kwon, Hawk-Bin; Kim, Sun
2014-06-01
Measuring expression levels of genes at the whole genome level can be useful for many purposes, especially for revealing biological pathways underlying specific phenotype conditions. When gene expression is measured over a time period, we have opportunities to understand how organisms react to stress conditions over time. Thus many biologists routinely measure whole genome level gene expressions at multiple time points. However, there are several technical difficulties for analyzing such whole genome expression data. In addition, these days gene expression data is often measured by using RNA-sequencing rather than microarray technologies and then analysis of expression data is much more complicated since the analysis process should start with mapping short reads and produce differentially activated pathways and also possibly interactions among pathways. In addition, many useful tools for analyzing microarray gene expression data are not applicable for the RNA-seq data. Thus a comprehensive package for analyzing time series transcriptome data is much needed. In this article, we present a comprehensive package, Time-series RNA-seq Analysis Package (TRAP), integrating all necessary tasks such as mapping short reads, measuring gene expression levels, finding differentially expressed genes (DEGs), clustering and pathway analysis for time-series data in a single environment. In addition to implementing useful algorithms that are not available for RNA-seq data, we extended existing pathway analysis methods, ORA and SPIA, for time series analysis and estimates statistical values for combined dataset by an advanced metric. TRAP also produces visual summary of pathway interactions. Gene expression change labeling, a practical clustering method used in TRAP, enables more accurate interpretation of the data when combined with pathway analysis. We applied our methods on a real dataset for the analysis of rice (Oryza sativa L. Japonica nipponbare) upon drought stress. The result showed that TRAP was able to detect pathways more accurately than several existing methods. TRAP is available at http://biohealth.snu.ac.kr/software/TRAP/. Copyright © 2014 Elsevier Inc. All rights reserved.
CauseMap: fast inference of causality from complex time series.
Maher, M Cyrus; Hernandez, Ryan D
2015-01-01
Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data. Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM), a method for establishing causality from long time series data (≳25 observations). Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens' Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement CCM in Julia, a high-performance programming language designed for facile technical computing. Our software package, CauseMap, is platform-independent and freely available as an official Julia package. Conclusions. CauseMap is an efficient implementation of a state-of-the-art algorithm for detecting causality from time series data. We believe this tool will be a valuable resource for biomedical research and personalized medicine.
Multifractal behavior of an air pollutant time series and the relevance to the predictability.
Dong, Qingli; Wang, Yong; Li, Peizhi
2017-03-01
Compared with the traditional method of detrended fluctuation analysis, which is used to characterize fractal scaling properties and long-range correlations, this research provides new insight into the multifractality and predictability of a nonstationary air pollutant time series using the methods of spectral analysis and multifractal detrended fluctuation analysis. First, the existence of a significant power-law behavior and long-range correlations for such series are verified. Then, by employing shuffling and surrogating procedures and estimating the scaling exponents, the major source of multifractality in these pollutant series is found to be the fat-tailed probability density function. Long-range correlations also partly contribute to the multifractal features. The relationship between the predictability of the pollutant time series and their multifractal nature is then investigated with extended analyses from the quantitative perspective, and it is found that the contribution of the multifractal strength of long-range correlations to the overall multifractal strength can affect the predictability of a pollutant series in a specific region to some extent. The findings of this comprehensive study can help to better understand the mechanisms governing the dynamics of air pollutant series and aid in performing better meteorological assessment and management. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Clarke, Hannah; Done, Fay; Casadio, Stefano; Mackin, Stephen; Dinelli, Bianca Maria; Castelli, Elisa
2016-08-01
The long time-series of observations made by the Along Track Scanning Radiometers (ATSR) missions represents a valuable resource for a wide range of research and EO applications.With the advent of ESA's Long-TermData Preservation (LTDP) programme, thought has turned to the preservation and improved understanding of such long time-series, to support their continued exploitation in both existing and new areas of research, bringing the possibility of improving the existing data set and to inform and contribute towards future missions. For this reason, the 'Long Term Stability of the ATSR Instrument Series: SWIR Calibration, Cloud Masking and SAA' project, commonly known as the ATSR Long Term Stability (or ALTS) project, is designed to explore the key characteristics of the data set and new and innovative ways of enhancing and exploiting it.Work has focussed on: A new approach to the assessment of Short Wave Infra-Red (SWIR) channel calibration.; Developmentof a new method for Total Column Water Vapour (TCWV) retrieval.; Study of the South Atlantic Anomaly (SAA).; Radiative Transfer (RT) modelling for ATSR.; Providing AATSR observations with their location in the original instrument grid.; Strategies for the retrieval and archiving of historical ATSR documentation.; Study of TCWV retrieval over land; Development of new methods for cloud masking This paper provides an overview of these activities and illustrates the importance of preserving and understanding 'old' data for continued use in the future.
Cui, Yiqian; Shi, Junyou; Wang, Zili
2015-11-01
Quantum Neural Networks (QNN) models have attracted great attention since it innovates a new neural computing manner based on quantum entanglement. However, the existing QNN models are mainly based on the real quantum operations, and the potential of quantum entanglement is not fully exploited. In this paper, we proposes a novel quantum neuron model called Complex Quantum Neuron (CQN) that realizes a deep quantum entanglement. Also, a novel hybrid networks model Complex Rotation Quantum Dynamic Neural Networks (CRQDNN) is proposed based on Complex Quantum Neuron (CQN). CRQDNN is a three layer model with both CQN and classical neurons. An infinite impulse response (IIR) filter is embedded in the Networks model to enable the memory function to process time series inputs. The Levenberg-Marquardt (LM) algorithm is used for fast parameter learning. The networks model is developed to conduct time series predictions. Two application studies are done in this paper, including the chaotic time series prediction and electronic remaining useful life (RUL) prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hensman, James; Lawrence, Neil D; Rattray, Magnus
2013-08-20
Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.
Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.
Coronal Mass Ejection Data Clustering and Visualization of Decision Trees
NASA Astrophysics Data System (ADS)
Ma, Ruizhe; Angryk, Rafal A.; Riley, Pete; Filali Boubrahimi, Soukaina
2018-05-01
Coronal mass ejections (CMEs) can be categorized as either “magnetic clouds” (MCs) or non-MCs. Features such as a large magnetic field, low plasma-beta, and low proton temperature suggest that a CME event is also an MC event; however, so far there is neither a definitive method nor an automatic process to distinguish the two. Human labeling is time-consuming, and results can fluctuate owing to the imprecise definition of such events. In this study, we approach the problem of MC and non-MC distinction from a time series data analysis perspective and show how clustering can shed some light on this problem. Although many algorithms exist for traditional data clustering in the Euclidean space, they are not well suited for time series data. Problems such as inadequate distance measure, inaccurate cluster center description, and lack of intuitive cluster representations need to be addressed for effective time series clustering. Our data analysis in this work is twofold: clustering and visualization. For clustering we compared the results from the popular hierarchical agglomerative clustering technique to a distance density clustering heuristic we developed previously for time series data clustering. In both cases, dynamic time warping will be used for similarity measure. For classification as well as visualization, we use decision trees to aggregate single-dimensional clustering results to form a multidimensional time series decision tree, with averaged time series to present each decision. In this study, we achieved modest accuracy and, more importantly, an intuitive interpretation of how different parameters contribute to an MC event.
Flicker Noise in GNSS Station Position Time Series: How much is due to Crustal Loading Deformations?
NASA Astrophysics Data System (ADS)
Rebischung, P.; Chanard, K.; Metivier, L.; Altamimi, Z.
2017-12-01
The presence of colored noise in GNSS station position time series was detected 20 years ago. It has been shown since then that the background spectrum of non-linear GNSS station position residuals closely follows a power-law process (known as flicker noise, 1/f noise or pink noise), with some white noise taking over at the highest frequencies. However, the origin of the flicker noise present in GNSS station position time series is still unclear. Flicker noise is often described as intrinsic to the GNSS system, i.e. due to errors in the GNSS observations or in their modeling, but no such error source has been identified so far that could explain the level of observed flicker noise, nor its spatial correlation.We investigate another possible contributor to the observed flicker noise, namely real crustal displacements driven by surface mass transports, i.e. non-tidal loading deformations. This study is motivated by the presence of power-law noise in the time series of low-degree (≤ 40) and low-order (≤ 12) Stokes coefficients observed by GRACE - power-law noise might also exist at higher degrees and orders, but obscured by GRACE observational noise. By comparing GNSS station position time series with loading deformation time series derived from GRACE gravity fields, both with their periodic components removed, we therefore assess whether GNSS and GRACE both plausibly observe the same flicker behavior of surface mass transports / loading deformations. Taking into account GRACE observability limitations, we also quantify the amount of flicker noise in GNSS station position time series that could be explained by such flicker loading deformations.
Structure of public transit costs in the presence of multiple serial correlation
DOT National Transportation Integrated Search
1999-12-01
Most studies indicate that public transit systems operate under increasing returns to capital stock utilization and are significantly overcapitalized. Existing flexible form time series analyses, however, fail to correct for serial correlation. In th...
Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier
2012-01-01
Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785
Sun, Bruce Qiang; Zhang, Jie
2016-03-01
For the effects of social integration on suicides, there have been different and even contradictive conclusions. In this study, the selected economic and social risks of suicide for different age groups and genders in the United Kingdom were identified and the effects were estimated by the multilevel time series analyses. To our knowledge, there exist no previous studies that estimated a dynamic model of suicides on the time series data together with multilevel analysis and autoregressive distributed lags. The investigation indicated that unemployment rate, inflation rate, and divorce rate are all significantly and positively related to the national suicide rates in the United Kingdom from 1981 to 2011. Furthermore, the suicide rates of almost all groups above 40 years are significantly associated with the risk factors of unemployment and inflation rate, in comparison with the younger groups. © 2016 American Academy of Forensic Sciences.
Error-based Extraction of States and Energy Landscapes from Experimental Single-Molecule Time-Series
NASA Astrophysics Data System (ADS)
Taylor, J. Nicholas; Li, Chun-Biu; Cooper, David R.; Landes, Christy F.; Komatsuzaki, Tamiki
2015-03-01
Characterization of states, the essential components of the underlying energy landscapes, is one of the most intriguing subjects in single-molecule (SM) experiments due to the existence of noise inherent to the measurements. Here we present a method to extract the underlying state sequences from experimental SM time-series. Taking into account empirical error and the finite sampling of the time-series, the method extracts a steady-state network which provides an approximation of the underlying effective free energy landscape. The core of the method is the application of rate-distortion theory from information theory, allowing the individual data points to be assigned to multiple states simultaneously. We demonstrate the method's proficiency in its application to simulated trajectories as well as to experimental SM fluorescence resonance energy transfer (FRET) trajectories obtained from isolated agonist binding domains of the AMPA receptor, an ionotropic glutamate receptor that is prevalent in the central nervous system.
Tewatia, D K; Tolakanahalli, R P; Paliwal, B R; Tomé, W A
2011-04-07
The underlying requirements for successful implementation of any efficient tumour motion management strategy are regularity and reproducibility of a patient's breathing pattern. The physiological act of breathing is controlled by multiple nonlinear feedback and feed-forward couplings. It would therefore be appropriate to analyse the breathing pattern of lung cancer patients in the light of nonlinear dynamical system theory. The purpose of this paper is to analyse the one-dimensional respiratory time series of lung cancer patients based on nonlinear dynamics and delay coordinate state space embedding. It is very important to select a suitable pair of embedding dimension 'm' and time delay 'τ' when performing a state space reconstruction. Appropriate time delay and embedding dimension were obtained using well-established methods, namely mutual information and the false nearest neighbour method, respectively. Establishing stationarity and determinism in a given scalar time series is a prerequisite to demonstrating that the nonlinear dynamical system that gave rise to the scalar time series exhibits a sensitive dependence on initial conditions, i.e. is chaotic. Hence, once an appropriate state space embedding of the dynamical system has been reconstructed, we show that the time series of the nonlinear dynamical systems under study are both stationary and deterministic in nature. Once both criteria are established, we proceed to calculate the largest Lyapunov exponent (LLE), which is an invariant quantity under time delay embedding. The LLE for all 16 patients is positive, which along with stationarity and determinism establishes the fact that the time series of a lung cancer patient's breathing pattern is not random or irregular, but rather it is deterministic in nature albeit chaotic. These results indicate that chaotic characteristics exist in the respiratory waveform and techniques based on state space dynamics should be employed for tumour motion management.
Solar Environmental Disturbances
2007-11-02
like stars were examined, extending the previous 7–12 year time series to 13–20 years by combining Strömgren b, y photometry from Lowell Observatory...per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...explanations for how these physical processes affect the production of solar activity, both on short and long time scales. Solar cycle variation
NASA Astrophysics Data System (ADS)
Korotkov, E. V.; Korotkova, M. A.
2017-01-01
The purpose of this study was to detect latent periodicity in the presence of deletions or insertions in the analyzed data, when the points of deletions or insertions are unknown. A mathematical method was developed to search for periodicity in the numerical series, using dynamic programming and random matrices. The developed method was applied to search for periodicity in the Euro/Dollar (Eu/) exchange rate, since 2001. The presence of periodicity within the period length equal to 24 h in the analyzed financial series was shown. Periodicity can be detected only with insertions and deletions. The results of this study show that periodicity phase shifts, depend on the observation time. The reasons for the existence of the periodicity in the financial ranks are discussed.
A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis
NASA Astrophysics Data System (ADS)
Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz
2018-04-01
For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.
Scaling Exponents in Financial Markets
NASA Astrophysics Data System (ADS)
Kim, Kyungsik; Kim, Cheol-Hyun; Kim, Soo Yong
2007-03-01
We study the dynamical behavior of four exchange rates in foreign exchange markets. A detrended fluctuation analysis (DFA) is applied to detect the long-range correlation embedded in the non-stationary time series. It is for our case found that there exists a persistent long-range correlation in volatilities, which implies the deviation from the efficient market hypothesis. Particularly, the crossover is shown to exist in the scaling behaviors of the volatilities.
Segmentation of time series with long-range fractal correlations.
Bernaola-Galván, P; Oliver, J L; Hackenberg, M; Coronado, A V; Ivanov, P Ch; Carpena, P
2012-06-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.
NASA Astrophysics Data System (ADS)
Suhaila, Jamaludin; Yusop, Zulkifli
2017-06-01
Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.
Chen, C P; Wan, J Z
1999-01-01
A fast learning algorithm is proposed to find an optimal weights of the flat neural networks (especially, the functional-link network). Although the flat networks are used for nonlinear function approximation, they can be formulated as linear systems. Thus, the weights of the networks can be solved easily using a linear least-square method. This formulation makes it easier to update the weights instantly for both a new added pattern and a new added enhancement node. A dynamic stepwise updating algorithm is proposed to update the weights of the system on-the-fly. The model is tested on several time-series data including an infrared laser data set, a chaotic time-series, a monthly flour price data set, and a nonlinear system identification problem. The simulation results are compared to existing models in which more complex architectures and more costly training are needed. The results indicate that the proposed model is very attractive to real-time processes.
AQUAdexIM: highly efficient in-memory indexing and querying of astronomy time series images
NASA Astrophysics Data System (ADS)
Hong, Zhi; Yu, Ce; Wang, Jie; Xiao, Jian; Cui, Chenzhou; Sun, Jizhou
2016-12-01
Astronomy has always been, and will continue to be, a data-based science, and astronomers nowadays are faced with increasingly massive datasets, one key problem of which is to efficiently retrieve the desired cup of data from the ocean. AQUAdexIM, an innovative spatial indexing and querying method, performs highly efficient on-the-fly queries under users' request to search for Time Series Images from existing observation data on the server side and only return the desired FITS images to users, so users no longer need to download entire datasets to their local machines, which will only become more and more impractical as the data size keeps increasing. Moreover, AQUAdexIM manages to keep a very low storage space overhead and its specially designed in-memory index structure enables it to search for Time Series Images of a given area of the sky 10 times faster than using Redis, a state-of-the-art in-memory database.
Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick
2018-01-01
When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Detection of anomalous signals in temporally correlated data (Invited)
NASA Astrophysics Data System (ADS)
Langbein, J. O.
2010-12-01
Detection of transient tectonic signals in data obtained from large geodetic networks requires the ability to detect signals that are both temporally and spatially coherent. In this report I will describe a modification to an existing method that estimates both the coefficients of temporally correlated noise model and an efficient filter based on the noise model. This filter, when applied to the original time-series, effectively whitens (or flattens) the power spectrum. The filtered data provide the means to calculate running averages which are then used to detect deviations from the background trends. For large networks, time-series of signal-to-noise ratio (SNR) can be easily constructed since, by filtering, each of the original time-series has been transformed into one that is closer to having a Gaussian distribution with a variance of 1.0. Anomalous intervals may be identified by counting the number of GPS sites for which the SNR exceeds a specified value. For example, during one time interval, if there were 5 out of 20 time-series with SNR>2, this would be considered anomalous; typically, one would expect at 95% confidence that there would be at least 1 out of 20 time-series with an SNR>2. For time intervals with an anomalously large number of high SNR, the spatial distribution of the SNR is mapped to identify the location of the anomalous signal(s) and their degree of spatial clustering. Estimating the filter that should be used to whiten the data requires modification of the existing methods that employ maximum likelihood estimation to determine the temporal covariance of the data. In these methods, it is assumed that the noise components in the data are a combination of white, flicker and random-walk processes and that they are derived from three different and independent sources. Instead, in this new method, the covariance matrix is constructed assuming that only one source is responsible for the noise and that source can be represented as a white-noise random-number generator convolved with a filter whose spectral properties are frequency (f) independent at its highest frequencies, 1/f at the middle frequencies, and 1/f2 at the lowest frequencies. For data sets with no gaps in their time-series, construction of covariance and inverse covariance matrices is extremely efficient. Application of the above algorithm to real data potentially involves several iterations as small, tectonic signals of interest are often indistinguishable from background noise. Consequently, simply plotting the time-series of each GPS site is used to identify the largest outliers and signals independent of their cause. Any analysis of the background noise levels must factor in these other signals while the gross outliers need to be removed.
Finding hidden periodic signals in time series - an application to stock prices
NASA Astrophysics Data System (ADS)
O'Shea, Michael
2014-03-01
Data in the form of time series appear in many areas of science. In cases where the periodicity is apparent and the only other contribution to the time series is stochastic in origin, the data can be `folded' to improve signal to noise and this has been done for light curves of variable stars with the folding resulting in a cleaner light curve signal. Stock index prices versus time are classic examples of time series. Repeating patterns have been claimed by many workers and include unusually large returns on small-cap stocks during the month of January, and small returns on the Dow Jones Industrial average (DJIA) in the months June through September compared to the rest of the year. Such observations imply that these prices have a periodic component. We investigate this for the DJIA. If such a component exists it is hidden in a large non-periodic variation and a large stochastic variation. We show how to extract this periodic component and for the first time reveal its yearly (averaged) shape. This periodic component leads directly to the `Sell in May and buy at Halloween' adage. We also drill down and show that this yearly variation emerges from approximately half of the underlying stocks making up the DJIA index.
Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.
Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald
2017-01-01
Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series
Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C.; Golino, Caroline A.; Kemper, Peter; Saha, Margaret S.
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features. PMID:27977764
Functional MRI and Multivariate Autoregressive Models
Rogers, Baxter P.; Katwal, Santosh B.; Morgan, Victoria L.; Asplund, Christopher L.; Gore, John C.
2010-01-01
Connectivity refers to the relationships that exist between different regions of the brain. In the context of functional magnetic resonance imaging (fMRI), it implies a quantifiable relationship between hemodynamic signals from different regions. One aspect of this relationship is the existence of small timing differences in the signals in different regions. Delays of 100 ms or less may be measured with fMRI, and these may reflect important aspects of the manner in which brain circuits respond as well as the overall functional organization of the brain. The multivariate autoregressive time series model has features to recommend it for measuring these delays, and is straightforward to apply to hemodynamic data. In this review, we describe the current usage of the multivariate autoregressive model for fMRI, discuss the issues that arise when it is applied to hemodynamic time series, and consider several extensions. Connectivity measures like Granger causality that are based on the autoregressive model do not always reflect true neuronal connectivity; however, we conclude that careful experimental design could make this methodology quite useful in extending the information obtainable using fMRI. PMID:20444566
Time series modeling of live-cell shape dynamics for image-based phenotypic profiling.
Gordonov, Simon; Hwang, Mun Kyung; Wells, Alan; Gertler, Frank B; Lauffenburger, Douglas A; Bathe, Mark
2016-01-01
Live-cell imaging can be used to capture spatio-temporal aspects of cellular responses that are not accessible to fixed-cell imaging. As the use of live-cell imaging continues to increase, new computational procedures are needed to characterize and classify the temporal dynamics of individual cells. For this purpose, here we present the general experimental-computational framework SAPHIRE (Stochastic Annotation of Phenotypic Individual-cell Responses) to characterize phenotypic cellular responses from time series imaging datasets. Hidden Markov modeling is used to infer and annotate morphological state and state-switching properties from image-derived cell shape measurements. Time series modeling is performed on each cell individually, making the approach broadly useful for analyzing asynchronous cell populations. Two-color fluorescent cells simultaneously expressing actin and nuclear reporters enabled us to profile temporal changes in cell shape following pharmacological inhibition of cytoskeleton-regulatory signaling pathways. Results are compared with existing approaches conventionally applied to fixed-cell imaging datasets, and indicate that time series modeling captures heterogeneous dynamic cellular responses that can improve drug classification and offer additional important insight into mechanisms of drug action. The software is available at http://saphire-hcs.org.
Analysis on Difference of Forest Phenology Extracted from EVI and LAI Based on PhenoCams
NASA Astrophysics Data System (ADS)
Wang, C.; Jing, L.; Qinhuo, L.
2017-12-01
Land surface phenology can make up for the deficiency of field observation with advantages of capturing the continuous expression of phenology on a large scale. However, there are some variability in phenological metrics derived from different satellite time-series data of vegetation parameters. This paper aims at assessing the difference of phenology information extracted from EVI and LAI time series. To achieve this, some web-camera sites were selected to analyze the characteristics between MODIS-EVI and MODIS-LAI time series from 2010 to 2014 for different forest types, including evergreen coniferous forest, evergreen broadleaf forest, deciduous coniferous forest and deciduous broadleaf forest. At the same time, satellite-based phenological metrics were extracted by the Logistics algorithm and compared with camera-based phenological metrics. Results show that the SOS and EOS that are extracted from LAI are close to bud burst and leaf defoliation respectively, while the SOS and EOS that are extracted from EVI is close to leaf unfolding and leaf coloring respectively. Thus the SOS that is extracted from LAI is earlier than that from EVI, while the EOS that is extracted from LAI is later than that from EVI at deciduous forest sites. Although the seasonal variation characteristics of evergreen forests are not apparent, significant discrepancies exist in LAI time series and EVI time series. In addition, Satellite- and camera-based phenological metrics agree well generally, but EVI has higher correlation with the camera-based canopy greenness (green chromatic coordinate, gcc) than LAI.
2017-08-14
the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this...place; c) Site visits took place for two of the candidate technologies, T- SERIES by ZeroBase and Sol-Char by the University of Colorado, within the...visits during the planned timeframe within the SLB-STO-D master plan; d) The T- Series by Zero-Base appears to be the most mature of all the industry
Streamflow properties from time series of surface velocity and stage
Plant, W.J.; Keller, W.C.; Hayes, K.; Spicer, K.
2005-01-01
Time series of surface velocity and stage have been collected simultaneously. Surface velocity was measured using an array of newly developed continuous-wave microwave sensors. Stage was obtained from the standard U.S. Geological Survey (USGS) measurements. The depth of the river was measured several times during our experiments using sounding weights. The data clearly showed that the point of zero flow was not the bottom at the measurement site, indicating that a downstream control exists. Fathometer measurements confirmed this finding. A model of the surface velocity expected at a site having a downstream control was developed. The model showed that the standard form for the friction velocity does not apply to sites where a downstream control exists. This model fit our measured surface velocity versus stage plots very well with reasonable values of the parameters. Discharges computed using the surface velocities and measured depths matched the USGS rating curve for the site. Values of depth-weighted mean velocities derived from our data did not agree with those expected from Manning's equation due to the downstream control. These results suggest that if real-time surface velocities were available at a gauging station, unstable stream beds could be monitored. Journal of Hydraulic Engineering ?? ASCE.
Taylor Series Trajectory Calculations Including Oblateness Effects and Variable Atmospheric Density
NASA Technical Reports Server (NTRS)
Scott, James R.
2011-01-01
Taylor series integration is implemented in NASA Glenn's Spacecraft N-body Analysis Program, and compared head-to-head with the code's existing 8th- order Runge-Kutta Fehlberg time integration scheme. This paper focuses on trajectory problems that include oblateness and/or variable atmospheric density. Taylor series is shown to be significantly faster and more accurate for oblateness problems up through a 4x4 field, with speedups ranging from a factor of 2 to 13. For problems with variable atmospheric density, speedups average 24 for atmospheric density alone, and average 1.6 to 8.2 when density and oblateness are combined.
Autoregressive-model-based missing value estimation for DNA microarray time series data.
Choong, Miew Keen; Charbit, Maurice; Yan, Hong
2009-01-01
Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.
Intermittent sea-level acceleration
NASA Astrophysics Data System (ADS)
Olivieri, M.; Spada, G.
2013-10-01
Using instrumental observations from the Permanent Service for Mean Sea Level (PSMSL), we provide a new assessment of the global sea-level acceleration for the last ~ 2 centuries (1820-2010). Our results, obtained by a stack of tide gauge time series, confirm the existence of a global sea-level acceleration (GSLA) and, coherently with independent assessments so far, they point to a value close to 0.01 mm/yr2. However, differently from previous studies, we discuss how change points or abrupt inflections in individual sea-level time series have contributed to the GSLA. Our analysis, based on methods borrowed from econometrics, suggests the existence of two distinct driving mechanisms for the GSLA, both involving a minority of tide gauges globally. The first effectively implies a gradual increase in the rate of sea-level rise at individual tide gauges, while the second is manifest through a sequence of catastrophic variations of the sea-level trend. These occurred intermittently since the end of the 19th century and became more frequent during the last four decades.
Gene regulatory network identification from the yeast cell cycle based on a neuro-fuzzy system.
Wang, B H; Lim, J W; Lim, J S
2016-08-30
Many studies exist for reconstructing gene regulatory networks (GRNs). In this paper, we propose a method based on an advanced neuro-fuzzy system, for gene regulatory network reconstruction from microarray time-series data. This approach uses a neural network with a weighted fuzzy function to model the relationships between genes. Fuzzy rules, which determine the regulators of genes, are very simplified through this method. Additionally, a regulator selection procedure is proposed, which extracts the exact dynamic relationship between genes, using the information obtained from the weighted fuzzy function. Time-series related features are extracted from the original data to employ the characteristics of temporal data that are useful for accurate GRN reconstruction. The microarray dataset of the yeast cell cycle was used for our study. We measured the mean squared prediction error for the efficiency of the proposed approach and evaluated the accuracy in terms of precision, sensitivity, and F-score. The proposed method outperformed the other existing approaches.
Application of data cubes for improving detection of water cycle extreme events
NASA Astrophysics Data System (ADS)
Teng, W. L.; Albayrak, A.
2015-12-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.
Application of Data Cubes for Improving Detection of Water Cycle Extreme Events
NASA Technical Reports Server (NTRS)
Albayrak, Arif; Teng, William
2015-01-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).
Dobson total ozone series of Oxford: Reevaluation and applications
NASA Astrophysics Data System (ADS)
Vogler, C.; BröNnimann, S.; Staehelin, J.; Griffin, R. E. M.
2007-10-01
We have reevaluated the original total ozone measurements made in Oxford between 1924 and 1957, with a view to extending backward in time the existing total ozone series from 1957 to 1975. The Oxford measurements are the oldest Dobson observations in the world. Their prime importance, when coupled with the series from Arosa (since 1926) and Tromsø (since 1935), is for increasing basic understanding of stratospheric ozone and dynamics, while in relation to studies of the recent ozone depletion they constitute a baseline of considerable (and unique) significance and value. However, the reevaluation was made difficult on account of changes to the instruments and wavelengths as the early data collection methods evolved, while unknowns due to the influence of aerosols and the possible presence of dioxides of sulphur and nitrogen created additional problems. Our reevaluation was based on statistical procedures (comparisons with meteorological upper air data and ozone series from Arosa) and also on corrections suggested by Dobson himself. The comparisons demonstrate that the data are internally consistent and of good quality. Nevertheless, as post-1957 data were not assessed in this study, the series cannot be recommended at present for trend analysis, though the series can be used for climatological studies. By supplementing the Oxford data with other existing series, we present a European total ozone climatology for 1924-1939, 1950-1965, and 1988-2000 and analyze the data with respect to variables measuring the strength and the temperature of the polar vortex.
Long-time predictions in nonlinear dynamics
NASA Technical Reports Server (NTRS)
Szebehely, V.
1980-01-01
It is known that nonintegrable dynamical systems do not allow precise predictions concerning their behavior for arbitrary long times. The available series solutions are not uniformly convergent according to Poincare's theorem and numerical integrations lose their meaningfulness after the elapse of arbitrary long times. Two approaches are the use of existing global integrals and statistical methods. This paper presents a generalized method along the first approach. As examples long-time predictions in the classical gravitational satellite and planetary problems are treated.
Mariani, Luigi; Zavatti, Franco
2017-09-01
The spectral periods in North Atlantic Oscillation (NAO), Atlantic Multidecadal Oscillation (AMO) and El Nino Southern Oscillation (ENSO) were analyzed and has been verified how they imprint a time series of European temperature anomalies (ETA), two European temperature time series and some phenological series (dates of cherry flowering and grapevine harvest). Such work had as reference scenario the linear causal chain MCTP (Macroscale Circulation→Temperature→Phenology of crops) that links oceanic and atmospheric circulation to surface air temperature which in its turn determines the earliness of appearance of phenological phases of plants. Results show that in the three segments of the MCTP causal chain are present cycles with the following central period in years (the % of the 12 analyzed time series interested by these cycles are in brackets): 65 (58%), 24 (58%), 20.5 (58%), 13.5 (50%), 11.5 (58%), 7.7 (75%), 5.5 (58%), 4.1 (58%), 3 (50%), 2.4 (67%). A comparison with short term spectral peaks of the four El Niño regions (nino1+2, nino3, nino3.4 and nino4) show that 10 of the 12 series are imprinted by periods around 2.3-2.4yr while 50-58% of the series are imprinted by El Niño periods of 4-4.2, 3.8-3.9, 3-3.1years. The analysis highlights the links among physical and biological variables of the climate system at scales that range from macro to microscale whose knowledge is crucial to reach a suitable understanding of the ecosystem behavior. The spectral analysis was also applied to a time series of spring - summer precipitation in order to evaluate the presence of peaks common with other 12 selected series with result substantially negative which brings us to rule out the existence of a linear causal chain MCPP (Macroscale Circulation→Precipitation→Phenology). Copyright © 2017 Elsevier B.V. All rights reserved.
77 FR 65506 - Airworthiness Directives; The Boeing Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-29
...We propose to supersede an existing airworthiness directive (AD) that applies to certain The Boeing Company Model 757-200 and - 200PF series airplanes. The existing AD currently requires modification of the nacelle strut and wing structure, and repair of any damage found during the modification. Since we issued that AD, a compliance time error involving the optional threshold formula was discovered, which could allow an airplane to exceed the acceptable compliance time for addressing the unsafe condition. This proposed AD would specify a maximum compliance time limit that overrides the optional threshold formula results. We are proposing this AD to prevent fatigue cracking in primary strut structure and consequent reduced structural integrity of the strut.
Arismendi, Ivan; Dunham, Jason B.; Heck, Michael; Schultz, Luke; Hockman-Wert, David
2017-01-01
Intermittent and ephemeral streams represent more than half of the length of the global river network. Dryland freshwater ecosystems are especially vulnerable to changes in human-related water uses as well as shifts in terrestrial climates. Yet, the description and quantification of patterns of flow permanence in these systems is challenging mostly due to difficulties in instrumentation. Here, we took advantage of existing stream temperature datasets in dryland streams in the northwest Great Basin desert, USA, to extract critical information on climate-sensitive patterns of flow permanence. We used a signal detection technique, Hidden Markov Models (HMMs), to extract information from daily time series of stream temperature to diagnose patterns of stream drying. Specifically, we applied HMMs to time series of daily standard deviation (SD) of stream temperature (i.e., dry stream channels typically display highly variable daily temperature records compared to wet stream channels) between April and August (2015–2016). We used information from paired stream and air temperature data loggers as well as co-located stream temperature data loggers with electrical resistors as confirmatory sources of the timing of stream drying. We expanded our approach to an entire stream network to illustrate the utility of the method to detect patterns of flow permanence over a broader spatial extent. We successfully identified and separated signals characteristic of wet and dry stream conditions and their shifts over time. Most of our study sites within the entire stream network exhibited a single state over the entire season (80%), but a portion of them showed one or more shifts among states (17%). We provide recommendations to use this approach based on a series of simple steps. Our findings illustrate a successful method that can be used to rigorously quantify flow permanence regimes in streams using existing records of stream temperature.
Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin
2013-01-01
Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509
Segmentation of time series with long-range fractal correlations
Bernaola-Galván, P.; Oliver, J.L.; Hackenberg, M.; Coronado, A.V.; Ivanov, P.Ch.; Carpena, P.
2012-01-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome. PMID:23645997
NASA Astrophysics Data System (ADS)
Juckett, David A.
2001-09-01
A more complete understanding of the periodic dynamics of the Sun requires continued exploration of non-11-year oscillations in addition to the benchmark 11-year sunspot cycle. In this regard, several solar, geomagnetic, and cosmic ray time series were examined to identify common spectral components and their relative phase relationships. Several non-11-year oscillations were identified within the near-decadal range with periods of ~8, 10, 12, 15, 18, 22, and 29 years. To test whether these frequency components were simply low-level noise or were related to a common source, the phases were extracted for each component in each series. The phases were nearly identical across the solar and geomagnetic series, while the corresponding components in four cosmic ray surrogate series exhibited inverted phases, similar to the known phase relationship with the 11-year sunspot cycle. Cluster analysis revealed that this pattern was unlikely to occur by chance. It was concluded that many non-11-year oscillations truly exist in the solar dynamical environment and that these contribute to the complex variations observed in geomagnetic and cosmic ray time series. Using the different energy sensitivities of the four cosmic ray surrogate series, a preliminary indication of the relative intensities of the various solar-induced oscillations was observed. It provides evidence that many of the non-11-year oscillations result from weak interplanetary magnetic field/solar wind oscillations that originate from corresponding variations in the open-field regions of the Sun.
Visibility graphlet approach to chaotic time series
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mutua, Stephen; Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega; Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn
Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems.more » Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.« less
State energy data report 1996: Consumption estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The State Energy Data Report (SEDR) provides annual time series estimates of State-level energy consumption by major economic sectors. The estimates are developed in the Combined State Energy Data System (CSEDS), which is maintained and operated by the Energy Information Administration (EIA). The goal in maintaining CSEDS is to create historical time series of energy consumption by State that are defined as consistently as possible over time and across sectors. CSEDS exists for two principal reasons: (1) to provide State energy consumption estimates to Members of Congress, Federal and State agencies, and the general public and (2) to provide themore » historical series necessary for EIA`s energy models. To the degree possible, energy consumption has been assigned to five sectors: residential, commercial, industrial, transportation, and electric utility sectors. Fuels covered are coal, natural gas, petroleum, nuclear electric power, hydroelectric power, biomass, and other, defined as electric power generated from geothermal, wind, photovoltaic, and solar thermal energy. 322 tabs.« less
NASA Astrophysics Data System (ADS)
Lindholm, D. M.; Wilson, A.
2010-12-01
The Laboratory for Atmospheric and Space Physics at the University of Colorado has developed an Open Source, OPeNDAP compliant, Java Servlet based, RESTful web service to serve time series data. In addition to handling OPeNDAP style requests and returning standard responses, existing modules for alternate output formats can be reused or customized. It is also simple to reuse or customize modules to directly read various native data sources and even to perform some processing on the server. The server is built around a common data model based on the Unidata Common Data Model (CDM) which merges the NetCDF, HDF, and OPeNDAP data models. The server framework features a modular architecture that supports pluggable Readers, Writers, and Filters via the common interface to the data, enabling a workflow that reads data from their native form, performs some processing on the server, and presents the results to the client in its preferred form. The service is currently being used operationally to serve time series data for the LASP Interactive Solar Irradiance Data Center (LISIRD, http://lasp.colorado.edu/lisird/) and as part of the Time Series Data Server (TSDS, http://tsds.net/). I will present the data model and how it enables reading, writing, and processing concerns to be separated into loosely coupled components. I will also share thoughts for evolving beyond the time series abstraction and providing a general purpose data service that can be orchestrated into larger workflows.
Common mode error in Antarctic GPS coordinate time series on its effect on bedrock-uplift estimates
NASA Astrophysics Data System (ADS)
Liu, Bin; King, Matt; Dai, Wujiao
2018-05-01
Spatially-correlated common mode error always exists in regional, or-larger, GPS networks. We applied independent component analysis (ICA) to GPS vertical coordinate time series in Antarctica from 2010 to 2014 and made a comparison with the principal component analysis (PCA). Using PCA/ICA, the time series can be decomposed into a set of temporal components and their spatial responses. We assume the components with common spatial responses are common mode error (CME). An average reduction of ˜40% about the RMS values was achieved in both PCA and ICA filtering. However, the common mode components obtained from the two approaches have different spatial and temporal features. ICA time series present interesting correlations with modeled atmospheric and non-tidal ocean loading displacements. A white noise (WN) plus power law noise (PL) model was adopted in the GPS velocity estimation using maximum likelihood estimation (MLE) analysis, with ˜55% reduction of the velocity uncertainties after filtering using ICA. Meanwhile, spatiotemporal filtering reduces the amplitude of PL and periodic terms in the GPS time series. Finally, we compare the GPS uplift velocities, after correction for elastic effects, with recent models of glacial isostatic adjustment (GIA). The agreements of the GPS observed velocities and four GIA models are generally improved after the spatiotemporal filtering, with a mean reduction of ˜0.9 mm/yr of the WRMS values, possibly allowing for more confident separation of various GIA model predictions.
Abduallah, Yasser; Turki, Turki; Byron, Kevin; Du, Zongxuan; Cervantes-Cervantes, Miguel; Wang, Jason T L
2017-01-01
Gene regulation is a series of processes that control gene expression and its extent. The connections among genes and their regulatory molecules, usually transcription factors, and a descriptive model of such connections are known as gene regulatory networks (GRNs). Elucidating GRNs is crucial to understand the inner workings of the cell and the complexity of gene interactions. To date, numerous algorithms have been developed to infer gene regulatory networks. However, as the number of identified genes increases and the complexity of their interactions is uncovered, networks and their regulatory mechanisms become cumbersome to test. Furthermore, prodding through experimental results requires an enormous amount of computation, resulting in slow data processing. Therefore, new approaches are needed to expeditiously analyze copious amounts of experimental data resulting from cellular GRNs. To meet this need, cloud computing is promising as reported in the literature. Here, we propose new MapReduce algorithms for inferring gene regulatory networks on a Hadoop cluster in a cloud environment. These algorithms employ an information-theoretic approach to infer GRNs using time-series microarray data. Experimental results show that our MapReduce program is much faster than an existing tool while achieving slightly better prediction accuracy than the existing tool.
An Iterative Time Windowed Signature Algorithm for Time Dependent Transcription Module Discovery
Meng, Jia; Gao, Shou-Jiang; Huang, Yufei
2010-01-01
An algorithm for the discovery of time varying modules using genome-wide expression data is present here. When applied to large-scale time serious data, our method is designed to discover not only the transcription modules but also their timing information, which is rarely annotated by the existing approaches. Rather than assuming commonly defined time constant transcription modules, a module is depicted as a set of genes that are co-regulated during a specific period of time, i.e., a time dependent transcription module (TDTM). A rigorous mathematical definition of TDTM is provided, which is serve as an objective function for retrieving modules. Based on the definition, an effective signature algorithm is proposed that iteratively searches the transcription modules from the time series data. The proposed method was tested on the simulated systems and applied to the human time series microarray data during Kaposi's sarcoma-associated herpesvirus (KSHV) infection. The result has been verified by Expression Analysis Systematic Explorer. PMID:21552463
NASA Astrophysics Data System (ADS)
Meshram, Sarita Gajbhiye; Singh, Sudhir Kumar; Meshram, Chandrashekhar; Deo, Ravinesh C.; Ambade, Balram
2017-12-01
Trend analysis of long-term rainfall records can be used to facilitate better agriculture water management decision and climate risk studies. The main objective of this study was to identify the existing trends in the long-term rainfall time series over the period 1901-2010 utilizing 12 hydrological stations located at the Ken River basin (KRB) in Madhya Pradesh, India. To investigate the different trends, the rainfall time series data were divided into annual and seasonal (i.e., pre-monsoon, monsoon, post-monsoon, and winter season) sub-sets, and a statistical analysis of data using the non-parametric Mann-Kendall (MK) test and the Sen's slope approach was applied to identify the nature of the existing trends in rainfall series for the Ken River basin. The obtained results were further interpolated with the aid of the Quantum Geographic Information System (GIS) approach employing the inverse distance weighted approach. The results showed that the monsoon and the winter season exhibited a negative trend in rainfall changes over the period of study, and this was true for all stations, although the changes during the pre- and the post-monsoon seasons were less significant. The outcomes of this research study also suggest significant decreases in the seasonal and annual trends of rainfall amounts in the study period. These findings showing a clear signature of climate change impacts on KRB region potentially have implications in terms of climate risk management strategies to be developed during major growing and harvesting seasons and also to aid in the appropriate water resource management strategies that must be implemented in decision-making process.
Flight test experience using advanced airborne equipment in a time-based metered traffic environment
NASA Technical Reports Server (NTRS)
Morello, S. A.
1980-01-01
A series of test flights have demonstrated that time-based metering guidance and control was acceptable to pilots and air traffic controllers. The descent algorithm of the technique, with good representation of aircraft performance and wind modeling, yielded arrival time accuracy within 12 sec. It is expected that this will represent significant fuel savings (1) through a reduction of the time error dispersions at the metering fix for the entire fleet, and (2) for individual aircraft as well, through the presentation of guidance for a fuel-efficient descent. Air traffic controller workloads were also reduced, in keeping with the reduction of required communications resulting from the transfer of navigation responsibilities to pilots. A second series of test flights demonstrated that an existing flight management system could be modified to operate in the new mode.
Quasi-experimental study designs series-paper 6: risk of bias assessment.
Waddington, Hugh; Aloe, Ariel M; Becker, Betsy Jane; Djimeu, Eric W; Hombrados, Jorge Garcia; Tugwell, Peter; Wells, George; Reeves, Barney
2017-09-01
Rigorous and transparent bias assessment is a core component of high-quality systematic reviews. We assess modifications to existing risk of bias approaches to incorporate rigorous quasi-experimental approaches with selection on unobservables. These are nonrandomized studies using design-based approaches to control for unobservable sources of confounding such as difference studies, instrumental variables, interrupted time series, natural experiments, and regression-discontinuity designs. We review existing risk of bias tools. Drawing on these tools, we present domains of bias and suggest directions for evaluation questions. The review suggests that existing risk of bias tools provide, to different degrees, incomplete transparent criteria to assess the validity of these designs. The paper then presents an approach to evaluating the internal validity of quasi-experiments with selection on unobservables. We conclude that tools for nonrandomized studies of interventions need to be further developed to incorporate evaluation questions for quasi-experiments with selection on unobservables. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Teng, William; Rui, Hualan; Strub, Richard; Vollmer, Bruce
2015-01-01
A Digital Divide has long stood between how NASA and other satellite-derived data are typically archived (time-step arrays or maps) and how hydrology and other point-time series oriented communities prefer to access those data. In essence, the desired method of data access is orthogonal to the way the data are archived. Our approach to bridging the Divide is part of a larger NASA-supported data rods project to enhance access to and use of NASA and other data by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) and the larger hydrology community. Our main objective was to determine a way to reorganize data that is optimal for these communities. Two related objectives were to optimally reorganize data in a way that (1) is operational and fits in and leverages the existing Goddard Earth Sciences Data and Information Services Center (GES DISC) operational environment and (2) addresses the scaling up of data sets available as time series from those archived at the GES DISC to potentially include those from other Earth Observing System Data and Information System (EOSDIS) data archives. Through several prototype efforts and lessons learned, we arrived at a non-database solution that satisfied our objectivesconstraints. We describe, in this presentation, how we implemented the operational production of pre-generated data rods and, considering the tradeoffs between length of time series (or number of time steps), resources needed, and performance, how we implemented the operational production of on-the-fly (virtual) data rods. For the virtual data rods, we leveraged a number of existing resources, including the NASA Giovanni Cache and NetCDF Operators (NCO) and used data cubes processed in parallel. Our current benchmark performance for virtual generation of data rods is about a years worth of time series for hourly data (9,000 time steps) in 90 seconds. Our approach is a specific implementation of the general optimal strategy of reorganizing data to match the desired means of access. Results from our project have already significantly extended NASA data to the large and important hydrology user community that has been, heretofore, mostly unable to easily access and use NASA data.
NASA Astrophysics Data System (ADS)
Teng, W. L.; Rui, H.; Strub, R. F.; Vollmer, B.
2015-12-01
A "Digital Divide" has long stood between how NASA and other satellite-derived data are typically archived (time-step arrays or "maps") and how hydrology and other point-time series oriented communities prefer to access those data. In essence, the desired method of data access is orthogonal to the way the data are archived. Our approach to bridging the Divide is part of a larger NASA-supported "data rods" project to enhance access to and use of NASA and other data by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) and the larger hydrology community. Our main objective was to determine a way to reorganize data that is optimal for these communities. Two related objectives were to optimally reorganize data in a way that (1) is operational and fits in and leverages the existing Goddard Earth Sciences Data and Information Services Center (GES DISC) operational environment and (2) addresses the scaling up of data sets available as time series from those archived at the GES DISC to potentially include those from other Earth Observing System Data and Information System (EOSDIS) data archives. Through several prototype efforts and lessons learned, we arrived at a non-database solution that satisfied our objectives/constraints. We describe, in this presentation, how we implemented the operational production of pre-generated data rods and, considering the tradeoffs between length of time series (or number of time steps), resources needed, and performance, how we implemented the operational production of on-the-fly ("virtual") data rods. For the virtual data rods, we leveraged a number of existing resources, including the NASA Giovanni Cache and NetCDF Operators (NCO) and used data cubes processed in parallel. Our current benchmark performance for virtual generation of data rods is about a year's worth of time series for hourly data (~9,000 time steps) in ~90 seconds. Our approach is a specific implementation of the general optimal strategy of reorganizing data to match the desired means of access. Results from our project have already significantly extended NASA data to the large and important hydrology user community that has been, heretofore, mostly unable to easily access and use NASA data.
Random walker in temporally deforming higher-order potential forces observed in a financial crisis.
Watanabe, Kota; Takayasu, Hideki; Takayasu, Misako
2009-11-01
Basic peculiarities of market price fluctuations are known to be well described by a recently developed random-walk model in a temporally deforming quadratic potential force whose center is given by a moving average of past price traces [M. Takayasu, T. Mizuno, and H. Takayasu, Physica A 370, 91 (2006)]. By analyzing high-frequency financial time series of exceptional events, such as bubbles and crashes, we confirm the appearance of higher-order potential force in the markets. We show statistical significance of its existence by applying the information criterion. This time series analysis is expected to be applied widely for detecting a nonstationary symptom in random phenomena.
Aerosol Index Dynamics over Athens and Beijing
NASA Astrophysics Data System (ADS)
Christodoulakis, J.; Varotsos, C.; Tzanis, C.; Xue, Y.
2014-11-01
We present the analysis of monthly mean Aerosol Index (AI) values, over Athens, Greece, and Beijing, China, for the period 1979-2012. The aim of the analysis is the identification of time scaling in the AI time series, by using a data analysis technique that would not be affected by the non-stationarity of the data. The appropriate technique satisfying this criterion is the Detrended Fluctuation Analysis (DF A). For the deseasonalization of time series classic Wiener method was applied filtering out the seasonal - 3 months, semiannual - 6 months and annual - 12 months periods. The data analysis for both Athens and Beijing revealed that the exponents α for both time periods are greater than 0.5 indicating that persistence of the correlations in the fluctuations of the deseasonalized AI values exists for time scales between about 4 months and 3.5 years (for the period 1979-1993) or 4 years (for the period 1996-2012).
Aerosol Index Dynamics over Athens and Beijing
NASA Astrophysics Data System (ADS)
Christodoulakis, J.; Varotsos, C.; Tzanis, C.; Xue, Y.
2014-11-01
We present the analysis of monthly mean Aerosol Index (AI) values, over Athens, Greece, and Beijing, China, for the period 1979- 2012. The aim of the analysis is the identification of time scaling in the AI time series, by using a data analysis technique that would not be affected by the non-stationarity of the data. The appropriate technique satisfying this criterion is the Detrended Fluctuation Analysis (DFA). For the deseasonalization of time series classic Wiener method was applied filtering out the seasonal - 3 months, semiannual - 6 months and annual - 12 months periods. The data analysis for both Athens and Beijing revealed that the exponents α for both time periods are greater than 0.5 indicating that persistence of the correlations in the fluctuations of the deseasonalized AI values exists for time scales between about 4 months and 3.5 years (for the period 1979-1993) or 4 years (for the period 1996-2012).
NASA Technical Reports Server (NTRS)
Keeling, Ralph F.; Campbell, J. A. (Technical Monitor)
2002-01-01
We successfully initiated a program to obtain continuous time series of atmospheric O2 concentrations at a semi-remote coastal site, in Trinidad, California. The installation, which was completed in September 1999, consists of a commercially-available O2 and CO2 analyzers interfaced to a custom gas handling system and housed in a dedicated building at the Trinidad site. Ultimately, the data from this site are expected to provide constraints, complementing satellite data, on variations in ocean productivity and carbon exchange on annual and interannual time scales, in the context of human-induced changes in global climate and other perturbations. The existing time-series, of limited duration, have been used in support of studies of the O2/CO2 exchange from a wild fire (which fortuitously occurred nearby in October 1999) and to quantify air-sea N2O and O2 exchanges related to coastal upwelling events. More generally, the project demonstrates the feasibility of obtaining semi-continuous O2 time series at moderate cost from strategic locations globally.
An M-estimator for reduced-rank system identification.
Chen, Shaojie; Liu, Kai; Yang, Yuguang; Xu, Yuting; Lee, Seonjoo; Lindquist, Martin; Caffo, Brian S; Vogelstein, Joshua T
2017-01-15
High-dimensional time-series data from a wide variety of domains, such as neuroscience, are being generated every day. Fitting statistical models to such data, to enable parameter estimation and time-series prediction, is an important computational primitive. Existing methods, however, are unable to cope with the high-dimensional nature of these data, due to both computational and statistical reasons. We mitigate both kinds of issues by proposing an M-estimator for Reduced-rank System IDentification ( MR. SID). A combination of low-rank approximations, ℓ 1 and ℓ 2 penalties, and some numerical linear algebra tricks, yields an estimator that is computationally efficient and numerically stable. Simulations and real data examples demonstrate the usefulness of this approach in a variety of problems. In particular, we demonstrate that MR. SID can accurately estimate spatial filters, connectivity graphs, and time-courses from native resolution functional magnetic resonance imaging data. MR. SID therefore enables big time-series data to be analyzed using standard methods, readying the field for further generalizations including non-linear and non-Gaussian state-space models.
An M-estimator for reduced-rank system identification
Chen, Shaojie; Liu, Kai; Yang, Yuguang; Xu, Yuting; Lee, Seonjoo; Lindquist, Martin; Caffo, Brian S.; Vogelstein, Joshua T.
2018-01-01
High-dimensional time-series data from a wide variety of domains, such as neuroscience, are being generated every day. Fitting statistical models to such data, to enable parameter estimation and time-series prediction, is an important computational primitive. Existing methods, however, are unable to cope with the high-dimensional nature of these data, due to both computational and statistical reasons. We mitigate both kinds of issues by proposing an M-estimator for Reduced-rank System IDentification ( MR. SID). A combination of low-rank approximations, ℓ1 and ℓ2 penalties, and some numerical linear algebra tricks, yields an estimator that is computationally efficient and numerically stable. Simulations and real data examples demonstrate the usefulness of this approach in a variety of problems. In particular, we demonstrate that MR. SID can accurately estimate spatial filters, connectivity graphs, and time-courses from native resolution functional magnetic resonance imaging data. MR. SID therefore enables big time-series data to be analyzed using standard methods, readying the field for further generalizations including non-linear and non-Gaussian state-space models. PMID:29391659
Inverse sequential procedures for the monitoring of time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy J.
1995-01-01
When one or more new values are added to a developing time series, they change its descriptive parameters (mean, variance, trend, coherence). A 'change index (CI)' is developed as a quantitative indicator that the changed parameters remain compatible with the existing 'base' data. CI formulate are derived, in terms of normalized likelihood ratios, for small samples from Poisson, Gaussian, and Chi-Square distributions, and for regression coefficients measuring linear or exponential trends. A substantial parameter change creates a rapid or abrupt CI decrease which persists when the length of the bases is changed. Except for a special Gaussian case, the CI has no simple explicit regions for tests of hypotheses. However, its design ensures that the series sampled need not conform strictly to the distribution form assumed for the parameter estimates. The use of the CI is illustrated with both constructed and observed data samples, processed with a Fortran code 'Sequitor'.
Effect of spatial image support in detecting long-term vegetation change from satellite time-series
USDA-ARS?s Scientific Manuscript database
Context Arid rangelands have been severely degraded over the past century. Multi-temporal remote sensing techniques are ideally suited to detect significant changes in ecosystem state; however, considerable uncertainty exists regarding the effects of changing image resolution on their ability to de...
78 FR 28152 - Airworthiness Directives; Airbus Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-14
... series airplanes. The existing AD currently requires repetitive inspections of the 80VU rack lower lateral fittings for damage; repetitive inspections of the 80VU rack lower central support for cracking... fittings of the 80VU rack. This proposed AD would reduce the inspection compliance time, add an inspection...
The State's Formula for Success 2006
ERIC Educational Resources Information Center
Colorado Department of Education, 2006
2006-01-01
This standards review is the second in a series of annual reviews of the Colorado Model Content Standards. Its purpose is to identify student performance over time on measures of existing science standards, identify ways to affirm and strengthen standards and more clearly articulate the practices used by Colorado schools to promote student…
Identification of varying time scales in sediment transport using the Hilbert-Huang Transform method
NASA Astrophysics Data System (ADS)
Kuai, Ken Z.; Tsai, Christina W.
2012-02-01
SummarySediment transport processes vary at a variety of time scales - from seconds, hours, days to months and years. Multiple time scales exist in the system of flow, sediment transport and bed elevation change processes. As such, identification and selection of appropriate time scales for flow and sediment processes can assist in formulating a system of flow and sediment governing equations representative of the dynamic interaction of flow and particles at the desired details. Recognizing the importance of different varying time scales in the fluvial processes of sediment transport, we introduce the Hilbert-Huang Transform method (HHT) to the field of sediment transport for the time scale analysis. The HHT uses the Empirical Mode Decomposition (EMD) method to decompose a time series into a collection of the Intrinsic Mode Functions (IMFs), and uses the Hilbert Spectral Analysis (HSA) to obtain instantaneous frequency data. The EMD extracts the variability of data with different time scales, and improves the analysis of data series. The HSA can display the succession of time varying time scales, which cannot be captured by the often-used Fast Fourier Transform (FFT) method. This study is one of the earlier attempts to introduce the state-of-the-art technique for the multiple time sales analysis of sediment transport processes. Three practical applications of the HHT method for data analysis of both suspended sediment and bedload transport time series are presented. The analysis results show the strong impact of flood waves on the variations of flow and sediment time scales at a large sampling time scale, as well as the impact of flow turbulence on those time scales at a smaller sampling time scale. Our analysis reveals that the existence of multiple time scales in sediment transport processes may be attributed to the fractal nature in sediment transport. It can be demonstrated by the HHT analysis that the bedload motion time scale is better represented by the ratio of the water depth to the settling velocity, h/ w. In the final part, HHT results are compared with an available time scale formula in literature.
NASA Astrophysics Data System (ADS)
Shi, Wenhui; Feng, Changyou; Qu, Jixian; Zha, Hao; Ke, Dan
2018-02-01
Most of the existing studies on wind power output focus on the fluctuation of wind farms and the spatial self-complementary of wind power output time series was ignored. Therefore the existing probability models can’t reflect the features of power system incorporating wind farms. This paper analyzed the spatial self-complementary of wind power and proposed a probability model which can reflect temporal characteristics of wind power on seasonal and diurnal timescales based on sufficient measured data and improved clustering method. This model could provide important reference for power system simulation incorporating wind farms.
Time-Resolved Transposon Insertion Sequencing Reveals Genome-Wide Fitness Dynamics during Infection.
Yang, Guanhua; Billings, Gabriel; Hubbard, Troy P; Park, Joseph S; Yin Leung, Ka; Liu, Qin; Davis, Brigid M; Zhang, Yuanxing; Wang, Qiyao; Waldor, Matthew K
2017-10-03
Transposon insertion sequencing (TIS) is a powerful high-throughput genetic technique that is transforming functional genomics in prokaryotes, because it enables genome-wide mapping of the determinants of fitness. However, current approaches for analyzing TIS data assume that selective pressures are constant over time and thus do not yield information regarding changes in the genetic requirements for growth in dynamic environments (e.g., during infection). Here, we describe structured analysis of TIS data collected as a time series, termed pattern analysis of conditional essentiality (PACE). From a temporal series of TIS data, PACE derives a quantitative assessment of each mutant's fitness over the course of an experiment and identifies mutants with related fitness profiles. In so doing, PACE circumvents major limitations of existing methodologies, specifically the need for artificial effect size thresholds and enumeration of bacterial population expansion. We used PACE to analyze TIS samples of Edwardsiella piscicida (a fish pathogen) collected over a 2-week infection period from a natural host (the flatfish turbot). PACE uncovered more genes that affect E. piscicida 's fitness in vivo than were detected using a cutoff at a terminal sampling point, and it identified subpopulations of mutants with distinct fitness profiles, one of which informed the design of new live vaccine candidates. Overall, PACE enables efficient mining of time series TIS data and enhances the power and sensitivity of TIS-based analyses. IMPORTANCE Transposon insertion sequencing (TIS) enables genome-wide mapping of the genetic determinants of fitness, typically based on observations at a single sampling point. Here, we move beyond analysis of endpoint TIS data to create a framework for analysis of time series TIS data, termed pattern analysis of conditional essentiality (PACE). We applied PACE to identify genes that contribute to colonization of a natural host by the fish pathogen Edwardsiella piscicida. PACE uncovered more genes that affect E. piscicida 's fitness in vivo than were detected using a terminal sampling point, and its clustering of mutants with related fitness profiles informed design of new live vaccine candidates. PACE yields insights into patterns of fitness dynamics and circumvents major limitations of existing methodologies. Finally, the PACE method should be applicable to additional "omic" time series data, including screens based on clustered regularly interspaced short palindromic repeats with Cas9 (CRISPR/Cas9). Copyright © 2017 Yang et al.
Genetic programming and serial processing for time series classification.
Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I
2014-01-01
This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.
Capturing Context-Related Change in Emotional Dynamics via Fixed Moderated Time Series Analysis.
Adolf, Janne K; Voelkle, Manuel C; Brose, Annette; Schmiedek, Florian
2017-01-01
Much of recent affect research relies on intensive longitudinal studies to assess daily emotional experiences. The resulting data are analyzed with dynamic models to capture regulatory processes involved in emotional functioning. Daily contexts, however, are commonly ignored. This may not only result in biased parameter estimates and wrong conclusions, but also ignores the opportunity to investigate contextual effects on emotional dynamics. With fixed moderated time series analysis, we present an approach that resolves this problem by estimating context-dependent change in dynamic parameters in single-subject time series models. The approach examines parameter changes of known shape and thus addresses the problem of observed intra-individual heterogeneity (e.g., changes in emotional dynamics due to observed changes in daily stress). In comparison to existing approaches to unobserved heterogeneity, model estimation is facilitated and different forms of change can readily be accommodated. We demonstrate the approach's viability given relatively short time series by means of a simulation study. In addition, we present an empirical application, targeting the joint dynamics of affect and stress and how these co-vary with daily events. We discuss potentials and limitations of the approach and close with an outlook on the broader implications for understanding emotional adaption and development.
Sinha, Shriprakash
2017-12-04
Ever since the accidental discovery of Wingless [Sharma R.P., Drosophila information service, 1973, 50, p 134], research in the field of Wnt signaling pathway has taken significant strides in wet lab experiments and various cancer clinical trials, augmented by recent developments in advanced computational modeling of the pathway. Information rich gene expression profiles reveal various aspects of the signaling pathway and help in studying different issues simultaneously. Hitherto, not many computational studies exist which incorporate the simultaneous study of these issues. This manuscript ∙ explores the strength of contributing factors in the signaling pathway, ∙ analyzes the existing causal relations among the inter/extracellular factors effecting the pathway based on prior biological knowledge and ∙ investigates the deviations in fold changes in the recently found prevalence of psychophysical laws working in the pathway. To achieve this goal, local and global sensitivity analysis is conducted on the (non)linear responses between the factors obtained from static and time series expression profiles using the density (Hilbert-Schmidt Information Criterion) and variance (Sobol) based sensitivity indices. The results show the advantage of using density based indices over variance based indices mainly due to the former's employment of distance measures & the kernel trick via Reproducing kernel Hilbert space (RKHS) that capture nonlinear relations among various intra/extracellular factors of the pathway in a higher dimensional space. In time series data, using these indices it is now possible to observe where in time, which factors get influenced & contribute to the pathway, as changes in concentration of the other factors are made. This synergy of prior biological knowledge, sensitivity analysis & representations in higher dimensional spaces can facilitate in time based administration of target therapeutic drugs & reveal hidden biological information within colorectal cancer samples.
Inspiring Examples in Rearrangements of Infinite Series
ERIC Educational Resources Information Center
Ramasinghe, W.
2002-01-01
Simple examples are really encouraging in the understanding of rearrangements of infinite series, since many texts and teachers provide only existence theorems. In the absence of examples, an existence theorem is just a statement and lends little confidence to understanding. Iterated sums of double series seem to have a similar spirit of…
Revising time series of the Elbe river discharge for flood frequency determination at gauge Dresden
NASA Astrophysics Data System (ADS)
Bartl, S.; Schümberg, S.; Deutsch, M.
2009-11-01
The German research programme RIsk MAnagment of eXtreme flood events has accomplished the improvement of regional hazard assessment for the large rivers in Germany. Here we focused on the Elbe river at its gauge Dresden, which belongs to the oldest gauges in Europe with officially available daily discharge time series beginning on 1 January 1890. The project on the one hand aimed to extend and to revise the existing time series, and on the other hand to examine the variability of the Elbe river discharge conditions on a greater time scale. Therefore one major task were the historical searches and the examination of the retrieved documents and the contained information. After analysing this information the development of the river course and the discharge conditions were discussed. Using the provided knowledge, in an other subproject, a historical hydraulic model was established. Its results then again were used here. A further purpose was the determining of flood frequency based on all pre-processed data. The obtained knowledge about historical changes was also used to get an idea about possible future variations under climate change conditions. Especially variations in the runoff characteristic of the Elbe river over the course of the year were analysed. It succeeded to obtain a much longer discharge time series which contain fewer errors and uncertainties. Hence an optimized regional hazard assessment was realised.
Construction of regulatory networks using expression time-series data of a genotyped population.
Yeung, Ka Yee; Dombek, Kenneth M; Lo, Kenneth; Mittler, John E; Zhu, Jun; Schadt, Eric E; Bumgarner, Roger E; Raftery, Adrian E
2011-11-29
The inference of regulatory and biochemical networks from large-scale genomics data is a basic problem in molecular biology. The goal is to generate testable hypotheses of gene-to-gene influences and subsequently to design bench experiments to confirm these network predictions. Coexpression of genes in large-scale gene-expression data implies coregulation and potential gene-gene interactions, but provide little information about the direction of influences. Here, we use both time-series data and genetics data to infer directionality of edges in regulatory networks: time-series data contain information about the chronological order of regulatory events and genetics data allow us to map DNA variations to variations at the RNA level. We generate microarray data measuring time-dependent gene-expression levels in 95 genotyped yeast segregants subjected to a drug perturbation. We develop a Bayesian model averaging regression algorithm that incorporates external information from diverse data types to infer regulatory networks from the time-series and genetics data. Our algorithm is capable of generating feedback loops. We show that our inferred network recovers existing and novel regulatory relationships. Following network construction, we generate independent microarray data on selected deletion mutants to prospectively test network predictions. We demonstrate the potential of our network to discover de novo transcription-factor binding sites. Applying our construction method to previously published data demonstrates that our method is competitive with leading network construction algorithms in the literature.
NASA Technical Reports Server (NTRS)
Chao, Benjamin F.; Cox, Christopher M.; Au, Andrew Y.
2004-01-01
Recent Satellite Laser Ranging derived long wavelength gravity time series analysis has focused to a large extent on the effects of the recent large changes in the Earth s 52, and the potential causes. However, it is difficult to determine whether there are corresponding signals in the shorter wavelength zonals from the existing SLR-derived time variable gravity results, although it appears that geophysical fluid transport is being observed. For example, the recovered J3 time series shows remarkable agreement with NCEP-derived estimates of atmospheric gravity variations. Likewise, some of the non-zonal spherical harmonic coefficient series have significant interannual signal that appears to be related to mass transport. The non-zonal degree 2 terms show reasonable correlation with atmospheric signals, as well as climatic effects such as El Nino Southern Oscillation. While the formal uncertainty of these terms is significantly higher than that for J2, it is also clear that there is useful signal to be extracted. Consequently, the SLR time series is being reprocessed to improve the time variable gravity field recovery. We will present recent updates on the J2 evolution, as well as a look at other components of the interannual variations of the gravity field, complete through degree 4, and possible geophysical and climatic causes.
Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives
NASA Astrophysics Data System (ADS)
Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.
2017-12-01
During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.
Water Column Variability in Coastal Regions
1997-09-30
to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data... 1 . REPORT DATE 30 SEP 1997 2. REPORT TYPE 3. DATES COVERED 00-00-1997 to 00-00-1997 4. TITLE AND SUBTITLE Water Column Variability in...Andrews, Woods, and Kester deployed a spar buoy at a central location in Narragansett Bay to obtain time-series variations at multiple depths ( 1 , 4
NASA Astrophysics Data System (ADS)
Reyes, J. C.; Vernon, F. L.; Newman, R. L.; Steidl, J. H.
2010-12-01
The Waveform Server is an interactive web-based interface to multi-station, multi-sensor and multi-channel high-density time-series data stored in Center for Seismic Studies (CSS) 3.0 schema relational databases (Newman et al., 2009). In the last twelve months, based on expanded specifications and current user feedback, both the server-side infrastructure and client-side interface have been extensively rewritten. The Python Twisted server-side code-base has been fundamentally modified to now present waveform data stored in cluster-based databases using a multi-threaded architecture, in addition to supporting the pre-existing single database model. This allows interactive web-based access to high-density (broadband @ 40Hz to strong motion @ 200Hz) waveform data that can span multiple years; the common lifetime of broadband seismic networks. The client-side interface expands on it's use of simple JSON-based AJAX queries to now incorporate a variety of User Interface (UI) improvements including standardized calendars for defining time ranges, applying on-the-fly data calibration to display SI-unit data, and increased rendering speed. This presentation will outline the various cyber infrastructure challenges we have faced while developing this application, the use-cases currently in existence, and the limitations of web-based application development.
Econophysics — complex correlations and trend switchings in financial time series
NASA Astrophysics Data System (ADS)
Preis, T.
2011-03-01
This article focuses on the analysis of financial time series and their correlations. A method is used for quantifying pattern based correlations of a time series. With this methodology, evidence is found that typical behavioral patterns of financial market participants manifest over short time scales, i.e., that reactions to given price patterns are not entirely random, but that similar price patterns also cause similar reactions. Based on the investigation of the complex correlations in financial time series, the question arises, which properties change when switching from a positive trend to a negative trend. An empirical quantification by rescaling provides the result that new price extrema coincide with a significant increase in transaction volume and a significant decrease in the length of corresponding time intervals between transactions. These findings are independent of the time scale over 9 orders of magnitude, and they exhibit characteristics which one can also find in other complex systems in nature (and in physical systems in particular). These properties are independent of the markets analyzed. Trends that exist only for a few seconds show the same characteristics as trends on time scales of several months. Thus, it is possible to study financial bubbles and their collapses in more detail, because trend switching processes occur with higher frequency on small time scales. In addition, a Monte Carlo based simulation of financial markets is analyzed and extended in order to reproduce empirical features and to gain insight into their causes. These causes include both financial market microstructure and the risk aversion of market participants.
Memory and long-range correlations in chess games
NASA Astrophysics Data System (ADS)
Schaigorodsky, Ana L.; Perotti, Juan I.; Billoni, Orlando V.
2014-01-01
In this paper we report the existence of long-range memory in the opening moves of a chronologically ordered set of chess games using an extensive chess database. We used two mapping rules to build discrete time series and analyzed them using two methods for detecting long-range correlations; rescaled range analysis and detrended fluctuation analysis. We found that long-range memory is related to the level of the players. When the database is filtered according to player levels we found differences in the persistence of the different subsets. For high level players, correlations are stronger at long time scales; whereas in intermediate and low level players they reach the maximum value at shorter time scales. This can be interpreted as a signature of the different strategies used by players with different levels of expertise. These results are robust against the assignation rules and the method employed in the analysis of the time series.
NASA Astrophysics Data System (ADS)
Sultana, Tahmina; Takagi, Hiroaki; Morimatsu, Miki; Teramoto, Hiroshi; Li, Chun-Biu; Sako, Yasushi; Komatsuzaki, Tamiki
2013-12-01
We present a novel scheme to extract a multiscale state space network (SSN) from single-molecule time series. The multiscale SSN is a type of hidden Markov model that takes into account both multiple states buried in the measurement and memory effects in the process of the observable whenever they exist. Most biological systems function in a nonstationary manner across multiple timescales. Combined with a recently established nonlinear time series analysis based on information theory, a simple scheme is proposed to deal with the properties of multiscale and nonstationarity for a discrete time series. We derived an explicit analytical expression of the autocorrelation function in terms of the SSN. To demonstrate the potential of our scheme, we investigated single-molecule time series of dissociation and association kinetics between epidermal growth factor receptor (EGFR) on the plasma membrane and its adaptor protein Ash/Grb2 (Grb2) in an in vitro reconstituted system. We found that our formula successfully reproduces their autocorrelation function for a wide range of timescales (up to 3 s), and the underlying SSNs change their topographical structure as a function of the timescale; while the corresponding SSN is simple at the short timescale (0.033-0.1 s), the SSN at the longer timescales (0.1 s to ˜3 s) becomes rather complex in order to capture multiscale nonstationary kinetics emerging at longer timescales. It is also found that visiting the unbound form of the EGFR-Grb2 system approximately resets all information of history or memory of the process.
77 FR 34870 - Airworthiness Directives; Bombardier, Inc. Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-12
... Bombardier, Inc. Model CL-600-2B19 (Regional Jet Series 100 & 440) airplanes. The existing AD currently requires a one-time inspection of the shafts of the main landing gear (MLG) side-brace fittings to detect...-brace fitting and replacing the side-brace fitting shaft with the re-designed side-brace fitting shaft...
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
Swetapadma, Aleena; Yadav, Anamika
2015-01-01
Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance. PMID:26413088
DNAism: exploring genomic datasets on the web with Horizon Charts.
Rio Deiros, David; Gibbs, Richard A; Rogers, Jeffrey
2016-01-27
Computational biologists daily face the need to explore massive amounts of genomic data. New visualization techniques can help researchers navigate and understand these big data. Horizon Charts are a relatively new visualization method that, under the right circumstances, maximizes data density without losing graphical perception. Horizon Charts have been successfully applied to understand multi-metric time series data. We have adapted an existing JavaScript library (Cubism) that implements Horizon Charts for the time series domain so that it works effectively with genomic datasets. We call this new library DNAism. Horizon Charts can be an effective visual tool to explore complex and large genomic datasets. Researchers can use our library to leverage these techniques to extract additional insights from their own datasets.
Earth Observing System, Conclusions and Recommendations
NASA Technical Reports Server (NTRS)
1984-01-01
The following Earth Observing Systems (E.O.S.) recommendations were suggested: (1) a program must be initiated to ensure that present time series of Earth science data are maintained and continued. (2) A data system that provides easy, integrated, and complete access to past, present, and future data must be developed as soon as possible. (3) A long term research effort must be sustained to study and understand these time series of Earth observations. (4) The E.O.S. should be established as an information system to carry out those aspects of the above recommendations which go beyond existing and currently planned activities. (5) The scientific direction of the E.O.S. should be established and continued through an international scientific steering committee.
Tracking signal test to monitor an intelligent time series forecasting model
NASA Astrophysics Data System (ADS)
Deng, Yan; Jaraiedi, Majid; Iskander, Wafik H.
2004-03-01
Extensive research has been conducted on the subject of Intelligent Time Series forecasting, including many variations on the use of neural networks. However, investigation of model adequacy over time, after the training processes is completed, remains to be fully explored. In this paper we demonstrate a how a smoothed error tracking signals test can be incorporated into a neuro-fuzzy model to monitor the forecasting process and as a statistical measure for keeping the forecasting model up-to-date. The proposed monitoring procedure is effective in the detection of nonrandom changes, due to model inadequacy or lack of unbiasedness in the estimation of model parameters and deviations from the existing patterns. This powerful detection device will result in improved forecast accuracy in the long run. An example data set has been used to demonstrate the application of the proposed method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, J.; Feng, W., E-mail: fengwen69@sina.cn
Extended time series of Solar Activity Indices (ESAI) extended the Greenwich series of sunspot area from the year 1874 back to 1821. The ESAI's yearly sunspot area in the northern and southern hemispheres from 1821 to 2013 is utilized to investigate characteristics of the north–south hemispherical asymmetry of sunspot activity. Periodical behavior of about 12 solar cycles is also confirmed from the ESAI data set to exist in dominant hemispheres, linear regression lines of yearly asymmetry values, and cumulative counts of yearly sunspot areas in the hemispheres for solar cycles. The period is also inferred to appear in both themore » cumulative difference in the yearly sunspot areas in the hemispheres over the entire time interval and in its statistical Student's t-test. The hemispherical bias of sunspot activity should be regarded as an impossible stochastic phenomenon over a long time period.« less
Abrupt Shift in the Observed Runoff from the Southwest Greenland Ice Sheet?
NASA Astrophysics Data System (ADS)
Ahlstrom, A.; Petersen, D.; Box, J.; Langen, P. P.; Citterio, M.
2016-12-01
Mass loss of the Greenland ice sheet has contributed significantly to sea level rise in recent years and is considered a crucial parameter when estimating the impact of future climate change. Few observational records of sufficient length exist to validate surface mass balance models, especially the estimated runoff. Here we present an observation time series from 1975-2014 of discharge from a large proglacial lake, Tasersiaq, in West Greenland (66.3°N, 50.4°W) with a mainly ice-covered catchment. We argue that the discharge time series is representative measure of ice sheet runoff, making it the only observational record of runoff to exceed the 30-year period needed to assess the climatological state of the ice sheet. We proceed to isolate the runoff part of the signal from precipitation and identified glacial lake outburst floods from a small sub-catchment. Similarly, the impact from major volcanic eruptions is clearly identified. We examine the trend and annual variability in the annual discharge, relating it to likely atmospheric forcing mechanisms and compare the observational time series with modelled runoff from the regional climate model HIRHAM.
Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva
2018-01-15
Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.
Inverse sequential procedures for the monitoring of time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy
1993-01-01
Climate changes traditionally have been detected from long series of observations and long after they happened. The 'inverse sequential' monitoring procedure is designed to detect changes as soon as they occur. Frequency distribution parameters are estimated both from the most recent existing set of observations and from the same set augmented by 1,2,...j new observations. Individual-value probability products ('likelihoods') are then calculated which yield probabilities for erroneously accepting the existing parameter(s) as valid for the augmented data set and vice versa. A parameter change is signaled when these probabilities (or a more convenient and robust compound 'no change' probability) show a progressive decrease. New parameters are then estimated from the new observations alone to restart the procedure. The detailed algebra is developed and tested for Gaussian means and variances, Poisson and chi-square means, and linear or exponential trends; a comprehensive and interactive Fortran program is provided in the appendix.
A graph-based approach to detect spatiotemporal dynamics in satellite image time series
NASA Astrophysics Data System (ADS)
Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal
2017-08-01
Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.
NASA Technical Reports Server (NTRS)
Cox, C.; Au, A.; Klosko, S.; Chao, B.; Smith, David E. (Technical Monitor)
2001-01-01
The upcoming GRACE mission promises to open a window on details of the global mass budget that will have remarkable clarity, but it will not directly answer the question of what the state of the Earth's mass budget is over the critical last quarter of the 20th century. To address that problem we must draw upon existing technologies such as SLR, DORIS, and GPS, and climate modeling runs in order to improve our understanding. Analysis of long-period geopotential changes based on SLR and DORIS tracking has shown that addition of post 1996 satellite tracking data has a significant impact on the recovered zonal rates and long-period tides. Interannual effects such as those causing the post 1996 anomalies must be better characterized before refined estimates of the decadal period changes in the geopotential can be derived from the historical database of satellite tracking. A possible cause of this anomaly is variations in ocean mass distribution, perhaps associated with the recent large El Nino/La Nina. In this study, a low-degree spherical harmonic gravity time series derived from satellite tracking is compared with a TOPEX/POSEIDON-derived sea surface height time series. Corrections for atmospheric mass effects, continental hydrology, snowfall accumulation, and ocean steric model predictions will be considered.
Ye, Yu; Kerr, William C
2011-01-01
To explore various model specifications in estimating relationships between liver cirrhosis mortality rates and per capita alcohol consumption in aggregate-level cross-section time-series data. Using a series of liver cirrhosis mortality rates from 1950 to 2002 for 47 U.S. states, the effects of alcohol consumption were estimated from pooled autoregressive integrated moving average (ARIMA) models and 4 types of panel data models: generalized estimating equation, generalized least square, fixed effect, and multilevel models. Various specifications of error term structure under each type of model were also examined. Different approaches controlling for time trends and for using concurrent or accumulated consumption as predictors were also evaluated. When cirrhosis mortality was predicted by total alcohol, highly consistent estimates were found between ARIMA and panel data analyses, with an average overall effect of 0.07 to 0.09. Less consistent estimates were derived using spirits, beer, and wine consumption as predictors. When multiple geographic time series are combined as panel data, none of existent models could accommodate all sources of heterogeneity such that any type of panel model must employ some form of generalization. Different types of panel data models should thus be estimated to examine the robustness of findings. We also suggest cautious interpretation when beverage-specific volumes are used as predictors. Copyright © 2010 by the Research Society on Alcoholism.
Interpretable Categorization of Heterogeneous Time Series Data
NASA Technical Reports Server (NTRS)
Lee, Ritchie; Kochenderfer, Mykel J.; Mengshoel, Ole J.; Silbermann, Joshua
2017-01-01
We analyze data from simulated aircraft encounters to validate and inform the development of a prototype aircraft collision avoidance system. The high-dimensional and heterogeneous time series dataset is analyzed to discover properties of near mid-air collisions (NMACs) and categorize the NMAC encounters. Domain experts use these properties to better organize and understand NMAC occurrences. Existing solutions either are not capable of handling high-dimensional and heterogeneous time series datasets or do not provide explanations that are interpretable by a domain expert. The latter is critical to the acceptance and deployment of safety-critical systems. To address this gap, we propose grammar-based decision trees along with a learning algorithm. Our approach extends decision trees with a grammar framework for classifying heterogeneous time series data. A context-free grammar is used to derive decision expressions that are interpretable, application-specific, and support heterogeneous data types. In addition to classification, we show how grammar-based decision trees can also be used for categorization, which is a combination of clustering and generating interpretable explanations for each cluster. We apply grammar-based decision trees to a simulated aircraft encounter dataset and evaluate the performance of four variants of our learning algorithm. The best algorithm is used to analyze and categorize near mid-air collisions in the aircraft encounter dataset. We describe each discovered category in detail and discuss its relevance to aircraft collision avoidance.
Review of current GPS methodologies for producing accurate time series and their error sources
NASA Astrophysics Data System (ADS)
He, Xiaoxing; Montillet, Jean-Philippe; Fernandes, Rui; Bos, Machiel; Yu, Kegen; Hua, Xianghong; Jiang, Weiping
2017-05-01
The Global Positioning System (GPS) is an important tool to observe and model geodynamic processes such as plate tectonics and post-glacial rebound. In the last three decades, GPS has seen tremendous advances in the precision of the measurements, which allow researchers to study geophysical signals through a careful analysis of daily time series of GPS receiver coordinates. However, the GPS observations contain errors and the time series can be described as the sum of a real signal and noise. The signal itself can again be divided into station displacements due to geophysical causes and to disturbing factors. Examples of the latter are errors in the realization and stability of the reference frame and corrections due to ionospheric and tropospheric delays and GPS satellite orbit errors. There is an increasing demand on detecting millimeter to sub-millimeter level ground displacement signals in order to further understand regional scale geodetic phenomena hence requiring further improvements in the sensitivity of the GPS solutions. This paper provides a review spanning over 25 years of advances in processing strategies, error mitigation methods and noise modeling for the processing and analysis of GPS daily position time series. The processing of the observations is described step-by-step and mainly with three different strategies in order to explain the weaknesses and strengths of the existing methodologies. In particular, we focus on the choice of the stochastic model in the GPS time series, which directly affects the estimation of the functional model including, for example, tectonic rates, seasonal signals and co-seismic offsets. Moreover, the geodetic community continues to develop computational methods to fully automatize all phases from analysis of GPS time series. This idea is greatly motivated by the large number of GPS receivers installed around the world for diverse applications ranging from surveying small deformations of civil engineering structures (e.g., subsidence of the highway bridge) to the detection of particular geophysical signals.
NASA Astrophysics Data System (ADS)
Gladwin, D.; Stewart, P.; Stewart, J.
2011-02-01
This article addresses the problem of maintaining a stable rectified DC output from the three-phase AC generator in a series-hybrid vehicle powertrain. The series-hybrid prime power source generally comprises an internal combustion (IC) engine driving a three-phase permanent magnet generator whose output is rectified to DC. A recent development has been to control the engine/generator combination by an electronically actuated throttle. This system can be represented as a nonlinear system with significant time delay. Previously, voltage control of the generator output has been achieved by model predictive methods such as the Smith Predictor. These methods rely on the incorporation of an accurate system model and time delay into the control algorithm, with a consequent increase in computational complexity in the real-time controller, and as a necessity relies to some extent on the accuracy of the models. Two complementary performance objectives exist for the control system. Firstly, to maintain the IC engine at its optimal operating point, and secondly, to supply a stable DC supply to the traction drive inverters. Achievement of these goals minimises the transient energy storage requirements at the DC link, with a consequent reduction in both weight and cost. These objectives imply constant velocity operation of the IC engine under external load disturbances and changes in both operating conditions and vehicle speed set-points. In order to achieve these objectives, and reduce the complexity of implementation, in this article a controller is designed by the use of Genetic Programming methods in the Simulink modelling environment, with the aim of obtaining a relatively simple controller for the time-delay system which does not rely on the implementation of real time system models or time delay approximations in the controller. A methodology is presented to utilise the miriad of existing control blocks in the Simulink libraries to automatically evolve optimal control structures.
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
NASA Astrophysics Data System (ADS)
Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio
2013-10-01
Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.
Financing School Construction. Educational Facilities Review Series, Number 12.
ERIC Educational Resources Information Center
Piele, Philip K.
The combination of defeated bond issues and rising building costs is contributing to a decline in both the construction of new school buildings and the remodeling of existing buildings. For the first time in many years, debt service and capital outlay expenditures actually declined on a per pupil basis. No change in either voter preferences or…
Does Expanding Higher Education Reduce Income Inequality in Emerging Economy? Evidence from Pakistan
ERIC Educational Resources Information Center
Qazi, Wasim; Raza, Syed Ali; Jawaid, Syed Tehseen; Karim, Mohd Zaini Abd
2018-01-01
This study investigates the impact of development in the higher education sector, on the Income Inequality in Pakistan, by using the annual time series data from 1973 to 2012. The autoregressive distributed lag bound testing co-integration approach confirms the existence of long-run relationship between higher education and income inequality.…
Detectability of Granger causality for subsampled continuous-time neurophysiological processes.
Barnett, Lionel; Seth, Anil K
2017-01-01
Granger causality is well established within the neurosciences for inference of directed functional connectivity from neurophysiological data. These data usually consist of time series which subsample a continuous-time biophysiological process. While it is well known that subsampling can lead to imputation of spurious causal connections where none exist, less is known about the effects of subsampling on the ability to reliably detect causal connections which do exist. We present a theoretical analysis of the effects of subsampling on Granger-causal inference. Neurophysiological processes typically feature signal propagation delays on multiple time scales; accordingly, we base our analysis on a distributed-lag, continuous-time stochastic model, and consider Granger causality in continuous time at finite prediction horizons. Via exact analytical solutions, we identify relationships among sampling frequency, underlying causal time scales and detectability of causalities. We reveal complex interactions between the time scale(s) of neural signal propagation and sampling frequency. We demonstrate that detectability decays exponentially as the sample time interval increases beyond causal delay times, identify detectability "black spots" and "sweet spots", and show that downsampling may potentially improve detectability. We also demonstrate that the invariance of Granger causality under causal, invertible filtering fails at finite prediction horizons, with particular implications for inference of Granger causality from fMRI data. Our analysis emphasises that sampling rates for causal analysis of neurophysiological time series should be informed by domain-specific time scales, and that state-space modelling should be preferred to purely autoregressive modelling. On the basis of a very general model that captures the structure of neurophysiological processes, we are able to help identify confounds, and offer practical insights, for successful detection of causal connectivity from neurophysiological recordings. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mélice, J. L.; Roucou, P.
The spectral characteristics of the δ18O isotopic ratio time series of the Quelccaya ice cap summit core are investigated with the multi taper method (MTM), the singular spectrum analysis (SSA) and the wavelet transform (WT) techniques for the 500 y long 1485-1984 period. The most significant (at the 99.8% level) cycle according to the MTM F-test has a period centered at 14.4 y while the largest variance explaining oscillation according to the SSA technique has a period centered at 12.9 y. The stability over time of these periods is investigated by performing evolutive MTM and SSA on the 500 y long δ18O series with a 100 y wide moving window. It is shown that the cycles with largest amplitude and that the oscillations with largest extracting variance have corresponding periods aggregated around 13.5 y that are very stable over the period between 1485 and 1984. The WT of the same isotopic time series reveals the existence of a main oscillation around 12 y which are also very stable in time. The relation between the isotopic data at Quelccaya and the annual sea surface temperature (SST) field anomalies is then evaluated for the overlapping 1919-1984 period. Significant global correlation and significant coherency at 12.1 y are found between the isotopic series and the annual global sea surface temperature (GSST) series. Moreover, the correlation between the low (over 8 y) frequency component of the isotopic time series and the annual SST field point out significant values in the tropical North Atlantic. This region is characterized by a main SST variability at 12.8 y. The Quelccaya δ18O isotopic ratio series may therefore be considered as a good recorder of the tropical North Atlantic SSTs. This may be explained by the following mechanism: the water vapor amount evaporated by the tropical North Atlantic is function of the SST. So is the water vapor δ18O isotopic ratio. This water vapor is advected during the rainy season by northeast winds and precipitates at the Quelccaya summit with its tropical North Atlantic isotopic signature. It is also suggested from this described stability of the decadal time scale variability observed in the Quelccaya isotopic series, that the decadal time scale GSST variability was also stable during the last five centuries.
The Sunspot Number and beyond : reconstructing detailed solar information over centuries
NASA Astrophysics Data System (ADS)
Lefevre, L.
2014-12-01
With four centuries of solar evolution, the International Sunspot Number (SSN) forms the longest solar time series currently available. It provides an essential reference for understanding and quantifying how the solar output has varied over decades and centuries and thus for assessing the variations of the main natural forcing on the Earth climate. Because of its importance, this unique time-series must be closely monitored for any possible biases and drifts. Here, we report about recent disagreements between solar indices, for example the sunspot sumber and the 10.7cm radio flux. Recent analyses indicate that while part of this divergence may be due to a calibration drift in the SSN, it also results from an intrinsic change in the global magnetic parameters of sunspots and solar active regions, suggesting a possible transition to a new activity regime. Going beyond the SSN series, in the framework of the TOSCA (www.cost-tosca.eu/) and SOLID (projects.pmodwrc.ch/solid/) projects, we produced a survey of all existing catalogs providing detailed sunspot information (Lefevre & Clette, 2014:10.1007/s11207-012-0184-5) and we also located different primary solar images and drawing collections that can be exploitable to complement the existing catalogs. These are first steps towards the construction of a multi-parametric time series of multiple sunspot and sunspot group properties over more than a century, allowing to reconstruct and extend the current 1-D SSN series. By bringing new spatial, morphological and evolutionary information, such a data set should bring major advances for the modeling of the solar dynamo and solar irradiance. We will present here the current status of this work. The preliminary version catalog now extends over the last 150 years. It makes use of data from DPD (http://fenyi.solarobs.unideb.hu/DPD/index.html), from the Uccle Solar Equatorial Table (USET:http://sidc.oma.be/uset/) operated by the Royal Obeservatory of Belgium, the Greenwich Catalog (RGO:http://www.ngdc.noaa.gov/) as well as the Kodaikanal white light data.
NASA Astrophysics Data System (ADS)
Staniec, Allison; Vlahos, Penny
2017-12-01
Long-term time series represent a critical part of the oceanographic community's efforts to discern natural and anthropogenically forced variations in the environment. They provide regular measurements of climate relevant indicators including temperature, oxygen concentrations, and salinity. When evaluating time series, it is essential to isolate long-term trends from autocorrelation in data and noise due to natural variability. Herein we apply a statistical approach, well-established in atmospheric time series, to key parameters in the U.S. east coast's Long Island Sound estuary (LIS). Analysis shows that the LIS time series (established in the early 1990s) is sufficiently long to detect significant trends in physical-chemical parameters including temperature (T) and dissolved oxygen (DO). Over the last two decades, overall (combined surface and deep) LIS T has increased at an average rate of 0.08 ± 0.03 °C yr-1 while overall DO has dropped at an average rate of 0.03 ± 0.01 mg L-1yr-1 since 1994 at the 95% confidence level. This trend is notably faster than the global open ocean T trend (0.01 °C yr-1), as might be expected for a shallower estuarine system. T and DO trends were always significant for the existing time series using four month data increments. Rates of change of DO and T in LIS are strongly correlated and the rate of decrease of DO concentrations is consistent with the expected reduced solubility of DO at these higher temperatures. Thus, changes in T alone, across decadal timescales can account for between 33 and 100% of the observed decrease in DO. This has significant implications for other dissolved gases and the long-term management of LIS hypoxia.
Can We Speculate Running Application With Server Power Consumption Trace?
Li, Yuanlong; Hu, Han; Wen, Yonggang; Zhang, Jun
2018-05-01
In this paper, we propose to detect the running applications in a server by classifying the observed power consumption series for the purpose of data center energy consumption monitoring and analysis. Time series classification problem has been extensively studied with various distance measurements developed; also recently the deep learning-based sequence models have been proved to be promising. In this paper, we propose a novel distance measurement and build a time series classification algorithm hybridizing nearest neighbor and long short term memory (LSTM) neural network. More specifically, first we propose a new distance measurement termed as local time warping (LTW), which utilizes a user-specified index set for local warping, and is designed to be noncommutative and nondynamic programming. Second, we hybridize the 1-nearest neighbor (1NN)-LTW and LSTM together. In particular, we combine the prediction probability vector of 1NN-LTW and LSTM to determine the label of the test cases. Finally, using the power consumption data from a real data center, we show that the proposed LTW can improve the classification accuracy of dynamic time warping (DTW) from about 84% to 90%. Our experimental results prove that the proposed LTW is competitive on our data set compared with existed DTW variants and its noncommutative feature is indeed beneficial. We also test a linear version of LTW and find out that it can perform similar to state-of-the-art DTW-based method while it runs as fast as the linear runtime lower bound methods like LB_Keogh for our problem. With the hybrid algorithm, for the power series classification task we achieve an accuracy up to about 93%. Our research can inspire more studies on time series distance measurement and the hybrid of the deep learning models with other traditional models.
Duliu, Octavian G; Varlam, Carmen; Shnawaw, Muataz Dheyaa
2018-05-16
To get more information on the origin of tritium and to evidence any possible presence of anthropogenic sources, between January 1999 and December 2016, the precipitation level and tritium concentration were monthly recorded and investigated by the Cryogenic Institute of Ramnicu Valcea, Romania. Compared with similar data covering a radius of about 1200 km westward, the measurements gave similar results concerning the time evolution of tritium content and precipitation level for the entire time interval excepting the period between 2009 and 2011 when the tritium concentrations showed a slight increase, most probable due to the activity of neighboring experimental pilot plant for tritium and deuterium separation. Regardless this fact, all data pointed towards a steady tendency of tritium concentrations to decrease with an annual rate of about 1.4 ± 0.05%. The experimental data on precipitation levels and tritium concentrations form two complete time series whose time series analysis showed, at p < 0.01, the presence of a single one-year periodicity whose coincident maximums which correspond to late spring - early summer months suggest the existence of the Spring Leak mechanism with a possible contribution of the soil moisture remobilization during the warm period. Copyright © 2018 Elsevier Ltd. All rights reserved.
Permutation entropy with vector embedding delays
NASA Astrophysics Data System (ADS)
Little, Douglas J.; Kane, Deb M.
2017-12-01
Permutation entropy (PE) is a statistic used widely for the detection of structure within a time series. Embedding delay times at which the PE is reduced are characteristic timescales for which such structure exists. Here, a generalized scheme is investigated where embedding delays are represented by vectors rather than scalars, permitting PE to be calculated over a (D -1 ) -dimensional space, where D is the embedding dimension. This scheme is applied to numerically generated noise, sine wave and logistic map series, and experimental data sets taken from a vertical-cavity surface emitting laser exhibiting temporally localized pulse structures within the round-trip time of the laser cavity. Results are visualized as PE maps as a function of embedding delay, with low PE values indicating combinations of embedding delays where correlation structure is present. It is demonstrated that vector embedding delays enable identification of structure that is ambiguous or masked, when the embedding delay is constrained to scalar form.
NASA Astrophysics Data System (ADS)
Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Bunyakova, Yu Ya; Florko, T. A.; Agayar, E. V.; Solyanikova, E. P.
2017-10-01
The present paper concerns the results of computational studying dynamics of the atmospheric pollutants (dioxide of nitrogen, sulphur etc) concentrations in an atmosphere of the industrial cities (Odessa) by using the dynamical systems and chaos theory methods. A chaotic behaviour in the nitrogen dioxide and sulphurous anhydride concentration time series at several sites of the Odessa city is numerically investigated. As usually, to reconstruct the corresponding attractor, the time delay and embedding dimension are needed. The former is determined by the methods of autocorrelation function and average mutual information, and the latter is calculated by means of a correlation dimension method and algorithm of false nearest neighbours. Further, the Lyapunov’s exponents spectrum, Kaplan-Yorke dimension and Kolmogorov entropy are computed. It has been found an existence of a low-D chaos in the time series of the atmospheric pollutants concentrations.
[The effect of tobacco prices on consumption: a time series data analysis for Mexico].
Olivera-Chávez, Rosa Itandehui; Cermeño-Bazán, Rodolfo; de Miera-Juárez, Belén Sáenz; Jiménez-Ruiz, Jorge Alberto; Reynales-Shigematsu, Luz Myriam
2010-01-01
To estimate the price elasticity of the demand for cigarettes in Mexico based on data sources and a methodology different from the ones used in previous studies on the topic. Quarterly time series of consumption, income and price for the time period 1994 to 2005 were used. A long-run demand model was estimated using Ordinary Least Squares (OLS) and the existence of a cointegration relationship was investigated. Also, a model using Dinamic Ordinary Least Squares (DOLS) was estimated to correct for potential endogeneity of independent variables and autocorrelation of the residuals. DOLS estimates showed that a 10% increase in cigarette prices could reduce consumption in 2.5% (p<0.05) and increase government revenue in 16.11%. The results confirmed the effectiveness of taxes as an instrument for tobacco control in Mexico. An increase in taxes can be used to increase cigarette prices and therefore to reduce consumption and increase government revenue.
Bussewitz, Bradly; DeVries, J George; Dujela, Michael; McAlister, Jeffrey E; Hyer, Christopher F; Berlet, Gregory C
2014-07-01
Large bone defects present a difficult task for surgeons when performing single-stage, complex combined hindfoot and ankle reconstruction. There exist little data in a case series format to evaluate the use of frozen femoral head allograft during tibiotalocalcaneal arthrodesis in various populations in the literature. The authors evaluated 25 patients from 2003 to 2011 who required a femoral head allograft and an intramedullary nail. The average time of final follow-up visit was 83 ± 63.6 weeks (range, 10-265). Twelve patients healed the fusion (48%). Twenty-one patients resulted in a braceable limb (84%). Four patients resulted in major amputation (16%). This series may allow surgeons to more accurately predict the success and clinical outcome of these challenging cases. Level IV, case series. © The Author(s) 2014.
On the multifractal effects generated by monofractal signals
NASA Astrophysics Data System (ADS)
Grech, Dariusz; Pamuła, Grzegorz
2013-12-01
We study quantitatively the level of false multifractal signal one may encounter while analyzing multifractal phenomena in time series within multifractal detrended fluctuation analysis (MF-DFA). The investigated effect appears as a result of finite length of used data series and is additionally amplified by the long-term memory the data eventually may contain. We provide the detailed quantitative description of such apparent multifractal background signal as a threshold in spread of generalized Hurst exponent values Δh or a threshold in the width of multifractal spectrum Δα below which multifractal properties of the system are only apparent, i.e. do not exist, despite Δα≠0 or Δh≠0. We find this effect quite important for shorter or persistent series and we argue it is linear with respect to autocorrelation exponent γ. Its strength decays according to power law with respect to the length of time series. The influence of basic linear and nonlinear transformations applied to initial data in finite time series with various levels of long memory is also investigated. This provides additional set of semi-analytical results. The obtained formulas are significant in any interdisciplinary application of multifractality, including physics, financial data analysis or physiology, because they allow to separate the ‘true’ multifractal phenomena from the apparent (artificial) multifractal effects. They should be a helpful tool of the first choice to decide whether we do in particular case with the signal with real multiscaling properties or not.
Development of advanced avionics systems applicable to terminal-configured vehicles
NASA Technical Reports Server (NTRS)
Heimbold, R. L.; Lee, H. P.; Leffler, M. F.
1980-01-01
A technique to add the time constraint to the automatic descent feature of the existing L-1011 aircraft Flight Management System (FMS) was developed. Software modifications were incorporated in the FMS computer program and the results checked by lab simulation and on a series of eleven test flights. An arrival time dispersion (2 sigma) of 19 seconds was achieved. The 4 D descent technique can be integrated with the time-based metering method of air traffic control. Substantial reductions in delays at today's busy airports should result.
Time-series analysis of delta13C from tree rings. I. Time trends and autocorrelation.
Monserud, R A; Marshall, J D
2001-09-01
Univariate time-series analyses were conducted on stable carbon isotope ratios obtained from tree-ring cellulose. We looked for the presence and structure of autocorrelation. Significant autocorrelation violates the statistical independence assumption and biases hypothesis tests. Its presence would indicate the existence of lagged physiological effects that persist for longer than the current year. We analyzed data from 28 trees (60-85 years old; mean = 73 years) of western white pine (Pinus monticola Dougl.), ponderosa pine (Pinus ponderosa Laws.), and Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. glauca) growing in northern Idaho. Material was obtained by the stem analysis method from rings laid down in the upper portion of the crown throughout each tree's life. The sampling protocol minimized variation caused by changing light regimes within each tree. Autoregressive moving average (ARMA) models were used to describe the autocorrelation structure over time. Three time series were analyzed for each tree: the stable carbon isotope ratio (delta(13)C); discrimination (delta); and the difference between ambient and internal CO(2) concentrations (c(a) - c(i)). The effect of converting from ring cellulose to whole-leaf tissue did not affect the analysis because it was almost completely removed by the detrending that precedes time-series analysis. A simple linear or quadratic model adequately described the time trend. The residuals from the trend had a constant mean and variance, thus ensuring stationarity, a requirement for autocorrelation analysis. The trend over time for c(a) - c(i) was particularly strong (R(2) = 0.29-0.84). Autoregressive moving average analyses of the residuals from these trends indicated that two-thirds of the individual tree series contained significant autocorrelation, whereas the remaining third were random (white noise) over time. We were unable to distinguish between individuals with and without significant autocorrelation beforehand. Significant ARMA models were all of low order, with either first- or second-order (i.e., lagged 1 or 2 years, respectively) models performing well. A simple autoregressive (AR(1)), model was the most common. The most useful generalization was that the same ARMA model holds for each of the three series (delta(13)C, delta, c(a) - c(i)) for an individual tree, if the time trend has been properly removed for each series. The mean series for the two pine species were described by first-order ARMA models (1-year lags), whereas the Douglas-fir mean series were described by second-order models (2-year lags) with negligible first-order effects. Apparently, the process of constructing a mean time series for a species preserves an underlying signal related to delta(13)C while canceling some of the random individual tree variation. Furthermore, the best model for the overall mean series (e.g., for a species) cannot be inferred from a consensus of the individual tree model forms, nor can its parameters be estimated reliably from the mean of the individual tree parameters. Because two-thirds of the individual tree time series contained significant autocorrelation, the normal assumption of a random structure over time is unwarranted, even after accounting for the time trend. The residuals of an appropriate ARMA model satisfy the independence assumption, and can be used to make hypothesis tests.
Wu, Wei-Sheng; Jhou, Meng-Jhun
2017-01-13
Missing value imputation is important for microarray data analyses because microarray data with missing values would significantly degrade the performance of the downstream analyses. Although many microarray missing value imputation algorithms have been developed, an objective and comprehensive performance comparison framework is still lacking. To solve this problem, we previously proposed a framework which can perform a comprehensive performance comparison of different existing algorithms. Also the performance of a new algorithm can be evaluated by our performance comparison framework. However, constructing our framework is not an easy task for the interested researchers. To save researchers' time and efforts, here we present an easy-to-use web tool named MVIAeval (Missing Value Imputation Algorithm evaluator) which implements our performance comparison framework. MVIAeval provides a user-friendly interface allowing users to upload the R code of their new algorithm and select (i) the test datasets among 20 benchmark microarray (time series and non-time series) datasets, (ii) the compared algorithms among 12 existing algorithms, (iii) the performance indices from three existing ones, (iv) the comprehensive performance scores from two possible choices, and (v) the number of simulation runs. The comprehensive performance comparison results are then generated and shown as both figures and tables. MVIAeval is a useful tool for researchers to easily conduct a comprehensive and objective performance evaluation of their newly developed missing value imputation algorithm for microarray data or any data which can be represented as a matrix form (e.g. NGS data or proteomics data). Thus, MVIAeval will greatly expedite the progress in the research of missing value imputation algorithms.
On fractality and chaos in Moroccan family business stock returns and volatility
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
2017-05-01
The purpose of this study is to examine existence of fractality and chaos in returns and volatilities of family business companies listed on the Casablanca Stock Exchange (CSE) in Morocco, and also in returns and volatility of the CSE market index. Detrended fluctuation analysis based Hurst exponent and fractionally integrated generalized autoregressive conditional heteroskedasticity (FIGARCH) model are used to quantify fractality in returns and volatility time series respectively. Besides, the largest Lyapunov exponent is employed to quantify chaos in both time series. The empirical results from sixteen family business companies follow. For return series, fractality analysis show that most of family business returns listed on CSE exhibit anti-persistent dynamics, whilst market returns have persistent dynamics. Besides, chaos tests show that business family stock returns are not chaotic while market returns exhibit evidence of chaotic behaviour. For volatility series, fractality analysis shows that most of family business stocks and market index exhibit long memory in volatility. Furthermore, results from chaos tests show that volatility of family business returns is not chaotic, whilst volatility of market index is chaotic. These results may help understanding irregularities patterns in Moroccan family business stock returns and volatility, and how they are different from market dynamics.
Applications and development of new algorithms for displacement analysis using InSAR time series
NASA Astrophysics Data System (ADS)
Osmanoglu, Batuhan
Time series analysis of Synthetic Aperture Radar Interferometry (InSAR) data has become an important scientific tool for monitoring and measuring the displacement of Earth's surface due to a wide range of phenomena, including earthquakes, volcanoes, landslides, changes in ground water levels, and wetlands. Time series analysis is a product of interferometric phase measurements, which become ambiguous when the observed motion is larger than half of the radar wavelength. Thus, phase observations must first be unwrapped in order to obtain physically meaningful results. Persistent Scatterer Interferometry (PSI), Stanford Method for Persistent Scatterers (StaMPS), Short Baselines Interferometry (SBAS) and Small Temporal Baseline Subset (STBAS) algorithms solve for this ambiguity using a series of spatio-temporal unwrapping algorithms and filters. In this dissertation, I improve upon current phase unwrapping algorithms, and apply the PSI method to study subsidence in Mexico City. PSI was used to obtain unwrapped deformation rates in Mexico City (Chapter 3),where ground water withdrawal in excess of natural recharge causes subsurface, clay-rich sediments to compact. This study is based on 23 satellite SAR scenes acquired between January 2004 and July 2006. Time series analysis of the data reveals a maximum line-of-sight subsidence rate of 300mm/yr at a high enough resolution that individual subsidence rates for large buildings can be determined. Differential motion and related structural damage along an elevated metro rail was evident from the results. Comparison of PSI subsidence rates with data from permanent GPS stations indicate root mean square (RMS) agreement of 6.9 mm/yr, about the level expected based on joint data uncertainty. The Mexico City results suggest negligible recharge, implying continuing degradation and loss of the aquifer in the third largest metropolitan area in the world. Chapters 4 and 5 illustrate the link between time series analysis and three-dimensional (3-D) phase unwrapping. Chapter 4 focuses on the unwrapping path. Unwrapping algorithms can be divided into two groups, path-dependent and path-independent algorithms. Path-dependent algorithms use local unwrapping functions applied pixel-by-pixel to the dataset. In contrast, path-independent algorithms use global optimization methods such as least squares, and return a unique solution. However, when aliasing and noise are present, path-independent algorithms can underestimate the signal in some areas due to global fitting criteria. Path-dependent algorithms do not underestimate the signal, but, as the name implies, the unwrapping path can affect the result. Comparison between existing path algorithms and a newly developed algorithm based on Fisher information theory was conducted. Results indicate that Fisher information theory does indeed produce lower misfit results for most tested cases. Chapter 5 presents a new time series analysis method based on 3-D unwrapping of SAR data using extended Kalman filters. Existing methods for time series generation using InSAR data employ special filters to combine two-dimensional (2-D) spatial unwrapping with one-dimensional (1-D) temporal unwrapping results. The new method, however, combines observations in azimuth, range and time for repeat pass interferometry. Due to the pixel-by-pixel characteristic of the filter, the unwrapping path is selected based on a quality map. This unwrapping algorithm is the first application of extended Kalman filters to the 3-D unwrapping problem. Time series analyses of InSAR data are used in a variety of applications with different characteristics. Consequently, it is difficult to develop a single algorithm that can provide optimal results in all cases, given that different algorithms possess a unique set of strengths and weaknesses. Nonetheless, filter-based unwrapping algorithms such as the one presented in this dissertation have the capability of joining multiple observations into a uniform solution, which is becoming an important feature with continuously growing datasets.
Eye Movement Control during Reading: II. Frequency of Refixating a Word. Technical Report No. 469.
ERIC Educational Resources Information Center
McConkie, G. W.; And Others
As part of a series of studies describing the oculomotor behavior of skilled readers, a study investigated whether a word refixation curve exists. Subjects, 66 college students fixating over 40,000 times, read lines of text from a computer screen and were instructed to read for meaning without regard to errors. Results of eye movement control…
ERIC Educational Resources Information Center
Ashkenas, Ron; And Others
The process of reexamining and reinventing a company demands a new organizational theory and, at the same time, a critical evaluation of the limits of existing theory. This book argues for change strategies that are aimed at creating more permeable boundaries within organizations. Four major sections focus on one of the four types of…
Virtual Schools in the U.S. 2014: Politics, Performance, Policy, and Research Evidence
ERIC Educational Resources Information Center
Huerta, Luis; Rice, Jennifer King; Shafer, Sheryl Rankin; Barbour, Michael K.; Miron, Gary; Gulosino, Charisse; Horvitz, Brian
2014-01-01
This report is the second of a series of annual reports by the National Education Policy Center (NEPC) on virtual education in the U.S. The NEPC reports contribute to the existing evidence and discourse on virtual education by providing an objective analysis of the evolution and performance of full-time, publicly funded K-12 virtual schools. This…
L.N. Hudson; T. Newbold; S. Contu
2014-01-01
Biodiversity continues to decline in the face of increasing anthropogenic pressures such as habitat destruction, exploitation, pollution and introduction of alien species. Existing global databases of speciesâ threat status or population time series are dominated by charismatic species. The collation of datasets with broad taxonomic and biogeographic extents, and that...
ERIC Educational Resources Information Center
Haddad, Caroline, Ed.; Rennie, Luisa, Ed.
2005-01-01
Although many excellent materials now exist that detail the full range of potential uses of Information Communication Technologies (ICTs) in education, already overworked policy makers and others often lack the time it takes to surf the Internet, or access libraries and other sources of information on their own in search of ideas and material…
Foreman, Brady Z; Straub, Kyle M
2017-09-01
Terrestrial paleoclimate records rely on proxies hosted in alluvial strata whose beds are deposited by unsteady and nonlinear geomorphic processes. It is broadly assumed that this renders the resultant time series of terrestrial paleoclimatic variability noisy and incomplete. We evaluate this assumption using a model of oscillating climate and the precise topographic evolution of an experimental alluvial system. We find that geomorphic stochasticity can create aliasing in the time series and spurious climate signals, but these issues are eliminated when the period of climate oscillation is longer than a key time scale of internal dynamics in the geomorphic system. This emergent autogenic geomorphic behavior imparts regularity to deposition and represents a natural discretization interval of the continuous climate signal. We propose that this time scale in nature could be in excess of 10 4 years but would still allow assessments of the rates of climate change at resolutions finer than the existing age model techniques in isolation.
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
NASA Astrophysics Data System (ADS)
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
Foreman, Brady Z.; Straub, Kyle M.
2017-01-01
Terrestrial paleoclimate records rely on proxies hosted in alluvial strata whose beds are deposited by unsteady and nonlinear geomorphic processes. It is broadly assumed that this renders the resultant time series of terrestrial paleoclimatic variability noisy and incomplete. We evaluate this assumption using a model of oscillating climate and the precise topographic evolution of an experimental alluvial system. We find that geomorphic stochasticity can create aliasing in the time series and spurious climate signals, but these issues are eliminated when the period of climate oscillation is longer than a key time scale of internal dynamics in the geomorphic system. This emergent autogenic geomorphic behavior imparts regularity to deposition and represents a natural discretization interval of the continuous climate signal. We propose that this time scale in nature could be in excess of 104 years but would still allow assessments of the rates of climate change at resolutions finer than the existing age model techniques in isolation. PMID:28924607
Fractal dimension and nonlinear dynamical processes
NASA Astrophysics Data System (ADS)
McCarty, Robert C.; Lindley, John P.
1993-11-01
Mandelbrot, Falconer and others have demonstrated the existence of dimensionally invariant geometrical properties of non-linear dynamical processes known as fractals. Barnsley defines fractal geometry as an extension of classical geometry. Such an extension, however, is not mathematically trivial Of specific interest to those engaged in signal processing is the potential use of fractal geometry to facilitate the analysis of non-linear signal processes often referred to as non-linear time series. Fractal geometry has been used in the modeling of non- linear time series represented by radar signals in the presence of ground clutter or interference generated by spatially distributed reflections around the target or a radar system. It was recognized by Mandelbrot that the fractal geometries represented by man-made objects had different dimensions than the geometries of the familiar objects that abound in nature such as leaves, clouds, ferns, trees, etc. The invariant dimensional property of non-linear processes suggests that in the case of acoustic signals (active or passive) generated within a dispersive medium such as the ocean environment, there exists much rich structure that will aid in the detection and classification of various objects, man-made or natural, within the medium.
A framework for periodic outlier pattern detection in time-series sequences.
Rasheed, Faraz; Alhajj, Reda
2014-05-01
Periodic pattern detection in time-ordered sequences is an important data mining task, which discovers in the time series all patterns that exhibit temporal regularities. Periodic pattern mining has a large number of applications in real life; it helps understanding the regular trend of the data along time, and enables the forecast and prediction of future events. An interesting related and vital problem that has not received enough attention is to discover outlier periodic patterns in a time series. Outlier patterns are defined as those which are different from the rest of the patterns; outliers are not noise. While noise does not belong to the data and it is mostly eliminated by preprocessing, outliers are actual instances in the data but have exceptional characteristics compared with the majority of the other instances. Outliers are unusual patterns that rarely occur, and, thus, have lesser support (frequency of appearance) in the data. Outlier patterns may hint toward discrepancy in the data such as fraudulent transactions, network intrusion, change in customer behavior, recession in the economy, epidemic and disease biomarkers, severe weather conditions like tornados, etc. We argue that detecting the periodicity of outlier patterns might be more important in many sequences than the periodicity of regular, more frequent patterns. In this paper, we present a robust and time efficient suffix tree-based algorithm capable of detecting the periodicity of outlier patterns in a time series by giving more significance to less frequent yet periodic patterns. Several experiments have been conducted using both real and synthetic data; all aspects of the proposed approach are compared with the existing algorithm InfoMiner; the reported results demonstrate the effectiveness and applicability of the proposed approach.
GPS Imaging of Time-Dependent Seasonal Strain in Central California
NASA Astrophysics Data System (ADS)
Kraner, M.; Hammond, W. C.; Kreemer, C.; Borsa, A. A.; Blewitt, G.
2016-12-01
Recently, studies are suggesting that crustal deformation can be time-dependent and nontectonic. Continuous global positioning system (cGPS) measurements are now showing how steady long-term deformation can be influenced by factors such as fluctuations in loading and temperature variations. Here we model the seasonal time-dependent dilatational and shear strain in Central California, specifically surrounding the Parkfield region and try to uncover the sources of these deformation patterns. We use 8 years of cGPS data (2008 - 2016) processed by the Nevada Geodetic Laboratory and carefully select the cGPS stations for our analysis based on the vertical position of cGPS time series during the drought period. In building our strain model, we first detrend the selected station time series using a set of velocities from the robust MIDAS trend estimator. This estimation algorithm is a robust approach that is insensitive to common problems such as step discontinuities, outliers, and seasonality. We use these detrended time series to estimate the median cGPS positions for each month of the 8-year period and filter displacement differences between these monthly median positions using a filtering technique called "GPS Imaging." This technique improves the overall robustness and spatial resolution of the input displacements for the strain model. We then model our dilatational and shear strain field for each month of time series. We also test a variety of a priori constraints, which controls the style of faulting within the strain model. Upon examining our strain maps, we find that a seasonal strain signal exists in Central California. We investigate how this signal compares to thermoelastic, hydrologic, and atmospheric loading models during the 8-year period. We additionally determine whether the drought played a role in influencing the seasonal signal.
NASA Technical Reports Server (NTRS)
Gao, Feng; DeColstoun, Eric Brown; Ma, Ronghua; Weng, Qihao; Masek, Jeffrey G.; Chen, Jin; Pan, Yaozhong; Song, Conghe
2012-01-01
Cities have been expanding rapidly worldwide, especially over the past few decades. Mapping the dynamic expansion of impervious surface in both space and time is essential for an improved understanding of the urbanization process, land-cover and land-use change, and their impacts on the environment. Landsat and other medium-resolution satellites provide the necessary spatial details and temporal frequency for mapping impervious surface expansion over the past four decades. Since the US Geological Survey opened the historical record of the Landsat image archive for free access in 2008, the decades-old bottleneck of data limitation has gone. Remote-sensing scientists are now rich with data, and the challenge is how to make best use of this precious resource. In this article, we develop an efficient algorithm to map the continuous expansion of impervious surface using a time series of four decades of medium-resolution satellite images. The algorithm is based on a supervised classification of the time-series image stack using a decision tree. Each imerpervious class represents urbanization starting in a different image. The algorithm also allows us to remove inconsistent training samples because impervious expansion is not reversible during the study period. The objective is to extract a time series of complete and consistent impervious surface maps from a corresponding times series of images collected from multiple sensors, and with a minimal amount of image preprocessing effort. The approach was tested in the lower Yangtze River Delta region, one of the fastest urban growth areas in China. Results from nearly four decades of medium-resolution satellite data from the Landsat Multispectral Scanner (MSS), Thematic Mapper (TM), Enhanced Thematic Mapper plus (ETM+) and China-Brazil Earth Resources Satellite (CBERS) show a consistent urbanization process that is consistent with economic development plans and policies. The time-series impervious spatial extent maps derived from this study agree well with an existing urban extent polygon data set that was previously developed independently. The overall mapping accuracy was estimated at about 92.5% with 3% commission error and 12% omission error for the impervious type from all images regardless of image quality and initial spatial resolution.
Sector Identification in a Set of Stock Return Time Series Traded at the London Stock Exchange
NASA Astrophysics Data System (ADS)
Coronnello, C.; Tumminello, M.; Lillo, F.; Micciche, S.; Mantegna, R. N.
2005-09-01
We compare some methods recently used in the literature to detect the existence of a certain degree of common behavior of stock returns belonging to the same economic sector. Specifically, we discuss methods based on random matrix theory and hierarchical clustering techniques. We apply these methods to a portfolio of stocks traded at the London Stock Exchange. The investigated time series are recorded both at a daily time horizon and at a 5-minute time horizon. The correlation coefficient matrix is very different at different time horizons confirming that more structured correlation coefficient matrices are observed for long time horizons. All the considered methods are able to detect economic information and the presence of clusters characterized by the economic sector of stocks. However, different methods present a different degree of sensitivity with respect to different sectors. Our comparative analysis suggests that the application of just a single method could not be able to extract all the economic information present in the correlation coefficient matrix of a stock portfolio.
NASA Astrophysics Data System (ADS)
Massah, Mozhdeh; Kantz, Holger
2016-04-01
As we have one and only one earth and no replicas, climate characteristics are usually computed as time averages from a single time series. For understanding climate variability, it is essential to understand how close a single time average will typically be to an ensemble average. To answer this question, we study large deviation probabilities (LDP) of stochastic processes and characterize them by their dependence on the time window. In contrast to iid variables for which there exists an analytical expression for the rate function, the correlated variables such as auto-regressive (short memory) and auto-regressive fractionally integrated moving average (long memory) processes, have not an analytical LDP. We study LDP for these processes, in order to see how correlation affects this probability in comparison to iid data. Although short range correlations lead to a simple correction of sample size, long range correlations lead to a sub-exponential decay of LDP and hence to a very slow convergence of time averages. This effect is demonstrated for a 120 year long time series of daily temperature anomalies measured in Potsdam (Germany).
Causal strength induction from time series data.
Soo, Kevin W; Rottman, Benjamin M
2018-04-01
One challenge when inferring the strength of cause-effect relations from time series data is that the cause and/or effect can exhibit temporal trends. If temporal trends are not accounted for, a learner could infer that a causal relation exists when it does not, or even infer that there is a positive causal relation when the relation is negative, or vice versa. We propose that learners use a simple heuristic to control for temporal trends-that they focus not on the states of the cause and effect at a given instant, but on how the cause and effect change from one observation to the next, which we call transitions. Six experiments were conducted to understand how people infer causal strength from time series data. We found that participants indeed use transitions in addition to states, which helps them to reach more accurate causal judgments (Experiments 1A and 1B). Participants use transitions more when the stimuli are presented in a naturalistic visual format than a numerical format (Experiment 2), and the effect of transitions is not driven by primacy or recency effects (Experiment 3). Finally, we found that participants primarily use the direction in which variables change rather than the magnitude of the change for estimating causal strength (Experiments 4 and 5). Collectively, these studies provide evidence that people often use a simple yet effective heuristic for inferring causal strength from time series data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
The influence of biomass energy consumption on CO2 emissions: a wavelet coherence approach.
Bilgili, Faik; Öztürk, İlhan; Koçak, Emrah; Bulut, Ümit; Pamuk, Yalçın; Muğaloğlu, Erhan; Bağlıtaş, Hayriye H
2016-10-01
In terms of today, one may argue, throughout observations from energy literature papers, that (i) one of the main contributors of the global warming is carbon dioxide emissions, (ii) the fossil fuel energy usage greatly contributes to the carbon dioxide emissions, and (iii) the simulations from energy models attract the attention of policy makers to renewable energy as alternative energy source to mitigate the carbon dioxide emissions. Although there appears to be intensive renewable energy works in the related literature regarding renewables' efficiency/impact on environmental quality, a researcher might still need to follow further studies to review the significance of renewables in the environment since (i) the existing seminal papers employ time series models and/or panel data models or some other statistical observation to detect the role of renewables in the environment and (ii) existing papers consider mostly aggregated renewable energy source rather than examining the major component(s) of aggregated renewables. This paper attempted to examine clearly the impact of biomass on carbon dioxide emissions in detail through time series and frequency analyses. Hence, the paper follows wavelet coherence analyses. The data covers the US monthly observations ranging from 1984:1 to 2015 for the variables of total energy carbon dioxide emissions, biomass energy consumption, coal consumption, petroleum consumption, and natural gas consumption. The paper thus, throughout wavelet coherence and wavelet partial coherence analyses, observes frequency properties as well as time series properties of relevant variables to reveal the possible significant influence of biomass usage on the emissions in the USA in both the short-term and the long-term cycles. The paper also reveals, finally, that the biomass consumption mitigates CO2 emissions in the long run cycles after the year 2005 in the USA.
Trend analysis for daily rainfall series of Barcelona
NASA Astrophysics Data System (ADS)
Ortego, M. I.; Gibergans-Báguena, J.; Tolosana-Delgado, R.; Egozcue, J. J.; Llasat, M. C.
2009-09-01
Frequency analysis of hydrological series is a key point to acquire an in-depth understanding of the behaviour of hydrologic events. The occurrence of extreme hydrologic events in an area may imply great social and economical impacts. A good understanding of hazardous events improves the planning of human activities. A useful model for hazard assessment of extreme hydrologic events in an area is the point-over-threshold (POT) model. Time-occurrence of events is assumed to be Poisson distributed, and the magnitude X of each event is modeled as an arbitrary random variable, whose excesses over the threshold x0, Y = X - x0, given X > x0, have a Generalized Pareto Distribution (GPD), ( ? )- 1? FY (y|β,?) = 1 - 1+ βy , 0 ? y < ysup , where ysup = +? if ? 0, and ysup = -β? ? if ? < 0. The limiting distribution for ? = 0 is an exponential one. Independence between this magnitude and occurrence in time is assumed, as well as independence from event to event. In order to take account for uncertainty of the estimation of the GPD parameters, a Bayesian approach is chosen. This approach allows to include necessary conditions on the parameters of the distribution for our particular phenomena, as well as propagate adequately the uncertainty of estimations to the hazard parameters, such as return periods. A common concern is to know whether magnitudes of hazardous events have changed in the last decades. Long data series are very appreciated in order to properly study these issues. The series of daily rainfall in Barcelona (1854-2006) has been selected. This is one of the longer european daily rainfall series available. Daily rainfall is better described using a relative scale and therefore it is suitably treated in a log-scale. Accordingly, log-precipitation is identified with X. Excesses over a threshold are modeled by a GPD with a limited maximum value. An additional assumption is that the distribution of the excesses Y has limited upper tail and, therefore, ? < 0, ysup = -β?. Such a long data series provides valuable information about the phenomena on hand, and therefore a very first step is to have a look to its reliability. The first part of the work focuses on the possible existence of abrupt changes in the parameters of the GPD. These abrupt changes may be due to changes in the location of the observatories and/or technological advances introduced in the measuring instruments. The second part of the work examines the possible existence of trends. The parameters of the model are considered as a function of time. A new parameterisation of the GPD distribution is suggested, in order to parsimoniously deal with this climate variation, ? = ln(-? ?;β) and ? = ln(-? ? β) The classical scale and shape parameters of the GPD (β,?) are reformulated as a location parameter ? "linked to the upper limit of the distribution", and a shape parameter ?. In this reparameterisation, the parsimonious choice is to consider shape as a linear function of time, ?(t) = ?0 + t? while keeping location fixed, ?(t) = ?0. Then, the climate change is assessed by checking the hypothesis ? 0. Results show no significant abrupt changes in excesses distribution of the Barcelona daily rainfall series but suggest a significant change for the parameters, and therefore the existence of a trend in daily rainfall for this period.
Long-Range Correlations and Memory in the Dynamics of Internet Interdomain Routing
Havlin, Shlomo; Krioukov, Dmitri
2015-01-01
Data transfer is one of the main functions of the Internet. The Internet consists of a large number of interconnected subnetworks or domains, known as Autonomous Systems (ASes). Due to privacy and other reasons the information about what route to use to reach devices within other ASes is not readily available to any given AS. The Border Gateway Protocol (BGP) is responsible for discovering and distributing this reachability information to all ASes. Since the topology of the Internet is highly dynamic, all ASes constantly exchange and update this reachability information in small chunks, known as routing control packets or BGP updates. In the view of the quick growth of the Internet there are significant concerns with the scalability of the BGP updates and the efficiency of the BGP routing in general. Motivated by these issues we conduct a systematic time series analysis of BGP update rates. We find that BGP update time series are extremely volatile, exhibit long-term correlations and memory effects, similar to seismic time series, or temperature and stock market price fluctuations. The presented statistical characterization of BGP update dynamics could serve as a basis for validation of existing and developing better models of Internet interdomain routing. PMID:26529312
Long-Range Correlations and Memory in the Dynamics of Internet Interdomain Routing.
Kitsak, Maksim; Elmokashfi, Ahmed; Havlin, Shlomo; Krioukov, Dmitri
2015-01-01
Data transfer is one of the main functions of the Internet. The Internet consists of a large number of interconnected subnetworks or domains, known as Autonomous Systems (ASes). Due to privacy and other reasons the information about what route to use to reach devices within other ASes is not readily available to any given AS. The Border Gateway Protocol (BGP) is responsible for discovering and distributing this reachability information to all ASes. Since the topology of the Internet is highly dynamic, all ASes constantly exchange and update this reachability information in small chunks, known as routing control packets or BGP updates. In the view of the quick growth of the Internet there are significant concerns with the scalability of the BGP updates and the efficiency of the BGP routing in general. Motivated by these issues we conduct a systematic time series analysis of BGP update rates. We find that BGP update time series are extremely volatile, exhibit long-term correlations and memory effects, similar to seismic time series, or temperature and stock market price fluctuations. The presented statistical characterization of BGP update dynamics could serve as a basis for validation of existing and developing better models of Internet interdomain routing.
Forecasting the portuguese stock market time series by using artificial neural networks
NASA Astrophysics Data System (ADS)
Isfan, Monica; Menezes, Rui; Mendes, Diana A.
2010-04-01
In this paper, we show that neural networks can be used to uncover the non-linearity that exists in the financial data. First, we follow a traditional approach by analysing the deterministic/stochastic characteristics of the Portuguese stock market data and some typical features are studied, like the Hurst exponents, among others. We also simulate a BDS test to investigate nonlinearities and the results are as expected: the financial time series do not exhibit linear dependence. Secondly, we trained four types of neural networks for the stock markets and used the models to make forecasts. The artificial neural networks were obtained using a three-layer feed-forward topology and the back-propagation learning algorithm. The quite large number of parameters that must be selected to develop a neural network forecasting model involves some trial and as a consequence the error is not small enough. In order to improve this we use a nonlinear optimization algorithm to minimize the error. Finally, the output of the 4 models is quite similar, leading to a qualitative forecast that we compare with the results of the application of k-nearest-neighbor for the same time series.
A daily Azores-Iceland North Atlantic Oscillation index back to 1850.
Cropper, Thomas; Hanna, Edward; Valente, Maria Antónia; Jónsson, Trausti
2015-07-01
We present the construction of a continuous, daily (09:00 UTC), station-based (Azores-Iceland) North Atlantic Oscillation (NAO) Index back to 1871 which is extended back to 1850 with additional daily mean data. The constructed index more than doubles the length of previously existing, widely available, daily NAO time series. The index is created using entirely observational sea-level pressure (SLP) data from Iceland and 73.5% of observational SLP data from the Azores - the remainder being filled in via reanalysis (Twentieth Century Reanalysis Project and European Mean Sea Level Pressure) SLP data. Icelandic data are taken from the Southwest Iceland pressure series. We construct and document a new Ponta Delgada SLP time series based on recently digitized and newly available data that extend back to 1872. The Ponta Delgada time series is created by splicing together several fractured records (from Ponta Delgada, Lajes, and Santa Maria) and filling in the major gaps (pre-1872, 1888-1905, and 1940-1941) and occasional days (145) with reanalysis data. Further homogeneity corrections are applied to the Azores record, and the daily (09:00 UTC) NAO index is then calculated. The resulting index, with its extended temporal length and daily resolution, is the first reconstruction of daily NAO back into the 19th Century and therefore is useful for researchers across multiple disciplines.
Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun; Niu, Hongli
2015-10-01
Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.
Using terrestrial radar to explore lava channel erosion on Momotombo volcano, Nicaragua
NASA Astrophysics Data System (ADS)
Gallant, E.; Deng, F.; Xie, S.; Connor, L.; Connor, C.; Saballos, J. A.; Dixon, T. H.; Myhre, D.
2017-12-01
We explore the application of terrestrial radar as a tool for imaging topography on Momotombo volcano, Nicaragua. A major feature of the edifice is an incised lava flow channel (possibly created by the 1904 eruption) that measures 150m in width and up to 60m in depth. This feature is unusual because most lava channels are constructional in nature and constrained by levees on their margins. The radar elevation model was used alongside a TerraSAR-X/TanDEM-X DEM to help create a topographic time series. We consider the possibility that the channel was formed during the 1904 eruption by thermal and / or mechanical erosion. We aim to quantify the energy required to create the observed topography by merging this topographic time series with existing field observations and mathematical models of erosion via lava flow.
NASA Astrophysics Data System (ADS)
Garcin, Matthieu
2017-10-01
Hurst exponents depict the long memory of a time series. For human-dependent phenomena, as in finance, this feature may vary in the time. It justifies modelling dynamics by multifractional Brownian motions, which are consistent with time-dependent Hurst exponents. We improve the existing literature on estimating time-dependent Hurst exponents by proposing a smooth estimate obtained by variational calculus. This method is very general and not restricted to the sole Hurst framework. It is globally more accurate and easier than other existing non-parametric estimation techniques. Besides, in the field of Hurst exponents, it makes it possible to make forecasts based on the estimated multifractional Brownian motion. The application to high-frequency foreign exchange markets (GBP, CHF, SEK, USD, CAD, AUD, JPY, CNY and SGD, all against EUR) shows significantly good forecasts. When the Hurst exponent is higher than 0.5, what depicts a long-memory feature, the accuracy is higher.
Key Technology of Real-Time Road Navigation Method Based on Intelligent Data Research
Tang, Haijing; Liang, Yu; Huang, Zhongnan; Wang, Taoyi; He, Lin; Du, Yicong; Ding, Gangyi
2016-01-01
The effect of traffic flow prediction plays an important role in routing selection. Traditional traffic flow forecasting methods mainly include linear, nonlinear, neural network, and Time Series Analysis method. However, all of them have some shortcomings. This paper analyzes the existing algorithms on traffic flow prediction and characteristics of city traffic flow and proposes a road traffic flow prediction method based on transfer probability. This method first analyzes the transfer probability of upstream of the target road and then makes the prediction of the traffic flow at the next time by using the traffic flow equation. Newton Interior-Point Method is used to obtain the optimal value of parameters. Finally, it uses the proposed model to predict the traffic flow at the next time. By comparing the existing prediction methods, the proposed model has proven to have good performance. It can fast get the optimal value of parameters faster and has higher prediction accuracy, which can be used to make real-time traffic flow prediction. PMID:27872637
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
NASA Astrophysics Data System (ADS)
Hentze, Konrad; Thonfeld, Frank; Menz, Gunter
2017-10-01
In the discourse on land reform assessments, a significant lack of spatial and time-series data has been identified, especially with respect to Zimbabwe's ;Fast-Track Land Reform Programme; (FTLRP). At the same time, interest persists among land use change scientists to evaluate causes of land use change and therefore to increase the explanatory power of remote sensing products. This study recognizes these demands and aims to provide input on both levels: Evaluating the potential of satellite remote sensing time-series to answer questions which evolved after intensive land redistribution efforts in Zimbabwe; and investigating how time-series analysis of Normalized Difference Vegetation Index (NDVI) can be enhanced to provide information on land reform induced land use change. To achieve this, two time-series methods are applied to MODIS NDVI data: Seasonal Trend Analysis (STA) and Breakpoint Analysis for Additive Season and Trend (BFAST). In our first analysis, a link of agricultural productivity trends to different land tenure regimes shows that regional clustering of trends is more dominant than a relationship between tenure and trend with a slightly negative slope for all regimes. We demonstrate that clusters of strong negative and positive productivity trends are results of changing irrigation patterns. To locate emerging and fallow irrigation schemes in semi-arid Zimbabwe, a new multi-method approach is developed which allows to map changes from bimodal seasonal phenological patterns to unimodal and vice versa. With an enhanced breakpoint analysis through the combination of STA and BFAST, we are able to provide a technique that can be applied on large scale to map status and development of highly productive cropping systems, which are key for food production, national export and local employment. We therefore conclude that the combination of existing and accessible time-series analysis methods: is able to achieve both: overcoming demonstrated limitations of MODIS based trend analysis and enhancing knowledge of Zimbabwe's FTLRP.
Optimizing Use of Water Management Systems during Changes of Hydrological Conditions
NASA Astrophysics Data System (ADS)
Výleta, Roman; Škrinár, Andrej; Danáčová, Michaela; Valent, Peter
2017-10-01
When designing the water management systems and their components, there is a need of more detail research on hydrological conditions of the river basin, runoff of which creates the main source of water in the reservoir. Over the lifetime of the water management systems the hydrological time series are never repeated in the same form which served as the input for the design of the system components. The design assumes the observed time series to be representative at the time of the system use. However, it is rather unrealistic assumption, because the hydrological past will not be exactly repeated over the design lifetime. When designing the water management systems, the specialists may occasionally face the insufficient or oversized capacity design, possibly wrong specification of the management rules which may lead to their non-optimal use. It is therefore necessary to establish a comprehensive approach to simulate the fluctuations in the interannual runoff (taking into account the current dry and wet periods) in the form of stochastic modelling techniques in water management practice. The paper deals with the methodological procedure of modelling the mean monthly flows using the stochastic Thomas-Fiering model, while modification of this model by Wilson-Hilferty transformation of independent random number has been applied. This transformation usually applies in the event of significant asymmetry in the observed time series. The methodological procedure was applied on the data acquired at the gauging station of Horné Orešany in the Parná Stream. Observed mean monthly flows for the period of 1.11.1980 - 31.10.2012 served as the model input information. After extrapolation the model parameters and Wilson-Hilferty transformation parameters the synthetic time series of mean monthly flows were simulated. Those have been compared with the observed hydrological time series using basic statistical characteristics (e. g. mean, standard deviation and skewness) for testing the quality of the model simulation. The synthetic hydrological series of monthly flows were created having the same statistical properties as the time series observed in the past. The compiled model was able to take into account the diversity of extreme hydrological situations in a form of synthetic series of mean monthly flows, while the occurrence of a set of flows was confirmed, which could and may occur in the future. The results of stochastic modelling in the form of synthetic time series of mean monthly flows, which takes into account the seasonal fluctuations of runoff within the year, could be applicable in engineering hydrology (e. g. for optimum use of the existing water management system that is related to reassessment of economic risks of the system).
Biogeochemical Response to Mesoscale Physical Forcing in the California Current System
NASA Technical Reports Server (NTRS)
Niiler, Pearn P.; Letelier, Ricardo; Moisan, John R.; Marra, John A. (Technical Monitor)
2001-01-01
In the first part of the project, we investigated the local response of the coastal ocean ecosystems (changes in chlorophyll, concentration and chlorophyll, fluorescence quantum yield) to physical forcing by developing and deploying Autonomous Drifting Ocean Stations (ADOS) within several mesoscale features along the U.S. west coast. Also, we compared the temporal and spatial variability registered by sensors mounted in the drifters to that registered by the sensors mounted in the satellites in order to assess the scales of variability that are not resolved by the ocean color satellite. The second part of the project used the existing WOCE SVP Surface Lagrangian drifters to track individual water parcels through time. The individual drifter tracks were used to generate multivariate time series by interpolating/extracting the biological and physical data fields retrieved by remote sensors (ocean color, SST, wind speed and direction, wind stress curl, and sea level topography). The individual time series of the physical data (AVHRR, TOPEX, NCEP) were analyzed against the ocean color (SeaWiFS) time-series to determine the time scale of biological response to the physical forcing. The results from this part of the research is being used to compare the decorrelation scales of chlorophyll from a Lagrangian and Eulerian framework. The results from both parts of this research augmented the necessary time series data needed to investigate the interactions between the ocean mesoscale features, wind, and the biogeochemical processes. Using the historical Lagrangian data sets, we have completed a comparison of the decorrelation scales in both the Eulerian and Lagrangian reference frame for the SeaWiFS data set. We are continuing to investigate how these results might be used in objective mapping efforts.
IDCDACS: IDC's Distributed Application Control System
NASA Astrophysics Data System (ADS)
Ertl, Martin; Boresch, Alexander; Kianička, Ján; Sudakov, Alexander; Tomuta, Elena
2015-04-01
The Preparatory Commission for the CTBTO is an international organization based in Vienna, Austria. Its mission is to establish a global verification regime to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. For this purpose time series data from a global network of seismic, hydro-acoustic and infrasound (SHI) sensors are transmitted to the International Data Centre (IDC) in Vienna in near-real-time, where it is processed to locate events that may be nuclear explosions. We newly designed the distributed application control system that glues together the various components of the automatic waveform data processing system at the IDC (IDCDACS). Our highly-scalable solution preserves the existing architecture of the IDC processing system that proved successful over many years of operational use, but replaces proprietary components with open-source solutions and custom developed software. Existing code was refactored and extended to obtain a reusable software framework that is flexibly adaptable to different types of processing workflows. Automatic data processing is organized in series of self-contained processing steps, each series being referred to as a processing pipeline. Pipelines process data by time intervals, i.e. the time-series data received from monitoring stations is organized in segments based on the time when the data was recorded. So-called data monitor applications queue the data for processing in each pipeline based on specific conditions, e.g. data availability, elapsed time or completion states of preceding processing pipelines. IDCDACS consists of a configurable number of distributed monitoring and controlling processes, a message broker and a relational database. All processes communicate through message queues hosted on the message broker. Persistent state information is stored in the database. A configurable processing controller instantiates and monitors all data processing applications. Due to decoupling by message queues the system is highly versatile and failure tolerant. The implementation utilizes the RabbitMQ open-source messaging platform that is based upon the Advanced Message Queuing Protocol (AMQP), an on-the-wire protocol (like HTML) and open industry standard. IDCDACS uses high availability capabilities provided by RabbitMQ and is equipped with failure recovery features to survive network and server outages. It is implemented in C and Python and is operated in a Linux environment at the IDC. Although IDCDACS was specifically designed for the existing IDC processing system its architecture is generic and reusable for different automatic processing workflows, e.g. similar to those described in (Olivieri et al. 2012, Kværna et al. 2012). Major advantages are its independence of the specific data processing applications used and the possibility to reconfigure IDCDACS for different types of processing, data and trigger logic. A possible future development would be to use the IDCDACS framework for different scientific domains, e.g. for processing of Earth observation satellite data extending the one-dimensional time-series intervals to spatio-temporal data cubes. REFERENCES Olivieri M., J. Clinton (2012) An almost fair comparison between Earthworm and SeisComp3, Seismological Research Letters, 83(4), 720-727. Kværna, T., S. J. Gibbons, D. B. Harris, D. A. Dodge (2012) Adapting pipeline architectures to track developing aftershock sequences and recurrent explosions, Proceedings of the 2012 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, 776-785.
Toward the Real-Time Tsunami Parameters Prediction
NASA Astrophysics Data System (ADS)
Lavrentyev, Mikhail; Romanenko, Alexey; Marchuk, Andrey
2013-04-01
Today, a wide well-developed system of deep ocean tsunami detectors operates over the Pacific. Direct measurements of tsunami-wave time series are available. However, tsunami-warning systems fail to predict basic parameters of tsunami waves on time. Dozens examples could be provided. In our view, the lack of computational power is the main reason of these failures. At the same time, modern computer technologies such as, GPU (graphic processing unit) and FPGA (field programmable gates array), can dramatically improve data processing performance, which may enhance timely tsunami-warning prediction. Thus, it is possible to address the challenge of real-time tsunami forecasting for selected geo regions. We propose to use three new techniques in the existing tsunami warning systems to achieve real-time calculation of tsunami wave parameters. First of all, measurement system (DART buoys location, e.g.) should be optimized (both in terms of wave arriving time and amplitude parameter). The corresponding software application exists today and is ready for use [1]. We consider the example of the coastal line of Japan. Numerical tests show that optimal installation of only 4 DART buoys (accounting the existing sea bed cable) will reduce the tsunami wave detection time to only 10 min after an underwater earthquake. Secondly, as was shown by this paper authors, the use of GPU/FPGA technologies accelerates the execution of the MOST (method of splitting tsunami) code by 100 times [2]. Therefore, tsunami wave propagation over the ocean area 2000*2000 km (wave propagation simulation: time step 10 sec, recording each 4th spatial point and 4th time step) could be calculated at: 3 sec with 4' mesh 50 sec with 1' mesh 5 min with 0.5' mesh The algorithm to switch from coarse mesh to the fine grain one is also available. Finally, we propose the new algorithm for tsunami source parameters determination by real-time processing the time series, obtained at DART. It is possible to approximate the measured time series by a linear combination of synthetic marigrams. Coefficients of such linear combination are calculated with the help of orthogonal decomposition. The algorithm is very fast and demonstrates good accuracy. Summing up, using the example of the coastal line of Japan, wave height evaluation will be available in 12-14 minutes after the earthquake even before the wave approaches the nearest shore point (usually, it takes places in about 20 minutes). The determination of the optimal sensors' location using genetic algorithm / A.S.Astrakova, D.V.Bannikov, S.G.Cherny, M.M.Lavrentiev // 3rd Nordic EMW Summer School, Turku, Finland, June, 2009: proceedings - Finland: TUSC General Publications, 2009. - N 53. - P.5-22. M.Lavrentiev Jr., A.Romanenko, "Modern Hardware Solutions to Speed Up Tsunami Simulation Codes", Geophysical research abstracts, Vol. 12, EGU2010-3835, 2010
77 FR 16280 - Capital Research and Management Company, et al.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
.... APPLICANTS: American Funds Insurance Series (``AFIS''), Capital Research and Management Company (``CRMC.../search.htm or by calling (202) 551-8090. Applicants' Representations 1. AFIS is organized as a... extent necessary to permit any existing or future series of AFIS and any other existing or future...
Yang, Jian-Yi; Peng, Zhen-Ling; Yu, Zu-Guo; Zhang, Rui-Jie; Anh, Vo; Wang, Desheng
2009-04-21
In this paper, we intend to predict protein structural classes (alpha, beta, alpha+beta, or alpha/beta) for low-homology data sets. Two data sets were used widely, 1189 (containing 1092 proteins) and 25PDB (containing 1673 proteins) with sequence homology being 40% and 25%, respectively. We propose to decompose the chaos game representation of proteins into two kinds of time series. Then, a novel and powerful nonlinear analysis technique, recurrence quantification analysis (RQA), is applied to analyze these time series. For a given protein sequence, a total of 16 characteristic parameters can be calculated with RQA, which are treated as feature representation of protein sequences. Based on such feature representation, the structural class for each protein is predicted with Fisher's linear discriminant algorithm. The jackknife test is used to test and compare our method with other existing methods. The overall accuracies with step-by-step procedure are 65.8% and 64.2% for 1189 and 25PDB data sets, respectively. With one-against-others procedure used widely, we compare our method with five other existing methods. Especially, the overall accuracies of our method are 6.3% and 4.1% higher for the two data sets, respectively. Furthermore, only 16 parameters are used in our method, which is less than that used by other methods. This suggests that the current method may play a complementary role to the existing methods and is promising to perform the prediction of protein structural classes.
An information-theoretical perspective on weighted ensemble forecasts
NASA Astrophysics Data System (ADS)
Weijs, Steven V.; van de Giesen, Nick
2013-08-01
This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals. PMID:27413364
Globally-Gridded Interpolated Night-Time Marine Air Temperatures 1900-2014
NASA Astrophysics Data System (ADS)
Junod, R.; Christy, J. R.
2016-12-01
Over the past century, climate records have pointed to an increase in global near-surface average temperature. Near-surface air temperature over the oceans is a relatively unused parameter in understanding the current state of climate, but is useful as an independent temperature metric over the oceans and serves as a geographical and physical complement to near-surface air temperature over land. Though versions of this dataset exist (i.e. HadMAT1 and HadNMAT2), it has been strongly recommended that various groups generate climate records independently. This University of Alabama in Huntsville (UAH) study began with the construction of monthly night-time marine air temperature (UAHNMAT) values from the early-twentieth century through to the present era. Data from the International Comprehensive Ocean and Atmosphere Data Set (ICOADS) were used to compile a time series of gridded UAHNMAT, (20S-70N). This time series was homogenized to correct for the many biases such as increasing ship height, solar deck heating, etc. The time series of UAHNMAT, once adjusted to a standard reference height, is gridded to 1.25° pentad grid boxes and interpolated using the kriging interpolation technique. This study will present results which quantify the variability and trends and compare to current trends of other related datasets that include HadNMAT2 and sea-surface temperatures (HadISST & ERSSTv4).
NASA Technical Reports Server (NTRS)
Teng, William; Rui, Hualan; Strub, Richard; Vollmer, Bruce
2016-01-01
A long-standing "Digital Divide" in data representation exists between the preferred way of data access by the hydrology community and the common way of data archival by earth science data centers. Typically, in hydrology, earth surface features are expressed as discrete spatial objects (e.g., watersheds), and time-varying data are contained in associated time series. Data in earth science archives, although stored as discrete values (of satellite swath pixels or geographical grids), represent continuous spatial fields, one file per time step. This Divide has been an obstacle, specifically, between the Consortium of Universities for the Advancement of Hydrologic Science, Inc. and NASA earth science data systems. In essence, the way data are archived is conceptually orthogonal to the desired method of access. Our recent work has shown an optimal method of bridging the Divide, by enabling operational access to long-time series (e.g., 36 years of hourly data) of selected NASA datasets. These time series, which we have termed "data rods," are pre-generated or generated on-the-fly. This optimal solution was arrived at after extensive investigations of various approaches, including one based on "data curtains." The on-the-fly generation of data rods uses "data cubes," NASA Giovanni, and parallel processing. The optimal reorganization of NASA earth science data has significantly enhanced the access to and use of the data for the hydrology user community.
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic.
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals.
NASA Technical Reports Server (NTRS)
Mehta, Vikram M.; Delworth, Thomas
1995-01-01
Sea surface temperature (SST) variability was investigated in a 200-yr integration of a global model of the coupled oceanic and atmospheric general circulations developed at the Geophysical Fluid Dynamics Laboratory (GFDL). The second 100 yr of SST in the coupled model's tropical Atlantic region were analyzed with a variety of techniques. Analyses of SST time series, averaged over approximately the same subregions as the Global Ocean Surface Temperature Atlas (GOSTA) time series, showed that the GFDL SST anomalies also undergo pronounced quasi-oscillatory decadal and multidecadal variability but at somewhat shorter timescales than the GOSTA SST anomalies. Further analyses of the horizontal structures of the decadal timescale variability in the GFDL coupled model showed the existence of two types of variability in general agreement with results of the GOSTA SST time series analyses. One type, characterized by timescales between 8 and 11 yr, has high spatial coherence within each hemisphere but not between the two hemispheres of the tropical Atlantic. A second type, characterized by timescales between 12 and 20 yr, has high spatial coherence between the two hemispheres. The second type of variability is considerably weaker than the first. As in the GOSTA time series, the multidecadal variability in the GFDL SST time series has approximately opposite phases between the tropical North and South Atlantic Oceans. Empirical orthogonal function analyses of the tropical Atlantic SST anomalies revealed a north-south bipolar pattern as the dominant pattern of decadal variability. It is suggested that the bipolar pattern can be interpreted as decadal variability of the interhemispheric gradient of SST anomalies. The decadal and multidecadal timescale variability of the tropical Atlantic SST, both in the actual and in the GFDL model, stands out significantly above the background 'red noise' and is coherent within each of the time series, suggesting that specific sets of processes may be responsible for the choice of the decadal and multidecadal timescales. Finally, it must be emphasized that the GFDL coupled ocean-atmosphere model generates the decadal and multidecadal timescale variability without any externally applied force, solar or lunar, at those timescales.
Salehi, Ali; Jimenez-Berni, Jose; Deery, David M; Palmer, Doug; Holland, Edward; Rozas-Larraondo, Pablo; Chapman, Scott C; Georgakopoulos, Dimitrios; Furbank, Robert T
2015-01-01
To our knowledge, there is no software or database solution that supports large volumes of biological time series sensor data efficiently and enables data visualization and analysis in real time. Existing solutions for managing data typically use unstructured file systems or relational databases. These systems are not designed to provide instantaneous response to user queries. Furthermore, they do not support rapid data analysis and visualization to enable interactive experiments. In large scale experiments, this behaviour slows research discovery, discourages the widespread sharing and reuse of data that could otherwise inform critical decisions in a timely manner and encourage effective collaboration between groups. In this paper we present SensorDB, a web based virtual laboratory that can manage large volumes of biological time series sensor data while supporting rapid data queries and real-time user interaction. SensorDB is sensor agnostic and uses web-based, state-of-the-art cloud and storage technologies to efficiently gather, analyse and visualize data. Collaboration and data sharing between different agencies and groups is thereby facilitated. SensorDB is available online at http://sensordb.csiro.au.
ERIC Educational Resources Information Center
Hargreaves, Andy
This book examines the personal, moral, cultural, and political dimensions of teaching in the context of rapid and far-reaching change within teachers' work and in the world beyond it. The chapters in Part One examine the powerful forces for change in society and how those forces are exerting pressure on existing institutions. Issues such as the…
Eocene volcanism and the origin of horizon A
Gibson, T.G.; Towe, K.M.
1971-01-01
A series of closely time-equivalent deposits that correlate with seismic reflector horizon A exists along the coast of eastern North America. These sediments of Late-Early to Early-Middle Eocene age contain an authigenic mineral suite indicative of the alteration of volcanic glass. A volcanic origin for these siliceous deposits onshore is consistent with a volcanic origin for the cherts of horizon A offshore.
Proliferation of Shadow Education Institutions (SEI's) in the Philippines: A Time Series Analysis
ERIC Educational Resources Information Center
de Castro, Belinda V.; de Guzman, Allan B.
2013-01-01
While the issue on the existence of shadow education institutions (SEI's) has only recently been the subject of investigation in studies in various countries worldwide, it is clear that its market is a huge industry in much of Asia and is growing fast elsewhere. Capitalizing on the annual number of SEI's gathered from key government agencies and…
STEM connections to the GOES-R Satellite Series
NASA Astrophysics Data System (ADS)
Mooney, M. E.; Schmit, T.
2015-12-01
GOES-R, a new Geostationary Operational Environmental Satellite (GOES) is scheduled to be launched in October of 2016. Its role is to continue western hemisphere satellite coverage while the existing GOES series winds down its 20-year operation. However, instruments on the next generation GOES-R satellite series will provide major improvements to the current GOES, both in the frequency of images acquired and the spectral and spatial resolution of the images, providing a perfect conduit for STEM education. Most of these improvements will be provided by the Advanced Baseline Imager (ABI). ABI will provide three times more spectral information, four times the spatial resolution, and more than five times faster temporal coverage than the current GOES. Another exciting addition to the GOES-R satellite series will be the Geostationary Lightning Mapper (GLM). The all new GLM on GOES-R will measure total lightning activity continuously over the Americas and adjacent ocean regions with near uniform spatial resolution of approximately 10 km! Due to ABI, GLM and improved spacecraft calibration and navigation, the next generation GOES-R satellite series will usher in an exciting era of satellite applications and opportunities for STEM education. This session will present and demonstrate exciting next-gen imagery advancements and new HTML5 WebApps that demonstrate STEM connections to these improvements. Participants will also be invited to join the GOES-R Education Proving Ground, a national network of educators who will receive stipends to attend 4 webinars during the spring of 2016, pilot a STEM lesson plan, and organize a school-wide launch awareness event.
Deterministic chaos in atmospheric radon dynamics
NASA Astrophysics Data System (ADS)
Cuculeanu, Vasile; Lupu, Alexandru
2001-08-01
The correlation dimension and Lyapunov exponents have been calculated for two time series of atmospheric radon daughter concentrations obtained from four daily measurements during the period 1993-1996. A number of about 6000 activity concentration values of 222Rn and 220Rn daughters have been used. The measuring method is based on aerosol collection on filters. In order to determine the filter activity, a low background gross beta measuring device with Geiger-Müller counter tubes in anticoincidence was used. The small noninteger value of the correlation dimension (≃2.2) and the existence of a positive Lyapunov exponent prove that deterministic chaos is present in the time series of atmospheric 220Rn daughters. This shows that a simple diffusion equation with a parameterized turbulent diffusion coefficient is insufficient for describing the dynamics in the near-ground layer where turbulence is not fully developed and coherent structures dominate. The analysis of 222Rn series confirms that the dynamics of the boundary layer cannot be described by a system of ordinary differential equations with a low number of independent variables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Christopher J; Ahrens, James P; Wang, Jun
2010-10-15
Petascale simulations compute at resolutions ranging into billions of cells and write terabytes of data for visualization and analysis. Interactive visuaUzation of this time series is a desired step before starting a new run. The I/O subsystem and associated network often are a significant impediment to interactive visualization of time-varying data; as they are not configured or provisioned to provide necessary I/O read rates. In this paper, we propose a new I/O library for visualization applications: VisIO. Visualization applications commonly use N-to-N reads within their parallel enabled readers which provides an incentive for a shared-nothing approach to I/O, similar tomore » other data-intensive approaches such as Hadoop. However, unlike other data-intensive applications, visualization requires: (1) interactive performance for large data volumes, (2) compatibility with MPI and POSIX file system semantics for compatibility with existing infrastructure, and (3) use of existing file formats and their stipulated data partitioning rules. VisIO, provides a mechanism for using a non-POSIX distributed file system to provide linear scaling of 110 bandwidth. In addition, we introduce a novel scheduling algorithm that helps to co-locate visualization processes on nodes with the requested data. Testing using VisIO integrated into Para View was conducted using the Hadoop Distributed File System (HDFS) on TACC's Longhorn cluster. A representative dataset, VPIC, across 128 nodes showed a 64.4% read performance improvement compared to the provided Lustre installation. Also tested, was a dataset representing a global ocean salinity simulation that showed a 51.4% improvement in read performance over Lustre when using our VisIO system. VisIO, provides powerful high-performance I/O services to visualization applications, allowing for interactive performance with ultra-scale, time-series data.« less
From blackbirds to black holes: Investigating capture-recapture methods for time domain astronomy
NASA Astrophysics Data System (ADS)
Laycock, Silas G. T.
2017-07-01
In time domain astronomy, recurrent transients present a special problem: how to infer total populations from limited observations. Monitoring observations may give a biassed view of the underlying population due to limitations on observing time, visibility and instrumental sensitivity. A similar problem exists in the life sciences, where animal populations (such as migratory birds) or disease prevalence, must be estimated from sparse and incomplete data. The class of methods termed Capture-Recapture is used to reconstruct population estimates from time-series records of encounters with the study population. This paper investigates the performance of Capture-Recapture methods in astronomy via a series of numerical simulations. The Blackbirds code simulates monitoring of populations of transients, in this case accreting binary stars (neutron star or black hole accreting from a stellar companion) under a range of observing strategies. We first generate realistic light-curves for populations of binaries with contrasting orbital period distributions. These models are then randomly sampled at observing cadences typical of existing and planned monitoring surveys. The classical capture-recapture methods, Lincoln-Peterson, Schnabel estimators, related techniques, and newer methods implemented in the Rcapture package are compared. A general exponential model based on the radioactive decay law is introduced which is demonstrated to recover (at 95% confidence) the underlying population abundance and duty cycle, in a fraction of the observing visits (10-50%) required to discover all the sources in the simulation. Capture-Recapture is a promising addition to the toolbox of time domain astronomy, and methods implemented in R by the biostats community can be readily called from within python.
NASA Astrophysics Data System (ADS)
DeWalle, David R.; Boyer, Elizabeth W.; Buda, Anthony R.
2016-12-01
Forecasts of ecosystem changes due to variations in atmospheric emissions policies require a fundamental understanding of lag times between changes in chemical inputs and watershed response. Impacts of changes in atmospheric deposition in the United States have been documented using national and regional long-term environmental monitoring programs beginning several decades ago. Consequently, time series of weekly NADP atmospheric wet deposition and monthly EPA-Long Term Monitoring stream chemistry now exist for much of the Northeast which may provide insights into lag times. In this study of Appalachian forest basins, we estimated lag times for S, N and Cl by cross-correlating monthly data from four pairs of stream and deposition monitoring sites during the period from 1978 to 2012. A systems or impulse response function approach to cross-correlation was used to estimate lag times where the input deposition time series was pre-whitened using regression modeling and the stream response time series was filtered using the deposition regression model prior to cross-correlation. Cross-correlations for S were greatest at annual intervals over a relatively well-defined range of lags with the maximum correlations occurring at mean lags of 48 months. Chloride results were similar but more erratic with a mean lag of 57 months. Few high-correlation lags for N were indicated. Given the growing availability of atmospheric deposition and surface water chemistry monitoring data and our results for four Appalachian basins, further testing of cross-correlation as a method of estimating lag times on other basins appears justified.
The impact of seasonal signals on spatio-temporal filtering
NASA Astrophysics Data System (ADS)
Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz
2016-04-01
Existence of Common Mode Errors (CMEs) in permanent GNSS networks contribute to spatial and temporal correlation in residual time series. Time series from permanently observing GNSS stations of distance less than 2 000 km are similarly influenced by such CME sources as: mismodelling (Earth Orientation Parameters - EOP, satellite orbits or antenna phase center variations) during the process of the reference frame realization, large-scale atmospheric and hydrospheric effects as well as small scale crust deformations. Residuals obtained as a result of detrending and deseasonalising of topocentric GNSS time series arranged epoch-by-epoch form an observation matrix independently for each component (North, East, Up). CME is treated as internal structure of the data. Assuming a uniform temporal function across the network it is possible to filter CME out using PCA (Principal Component Analysis) approach. Some of above described CME sources may be reflected as a wide range of frequencies in GPS residual time series. In order to determine an impact of seasonal signals modeling to existence of spatial correlation in network and consequently the results of CME filtration, we chose two ways of modeling. The first approach was commonly presented by previous authors, who modeled with the Least-Squares Estimation (LSE) only annual and semi-annual oscillations. In the second one the set of residuals was a result of modeling of deterministic part that included fortnightly periods plus up to 9th harmonics of Chandlerian, tropical and draconitic oscillations. Correlation coefficients for residuals in parallel with KMO (Kaiser-Meyer-Olkin) statistic and Bartlett's test of sphericity were determined. For this research we used time series expressed in ITRF2008 provided by JPL (Jet Propulsion Laboratory). GPS processing was made using GIPSY-OASIS software in a PPP (Precise Point Positioning) mode. In order to form GPS station network that meet demands of uniform spatial response to the CME we chose 18 stations located in Central Europe. Created network extends up to 1500 kilometers. The KMO statistic indicate whether a component analysis may be useful for a chosen data set. We obtained KMO statistic value of 0.87 and 0.62 for residuals of Up component after first and second approaches were applied, what means that both residuals share common errors. Bartlett's test of sphericity analysis met a requirement that in both cases there are correlations in residuals. Another important results are the eigenvalues expressed as a percentage of the total variance explained by the first few components in PCA. For North, East and Up component we obtain respectively 68%, 75%, 65% and 47%, 54%, 52% after first and second approaches were applied. The results of CME filtration using PCA approach performed on both residual time series influence directly the uncertainty of the velocity of permanent stations. In our case spatial filtering reduces the uncertainty of velocity from 0.5 to 0.8 mm for horizontal components and from 0.6 to 0.9 mm on average for Up component when annual and semi-annual signals were assumed. Nevertheless, while second approach to the deterministic part modelling was used, deterioration of velocity uncertainty was noticed only for Up component, probably due to much higher autocorrelation in the time series when comparing to horizontal components.
Zhao, Yu Xi; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi
2018-04-01
Hydrological process evaluation is temporal dependent. Hydrological time series including dependence components do not meet the data consistency assumption for hydrological computation. Both of those factors cause great difficulty for water researches. Given the existence of hydrological dependence variability, we proposed a correlationcoefficient-based method for significance evaluation of hydrological dependence based on auto-regression model. By calculating the correlation coefficient between the original series and its dependence component and selecting reasonable thresholds of correlation coefficient, this method divided significance degree of dependence into no variability, weak variability, mid variability, strong variability, and drastic variability. By deducing the relationship between correlation coefficient and auto-correlation coefficient in each order of series, we found that the correlation coefficient was mainly determined by the magnitude of auto-correlation coefficient from the 1 order to p order, which clarified the theoretical basis of this method. With the first-order and second-order auto-regression models as examples, the reasonability of the deduced formula was verified through Monte-Carlo experiments to classify the relationship between correlation coefficient and auto-correlation coefficient. This method was used to analyze three observed hydrological time series. The results indicated the coexistence of stochastic and dependence characteristics in hydrological process.
What physicists should know about finance
NASA Astrophysics Data System (ADS)
Schmidt, Anatoly B.
2005-05-01
There has been growing interest in Econophysics, i.e. analysis and modeling of financial time series using the theoretical Physics concepts (scaling, fractals, chaos). Besides the scientific stimuli, this interest is backed by perception that the financial industry is a viable alternative for those physicists who are not able or are not willing to pursue an academic career. However, the times when any Ph.D. in Physics had a chance to find a job on the Wall Street are gone (if they ever existed). Indeed, not every physicist wields the stochastic calculus, non-normal statistical distributions, and the methods of time series analysis. Moreover, now that many universities offer courses in mathematical finance, the applicants for quantitative positions in finance are expected to know such concepts as option pricing, portfolio management, and risk measurement. Here I describe a synthetic course based on my book [1] that outlines both worlds: Econophysics and Mathematical Finance. The course may be offered as elective for senior undergraduate or graduate Physics majors.
Comparison of ocean mass content change from direct and inversion based approaches
NASA Astrophysics Data System (ADS)
Uebbing, Bernd; Kusche, Jürgen; Rietbroek, Roelof
2017-04-01
The GRACE satellite mission provides an indispensable tool for measuring oceanic mass variations. Such time series are essential to separate global mean sea level rise in thermosteric and mass driven contributions, and thus to constrain ocean heat content and (deep) ocean warming when viewed together with altimetry and Argo data. However, published estimates over the GRACE era differ, not only depending on the time window considered. Here, we will look into sources of such differences with direct and inverse approaches. Deriving ocean mass time series requires several processing steps; choosing a GRACE (and altimetry and Argo) product, data coverage, masks and filters to be applied in either spatial or spectral domain, corrections related to spatial leakage, GIA and geocenter motion need to be accounted for. In this study, we quantify the effects of individual processing choices and assumptions of the direct and inversion based approaches to derive ocean mass content change. Furthermore, we compile the different estimates from existing literature and sources, to highlight the differences.
Testing for Granger Causality in the Frequency Domain: A Phase Resampling Method.
Liu, Siwei; Molenaar, Peter
2016-01-01
This article introduces phase resampling, an existing but rarely used surrogate data method for making statistical inferences of Granger causality in frequency domain time series analysis. Granger causality testing is essential for establishing causal relations among variables in multivariate dynamic processes. However, testing for Granger causality in the frequency domain is challenging due to the nonlinear relation between frequency domain measures (e.g., partial directed coherence, generalized partial directed coherence) and time domain data. Through a simulation study, we demonstrate that phase resampling is a general and robust method for making statistical inferences even with short time series. With Gaussian data, phase resampling yields satisfactory type I and type II error rates in all but one condition we examine: when a small effect size is combined with an insufficient number of data points. Violations of normality lead to slightly higher error rates but are mostly within acceptable ranges. We illustrate the utility of phase resampling with two empirical examples involving multivariate electroencephalography (EEG) and skin conductance data.
Memory effects in stock price dynamics: evidences of technical trading
Garzarelli, Federico; Cristelli, Matthieu; Pompa, Gabriele; Zaccaria, Andrea; Pietronero, Luciano
2014-01-01
Technical trading represents a class of investment strategies for Financial Markets based on the analysis of trends and recurrent patterns in price time series. According standard economical theories these strategies should not be used because they cannot be profitable. On the contrary, it is well-known that technical traders exist and operate on different time scales. In this paper we investigate if technical trading produces detectable signals in price time series and if some kind of memory effects are introduced in the price dynamics. In particular, we focus on a specific figure called supports and resistances. We first develop a criterion to detect the potential values of supports and resistances. Then we show that memory effects in the price dynamics are associated to these selected values. In fact we show that prices more likely re-bounce than cross these values. Such an effect is a quantitative evidence of the so-called self-fulfilling prophecy, that is the self-reinforcement of agents' belief and sentiment about future stock prices' behavior. PMID:24671011
Memory effects in stock price dynamics: evidences of technical trading
NASA Astrophysics Data System (ADS)
Garzarelli, Federico; Cristelli, Matthieu; Pompa, Gabriele; Zaccaria, Andrea; Pietronero, Luciano
2014-03-01
Technical trading represents a class of investment strategies for Financial Markets based on the analysis of trends and recurrent patterns in price time series. According standard economical theories these strategies should not be used because they cannot be profitable. On the contrary, it is well-known that technical traders exist and operate on different time scales. In this paper we investigate if technical trading produces detectable signals in price time series and if some kind of memory effects are introduced in the price dynamics. In particular, we focus on a specific figure called supports and resistances. We first develop a criterion to detect the potential values of supports and resistances. Then we show that memory effects in the price dynamics are associated to these selected values. In fact we show that prices more likely re-bounce than cross these values. Such an effect is a quantitative evidence of the so-called self-fulfilling prophecy, that is the self-reinforcement of agents' belief and sentiment about future stock prices' behavior.
NASA Astrophysics Data System (ADS)
Zingone, Adriana; Harrison, Paul J.; Kraberg, Alexandra; Lehtinen, Sirpa; McQuatters-Gollop, Abigail; O'Brien, Todd; Sun, Jun; Jakobsen, Hans H.
2015-09-01
Phytoplankton diversity and its variation over an extended time scale can provide answers to a wide range of questions relevant to societal needs. These include human health, the safe and sustained use of marine resources and the ecological status of the marine environment, including long-term changes under the impact of multiple stressors. The analysis of phytoplankton data collected at the same place over time, as well as the comparison among different sampling sites, provide key information for assessing environmental change, and evaluating new actions that must be made to reduce human induced pressures on the environment. To achieve these aims, phytoplankton data may be used several decades later by users that have not participated in their production, including automatic data retrieval and analysis. The methods used in phytoplankton species analysis vary widely among research and monitoring groups, while quality control procedures have not been implemented in most cases. Here we highlight some of the main differences in the sampling and analytical procedures applied to phytoplankton analysis and identify critical steps that are required to improve the quality and inter-comparability of data obtained at different sites and/or times. Harmonization of methods may not be a realistic goal, considering the wide range of purposes of phytoplankton time-series data collection. However, we propose that more consistent and detailed metadata and complementary information be recorded and made available along with phytoplankton time-series datasets, including description of the procedures and elements allowing for a quality control of the data. To keep up with the progress in taxonomic research, there is a need for continued training of taxonomists, and for supporting and complementing existing web resources, in order to allow a constant upgrade of knowledge in phytoplankton classification and identification. Efforts towards the improvement of metadata recording, data annotation and quality control procedures will ensure the internal consistency of phytoplankton time series and facilitate their comparability and accessibility, thus strongly increasing the value of the precious information they provide. Ultimately, the sharing of quality controlled data will allow one to recoup the high cost of obtaining the data through the multiple use of the time-series data in various projects over many decades.
Cohen, Michael X
2017-09-27
The number of simultaneously recorded electrodes in neuroscience is steadily increasing, providing new opportunities for understanding brain function, but also new challenges for appropriately dealing with the increase in dimensionality. Multivariate source separation analysis methods have been particularly effective at improving signal-to-noise ratio while reducing the dimensionality of the data and are widely used for cleaning, classifying and source-localizing multichannel neural time series data. Most source separation methods produce a spatial component (that is, a weighted combination of channels to produce one time series); here, this is extended to apply source separation to a time series, with the idea of obtaining a weighted combination of successive time points, such that the weights are optimized to satisfy some criteria. This is achieved via a two-stage source separation procedure, in which an optimal spatial filter is first constructed and then its optimal temporal basis function is computed. This second stage is achieved with a time-delay-embedding matrix, in which additional rows of a matrix are created from time-delayed versions of existing rows. The optimal spatial and temporal weights can be obtained by solving a generalized eigendecomposition of covariance matrices. The method is demonstrated in simulated data and in an empirical electroencephalogram study on theta-band activity during response conflict. Spatiotemporal source separation has several advantages, including defining empirical filters without the need to apply sinusoidal narrowband filters. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems
NASA Astrophysics Data System (ADS)
Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.
2012-12-01
A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time-series data in the GUI as well. This GUI also includes mouse-clickable functions such as zooming or expanding the time-series display, measuring tsunami signal characteristics (arrival time, wave period and amplitude, etc.), and removing the tide signal from the time-series data. De-tiding of the time series is necessary to obtain accurate measurements of tsunami wave parameters and to maintain accurate historical tsunami databases. With TIDE TOOL, de-tiding is accomplished with a set of tide harmonic coefficients routinely computed and updated at PTWC for many of the stations in PTWC's inventory (~570). PTWC also uses the decoded time series files (previous 3-5 days' worth) to compute on-the-fly tide coefficients. The latter is useful in cases where the station is new and a long-term stable set of tide coefficients are not available or cannot be easily obtained due to various non-astronomical effects. The international tsunami warning system is coordinated globally by the UNESCO IOC, and a number of countries in the Pacific and Indian Ocean, and Caribbean depend on Tide Tool to monitor tsunamis in real time.
Modeling multivariate time series on manifolds with skew radial basis functions.
Jamshidi, Arta A; Kirby, Michael J
2011-01-01
We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.
NASA Astrophysics Data System (ADS)
Rawat, Kishan Singh; Singh, Sudhir Kumar; Jacintha, T. German Amali; Nemčić-Jurec, Jasna; Tripathi, Vinod Kumar
2017-12-01
A review has been made to understand the hydrogeochemical behaviour of groundwater through statistical analysis of long term water quality data (year 2005-2013). Water Quality Index ( WQI), descriptive statistics, Hurst exponent, fractal dimension and predictability index were estimated for each water parameter. WQI results showed that majority of samples fall in moderate category during 2005-2013, but monitoring site four falls under severe category (water unfit for domestic use). Brownian time series behaviour (a true random walk nature) exists between calcium (Ca^{2+}) and electric conductivity (EC); magnesium (Mg^{2+}) with EC; sodium (Na+) with EC; sulphate (SO4^{2-}) with EC; total dissolved solids (TDS) with chloride (Cl-) during pre- (2005-2013) and post- (2006-2013) monsoon season. These parameters have a closer value of Hurst exponent ( H) with Brownian time series behaviour condition (H=0.5). The result of times series analysis of water quality data shows a persistent behaviour (a positive autocorrelation) that has played a role between Cl- and Mg^{2+}, Cl- and Ca^{2+}, TDS and Na+, TDS and SO4^{2-}, TDS and Ca^{2+} in pre- and post-monsoon time series because of the higher value of H (>1). Whereas an anti-persistent behaviour (or negative autocorrelation) was found between Cl- and EC, TDS and EC during pre- and post-monsoon due to low value of H. The work outline shows that the groundwater of few areas needs treatment before direct consumption, and it also needs to be protected from contamination.
Arismendi, Ivan; Johnson, Sherri; Dunham, Jason B.; Haggerty, Roy; Hockman-Wert, David
2012-01-01
Temperature is a fundamentally important driver of ecosystem processes in streams. Recent warming of terrestrial climates around the globe has motivated concern about consequent increases in stream temperature. More specifically, observed trends of increasing air temperature and declining stream flow are widely believed to result in corresponding increases in stream temperature. Here, we examined the evidence for this using long-term stream temperature data from minimally and highly human-impacted sites located across the Pacific continental United States. Based on hypothesized climate impacts, we predicted that we should find warming trends in the maximum, mean and minimum temperatures, as well as increasing variability over time. These predictions were not fully realized. Warming trends were most prevalent in a small subset of locations with longer time series beginning in the 1950s. More recent series of observations (1987-2009) exhibited fewer warming trends and more cooling trends in both minimally and highly human-influenced systems. Trends in variability were much less evident, regardless of the length of time series. Based on these findings, we conclude that our perspective of climate impacts on stream temperatures is clouded considerably by a lack of long-termdata on minimally impacted streams, and biased spatio-temporal representation of existing time series. Overall our results highlight the need to develop more mechanistic, process-based understanding of linkages between climate change, other human impacts and stream temperature, and to deploy sensor networks that will provide better information on trends in stream temperatures in the future.
NASA Astrophysics Data System (ADS)
Boudhina, Nissaf; Zitouna-Chebbi, Rim; Mekki, Insaf; Jacob, Frédéric; Ben Mechlia, Nétij; Masmoudi, Moncef; Prévot, Laurent
2018-06-01
Estimating evapotranspiration in hilly watersheds is paramount for managing water resources, especially in semiarid/subhumid regions. The eddy covariance (EC) technique allows continuous measurements of latent heat flux (LE). However, time series of EC measurements often experience large portions of missing data because of instrumental malfunctions or quality filtering. Existing gap-filling methods are questionable over hilly crop fields because of changes in airflow inclination and subsequent aerodynamic properties. We evaluated the performances of different gap-filling methods before and after tailoring to conditions of hilly crop fields. The tailoring consisted of splitting the LE time series beforehand on the basis of upslope and downslope winds. The experiment was setup within an agricultural hilly watershed in northeastern Tunisia. EC measurements were collected throughout the growth cycle of three wheat crops, two of them located in adjacent fields on opposite hillslopes, and the third one located in a flat field. We considered four gap-filling methods: the REddyProc method, the linear regression between LE and net radiation (Rn), the multi-linear regression of LE against the other energy fluxes, and the use of evaporative fraction (EF). Regardless of the method, the splitting of the LE time series did not impact the gap-filling rate, and it might improve the accuracies on LE retrievals in some cases. Regardless of the method, the obtained accuracies on LE estimates after gap filling were close to instrumental accuracies, and they were comparable to those reported in previous studies over flat and mountainous terrains. Overall, REddyProc was the most appropriate method, for both gap-filling rate and retrieval accuracy. Thus, it seems possible to conduct gap filling for LE time series collected over hilly crop fields, provided the LE time series are split beforehand on the basis of upslope-downslope winds. Future works should address consecutive vegetation growth cycles for a larger panel of conditions in terms of climate, vegetation, and water status.
Analyses of GIMMS NDVI Time Series in Kogi State, Nigeria
NASA Astrophysics Data System (ADS)
Palka, Jessica; Wessollek, Christine; Karrasch, Pierre
2017-10-01
The value of remote sensing data is particularly evident where an areal monitoring is needed to provide information on the earth's surface development. The use of temporal high resolution time series data allows for detecting short-term changes. In Kogi State in Nigeria different vegetation types can be found. As the major population in this region is living in rural communities with crop farming the existing vegetation is slowly being altered. The expansion of agricultural land causes loss of natural vegetation, especially in the regions close to the rivers which are suitable for crop production. With regard to these facts, two questions can be dealt with covering different aspects of the development of vegetation in the Kogi state, the determination and evaluation of the general development of the vegetation in the study area (trend estimation) and analyses on a short-term behavior of vegetation conditions, which can provide information about seasonal effects in vegetation development. For this purpose, the GIMMS-NDVI data set, provided by the NOAA, provides information on the normalized difference vegetation index (NDVI) in a geometric resolution of approx. 8 km. The temporal resolution of 15 days allows the already described analyses. For the presented analysis data for the period 1981-2012 (31 years) were used. The implemented workflow mainly applies methods of time series analysis. The results show that in addition to the classical seasonal development, artefacts of different vegetation periods (several NDVI maxima) can be found in the data. The trend component of the time series shows a consistently positive development in the entire study area considering the full investigation period of 31 years. However, the results also show that this development has not been continuous and a simple linear modeling of the NDVI increase is only possible to a limited extent. For this reason, the trend modeling was extended by procedures for detecting structural breaks in the time series.
Identification of Boolean Network Models From Time Series Data Incorporating Prior Knowledge.
Leifeld, Thomas; Zhang, Zhihua; Zhang, Ping
2018-01-01
Motivation: Mathematical models take an important place in science and engineering. A model can help scientists to explain dynamic behavior of a system and to understand the functionality of system components. Since length of a time series and number of replicates is limited by the cost of experiments, Boolean networks as a structurally simple and parameter-free logical model for gene regulatory networks have attracted interests of many scientists. In order to fit into the biological contexts and to lower the data requirements, biological prior knowledge is taken into consideration during the inference procedure. In the literature, the existing identification approaches can only deal with a subset of possible types of prior knowledge. Results: We propose a new approach to identify Boolean networks from time series data incorporating prior knowledge, such as partial network structure, canalizing property, positive and negative unateness. Using vector form of Boolean variables and applying a generalized matrix multiplication called the semi-tensor product (STP), each Boolean function can be equivalently converted into a matrix expression. Based on this, the identification problem is reformulated as an integer linear programming problem to reveal the system matrix of Boolean model in a computationally efficient way, whose dynamics are consistent with the important dynamics captured in the data. By using prior knowledge the number of candidate functions can be reduced during the inference. Hence, identification incorporating prior knowledge is especially suitable for the case of small size time series data and data without sufficient stimuli. The proposed approach is illustrated with the help of a biological model of the network of oxidative stress response. Conclusions: The combination of efficient reformulation of the identification problem with the possibility to incorporate various types of prior knowledge enables the application of computational model inference to systems with limited amount of time series data. The general applicability of this methodological approach makes it suitable for a variety of biological systems and of general interest for biological and medical research.
Identification and Inference for Econometric Models
NASA Astrophysics Data System (ADS)
Andrews, Donald W. K.; Stock, James H.
2005-07-01
This volume contains the papers presented in honor of the lifelong achievements of Thomas J. Rothenberg on the occasion of his retirement. The authors of the chapters include many of the leading econometricians of our day, and the chapters address topics of current research significance in econometric theory. The chapters cover four themes: identification and efficient estimation in econometrics, asymptotic approximations to the distributions of econometric estimators and tests, inference involving potentially nonstationary time series, such as processes that might have a unit autoregressive root, and nonparametric and semiparametric inference. Several of the chapters provide overviews and treatments of basic conceptual issues, while others advance our understanding of the properties of existing econometric procedures and/or propose new ones. Specific topics include identification in nonlinear models, inference with weak instruments, tests for nonstationary in time series and panel data, generalized empirical likelihood estimation, and the bootstrap.
NASA Astrophysics Data System (ADS)
Cook, Steve; Watson, Duncan
2017-03-01
Following its introduction in the seminal study of Osborne (1959), a voluminous literature has emerged examining the returns-volume relationship for financial assets. The present paper revisits this relationship in an examination of the FTSE100 which extends the existing literature in two ways. First, alternative daily measures of the FTSE100 index are used to create differing returns and absolute returns series to employ in an examination of returns-volume causality. Second, rolling regression analysis is utilised to explore potential time variation in the returns-volume relationship. The findings obtained depict a hitherto unconsidered complexity in this relationship with the type of returns series considered and financial crisis found to be significant underlying factors. The implications of the newly derived results for both the understanding of the nature of the returns-volume relationship and the development of theories in connection to it are discussed.
mSieve: Differential Behavioral Privacy in Time Series of Mobile Sensor Data.
Saleheen, Nazir; Chakraborty, Supriyo; Ali, Nasir; Mahbubur Rahman, Md; Hossain, Syed Monowar; Bari, Rummana; Buder, Eugene; Srivastava, Mani; Kumar, Santosh
2016-09-01
Differential privacy concepts have been successfully used to protect anonymity of individuals in population-scale analysis. Sharing of mobile sensor data, especially physiological data, raise different privacy challenges, that of protecting private behaviors that can be revealed from time series of sensor data. Existing privacy mechanisms rely on noise addition and data perturbation. But the accuracy requirement on inferences drawn from physiological data, together with well-established limits within which these data values occur, render traditional privacy mechanisms inapplicable. In this work, we define a new behavioral privacy metric based on differential privacy and propose a novel data substitution mechanism to protect behavioral privacy. We evaluate the efficacy of our scheme using 660 hours of ECG, respiration, and activity data collected from 43 participants and demonstrate that it is possible to retain meaningful utility, in terms of inference accuracy (90%), while simultaneously preserving the privacy of sensitive behaviors.
Extended space expectation values in quantum dynamical system evolutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demiralp, Metin
2014-10-06
The time variant power series expansion for the expectation value of a given quantum dynamical operator is well-known and well-investigated issue in quantum dynamics. However, depending on the operator and Hamiltonian singularities this expansion either may not exist or may not converge for all time instances except the beginning of the evolution. This work focuses on this issue and seeks certain cures for the negativities. We work in the extended space obtained by adding all images of the initial wave function under the system Hamiltonian’s positive integer powers. This requires the introduction of certain appropriately defined weight operators. The resultingmore » better convergence in the temporal power series urges us to call the new defined entities “extended space expectation values” even though they are constructed over certain weight operators and are somehow pseudo expectation values.« less
Trading Network Predicts Stock Price
Sun, Xiao-Qian; Shen, Hua-Wei; Cheng, Xue-Qi
2014-01-01
Stock price prediction is an important and challenging problem for studying financial markets. Existing studies are mainly based on the time series of stock price or the operation performance of listed company. In this paper, we propose to predict stock price based on investors' trading behavior. For each stock, we characterize the daily trading relationship among its investors using a trading network. We then classify the nodes of trading network into three roles according to their connectivity pattern. Strong Granger causality is found between stock price and trading relationship indices, i.e., the fraction of trading relationship among nodes with different roles. We further predict stock price by incorporating these trading relationship indices into a neural network based on time series of stock price. Experimental results on 51 stocks in two Chinese Stock Exchanges demonstrate the accuracy of stock price prediction is significantly improved by the inclusion of trading relationship indices. PMID:24429767
Sectoral risk research about input-output structure of the United States
NASA Astrophysics Data System (ADS)
Zhang, Mao
2018-02-01
There exist rare researches about economic risk in sectoral level, which is significantly important for risk prewarning. This paper employed status coefficient to measure the symmetry of economic subnetwork, which is negatively correlated with sectoral risk. Then, we do empirical research in both cross section and time series dimensions. In cross section dimension, we study the correlation between sectoral status coefficient and sectoral volatility, earning rate and Sharpe ratio respectively in the year 2015. Next, in the perspective of time series, we first investigate the correlation change between sectoral status coefficient and annual total output from 1997 to 2015. Then, we divide the 71 sectors in America into agriculture, manufacturing, services and government, compare the trend terms of average sectoral status coefficients of the four industries and illustrate the causes behind it. We also find obvious abnormality in the sector of housing. At last, this paper puts forward some suggestions for the federal government.
78 FR 79333 - Airworthiness Directives; Airbus Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-30
...We propose to supersede airworthiness directive (AD) 2000-12- 12, for certain Airbus Model A300, A300-600, and A310 series airplanes. AD 2000-12-12 currently requires inspecting to detect cracks in the lower spar axis of the nacelle pylon between ribs 9 and 10, and repair if necessary. AD 2000-12-12 also provides for optional modification of the pylon, which terminates the inspections for Model A300 series airplanes. Since we issued AD 2000-12-12, we have received reports of cracking of the lower pylon spar after accomplishing the existing modification and have determined that shorter initial and repetitive inspection compliance times are necessary to address the identified unsafe condition. This proposed AD would reduce the initial and repetitive inspection compliance times. We are proposing this AD to detect and correct fatigue cracking, which could result in reduced structural integrity of the lower spar of the nacelle pylon.
Parallel photonic information processing at gigabyte per second data rates using transient states
NASA Astrophysics Data System (ADS)
Brunner, Daniel; Soriano, Miguel C.; Mirasso, Claudio R.; Fischer, Ingo
2013-01-01
The increasing demands on information processing require novel computational concepts and true parallelism. Nevertheless, hardware realizations of unconventional computing approaches never exceeded a marginal existence. While the application of optics in super-computing receives reawakened interest, new concepts, partly neuro-inspired, are being considered and developed. Here we experimentally demonstrate the potential of a simple photonic architecture to process information at unprecedented data rates, implementing a learning-based approach. A semiconductor laser subject to delayed self-feedback and optical data injection is employed to solve computationally hard tasks. We demonstrate simultaneous spoken digit and speaker recognition and chaotic time-series prediction at data rates beyond 1Gbyte/s. We identify all digits with very low classification errors and perform chaotic time-series prediction with 10% error. Our approach bridges the areas of photonic information processing, cognitive and information science.
mSieve: Differential Behavioral Privacy in Time Series of Mobile Sensor Data
Saleheen, Nazir; Chakraborty, Supriyo; Ali, Nasir; Mahbubur Rahman, Md; Hossain, Syed Monowar; Bari, Rummana; Buder, Eugene; Srivastava, Mani; Kumar, Santosh
2016-01-01
Differential privacy concepts have been successfully used to protect anonymity of individuals in population-scale analysis. Sharing of mobile sensor data, especially physiological data, raise different privacy challenges, that of protecting private behaviors that can be revealed from time series of sensor data. Existing privacy mechanisms rely on noise addition and data perturbation. But the accuracy requirement on inferences drawn from physiological data, together with well-established limits within which these data values occur, render traditional privacy mechanisms inapplicable. In this work, we define a new behavioral privacy metric based on differential privacy and propose a novel data substitution mechanism to protect behavioral privacy. We evaluate the efficacy of our scheme using 660 hours of ECG, respiration, and activity data collected from 43 participants and demonstrate that it is possible to retain meaningful utility, in terms of inference accuracy (90%), while simultaneously preserving the privacy of sensitive behaviors. PMID:28058408
Predicting Flavonoid UGT Regioselectivity
Jackson, Rhydon; Knisley, Debra; McIntosh, Cecilia; Pfeiffer, Phillip
2011-01-01
Machine learning was applied to a challenging and biologically significant protein classification problem: the prediction of avonoid UGT acceptor regioselectivity from primary sequence. Novel indices characterizing graphical models of residues were proposed and found to be widely distributed among existing amino acid indices and to cluster residues appropriately. UGT subsequences biochemically linked to regioselectivity were modeled as sets of index sequences. Several learning techniques incorporating these UGT models were compared with classifications based on standard sequence alignment scores. These techniques included an application of time series distance functions to protein classification. Time series distances defined on the index sequences were used in nearest neighbor and support vector machine classifiers. Additionally, Bayesian neural network classifiers were applied to the index sequences. The experiments identified improvements over the nearest neighbor and support vector machine classifications relying on standard alignment similarity scores, as well as strong correlations between specific subsequences and regioselectivities. PMID:21747849
Trading network predicts stock price.
Sun, Xiao-Qian; Shen, Hua-Wei; Cheng, Xue-Qi
2014-01-16
Stock price prediction is an important and challenging problem for studying financial markets. Existing studies are mainly based on the time series of stock price or the operation performance of listed company. In this paper, we propose to predict stock price based on investors' trading behavior. For each stock, we characterize the daily trading relationship among its investors using a trading network. We then classify the nodes of trading network into three roles according to their connectivity pattern. Strong Granger causality is found between stock price and trading relationship indices, i.e., the fraction of trading relationship among nodes with different roles. We further predict stock price by incorporating these trading relationship indices into a neural network based on time series of stock price. Experimental results on 51 stocks in two Chinese Stock Exchanges demonstrate the accuracy of stock price prediction is significantly improved by the inclusion of trading relationship indices.
NASA Astrophysics Data System (ADS)
Wu, Qi
2010-03-01
Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.
Analytical concepts for health management systems of liquid rocket engines
NASA Technical Reports Server (NTRS)
Williams, Richard; Tulpule, Sharayu; Hawman, Michael
1990-01-01
Substantial improvement in health management systems performance can be realized by implementing advanced analytical methods of processing existing liquid rocket engine sensor data. In this paper, such techniques ranging from time series analysis to multisensor pattern recognition to expert systems to fault isolation models are examined and contrasted. The performance of several of these methods is evaluated using data from test firings of the Space Shuttle main engines.
Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach
Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei
2016-01-01
Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795
Wang, Fang
2016-06-01
In order to detect and quantify asymmetry of two time series, a novel cross-correlation coefficient is proposed based on recent asymmetric detrended cross-correlation analysis (A-DXA), which we called A-DXA coefficient. The A-DXA coefficient, as an important extension of DXA coefficient ρDXA, contains two directional asymmetric cross-correlated indexes, describing upwards and downwards asymmetric cross-correlations, respectively. By using the information of directional covariance function of two time series and directional variance function of each series itself instead of power-law between the covariance function and time scale, the proposed A-DXA coefficient can well detect asymmetry between the two series no matter whether the cross-correlation is significant or not. By means of the proposed A-DXA coefficient conducted over the asymmetry for California electricity market, we found that the asymmetry between the prices and loads is not significant for daily average data in 1999 yr market (before electricity crisis) but extremely significant for those in 2000 yr market (during the crisis). To further uncover the difference of asymmetry between the years 1999 and 2000, a modified H statistic (MH) and ΔMH statistic are proposed. One of the present contributions is that the high MH values calculated for hourly data exist in majority months in 2000 market. Another important conclusion is that the cross-correlation with downwards dominates over the whole 1999 yr in contrast to the cross-correlation with upwards dominates over the 2000 yr.
Mei, Jiangyuan; Liu, Meizhu; Wang, Yuan-Fang; Gao, Huijun
2016-06-01
Multivariate time series (MTS) datasets broadly exist in numerous fields, including health care, multimedia, finance, and biometrics. How to classify MTS accurately has become a hot research topic since it is an important element in many computer vision and pattern recognition applications. In this paper, we propose a Mahalanobis distance-based dynamic time warping (DTW) measure for MTS classification. The Mahalanobis distance builds an accurate relationship between each variable and its corresponding category. It is utilized to calculate the local distance between vectors in MTS. Then we use DTW to align those MTS which are out of synchronization or with different lengths. After that, how to learn an accurate Mahalanobis distance function becomes another key problem. This paper establishes a LogDet divergence-based metric learning with triplet constraint model which can learn Mahalanobis matrix with high precision and robustness. Furthermore, the proposed method is applied on nine MTS datasets selected from the University of California, Irvine machine learning repository and Robert T. Olszewski's homepage, and the results demonstrate the improved performance of the proposed approach.
NASA Astrophysics Data System (ADS)
Bildirici, Melike; Sonustun, Fulya Ozaksoy; Sonustun, Bahri
2018-01-01
In the regards of chaos theory, new concepts such as complexity, determinism, quantum mechanics, relativity, multiple equilibrium, complexity, (continuously) instability, nonlinearity, heterogeneous agents, irregularity were widely questioned in economics. It is noticed that linear models are insufficient for analyzing unpredictable, irregular and noncyclical oscillations of economies, and for predicting bubbles, financial crisis, business cycles in financial markets. Therefore, economists gave great consequence to use appropriate tools for modelling non-linear dynamical structures and chaotic behaviors of the economies especially in macro and the financial economy. In this paper, we aim to model the chaotic structure of exchange rates (USD-TL and EUR-TL). To determine non-linear patterns of the selected time series, daily returns of the exchange rates were tested by BDS during the period from January 01, 2002 to May 11, 2017 which covers after the era of the 2001 financial crisis. After specifying the non-linear structure of the selected time series, it was aimed to examine the chaotic characteristic for the selected time period by Lyapunov Exponents. The findings verify the existence of the chaotic structure of the exchange rate returns in the analyzed time period.
Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il
A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less
NASA Astrophysics Data System (ADS)
Lanfredi, M.; Simoniello, T.; Cuomo, V.; Macchiato, M.
2009-02-01
This study originated from recent results reported in literature, which support the existence of long-range (power-law) persistence in atmospheric temperature fluctuations on monthly and inter-annual scales. We investigated the results of Detrended Fluctuation Analysis (DFA) carried out on twenty-two historical daily time series recorded in Europe in order to evaluate the reliability of such findings in depth. More detailed inspections emphasized systematic deviations from power-law and high statistical confidence for functional form misspecification. Rigorous analyses did not support scale-free correlation as an operative concept for Climate modelling, as instead suggested in literature. In order to understand the physical implications of our results better, we designed a bivariate Markov process, parameterised on the basis of the atmospheric observational data by introducing a slow dummy variable. The time series generated by this model, analysed both in time and frequency domains, tallied with the real ones very well. They accounted for both the deceptive scaling found in literature and the correlation details enhanced by our analysis. Our results seem to evidence the presence of slow fluctuations from another climatic sub-system such as ocean, which inflates temperature variance up to several months. They advise more precise re-analyses of temperature time series before suggesting dynamical paradigms useful for Climate modelling and for the assessment of Climate Change.
NASA Astrophysics Data System (ADS)
Lanfredi, M.; Simoniello, T.; Cuomo, V.; Macchiato, M.
2009-07-01
This study originated from recent results reported in literature, which support the existence of long-range (power-law) persistence in atmospheric temperature fluctuations on monthly and inter-annual scales. We investigated the results of Detrended Fluctuation Analysis (DFA) carried out on twenty-two historical daily time series recorded in Europe in order to evaluate the reliability of such findings in depth. More detailed inspections emphasized systematic deviations from power-law and high statistical confidence for functional form misspecification. Rigorous analyses did not support scale-free correlation as an operative concept for Climate modelling, as instead suggested in literature. In order to understand the physical implications of our results better, we designed a bivariate Markov process, parameterised on the basis of the atmospheric observational data by introducing a slow dummy variable. The time series generated by this model, analysed both in time and frequency domains, tallied with the real ones very well. They accounted for both the deceptive scaling found in literature and the correlation details enhanced by our analysis. Our results seem to evidence the presence of slow fluctuations from another climatic sub-system such as ocean, which inflates temperature variance up to several months. They advise more precise re-analyses of temperature time series before suggesting dynamical paradigms useful for Climate modelling and for the assessment of Climate Change.
Scaling analysis of bilateral hand tremor movements in essential tremor patients.
Blesic, S; Maric, J; Dragasevic, N; Milanovic, S; Kostic, V; Ljubisavljevic, Milos
2011-08-01
Recent evidence suggests that the dynamic-scaling behavior of the time-series of signals extracted from separate peaks of tremor spectra may reveal existence of multiple independent sources of tremor. Here, we have studied dynamic characteristics of the time-series of hand tremor movements in essential tremor (ET) patients using the detrended fluctuation analysis method. Hand accelerometry was recorded with (500 g) and without weight loading under postural conditions in 25 ET patients and 20 normal subjects. The time-series comprising peak-to-peak (PtP) intervals were extracted from regions around the first three main frequency components of power spectra (PwS) of the recorded tremors. The data were compared between the load and no-load condition on dominant (related to tremor severity) and non-dominant tremor side and with the normal (physiological) oscillations in healthy subjects. Our analysis shows that, in ET, the dynamic characteristics of the main frequency component of recorded tremors exhibit scaling behavior. Furthermore, they show that the two main components of ET tremor frequency spectra, otherwise indistinguishable without load, become significantly different after inertial loading and that they differ between the tremor sides (related to tremor severity). These results show that scaling, a time-domain analysis, helps revealing tremor features previously not revealed by frequency-domain analysis and suggest that distinct oscillatory central circuits may generate the tremor in ET patients.
Miró, Conrado; Baeza, Antonio; Madruga, María J; Periañez, Raul
2012-11-01
The objective of this work consisted of analysing the spatial and temporal evolution of two radionuclide concentrations in the Tagus River. Time-series analysis techniques and numerical modelling have been used in this study. (137)Cs and (90)Sr concentrations have been measured from 1994 to 1999 at several sampling points in Spain and Portugal. These radionuclides have been introduced into the river by the liquid releases from several nuclear power plants in Spain, as well as from global fallout. Time-series analysis techniques have allowed the determination of radionuclide transit times along the river, and have also pointed out the existence of temporal cycles of radionuclide concentrations at some sampling points, which are attributed to water management in the reservoirs placed along the Tagus River. A stochastic dispersion model, in which transport with water, radioactive decay and water-sediment interactions are solved through Monte Carlo methods, has been developed. Model results are, in general, in reasonable agreement with measurements. The model has finally been applied to the calculation of mean ages of radioactive content in water and sediments in each reservoir. This kind of model can be a very useful tool to support the decision-making process after an eventual emergency situation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Exploring Low-Amplitude, Long-Duration Deformational Transients on the Cascadia Subduction Zone
NASA Astrophysics Data System (ADS)
Nuyen, C.; Schmidt, D. A.
2017-12-01
The absence of long-term slow slip events (SSEs) in Cascadia is enigmatic on account of the diverse group of subduction zone systems that do experience long-term SSEs. In particular, southwest Japan, Alaska, New Zealand and Mexico have observed long-term SSEs, with some of the larger events exhibiting centimeter-scale surface displacements over the course of multiple years. The conditions that encourage long-term slow slip are not well established due to the variability in thermal parameter and plate dip amongst subduction zones that host long-term events. The Cascadia Subduction Zone likely has the capacity to host long-term SSEs, and the lack of such events motivates further exploration of the observational data. In order to search for the existence of long-duration transients in surface displacements, we examine Cascadia GPS time series from PANGA and PBO to determine whether or not Cascadia has hosted a long-term slow slip event in the past 20 years. A careful review of the time series does not reveal any large-scale multi-year transients. In order to more clearly recognize possible small amplitude long-term SSEs in Cascadia, the GPS time series are reduced with two separate methods. The first method involves manually removing (1) continental water loading terms, (2) transient displacements of known short-term SSEs, and (3) common mode signals that span the network. The second method utilizes a seasonal-trend decomposition procedure (STL) to extract a long-term trend from the GPS time-series. Manual inspection of both of these products reveals intriguing long-term changes in the longitudinal component of several GPS stations in central Cascadia. To determine whether these shifts could be due to long-term slow slip, we invert the reduced surface displacement time series for fault slip using a principle component analysis-based inversion method. We also utilize forward fault models of various synthetic long-term SSEs to better understand how these events may appear in the time series for a range of magnitudes and durations. Results from this research have direct implications for the possible slip modes in Cascadia and how variations in slip over time can impact stress and strain accumulations along the margin.
NASA Astrophysics Data System (ADS)
Marcos-Garcia, Patricia; Pulido-Velazquez, Manuel; Lopez-Nicolas, Antonio
2016-04-01
Extreme natural phenomena, and more specifically droughts, constitute a serious environmental, economic and social issue in Southern Mediterranean countries, common in the Mediterranean Spanish basins due to the high temporal and spatial rainfall variability. Drought events are characterized by their complexity, being often difficult to identify and quantify both in time and space, and an universally accepted definition does not even exist. This fact, along with future uncertainty about the duration and intensity of the phenomena on account of climate change, makes necessary increasing the knowledge about the impacts of climate change on droughts in order to design management plans and mitigation strategies. The present abstract aims to evaluate the impact of climate change on both meteorological and hydrological droughts, through the use of a generalization of the Standardized Precipitation Index (SPI). We use the Standardized Flow Index (SFI) to assess the hydrological drought, using flow time series instead of rainfall time series. In the case of the meteorological droughts, the Standardized Precipitation and Evapotranspiration Index (SPEI) has been applied to assess the variability of temperature impacts. In order to characterize climate change impacts on droughts, we have used projections from the CORDEX project (Coordinated Regional Climate Downscaling Experiment). Future rainfall and temperature time series for short (2011-2040) and medium terms (2041-2070) were obtained, applying a quantile mapping method to correct the bias of these time series. Regarding the hydrological drought, the Témez hydrological model has been applied to simulate the impacts of future temperature and rainfall time series on runoff and river discharges. It is a conceptual, lumped and a few parameters hydrological model. Nevertheless, it is necessary to point out the time difference between the meteorological and the hydrological droughts. The case study is the Jucar river basin (Spain), a highly regulated system with a share of 80% of water use for irrigated agriculture. The results show that the climate change would increase the historical drought impacts in the river basin. Acknowledgments The study has been supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and European FEDER funds.
The Interstellar Gas Dust Streams and Seeds of Life
NASA Astrophysics Data System (ADS)
Oleg, Khavroshkin; Vladislav, Tsyplakov
Gas dust Streams from Double Stars & Lunar Seismicity. The time series of seismic events were generated as follows: on the ordinate axis the peak amplitudes of events in standard units, on abscissa axis - seismogram durations of the same moonquakes and subsequent time intervals between them were used. Spectrum of the series disclosed time picks on hidden cosmological periodicities of lunar seismicity. A part of results (picks) presents orbital periods of double stars nearest to the Solar system. The explanation of that results is existing gas-dust streams from binary stars systems and interacting of it with lunar surface. Information content of the Nakamura`s Catalog of moonquakes is very rich: from solar-earth tides to clustering among the meteoroid streams [1, 2]. The histograms from meteoroid impacts seismic data revealed the seismic wave responses of the Moon to solar oscillations and the action on the lunar surface by dust-gas plasma of meteoroid streams [3]. The time series of seismic events were generated as follows: on an axis of ordinates - the peak amplitudes of events in standard units, on an abscissa - seismogram durations of the same moonquakes and subsequent time intervals between them were put aside [4]. Spectrum of the series of meteoroid streams seismicity disclosed time picks on orbital periods some planets and their satellites and solar oscillations [4, 5]. The research of peculiarities of histogram envelopes [3] and comparative common analysis solar bursts data and mass meteoroid distribution are confirmed [3, 4] and revealed Forbush`s effect for gas-dust plasma [6]. Hidden astrophysical periodicities of lunar seismicity were obtained early from an analysis of time series [7] which were similarity to series [4]. The path of results of [7] is presented in the Table where picks presents orbital periods of double stars nearest to the Solar system. Hypothesis for explanation of the Table results is existing gas-dust streams from binary stars near systems solar system and interacting with lunar surface. Characteristic of binary stars systems and picked out periods of lunar seismicity are publish. Genesis of Life. If the solar system is reached by the gas-dust streams from binary stars, then all bodes in space have particles of star dust on their surfaces and/or atmospheres. Solar system has made 8-10 revolutions around galactic center and thus captured dust from many thousands stars. As these stars caught in turn dust particles from other stars too then probably our solar system has mainly dust samples from all objects of our galaxy. The age of galaxy and old stars is approximately more than15 billion years and that of the Earth is only 4, 5 Gyr. Genesis of Life for the Earth has not more than 3 billion years. Thus comparative analysis of simple balance of these times shows that the genesis of Life for Earth is the result of galactic processes/objects and not of the solar system of course. Peculiarity of Genesis. After formation of the solar system all old and new captured dust particles are first accumulated in the Oort cloud and then they are carried by comets to planets. The modern state of the Earth exists for more than 3 billion years, so possibilities for appearing Life were always. These processes had happened a few times during this period of the Earth state. The sizes of the universe and galaxies at t0 < 1 billion years could be much less than modern estimates (for example, up to 15 times in diameter), that implies the existence of a common gas-dust exchange. The density of physical fields and radiations at the moment τ0 was many orders of magnitude higher than the density existing now. Disintegration of neutron substance and nucleus of heavy unstable elements have caused constantly existing streams of left polarized electrons which have determined chirality’s asymmetry of original organic molecules and thus the hilarity of the existing biological world. Some types of radiations functionally could replace enzymes during formation of self-reproducing molecular structures. Man is used only 10 % of the genetic information. It indicates the common total surplus of a genetic material of biosphere of the Earth. Probably, at the moment t0 in unique conditions and with sufficient time for creation the universal galactic gene was created which different elements are capable to create biospheres on planets with the widest set of external conditions and for various stages of development of everyone. If the universal uniform galactic genome exists, this universality will appear as redundancy. The universal model of the gene logically contacts the concept of a prediction and designer, hence, the model of occurrence of life and the Creator is logically more proved. Gas - Dust Streams and Safety of Life Seeds. General rule in this case plays by Gas - Dust Structure (plasma crystals). Seeds of life & Epidemic on the Earth. Discovery existence of strong correlation between appearance comets which fly beside Earth and meteoroids impacts on day surface Earth with people epidemics. Cosmonaut Serebrov dearth and gas dust streams. Why epidemics are being so seldom? References 1. Sadeh D. Possible siderial period for the seismic lunar activity // Nature, 1972. Vol. 240, p.139 2. Oberst J. and Nakamura Y. A Search for Clustering among the Meteoroid Impacts Detected by the Apollo Lunar Seismic Network // ICARUS, Vol. 91, 315-325, 1991; Balazin M. and Zetzsche A. // PHYS.STAT.SOL., Vol.2, ,1962 1670-1674 3. Khavroshkin O.B. and Tsyplakov V.V. Meteoroid stream impacts on the Moon: Information of duration of the seismograms / In: Proceedings of the Conference METEOROID 2001, Swedish Institute of Space Physics, Kiruna, Sweden, 6-10 August 2001 4. Khavroshkin O.B. and Tsyplakov V.V., Temporal Structure of Meteoroid Streams and Lunar Seismicity according to Nakamura's Catalogue / In: Proceedings of the Conference METEOROID 2001, Swedish Institute of Space Physics, Kiruna, Sweden, 6-10 August 2001 5. O.B.Khavroshkin, V.V.Tsyplakov. Moon exogenous seismicity: meteoroid streams, micrometeorites and IDPs, Solar wind // Herald of the DGGGMS RAS: Electr. Sci.-Inf. J., 4(21)’2003 http://www.scgis.ru/russian/cp1251/h_dgggms/1-2003/scpub-3.pdf 6. O.B.Khavroshkin, V.V.Tsyplakov. Peculiarities of envelops of histograms of lunar impact seismogram durations / In: Geophysical research essays. Schmidt United Institute of Physics of the Earth Press, Moscow, 2003. 471 p., (in Russian). 2003, 471a;. 7. O.B.Khavroshkin, V.V.Tsyplakov. Hidden astrophysical periodicities of lunar seismisity // Herald of the DGGGMS RAS: Electr. Sci.-Inf. J., 4(14)` 2000 • http://www.scgis.ru/russian/cp1251/h_dgggms/4-2000/scpub-3.pdf
NASA Astrophysics Data System (ADS)
Molina, A.; Vanacker, V.; Brisson, E.; Balthazar, V.
2012-04-01
Interactions between human activities and the physical environment have increasingly transformed the hydrological functioning of Andean ecosystems. In these human-modified landscapes, land use/-cover change may have a profound effect on riverine water and sediment fluxes. The hydrological impacts of land use/-cover change are diverse, as changes in vegetation affect the various components of the hydrological cycle including evapotranspiration, infiltration and surface runoff. Quantitative data for tropical mountain regions are scarce, as few long time series on rainfall, water discharge and land use are available. Furthermore, time series of rainfall and streamflow data in tropical mountains are often highly influenced by large inter- and intra-annual variability. In this paper, we analyse the hydrological response to complex forest cover change for a catchment of 280 km2 located in the Ecuadorian Andes. Forest cover change in the Pangor catchment was reconstructed based on airphotos (1963, 1977), LANDSAT TM (1991) and ETM+ data (2001, 2009). From 1963, natural vegetation was converted to agricultural land and pine plantations: forests decreased by a factor 2, and paramo decreased by 20 km2 between 1963 and 2009. For this catchment, there exists an exceptionally long record of rainfall and streamflow data that dates back from the '70s till now, but large variability in hydrometeorological data exists that is partly related to ENSO events. Given the nonstationary and nonlinear character of the ENSO-related changes in rainfall, we used the Hilbert-Huang transformation to detrend the time series of the river flow data from inter- and intra-annual fluctuations in rainfall. After applying adaptive data analysis based on empirical model decomposition techniques, it becomes apparent that the long-term trend in streamflow is different from the long-term trend in rainfall data. While the streamflow data show a long-term decrease in monthly flow, the rainfall data have a trend of increasing and then decreasing precipitation amounts. These results suggest that the land use changes had an important impact on the total water yield of the catchment. Interestingly, the effect of reforestation in the upper part of the catchment with its associated decrease in water yield seems to be dominant over the effect of deforestation in the lower part of the basin.
Do regional methods really help reduce uncertainties in flood frequency analyses?
NASA Astrophysics Data System (ADS)
Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric
2013-04-01
Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.
NASA Astrophysics Data System (ADS)
Daux, V.; Garcia de Cortazar-Atauri, I.; Yiou, P.; Chuine, I.; Garnier, E.; Ladurie, E. Le Roy; Mestre, O.; Tardaguila, J.
2011-11-01
We present a dataset of grape harvest dates (GHD) series that has been compiled from international and non-translated French and Spanish literature and from unpublished documentary sources from public organizations and from wine-growers. As of June 2011, this GHD dataset comprises 378 series mainly from France (93% of the data) as well as series from Switzerland, Italy, Spain and Luxembourg. The series have variable length and contain gaps of variable sizes. The longest and most complete ones are from Burgundy, Switzerland, Southern Rhône valley, Jura and Ile-de-France. The GHD series were grouped into 27 regions according to their location, to geomorphological and geological criteria, and to past and present grape varieties. The GHD regional composite series (GHD-RCS) were calculated and compared pairwise to assess the quality of the series. Significant (p-value < 0.001) and strong correlations exist between most of them. As expected, the correlations tended to be higher when the vineyards are closer, the highest correlation (R = 0.91) being obtained between the High Loire Valley and the Ile-de-France GHD-RCS. The strong dependence of vine cycle on temperature and, therefore, the strong link between GHD and the temperature of the growing season was also used to test the quality of the GHD series. The strongest correlations are obtained between the GHD-RCS and the temperature series of the nearest weather stations. Moreover, the GHD-RCS/temperature correlation maps show spatial patterns similar to temperature correlation maps. The stability of the correlations over time is explored. The most striking feature is their generalized deterioration at the late 19th-early 20th turning point. The possible effects on the GHD of the phylloxera crisis, which took place at this time, are discussed. The median of the standardized GHD-RCS was calculated. The distribution of the extreme years of this general synthetic series is not homogenous. Extremely late years all occur during a two-century long time-window from the early 17th to the early 19th century, while extremely early years are frequent during the 16th and since the mid-19th century. The dataset is made accessible for climate research through the Internet. It should allow a variety of climate studies, including reconstructions of atmospheric circulation over Western Europe.
Lunar periodicity in the shell flux of some planktonic foraminifera in the Gulf of Mexico
NASA Astrophysics Data System (ADS)
Jonkers, L.; Reynolds, C. E.; Richey, J.; Hall, I. R.
2014-12-01
Synchronised reproduction offers clear benefits to planktonic foraminifera - an important group of marine calcifiers - as it increases the chances of successful gamete fusion. Such synchrony requires tuning to an internal or external clock. Evidence exists for lunar reproductive cycles in some species, but its recognition in shell flux time series has proven difficult, raising questions about reproductive strategies. Using spectral analysis of a 6 year time series (mostly at weekly resolution) from the northern Gulf of Mexico we show that the shell flux of Globorotalia menardii, Globigerinella siphonifera, Orbulina universa, Globigerinoides sacculifer and in Globigerinoides ruber (both pink and white varieties) is characterised by lunar periodicity. The fluxes of Pulleniatina obliquiloculata, Neogloboquadrina dutertrei, Globigerinella calida, Globorotalia crassaformis and Globigerinita glutinata do not show significant spectral power at the lunar frequency. If present, lunar periodicity is superimposed on longer term/seasonal changes in the shell fluxes, but accounts for a significant part of the variance in the fluxes. The amplitude of the lunar cycle increases roughly proportional with the magnitude of the flux, demonstrating that most of the population is indeed affected by lunar-phased synchronisation. Phasing of peak fluxes appears species-specific, with G. menardii, O. universa and G. sacculifer showing most peaks around the full moon and G. ruber one week later. Contrastingly, peaks G. siphonifera occur dominantly around new moon. Very limited literature exists on lunar phasing of foraminiferal export fluxes, but spatial differences in its presence may exist, corroborating the exogenous nature of lunar synchrony in planktonic foraminifera.
17 CFR 232.313 - Identification of investment company type and series and/or class (or contract).
Code of Federal Regulations, 2013 CFR
2013-04-01
... company type and series and/or class (or contract). 232.313 Section 232.313 Commodity and Securities... series and/or class (or contract). (a) Registered investment companies and business development companies... keep current, information concerning their existing and new series and/or classes (or contracts, in the...
17 CFR 232.313 - Identification of investment company type and series and/or class (or contract).
Code of Federal Regulations, 2010 CFR
2010-04-01
... company type and series and/or class (or contract). 232.313 Section 232.313 Commodity and Securities... series and/or class (or contract). (a) Registered investment companies and business development companies... keep current, information concerning their existing and new series and/or classes (or contracts, in the...
Álvarez, Natalí; Gómez, Giovan F; Naranjo-Díaz, Nelson; Correa, Margarita M
2018-06-18
The Arribalzagia Series of the Anopheles Subgenus comprises morphologically similar species or members of species complexes which makes correct species identification difficult. Therefore, the aim of this work was to discriminate the morphospecies of the Arribalzagia Series present in Colombia using a multilocus approach based on ITS2, COI and CAD sequences. Specimens of the Arribalzagia Series collected at 32 localities in nine departments were allocated to seven species. Individual and concatenated Bayesian analyses showed high support for each of the species and reinforced the previous report of the Apicimacula species Complex with distribution in the Pacific Coast and northwestern Colombia. In addition, a new molecular operational taxonomic unit-MOTU was identified, herein denominated near Anopheles peryassui, providing support for the existence of a Peryassui species Complex. Further, the CAD gene, just recently used for Anopheles taxonomy and phylogeny, demonstrated its power in resolving phylogenetic relationships among species of the Arribalzagia Series. The divergence times for these species correspond to the early Pliocene and the Miocene. Considering the epidemiological importance of some species of the Series and their co-occurrence in malaria endemic regions of Colombia, their discrimination constitutes an important step for vector incrimination and control in the country. Copyright © 2018. Published by Elsevier B.V.
Park, Chihyun; Yun, So Jeong; Ryu, Sung Jin; Lee, Soyoung; Lee, Young-Sam; Yoon, Youngmi; Park, Sang Chul
2017-03-15
Cellular senescence irreversibly arrests growth of human diploid cells. In addition, recent studies have indicated that senescence is a multi-step evolving process related to important complex biological processes. Most studies analyzed only the genes and their functions representing each senescence phase without considering gene-level interactions and continuously perturbed genes. It is necessary to reveal the genotypic mechanism inferred by affected genes and their interaction underlying the senescence process. We suggested a novel computational approach to identify an integrative network which profiles an underlying genotypic signature from time-series gene expression data. The relatively perturbed genes were selected for each time point based on the proposed scoring measure denominated as perturbation scores. Then, the selected genes were integrated with protein-protein interactions to construct time point specific network. From these constructed networks, the conserved edges across time point were extracted for the common network and statistical test was performed to demonstrate that the network could explain the phenotypic alteration. As a result, it was confirmed that the difference of average perturbation scores of common networks at both two time points could explain the phenotypic alteration. We also performed functional enrichment on the common network and identified high association with phenotypic alteration. Remarkably, we observed that the identified cell cycle specific common network played an important role in replicative senescence as a key regulator. Heretofore, the network analysis from time series gene expression data has been focused on what topological structure was changed over time point. Conversely, we focused on the conserved structure but its context was changed in course of time and showed it was available to explain the phenotypic changes. We expect that the proposed method will help to elucidate the biological mechanism unrevealed by the existing approaches.
Continuing upward trend in Mt Read Huon pine ring widths - Temperature or divergence?
NASA Astrophysics Data System (ADS)
Allen, K. J.; Cook, E. R.; Buckley, B. M.; Larsen, S. H.; Drew, D. M.; Downes, G. M.; Francey, R. J.; Peterson, M. J.; Baker, P. J.
2014-10-01
To date, no attempt has been made to assess the presence or otherwise of the “Divergence Problem” (DP) in existing multi-millennial Southern Hemisphere tree-ring chronologies. We have updated the iconic Mt Read Huon pine chronology from Tasmania, southeastern Australia, to now include the warmest decade on record, AD 2000-2010, and used the Kalman Filter (KF) to examine it for signs of divergence against four different temperature series available for the region. Ring-width growth for the past two decades is statistically unprecedented for the past 1048 years. Although we have identified a decoupling between temperature and growth in the past two decades, the relationship between some of the temperature records and growth has varied over time since the start of instrumental records. Rather than the special case of ‘divergence', we have identified a more general time-dependence between growth and temperature over the last 100 years. This time-dependence appears particularly problematic at interdecadal time scales. Due to the time-dependent relationships, and uncertainties related to the climate data, the use of any of the individual temperature series examined here potentially complicates temperature reconstruction. Some of the uncertainty in the climate data may be associated with changing climatic conditions, such as the intensification of the sub-tropical ridge (STR) and its impact on the frequency of anticyclonic conditions over the Mt Read site. Increased growth at the site, particularly in the last decade, over and above what would be expected based on a linear temperature model alone, may be consistent with a number of hypotheses. Existing uncertainties in the climate data need to be resolved and independent physiological information obtained before a range of hypotheses for this increased growth can be effectively evaluated.
NASA Astrophysics Data System (ADS)
Kasatkina, T. I.; Dushkin, A. V.; Pavlov, V. A.; Shatovkin, R. R.
2018-03-01
In the development of information, systems and programming to predict the series of dynamics, neural network methods have recently been applied. They are more flexible, in comparison with existing analogues and are capable of taking into account the nonlinearities of the series. In this paper, we propose a modified algorithm for predicting the series of dynamics, which includes a method for training neural networks, an approach to describing and presenting input data, based on the prediction by the multilayer perceptron method. To construct a neural network, the values of a series of dynamics at the extremum points and time values corresponding to them, formed based on the sliding window method, are used as input data. The proposed algorithm can act as an independent approach to predicting the series of dynamics, and be one of the parts of the forecasting system. The efficiency of predicting the evolution of the dynamics series for a short-term one-step and long-term multi-step forecast by the classical multilayer perceptron method and a modified algorithm using synthetic and real data is compared. The result of this modification was the minimization of the magnitude of the iterative error that arises from the previously predicted inputs to the inputs to the neural network, as well as the increase in the accuracy of the iterative prediction of the neural network.
Dual-induced multifractality in online viewing activity.
Qin, Yu-Hao; Zhao, Zhi-Dan; Cai, Shi-Min; Gao, Liang; Stanley, H Eugene
2018-01-01
Although recent studies have found that the long-term correlations relating to the fat-tailed distribution of inter-event times exist in human activity and that these correlations indicate the presence of fractality, the property of fractality and its origin have not been analyzed. We use both detrended fluctuation analysis and multifractal detrended fluctuation analysis to analyze the time series in online viewing activity separating from Movielens and Netflix. We find long-term correlations at both the individual and communal levels and that the extent of correlation at the individual level is determined by the activity level. These long-term correlations also indicate that there is fractality in the pattern of online viewing. We first find a multifractality that results from the combined effect of the fat-tailed distribution of inter-event times (i.e., the times between successive viewing actions of individuals) and the long-term correlations in online viewing activity and verify this finding using three synthesized series. Therefore, it can be concluded that the multifractality in online viewing activity is caused by both the fat-tailed distribution of inter-event times and the long-term correlations and that this enlarges the generic property of human activity to include not just physical space but also cyberspace.
Dual-induced multifractality in online viewing activity
NASA Astrophysics Data System (ADS)
Qin, Yu-Hao; Zhao, Zhi-Dan; Cai, Shi-Min; Gao, Liang; Stanley, H. Eugene
2018-01-01
Although recent studies have found that the long-term correlations relating to the fat-tailed distribution of inter-event times exist in human activity and that these correlations indicate the presence of fractality, the property of fractality and its origin have not been analyzed. We use both detrended fluctuation analysis and multifractal detrended fluctuation analysis to analyze the time series in online viewing activity separating from Movielens and Netflix. We find long-term correlations at both the individual and communal levels and that the extent of correlation at the individual level is determined by the activity level. These long-term correlations also indicate that there is fractality in the pattern of online viewing. We first find a multifractality that results from the combined effect of the fat-tailed distribution of inter-event times (i.e., the times between successive viewing actions of individuals) and the long-term correlations in online viewing activity and verify this finding using three synthesized series. Therefore, it can be concluded that the multifractality in online viewing activity is caused by both the fat-tailed distribution of inter-event times and the long-term correlations and that this enlarges the generic property of human activity to include not just physical space but also cyberspace.
NASA Astrophysics Data System (ADS)
Dobrovolný, Petr; Brázdil, Rudolf; Kotyza, Oldřich; Valášek, Hubert
2010-05-01
Series of temperature and precipitation indices (in ordinal scale) based on interpretation of various sources of documentary evidence (e.g. narrative written reports, visual daily weather records, personal correspondence, special prints, official economic records, etc.) are used as predictors in the reconstruction of mean seasonal temperatures and seasonal precipitation totals for the Czech Lands from A.D. 1500. Long instrumental measurements from 1771 (temperatures) and 1805 (precipitation) are used as a target values to calibrate and verify documentary-based index series. Reconstruction is based on linear regression with variance and mean adjustments. Reconstructed series were compared with similar European documentary-based reconstructions as well as with reconstructions based on different natural proxies. Reconstructed series were analyzed with respect to trends on different time-scales and occurrence of extreme values. We discuss uncertainties typical for documentary evidence from historical archives. Besides the fact that reports on weather and climate in documentary archives cover all seasons, our reconstructions provide the best results for winter temperatures and summer precipitation. However, explained variance for these seasons is comparable to other existing reconstructions for Central Europe.
Updating stand-level forest inventories using airborne laser scanning and Landsat time series data
NASA Astrophysics Data System (ADS)
Bolton, Douglas K.; White, Joanne C.; Wulder, Michael A.; Coops, Nicholas C.; Hermosilla, Txomin; Yuan, Xiaoping
2018-04-01
Vertical forest structure can be mapped over large areas by combining samples of airborne laser scanning (ALS) data with wall-to-wall spatial data, such as Landsat imagery. Here, we use samples of ALS data and Landsat time-series metrics to produce estimates of top height, basal area, and net stem volume for two timber supply areas near Kamloops, British Columbia, Canada, using an imputation approach. Both single-year and time series metrics were calculated from annual, gap-free Landsat reflectance composites representing 1984-2014. Metrics included long-term means of vegetation indices, as well as measures of the variance and slope of the indices through time. Terrain metrics, generated from a 30 m digital elevation model, were also included as predictors. We found that imputation models improved with the inclusion of Landsat time series metrics when compared to single-year Landsat metrics (relative RMSE decreased from 22.8% to 16.5% for top height, from 32.1% to 23.3% for basal area, and from 45.6% to 34.1% for net stem volume). Landsat metrics that characterized 30-years of stand history resulted in more accurate models (for all three structural attributes) than Landsat metrics that characterized only the most recent 10 or 20 years of stand history. To test model transferability, we compared imputed attributes against ALS-based estimates in nearby forest blocks (>150,000 ha) that were not included in model training or testing. Landsat-imputed attributes correlated strongly to ALS-based estimates in these blocks (R2 = 0.62 and relative RMSE = 13.1% for top height, R2 = 0.75 and relative RMSE = 17.8% for basal area, and R2 = 0.67 and relative RMSE = 26.5% for net stem volume), indicating model transferability. These findings suggest that in areas containing spatially-limited ALS data acquisitions, imputation models, and Landsat time series and terrain metrics can be effectively used to produce wall-to-wall estimates of key inventory attributes, providing an opportunity to update estimates of forest attributes in areas where inventory information is either out of date or non-existent.
NASA Astrophysics Data System (ADS)
Evans, K. D.; Early, A. B.; Northup, E. A.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.; Arctur, D. K.; Beach, A. L., III; Silverman, M. L.
2017-12-01
The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering Report template, describing recommended changes to TimeseriesML 1.0, in the form of use cases. In 2017, we are conducting interoperability experiments to implement the use cases and demonstrate the feasibility and suitability of these modifications for NASA and related user communities. The results will be incorporated into the existing draft engineering report.
NASA Technical Reports Server (NTRS)
Evans, Keith D.; Early, Amanda; Northup, Emily; Ames, Dan; Teng, William; Archur, David; Beach, Aubrey; Olding, Steve; Krotkov, Nickolay A.
2017-01-01
The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering Report template, describing recommended changes to TimeseriesML 1.0, in the form of use cases. In 2017, we are conducting interoperability experiments to implement the use cases and demonstrate the feasibility and suitability of these modifications for NASA and related user communities. The results will be incorporated into the existing draft engineering report.
NASA Astrophysics Data System (ADS)
Ritzberger, D.; Jakubek, S.
2017-09-01
In this work, a data-driven identification method, based on polynomial nonlinear autoregressive models with exogenous inputs (NARX) and the Volterra series, is proposed to describe the dynamic and nonlinear voltage and current characteristics of polymer electrolyte membrane fuel cells (PEMFCs). The structure selection and parameter estimation of the NARX model is performed on broad-band voltage/current data. By transforming the time-domain NARX model into a Volterra series representation using the harmonic probing algorithm, a frequency-domain description of the linear and nonlinear dynamics is obtained. With the Volterra kernels corresponding to different operating conditions, information from existing diagnostic tools in the frequency domain such as electrochemical impedance spectroscopy (EIS) and total harmonic distortion analysis (THDA) are effectively combined. Additionally, the time-domain NARX model can be utilized for fault detection by evaluating the difference between measured and simulated output. To increase the fault detectability, an optimization problem is introduced which maximizes this output residual to obtain proper excitation frequencies. As a possible extension it is shown, that by optimizing the periodic signal shape itself that the fault detectability is further increased.
A review on battery thermal management in electric vehicle application
NASA Astrophysics Data System (ADS)
Xia, Guodong; Cao, Lei; Bi, Guanglong
2017-11-01
The global issues of energy crisis and air pollution have offered a great opportunity to develop electric vehicles. However, so far, cycle life of power battery, environment adaptability, driving range and charging time seems far to compare with the level of traditional vehicles with internal combustion engine. Effective battery thermal management (BTM) is absolutely essential to relieve this situation. This paper reviews the existing literature from two levels that are cell level and battery module level. For single battery, specific attention is paid to three important processes which are heat generation, heat transport, and heat dissipation. For large format cell, multi-scale multi-dimensional coupled models have been developed. This will facilitate the investigation on factors, such as local irreversible heat generation, thermal resistance, current distribution, etc., that account for intrinsic temperature gradients existing in cell. For battery module based on air and liquid cooling, series, series-parallel and parallel cooling configurations are discussed. Liquid cooling strategies, especially direct liquid cooling strategies, are reviewed and they may advance the battery thermal management system to a new generation.
Molecular imaging of cannabis leaf tissue with MeV-SIMS method
NASA Astrophysics Data System (ADS)
Jenčič, Boštjan; Jeromel, Luka; Ogrinc Potočnik, Nina; Vogel-Mikuš, Katarina; Kovačec, Eva; Regvar, Marjana; Siketić, Zdravko; Vavpetič, Primož; Rupnik, Zdravko; Bučar, Klemen; Kelemen, Mitja; Kovač, Janez; Pelicon, Primož
2016-03-01
To broaden our analytical capabilities with molecular imaging in addition to the existing elemental imaging with micro-PIXE, a linear Time-Of-Flight mass spectrometer for MeV Secondary Ion Mass Spectrometry (MeV-SIMS) was constructed and added to the existing nuclear microprobe at the Jožef Stefan Institute. We measured absolute molecular yields and damage cross-section of reference materials, without significant alteration of the fragile biological samples during the duration of measurements in the mapping mode. We explored the analytical capability of the MeV-SIMS technique for chemical mapping of the plant tissue of medicinal cannabis leaves. A series of hand-cut plant tissue slices were prepared by standard shock-freezing and freeze-drying protocol and deposited on the Si wafer. We show the measured MeV-SIMS spectra showing a series of peaks in the mass area of cannabinoids, as well as their corresponding maps. The indicated molecular distributions at masses of 345.5 u and 359.4 u may be attributed to the protonated THCA and THCA-C4 acids, and show enhancement in the areas with opened trichome morphology.
Analysis of Multipsectral Time Series for supporting Forest Management Plans
NASA Astrophysics Data System (ADS)
Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.
2010-05-01
Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.
NASA Astrophysics Data System (ADS)
Heimhuber, V.; Tulbure, M. G.; Broich, M.
2017-02-01
Periodically inundated floodplain areas are hot spots of biodiversity and provide a broad range of ecosystem services but have suffered alarming declines in recent history. Despite their importance, their long-term surface water (SW) dynamics and hydroclimatic drivers remain poorly quantified on continental scales. In this study, we used a 26 year time series of Landsat-derived SW maps in combination with river flow data from 68 gauges and spatial time series of rainfall, evapotranspiration and soil moisture to statistically model SW dynamics as a function of key drivers across Australia's Murray-Darling Basin (˜1 million km2). We fitted generalized additive models for 18,521 individual modeling units made up of 10 × 10 km grid cells, each split into floodplain, floodplain-lake, and nonfloodplain area. Average goodness of fit of models was high across floodplains and floodplain-lakes (r2 > 0.65), which were primarily driven by river flow, and was lower for nonfloodplain areas (r2 > 0.24), which were primarily driven by rainfall. Local climate conditions were more relevant for SW dynamics in the northern compared to the southern basin and had the highest influence in the least regulated and most extended floodplains. We further applied the models of two contrasting floodplain areas to predict SW extents of cloud-affected time steps in the Landsat series during the large 2010 floods with high validated accuracy (r2 > 0.97). Our framework is applicable to other complex river basins across the world and enables a more detailed quantification of large floods and drivers of SW dynamics compared to existing methods.
State-space forecasting of Schistosoma haematobium time-series in Niono, Mali.
Medina, Daniel C; Findley, Sally E; Doumbia, Seydou
2008-08-13
Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.-which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively-is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. In this longitudinal retrospective (01/1996-06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state-space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. The exponential smoothing state-space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium-induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel.
State–Space Forecasting of Schistosoma haematobium Time-Series in Niono, Mali
Medina, Daniel C.; Findley, Sally E.; Doumbia, Seydou
2008-01-01
Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.—which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively—is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state–space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. Conclusions/Significance The exponential smoothing state–space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium–induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel. PMID:18698361
Modeling commodity salam contract between two parties for discrete and continuous time series
NASA Astrophysics Data System (ADS)
Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd
2017-08-01
In order for Islamic finance to remain competitive as the conventional, there needs a new development of Islamic compliance product such as Islamic derivative that can be used to manage the risk. However, under syariah principles and regulations, all financial instruments must not be conflicting with five syariah elements which are riba (interest paid), rishwah (corruption), gharar (uncertainty or unnecessary risk), maysir (speculation or gambling) and jahl (taking advantage of the counterparty's ignorance). This study has proposed a traditional Islamic contract namely salam that can be built as an Islamic derivative product. Although a lot of studies has been done on discussing and proposing the implementation of salam contract as the Islamic product however they are more into qualitative and law issues. Since there is lack of quantitative study of salam contract being developed, this study introduces mathematical models that can value the appropriate salam price for a commodity salam contract between two parties. In modeling the commodity salam contract, this study has modified the existing conventional derivative model and come out with some adjustments to comply with syariah rules and regulations. The cost of carry model has been chosen as the foundation to develop the commodity salam model between two parties for discrete and continuous time series. However, the conventional time value of money results from the concept of interest that is prohibited in Islam. Therefore, this study has adopted the idea of Islamic time value of money which is known as the positive time preference, in modeling the commodity salam contract between two parties for discrete and continuous time series.
NASA Astrophysics Data System (ADS)
Varotsos, Costas A.; Efstathiou, Maria N.
2017-05-01
A substantial weakness of several climate studies on long-range dependence is the conclusion of long-term memory of the climate conditions, without considering it necessary to establish the power-law scaling and to reject a simple exponential decay of the autocorrelation function. We herewith show one paradigmatic case, where a strong long-range dependence could be wrongly inferred from incomplete data analysis. We firstly apply the DFA method on the solar and volcanic forcing time series over the tropical Pacific, during the past 1000 years and the results obtained show that a statistically significant straight line fit to the fluctuation function in a log-log representation is revealed with slope higher than 0.5, which wrongly may be assumed as an indication of persistent long-range correlations in the time series. We argue that the long-range dependence cannot be concluded just from this straight line fit, but it requires the fulfilment of the two additional prerequisites i.e. reject the exponential decay of the autocorrelation function and establish the power-law scaling. In fact, the investigation of the validity of these prerequisites showed that the DFA exponent higher than 0.5 does not justify the existence of persistent long-range correlations in the temporal evolution of the solar and volcanic forcing during last millennium. In other words, we show that empirical analyses, based on these two prerequisites must not be considered as panacea for a direct proof of scaling, but only as evidence that the scaling hypothesis is plausible. We also discuss the scaling behaviour of solar and volcanic forcing data based on the Haar tool, which recently proved its ability to reliably detect the existence of the scaling effect in climate series.
The acute effect of a plyometric stimulus on jump performance in professional rugby players.
Tobin, Daniel P; Delahunt, Eamonn
2014-02-01
Post-activation potentiation (PAP) is the elevation of motor performance to a higher level in response to a conditioning stimulus. Extensive research exists examining the PAP effect after a heavy resistance exercise. However, there is limited research examining the PAP effect after a plyometric stimulus. This study was designed to examine whether a plyometric stimulus could produce a PAP effect comparable to that typically reported with a heavy resistance protocol. Importantly, it was hypothesized that the PAP effect would exist without the same levels of acute fatigue resulting from a heavy stimulus, thus allowing improvement in performance within a short rest interval range. Twenty professional rugby players were recruited for the study. Subjects performed 2 countermovement jumps (CMJs) at baseline and at 1, 3, and 5 minutes after a plyometric stimulus consisting of 40 jumps. Two separate 1-way repeated-measures analyses of variance were conducted to compare the dependent variables CMJ height and peak force at the 4 time points. Results of the Bonferroni adjusted pairwise comparisons indicated that jump height and peak force before plyometric exercises were significantly lower than all other time points (p < 0.01). The main finding of this study indicates that a series of plyometric exercises causes a significant acute enhancement in CMJ height (p < 0.01) and peak force (p < 0.01) throughout the rest interval range of 1-5 minutes. The plyometric series induced an improvement in CMJ height comparable to that reported elsewhere after a heavy lifting stimulus but without the need for a prolonged rest interval. Performing repeated series of plyometric jumps appears to be an efficient method of taking advantage of the PAP phenomenon, thus possibly eliminating the need for a complex training protocol.
InSAR Time Series Analysis of Dextral Strain Partitioning Across the Burma Plate
NASA Astrophysics Data System (ADS)
Reitman, N. G.; Wang, Y.; Lin, N.; Lindsey, E. O.; Mueller, K. J.
2017-12-01
Oblique convergence between the India and Sunda plates creates partitioning of strike-slip and compressional strain across the Burma plate. GPS data indicate up to 40 mm/yr (Steckler et al 2016) of dextral strain exists between the India and Sunda plates. The Sagaing fault in Myanmar accommodates 20 mm/yr at the eastern boundary of the Burma plate, but the location and magnitude of dextral strain on other faults remains an open question, as does the relative importance of seismic vs aseismic processes. The remaining 20 mm/yr of dextral strain may be accommodated on one or two faults or widely distributed on faults across the Burma plate, scenarios that have a major impact on seismic hazard. However, the dense GPS data necessary for precise determination of which faults accommodate how much strain do not exist yet. Previous studies using GPS data ascribe 10-18 mm/yr dextral strain on the Churachandpur Mao fault in India (Gahaluat et al 2013, Steckler et al 2016) and 18-22 mm/yr on the northern Sagaing fault (Maurin et al 2010, Steckler et al 2016), leaving up to 10 mm/yr unconstrained. Several of the GPS results are suggestive of shallow aseismic slip along parts of these faults, which, if confirmed, would have a significant impact on our understanding of hazard in the area. Here, we use differential InSAR analyzed in time series to investigate dextral strain on the Churachandpur Mao fault and across the Burma plate. Ascending ALOS-1 imagery spanning 2007-2010 were processed in time series for three locations. Offsets in phase and a strong gradient in line-of-sight deformation rate are observed across the Churachandpur Mao fault, and work is ongoing to determine if these are produced by shallow fault movement, topographic effects, or both. The results of this study will provide further constraints for strain rate on the Churachandpur Mao fault, and yield a more complete understanding of strain partitioning across the Burma plate.
Space Fission Propulsion Testing and Development Progress. Phase 1
NASA Technical Reports Server (NTRS)
VanDyke, Melissa; Houts, Mike; Pedersen, Kevin; Godfroy, Tom; Dickens, Ricky; Poston, David; Reid, Bob; Salvail, Pat; Ring, Peter; Rodgers, Stephen L. (Technical Monitor)
2001-01-01
Successful development of space fission systems will require an extensive program of affordable and realistic testing. In addition to tests related to design/development of the fission system, realistic testing of the actual flight unit must also be performed. Testing can be divided into two categories, non-nuclear tests and nuclear tests. Full power nuclear tests of space fission systems we expensive, time consuming, and of limited use, even in the best of programmatic environments. If the system is designed to operate within established radiation damage and fuel burn up limits while simultaneously being designed to allow close simulation of heat from fission using resistance heaters, high confidence in fission system performance and lifetime can be attained through a series of non-nuclear tests. Non-nuclear tests are affordable and timely, and the cause of component and system failures can be quickly and accurately identified. MSFC is leading a Safe Affordable Fission Engine (SAFE) test series whose ultimate goal is the demonstration of a 300 kW flight configuration system using non-nuclear testing. This test series is carried out in collaboration with other NASA centers, other government agencies, industry, and universities. If SAFE-related nuclear tests are desired they will have a high probability of success and can be performed at existing nuclear facilities. The paper describes the SAFE non-nuclear test series, which includes test article descriptions, test results and conclusions, and future test plans.
A Smoothing Technique for the Multifractal Analysis of a Medium Voltage Feeders Electric Current
NASA Astrophysics Data System (ADS)
de Santis, Enrico; Sadeghian, Alireza; Rizzi, Antonello
2017-12-01
The current paper presents a data-driven detrending technique allowing to smooth complex sinusoidal trends from a real-world electric load time series before applying the Detrended Multifractal Fluctuation Analysis (MFDFA). The algorithm we call Smoothed Sort and Cut Fourier Detrending (SSC-FD) is based on a suitable smoothing of high power periodicities operating directly in the Fourier spectrum through a polynomial fitting technique of the DFT. The main aim consists of disambiguating the characteristic slow varying periodicities, that can impair the MFDFA analysis, from the residual signal in order to study its correlation properties. The algorithm performances are evaluated on a simple benchmark test consisting of a persistent series where the Hurst exponent is known, with superimposed ten sinusoidal harmonics. Moreover, the behavior of the algorithm parameters is assessed computing the MFDFA on the well-known sunspot data, whose correlation characteristics are reported in literature. In both cases, the SSC-FD method eliminates the apparent crossover induced by the synthetic and natural periodicities. Results are compared with some existing detrending methods within the MFDFA paradigm. Finally, a study of the multifractal characteristics of the electric load time series detrendended by the SSC-FD algorithm is provided, showing a strong persistent behavior and an appreciable amplitude of the multifractal spectrum that allows to conclude that the series at hand has multifractal characteristics.
Phase 1 space fission propulsion system testing and development progress
NASA Astrophysics Data System (ADS)
van Dyke, Melissa; Houts, Mike; Pedersen, Kevin; Godfroy, Tom; Dickens, Ricky; Poston, David; Reid, Bob; Salvail, Pat; Ring, Peter
2001-02-01
Successful development of space fission systems will require an extensive program of affordable and realistic testing. In addition to tests related to design/development of the fission system, realistic testing of the actual flight unit must also be performed. Testing can be divided into two categories, non-nuclear tests and nuclear tests. Full power nuclear tests of space fission systems are expensive, time consuming, and of limited use, even in the best of programmatic environments. If the system is designed to operate within established radiation damage and fuel burn up limits while simultaneously being designed to allow close simulation of heat from fission using resistance heaters, high confidence in fission system performance and lifetime can be attained through a series of non-nuclear tests. Non-nuclear tests are affordable and timely, and the cause of component and system failures can be quickly and accurately identified, MSFC is leading a Safe Affordable Fission Engine (SAFE) test series whose ultimate goal is the demonstration of a 300 kW flight configuration system using non-nuclear testing. This test series is carried out in collaboration with other NASA centers, other government agencies, industry, and universities. If SAFE-related nuclear tests are desired, they will have a high probability of success and can be performed at existing nuclear facilities. The paper describes the SAFE non-nuclear test series, which includes test article descriptions, test results and conclusions, and future test plans. .
NASA Astrophysics Data System (ADS)
Demetrescu, C.; Dobrica, V.; Stefan, C.
2017-12-01
A rich scientific literature is linking length-of-day (LOD) fluctuations to geomagnetic field and flow oscillations in the fluid outer core. We demostrate that the temporal evolution of the geomagnetic field shows the existence of several oscillations at decadal, inter-decadal, and sub-centennial time scales that superimpose on a so-called inter-centennial constituent. We show that while the subcentennial oscillations of the geomagnetic field, produced by torsional oscillations in the core, could be linked to oscillations of LOD at a similar time scale, the oscillations at decadal and sub-decadal time scales, of external origin, can be found in LOD too. We discuss these issues from the perspective of long time-span main field models (gufm1 - Jackson et al., 2000; COV-OBS - Gillet et al., 2013) that are used to retrieve time series of geomagnetic elements in a 2.5x2.5° network. The decadal and sub-decadal constituents of the time series of annual values in LOD and geomagnetic field were separated in the cyclic component of a Hodrick-Prescott filtering applied to data, and shown to highly correlate to variations of external sources such as the magnetospheric ring current.
Developing NDE Techniques for Large Cryogenic Tanks
NASA Technical Reports Server (NTRS)
Parker, Don; Starr, Stan
2009-01-01
The Shuttle and Constellation Programs require very large cryogenic ground storage tanks in which to store liquid oxygen and hydrogen. The existing LC-39 pad tanks, which will be passed onto Constellation, are 40 years old and have received minimal refurbishment or even inspection, because they can only be temperature cycled a few times before being overhauled (a costly operation in both time and dollars). Numerous questions exist on the performance and reliability of these old tanks which could cause a major Program schedule disruption. Consequently, with the passing of the first two tanks to Constellation to occur this year, there is growing awareness that NDE is needed to detect problems early in these tanks so that corrective actions can be scheduled when least disruptive. Time series thermal images of two sides of the Pad B LH2 tank have been taken over multiple days to demonstrate the effects of environmental conditions to the solar heating of the tank and therefore the effectiveness of thermal imaging.
The whole earth telescope - A new astronomical instrument
NASA Technical Reports Server (NTRS)
Nather, R. E.; Winget, D. E.; Clemens, J. C.; Hansen, C. J.; Hine, B. P.
1990-01-01
A new multimirror ground-based telescope for time-series photometry of rapid variable stars, designed to minimize or eliminate gaps in the brightness record caused by the rotation of the earth, is described. A sequence of existing telescopes distributed in longitude, coordinated from a single control center, is used to measure designated target stars so long as they are in darkness. Data are returned by electronic mail to the control center, where they are analyzed in real time. This instrument is the first to provide data of continuity and quality that permit true high-resolution power spectroscopy of pulsating white dwarf stars.
Mechanical behaviour of cerclage material consisting of silicon rubber.
Hinrichsen, G; Eberhardt, A; Springer, H
1979-09-01
Silicon rubber specimens of circular or rectangular cross-section (cross-section area between ca. 2 and 7 mm2) are used as cerclage bands. A series of commercial cerclage elements was investigated for mechanical characteristics, such as stress-strain behaviour and modulus of elasticity, using a tensile-testing machine. Large differences in these properties exist among the various specimens. Moreover, time-dependent effects, such as stress-relaxation, retardation, and creep, were analysed by the present investigations. One has to take into consideration that the initial length and stress of the cerclage band vary significantly with time after the operation.
Investigation of Noises in GPS Time Series: Case Study on Epn Weekly Solutions
NASA Astrophysics Data System (ADS)
Klos, Anna; Bogusz, Janusz; Figurski, Mariusz; Kosek, Wieslaw; Gruszczynski, Maciej
2014-05-01
The noises in GPS time series are stated to be described the best by the combination of white (Gaussian) and power-law processes. They are mainly the effect of mismodelled satellite orbits, Earth orientation parameters, atmospheric effects, antennae phase centre effects, or of monument instability. Due to the fact, that velocities of permanent stations define the kinematic reference frame, they have to fulfil the requirement of being stable at 0.1 mm/yr. The previously performed researches showed, that the wrong assumption of noise model leads to the underestimation of velocities and their uncertainties from 2 up to even 11, especially in the Up direction. This presentation focuses on more than 200 EPN (EUREF Permanent Network) stations from the area of Europe with various monument types (concrete pillars, buildings, metal masts, with or without domes, placed on the ground or on the rock) and coordinates of weekly changes (GPS weeks 0834-1459). The topocentric components (North, East, Up) in ITRF2005 which come from the EPN Re-Processing made by the Military University of Technology Local Analysis Centre (MUT LAC) were processed with Maximum Likelihood Estimation (MLE) using CATS software. We have assumed the existence of few combinations of noise models (these are: white, flicker and random walk noise with integer spectral indices and power-law noise models with fractional spectral indices) and investigated which of them EPN weekly time series are likely to follow. The results show, that noises in GPS time series are described the best by the combination of white and flicker noise model. It is strictly related to the so-called common mode error (CME) that is spatially correlated error being one of the dominant error source in GPS solutions. We have assumed CME as spatially uniform, what was a good approximation for stations located hundreds of kilometres one to another. Its removal with spatial filtering reduces the amplitudes of white and flicker noise by a factor of 2 or 3. The assumption of white plus flicker plus random-walk noise (which is considered to be the effect of badly monumented stations) resulted in the random-walk amplitudes at the level of single millimetres for some of the stations, while for the majority of them no random-walk was detected, due to the fact that flicker noise prevails in GPS time series. The removal of CME caused the decrease in flicker noise amplitudes leading at the same time to greater random-walk amplitudes. The assumed combination of white plus power-law noise showed that the spectral indices for the best fitted noise model are unevenly distributed around -1 what also indicates the flicker noise existence in EPN weekly time series. The poster will present all of the assumed noise model combinations with the comparison of noise amplitudes before and after spatial filtering. Additionally, we will discuss over the latitude and longitude noise dependencies for the area of Europe to indicate any similarities between noise amplitudes and the location of stations. Finally, we will focus on the velocities with their uncertainties that were determined from EPN weekly solutions and show how the wrong assumption of noise model changes both of them.
Socolovsky, Mariano; Paez, Miguel Domínguez
2013-11-27
A wide range of results have appeared in the literature for intercostal nerve transfers in brachial plexus patients. Oriental countries generally have a lower body mass index (BMI) than their occidental counterparts. We analyzed published series of intercostal nerve transfers for elbow reinnervation to determine if a difference in outcomes exists between Eastern and Western series that could be inversely related to BMI. A PubMed search was conducted. Inclusion criteria were: (1) time from trauma to surgery <12 months, (2) minimum follow-up one year, (3) intercostal to musculocutaneous nerve transfer the only surgical procedure performed to reestablish elbow flexion, and (4) males comprising more than 75% of cases. Two groups were created: Series from western countries, including America, Europe, and Africa; and series from Asia. Pearson correlation analysis was performed to assess for the degree of correlation between percent responders and mean national BMI. A total of 26 series were included, 14 from western countries and 12 from Eastern countries, encompassing a total of 274 and 432 surgical cases, respectively. The two groups were almost identical in mean age, but quite different in mean national BMI (26.3 vs. 22.5) and in the percentage of patients who achieved at least a Medical Research Council (MRC) level 3 (59.5% vs. 79.3%). Time from trauma to surgery was slightly shorter in Eastern (3.4 months) versus Western countries (5.0 months). The percentage of responders to intercostal to musculocutaneous nerve transfer was inversely correlated with the mean national BMI among male residents of the country where the series was performed.
Advanced methods for modeling water-levels and estimating drawdowns with SeriesSEE, an Excel add-in
Halford, Keith; Garcia, C. Amanda; Fenelon, Joe; Mirus, Benjamin B.
2012-12-21
Water-level modeling is used for multiple-well aquifer tests to reliably differentiate pumping responses from natural water-level changes in wells, or “environmental fluctuations.” Synthetic water levels are created during water-level modeling and represent the summation of multiple component fluctuations, including those caused by environmental forcing and pumping. Pumping signals are modeled by transforming step-wise pumping records into water-level changes by using superimposed Theis functions. Water-levels can be modeled robustly with this Theis-transform approach because environmental fluctuations and pumping signals are simulated simultaneously. Water-level modeling with Theis transforms has been implemented in the program SeriesSEE, which is a Microsoft® Excel add-in. Moving average, Theis, pneumatic-lag, and gamma functions transform time series of measured values into water-level model components in SeriesSEE. Earth tides and step transforms are additional computed water-level model components. Water-level models are calibrated by minimizing a sum-of-squares objective function where singular value decomposition and Tikhonov regularization stabilize results. Drawdown estimates from a water-level model are the summation of all Theis transforms minus residual differences between synthetic and measured water levels. The accuracy of drawdown estimates is limited primarily by noise in the data sets, not the Theis-transform approach. Drawdowns much smaller than environmental fluctuations have been detected across major fault structures, at distances of more than 1 mile from the pumping well, and with limited pre-pumping and recovery data at sites across the United States. In addition to water-level modeling, utilities exist in SeriesSEE for viewing, cleaning, manipulating, and analyzing time-series data.
NASA Astrophysics Data System (ADS)
Wang, Fang
2016-06-01
In order to detect and quantify asymmetry of two time series, a novel cross-correlation coefficient is proposed based on recent asymmetric detrended cross-correlation analysis (A-DXA), which we called A-DXA coefficient. The A-DXA coefficient, as an important extension of DXA coefficient ρ D X A , contains two directional asymmetric cross-correlated indexes, describing upwards and downwards asymmetric cross-correlations, respectively. By using the information of directional covariance function of two time series and directional variance function of each series itself instead of power-law between the covariance function and time scale, the proposed A-DXA coefficient can well detect asymmetry between the two series no matter whether the cross-correlation is significant or not. By means of the proposed A-DXA coefficient conducted over the asymmetry for California electricity market, we found that the asymmetry between the prices and loads is not significant for daily average data in 1999 yr market (before electricity crisis) but extremely significant for those in 2000 yr market (during the crisis). To further uncover the difference of asymmetry between the years 1999 and 2000, a modified H statistic (MH) and ΔMH statistic are proposed. One of the present contributions is that the high MH values calculated for hourly data exist in majority months in 2000 market. Another important conclusion is that the cross-correlation with downwards dominates over the whole 1999 yr in contrast to the cross-correlation with upwards dominates over the 2000 yr.
Representations of time coordinates in FITS. Time and relative dimension in space
NASA Astrophysics Data System (ADS)
Rots, Arnold H.; Bunclark, Peter S.; Calabretta, Mark R.; Allen, Steven L.; Manchester, Richard N.; Thompson, William T.
2015-02-01
Context. In a series of three previous papers, formulation and specifics of the representation of world coordinate transformations in FITS data have been presented. This fourth paper deals with encoding time. Aims: Time on all scales and precisions known in astronomical datasets is to be described in an unambiguous, complete, and self-consistent manner. Methods: Employing the well-established World Coordinate System (WCS) framework, and maintaining compatibility with the FITS conventions that are currently in use to specify time, the standard is extended to describe rigorously the time coordinate. Results: World coordinate functions are defined for temporal axes sampled linearly and as specified by a lookup table. The resulting standard is consistent with the existing FITS WCS standards and specifies a metadata set that achieves the aims enunciated above.
Rainfall height stochastic modelling as a support tool for landslides early warning
NASA Astrophysics Data System (ADS)
Capparelli, G.; Giorgio, M.; Greco, R.; Versace, P.
2009-04-01
Occurrence of landslides is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Although heavy landslides frequently occurred in Campania, southern Italy, during the last decade, no complete data sets are available for natural slopes where landslides occurred. As a consequence, landslide risk assessment procedures and early warning systems in Campania still rely on simple empirical models based on correlation between daily rainfall records and observed landslides, like FLAIR model [Versace et al., 2003]. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction. In mountainous areas, rainfall spatial and temporal variability are very pronounced due to orographic effects, making predictions even more complicated. Existing rain gauge networks are not dense enough to resolve the small scale spatial variability, and the same limitation of spatial resolution affects rainfall height maps provided by radar sensors as well as by meteorological physically based models. Therefore, analysis of on-site recorded rainfall height time series still represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR and ARMA [Box and Jenkins, 1976]. Sometimes exogenous information coming from additional series of observations is also taken into account, and the models are called ARX and ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted in conjunction with FLAIR model to calculate the probability of flowslides occurrence. The final aim of the study is in fact to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil protection agency meteorological warning network. So far, the model has been applied only to data series recorded at a single rain gauge. Future extension will deal with spatial correlation between time series recorded at different gauges. ACKNOWLEDGEMENTS The research was co-financed by the Italian Ministry of University, by means of the PRIN 2006 PRIN program, within the research project entitled ‘Definition of critical rainfall thresholds for destructive landslides for civil protection purposes'. REFERENCES Box, G.E.P. and Jenkins, G.M., 1976. Time Series Analysis Forecasting and Control, Holden-Day, San Francisco. Cowpertwait, P.S.P., Kilsby, C.G. and O'Connell, P.E., 2002. A space-time Neyman-Scott model of rainfall: Empirical analysis of extremes, Water Resources Research, 38(8):1-14. Salas, J.D., 1992. Analysis and modeling of hydrological time series, in D.R. Maidment, ed., Handbook of Hydrology, McGraw-Hill, New York. Heneker, T.M., Lambert, M.F. and Kuczera G., 2001. A point rainfall model for risk-based design, Journal of Hydrology, 247(1-2):54-71. Versace, P., Sirangelo. B. and Capparelli, G., 2003. Forewarning model of landslides triggered by rainfall. Proc. 3rd International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction and Assessment, Davos.
Inner shelf morphologic controls on the dynamics of the beach and bar system, Fire Island, New York
Hapke, Cheryl J.; Schwab, William C.; Gayes, Paul T.; McCoy, Clay; Viso, Richard; Lentz, Erika E.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.
2011-01-01
The mechanism of sediment exchange between offshore sand ridges and the beach at Fire Island, New York is largely unknown. However, recent evidence from repeat nearshore bathymetry surveys, coupled with the complex but consistent bar morphology and patterns of shoreline change demonstrate that there is a feedback occurring between the regional geologic framework and modern processes. Analysis of bathymetric survey data provides direct confirmation that the offshore ridges are connected to the shoreface and are spatially persistent. The fixed nature of the nearshore morphology is further supported by time series camera data that indicate persistent bars with breaks that re-form in the same locations. A long-term time series of shoreline change shows distinct zones of erosion and accretion that are pervasive over time scales greater than a half-century, and their length-scales are similar to the spacing of the offshore ridge-trough system. The first-order geologic framework is responsible for the existence and locations of the ridges and troughs, which then influence the morphodynamics of the beach and bar system.
Fluctuations and Noise in Stochastic Spread of Respiratory Infection Epidemics in Social Networks
NASA Astrophysics Data System (ADS)
Yulmetyev, Renat; Emelyanova, Natalya; Demin, Sergey; Gafarov, Fail; Hänggi, Peter; Yulmetyeva, Dinara
2003-05-01
For the analysis of epidemic and disease dynamics complexity, it is necessary to understand the basic principles and notions of its spreading in long-time memory media. Here we considering the problem from a theoretical and practical viewpoint, presenting the quantitative evidence confirming the existence of stochastic long-range memory and robust chaos in a real time series of respiratory infections of human upper respiratory track. In this work we present a new statistical method of analyzing the spread of grippe and acute respiratory track infections epidemic process of human upper respiratory track by means of the theory of discrete non-Markov stochastic processes. We use the results of our recent theory (Phys. Rev. E 65, 046107 (2002)) for the study of statistical effects of memory in real data series, describing the epidemic dynamics of human acute respiratory track infections and grippe. The obtained results testify to an opportunity of the strict quantitative description of the regular and stochastic components in epidemic dynamics of social networks with a view to time discreteness and effects of statistical memory.
Predicting hepatitis B monthly incidence rates using weighted Markov chains and time series methods.
Shahdoust, Maryam; Sadeghifar, Majid; Poorolajal, Jalal; Javanrooh, Niloofar; Amini, Payam
2015-01-01
Hepatitis B (HB) is a major global mortality. Accurately predicting the trend of the disease can provide an appropriate view to make health policy disease prevention. This paper aimed to apply three different to predict monthly incidence rates of HB. This historical cohort study was conducted on the HB incidence data of Hamadan Province, the west of Iran, from 2004 to 2012. Weighted Markov Chain (WMC) method based on Markov chain theory and two time series models including Holt Exponential Smoothing (HES) and SARIMA were applied on the data. The results of different applied methods were compared to correct percentages of predicted incidence rates. The monthly incidence rates were clustered into two clusters as state of Markov chain. The correct predicted percentage of the first and second clusters for WMC, HES and SARIMA methods was (100, 0), (84, 67) and (79, 47) respectively. The overall incidence rate of HBV is estimated to decrease over time. The comparison of results of the three models indicated that in respect to existing seasonality trend and non-stationarity, the HES had the most accurate prediction of the incidence rates.
A statistical analysis of flank eruptions on Etna volcano
NASA Astrophysics Data System (ADS)
Mulargia, Francesco; Tinti, Stefano; Boschi, Enzo
1985-02-01
A singularly complete record exists for the eruptive activity of Etna volcano. The time series of occurrence of flank eruptions in the period 1600-1980, in which the record is presumably complete, is found to follow a stationary Poisson process. A revision of the available data shows that eruption durations are rather well correlated with the estimates of the volume of lava flows. This implies that the magnitude of an eruption can be defined directly by its duration. Extreme value statistics are then applied to the time series, using duration as a dependent variable. The probability of occurrence of a very long (300 days) eruption is greater than 50% only in time intervals of the order of 50 years. The correlation found between duration and total output also allows estimation of the probability of occurrence of a major event which exceeds a given duration and total flow of lava. The composite probabilities do not differ considerably from the pure ones. Paralleling a well established application to seismic events, extreme value theory can be profitably used in volcanic risk estimates, provided that appropriate account is also taken of all other variables.
Lange, Maximilian; Dechant, Benjamin; Rebmann, Corinna; Vohland, Michael; Cuntz, Matthias; Doktor, Daniel
2017-08-11
Quantifying the accuracy of remote sensing products is a timely endeavor given the rapid increase in Earth observation missions. A validation site for Sentinel-2 products was hence established in central Germany. Automatic multispectral and hyperspectral sensor systems were installed in parallel with an existing eddy covariance flux tower, providing spectral information of the vegetation present at high temporal resolution. Normalized Difference Vegetation Index (NDVI) values from ground-based hyperspectral and multispectral sensors were compared with NDVI products derived from Sentinel-2A and Moderate-resolution Imaging Spectroradiometer (MODIS). The influence of different spatial and temporal resolutions was assessed. High correlations and similar phenological patterns between in situ and satellite-based NDVI time series demonstrated the reliability of satellite-based phenological metrics. Sentinel-2-derived metrics showed better agreement with in situ measurements than MODIS-derived metrics. Dynamic filtering with the best index slope extraction algorithm was nevertheless beneficial for Sentinel-2 NDVI time series despite the availability of quality information from the atmospheric correction procedure.
Lange, Maximilian; Rebmann, Corinna; Cuntz, Matthias; Doktor, Daniel
2017-01-01
Quantifying the accuracy of remote sensing products is a timely endeavor given the rapid increase in Earth observation missions. A validation site for Sentinel-2 products was hence established in central Germany. Automatic multispectral and hyperspectral sensor systems were installed in parallel with an existing eddy covariance flux tower, providing spectral information of the vegetation present at high temporal resolution. Normalized Difference Vegetation Index (NDVI) values from ground-based hyperspectral and multispectral sensors were compared with NDVI products derived from Sentinel-2A and Moderate-resolution Imaging Spectroradiometer (MODIS). The influence of different spatial and temporal resolutions was assessed. High correlations and similar phenological patterns between in situ and satellite-based NDVI time series demonstrated the reliability of satellite-based phenological metrics. Sentinel-2-derived metrics showed better agreement with in situ measurements than MODIS-derived metrics. Dynamic filtering with the best index slope extraction algorithm was nevertheless beneficial for Sentinel-2 NDVI time series despite the availability of quality information from the atmospheric correction procedure. PMID:28800065
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
State-level gonorrhea rates and expedited partner therapy laws: insights from time series analyses.
Owusu-Edusei, K; Cramer, R; Chesson, H W; Gift, T L; Leichliter, J S
2017-06-01
In this study, we examined state-level monthly gonorrhea morbidity and assessed the potential impact of existing expedited partner therapy (EPT) laws in relation to the time that the laws were enacted. Longitudinal study. We obtained state-level monthly gonorrhea morbidity (number of cases/100,000 for males, females and total) from the national surveillance data. We used visual examination (of morbidity trends) and an autoregressive time series model in a panel format with intervention (interrupted time series) analysis to assess the impact of state EPT laws based on the months in which the laws were enacted. For over 84% of the states with EPT laws, the monthly morbidity trends did not show any noticeable decreases on or after the laws were enacted. Although we found statistically significant decreases in gonorrhea morbidity within four of the states with EPT laws (Alaska, Illinois, Minnesota, and Vermont), there were no significant decreases when the decreases in the four states were compared contemporaneously with the decreases in states that do not have the laws. We found no impact (decrease in gonorrhea morbidity) attributable exclusively to the EPT law(s). However, these results do not imply that the EPT laws themselves were not effective (or failed to reduce gonorrhea morbidity), because the effectiveness of the EPT law is dependent on necessary intermediate events/outcomes, including sexually transmitted infection service providers' awareness and practice, as well as acceptance by patients and their partners. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Chattopadhyay, Anirban; Khondekar, Mofazzal Hossain; Bhattacharjee, Anup Kumar
2017-09-01
In this paper initiative has been taken to search the periodicities of linear speed of Coronal Mass Ejection in solar cycle 23. Double exponential smoothing and Discrete Wavelet Transform are being used for detrending and filtering of the CME linear speed time series. To choose the appropriate statistical methodology for the said purpose, Smoothed Pseudo Wigner-Ville distribution (SPWVD) has been used beforehand to confirm the non-stationarity of the time series. The Time-Frequency representation tool like Hilbert Huang Transform and Empirical Mode decomposition has been implemented to unearth the underneath periodicities in the non-stationary time series of the linear speed of CME. Of all the periodicities having more than 95% Confidence Level, the relevant periodicities have been segregated out using Integral peak detection algorithm. The periodicities observed are of low scale ranging from 2-159 days with some relevant periods like 4 days, 10 days, 11 days, 12 days, 13.7 days, 14.5 and 21.6 days. These short range periodicities indicate the probable origin of the CME is the active longitude and the magnetic flux network of the sun. The results also insinuate about the probable mutual influence and causality with other solar activities (like solar radio emission, Ap index, solar wind speed, etc.) owing to the similitude between their periods and CME linear speed periods. The periodicities of 4 days and 10 days indicate the possible existence of the Rossby-type waves or planetary waves in Sun.
Lu, Wei-Zhen; Wang, Wen-Jian
2005-04-01
Monitoring and forecasting of air quality parameters are popular and important topics of atmospheric and environmental research today due to the health impact caused by exposing to air pollutants existing in urban air. The accurate models for air pollutant prediction are needed because such models would allow forecasting and diagnosing potential compliance or non-compliance in both short- and long-term aspects. Artificial neural networks (ANN) are regarded as reliable and cost-effective method to achieve such tasks and have produced some promising results to date. Although ANN has addressed more attentions to environmental researchers, its inherent drawbacks, e.g., local minima, over-fitting training, poor generalization performance, determination of the appropriate network architecture, etc., impede the practical application of ANN. Support vector machine (SVM), a novel type of learning machine based on statistical learning theory, can be used for regression and time series prediction and have been reported to perform well by some promising results. The work presented in this paper aims to examine the feasibility of applying SVM to predict air pollutant levels in advancing time series based on the monitored air pollutant database in Hong Kong downtown area. At the same time, the functional characteristics of SVM are investigated in the study. The experimental comparisons between the SVM model and the classical radial basis function (RBF) network demonstrate that the SVM is superior to the conventional RBF network in predicting air quality parameters with different time series and of better generalization performance than the RBF model.
Recombinant Temporal Aberration Detection Algorithms for Enhanced Biosurveillance
Murphy, Sean Patrick; Burkom, Howard
2008-01-01
Objective Broadly, this research aims to improve the outbreak detection performance and, therefore, the cost effectiveness of automated syndromic surveillance systems by building novel, recombinant temporal aberration detection algorithms from components of previously developed detectors. Methods This study decomposes existing temporal aberration detection algorithms into two sequential stages and investigates the individual impact of each stage on outbreak detection performance. The data forecasting stage (Stage 1) generates predictions of time series values a certain number of time steps in the future based on historical data. The anomaly measure stage (Stage 2) compares features of this prediction to corresponding features of the actual time series to compute a statistical anomaly measure. A Monte Carlo simulation procedure is then used to examine the recombinant algorithms’ ability to detect synthetic aberrations injected into authentic syndromic time series. Results New methods obtained with procedural components of published, sometimes widely used, algorithms were compared to the known methods using authentic datasets with plausible stochastic injected signals. Performance improvements were found for some of the recombinant methods, and these improvements were consistent over a range of data types, outbreak types, and outbreak sizes. For gradual outbreaks, the WEWD MovAvg7+WEWD Z-Score recombinant algorithm performed best; for sudden outbreaks, the HW+WEWD Z-Score performed best. Conclusion This decomposition was found not only to yield valuable insight into the effects of the aberration detection algorithms but also to produce novel combinations of data forecasters and anomaly measures with enhanced detection performance. PMID:17947614
78 FR 63255 - Arden Investment Series Trust, et al.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-23
... Series Trust, et al.; Notice of Application October 17, 2013. AGENCY: Securities and Exchange Commission... Series Trust (the ``Trust'') and Arden Asset Management LLC (``Arden'') (collectively, the ``Applicants... Accounts'') holds shares of Arden Variable Alternative Strategies Fund, an existing portfolio of the Trust...
Celestial mechanics solutions that escape
NASA Astrophysics Data System (ADS)
Gingold, Harry; Solomon, Daniel
2017-08-01
We establish the existence of an open set of initial conditions through which pass solutions without singularities to Newton's gravitational equations in R3 on a semi-infinite interval in forward time, for which every pair of particles separates like At , A > 0, as t → ∞. The solutions are constructable as series with rapid uniform convergence and their asymptotic behavior to any order is prescribed. We show that this family of solutions depends on 6N parameters subject to certain constraints.
Monograph on the use of the multivariate Gram Charlier series Type A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hatayodom, T.; Heydt, G.
1978-01-01
The Gram-Charlier series in an infinite series expansion for a probability density function (pdf) in which terms of the series are Hermite polynomials. There are several Gram-Charlier series - the best known is Type A. The Gram-Charlier series, Type A (GCA) exists for both univariate and multivariate random variables. This monograph introduces the multivariate GCA and illustrates its use through several examples. A brief bibliography and discussion of Hermite polynomials is also included. 9 figures, 2 tables.
Ebhuoma, Osadolor; Gebreslasie, Michael; Magubane, Lethumusa
The change of the malaria control intervention policy in South Africa (SA), re-introduction of dichlorodiphenyltrichloroethane (DDT), may be responsible for the low and sustained malaria transmission in KwaZulu-Natal (KZN). We evaluated the effect of the re-introduction of DDT on malaria in KZN and suggested practical ways the province can strengthen her already existing malaria control and elimination efforts, to achieve zero malaria transmission. We obtained confirmed monthly malaria cases in KZN from the malaria control program of KZN from 1998 to 2014. The seasonal autoregressive integrated moving average (SARIMA) intervention time series analysis (ITSA) was employed to model the effect of the re-introduction of DDT on confirmed monthly malaria cases. The result is an abrupt and permanent decline of monthly malaria cases (w 0 =-1174.781, p-value=0.003) following the implementation of the intervention policy. The sustained low malaria cases observed over a long period suggests that the continued usage of DDT did not result in insecticide resistance as earlier anticipated. It may be due to exophagic malaria vectors, which renders the indoor residual spraying not totally effective. Therefore, the feasibility of reducing malaria transmission to zero in KZN requires other reliable and complementary intervention resources to optimize the existing ones. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Towards a first detailed reconstruction of sunspot information over the last 150 years
NASA Astrophysics Data System (ADS)
Lefevre, Laure; Clette, Frédéric
2013-04-01
With four centuries of solar evolution, the International Sunspot Number (SSN) forms the longest solar time series currently available. It provides an essential reference for understanding and quantifying how the solar output has varied over decades and centuries and thus for assessing the variations of the main natural forcing on the Earth climate. For such a quantitative use, this unique time-series must be closely monitored for any possible biases and drifts. This is the main objective of the Sunspot Workshops organized jointly by the National Solar Observatory (NSO) and the Royal Observatory of Belgium (ROB) since 2010. Here, we will report about some recent outcomes of past workshops, like diagnostics of scaling errors and their proposed corrections, or the recent disagreement between the sunspot sumber and other solar indices like the 10.7cm radio flux. Our most recent analyses indicate that while part of this divergence may be due to a calibration drift in the SSN, it also results from an intrinsic change in the global magnetic parameters of sunspots and solar active regions, suggesting a possible transition to a new activity regime. Going beyond the SSN series, in the framework of the SOTERIA, TOSCA and SOLID projects, we produced a survey of all existing catalogs providing detailed sunspot information and we also located different primary solar images and drawing collections that can be exploitable to complement the existing catalogs (COMESEP project). These are first steps towards the construction of a multi-parametric time series of multiple sunspot group properties over at least the last 150 years, allowing to reconstruct and extend the current 1-D SSN series. By bringing new spatial, morphological and evolutionary information, such a data set should bring major advances for the modeling of the solar dynamo and solar irradiance. We will present here the current status of this work. The catalog now extends over the last 3 cycles (Lefevre & Clette 2011,doi:10.1007/s11207-012-0184-5). A partially complete version extends back to 1965, and will soon reach 1940 thanks to the data from the Uccle Solar Equatorial Table (USET) operated by the ROB. We will also present initial applications derived from the present version of the catalog, such as new sunspot-based solar fluxes and proxies that should ultimately help refine our knowledge of the influence of the Sun on our environment, now and throughout the ages. This work has received funding from the European Commission FP7 Project COMESEP (263252).
Field evidences for a Mesozoic palaeo-relief through the northern Tianshan
NASA Astrophysics Data System (ADS)
Gumiaux, Charles; Chen, Ke; Augier, Romain; Chen, Yan; Wang, Qingchen
2010-05-01
The modern Tianshan mountain belt, located in Central Asia, offers a natural laboratory to study orogenic processes linked with convergent geodynamical settings. Most of the previous studies either focused on the Paleozoic evolution of the range - subductions, arc accretions and continental collision - or on its Cenozoic intra-continental evolution linked with the India-Asia collision. At first order, the finite structure of this range obviously displays a remarkable uprising of Paleozoic "basement" rocks - as a crustal-scale ‘pop-up' - surrounded by two Cenozoic foreland basins. The present-day topography of the Tianshan is traditionally related to the latest intra-continental reactivation of the range. In contrast, the present field study of the northern Tianshan brings new and clear evidences for the existence of a significant relief, in this area, during Mesozoic times. The investigation zone is about 250 km long, from Wusu to Urumqi, along the northern flank of the Tianshan where the rivers deeply incised the topography. In such valleys, lithologies and structural relationships between Paleozoic basement rocks, Mesozoic and Cenozoic sedimentary series are particularly well exposed along several sections. Jurassic series are mostly characterized by coal-bearing, coarse-grained continental deposits. Within intra-mountain basins, sedimentary breccias, with clasts of Carboniferous basement rocks, have been locally found at the base of the series. This argues for the presence of a rather proximal palaeo-relief of basement rocks along the range front and the occurrence of proximal intra-mountain basins, during the Jurassic. Moreover, while a major thrust is mostly evoked between Jurassic deposits and the Paleozoic units, some of the studied sections show that the Triassic to Jurassic sedimentary series can be followed from the basin to the range. In these cases, the unconformity of the Mesozoic series on top of the Carboniferous basement has been locally clearly identified quite high in the mountain range or even, surprisingly, directly along the northern Tianshan "front" itself. Combining available information from geological maps, field investigations and numerous drilling wells, regional-scale cross-sections have been built. Some of them show "onlap" type deposit of the Triassic to Jurassic clastic sediments on top of the Paleozoic basement that was thus significantly sloping down to the North at that time. Our study clearly evidences, at different scales, the existence of a major palaeo-relief along the northern Tianshan range during Mesozoic, and particularly during Jurassic times. Such results are compatible with previous fission tracks and sedimentology studies. From this, the Tianshan's uplift and the movements associated with along its northern front structures, which are traditionally assigned to its Cenozoic reactivation, must be reduced. These new results question on the mode and timing of reactivation of structures and on the link between topography and intra-continental collisional settings.
A comparative simulation study of AR(1) estimators in short time series.
Krone, Tanja; Albers, Casper J; Timmerman, Marieke E
2017-01-01
Various estimators of the autoregressive model exist. We compare their performance in estimating the autocorrelation in short time series. In Study 1, under correct model specification, we compare the frequentist r 1 estimator, C-statistic, ordinary least squares estimator (OLS) and maximum likelihood estimator (MLE), and a Bayesian method, considering flat (B f ) and symmetrized reference (B sr ) priors. In a completely crossed experimental design we vary lengths of time series (i.e., T = 10, 25, 40, 50 and 100) and autocorrelation (from -0.90 to 0.90 with steps of 0.10). The results show a lowest bias for the B sr , and a lowest variability for r 1 . The power in different conditions is highest for B sr and OLS. For T = 10, the absolute performance of all measurements is poor, as expected. In Study 2, we study robustness of the methods through misspecification by generating the data according to an ARMA(1,1) model, but still analysing the data with an AR(1) model. We use the two methods with the lowest bias for this study, i.e., B sr and MLE. The bias gets larger when the non-modelled moving average parameter becomes larger. Both the variability and power show dependency on the non-modelled parameter. The differences between the two estimation methods are negligible for all measurements.
Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng
2018-04-20
Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Segmentation and analysis of mouse pituitary cells with graphic user interface (GUI)
NASA Astrophysics Data System (ADS)
González, Erika; Medina, Lucía.; Hautefeuille, Mathieu; Fiordelisio, Tatiana
2018-02-01
In this work we present a method to perform pituitary cell segmentation in image stacks acquired by fluorescence microscopy from pituitary slice preparations. Although there exist many procedures developed to achieve cell segmentation tasks, they are generally based on the edge detection and require high resolution images. However in the biological preparations that we worked on, the cells are not well defined as experts identify their intracellular calcium activity due to fluorescence intensity changes in different regions over time. This intensity changes were associated with time series over regions, and because they present a particular behavior they were used into a classification procedure in order to perform cell segmentation. Two logistic regression classifiers were implemented for the time series classification task using as features the area under the curve and skewness in the first classifier and skewness and kurtosis in the second classifier. Once we have found both decision boundaries in two different feature spaces by training using 120 time series, the decision boundaries were tested over 12 image stacks through a python graphical user interface (GUI), generating binary images where white pixels correspond to cells and the black ones to background. Results show that area-skewness classifier reduces the time an expert dedicates in locating cells by up to 75% in some stacks versus a 92% for the kurtosis-skewness classifier, this evaluated on the number of regions the method found. Due to the promising results, we expect that this method will be improved adding more relevant features to the classifier.
Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A
2017-01-01
The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60-90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity.
Dynamics of electricity market correlations
NASA Astrophysics Data System (ADS)
Alvarez-Ramirez, J.; Escarela-Perez, R.; Espinosa-Perez, G.; Urrea, R.
2009-06-01
Electricity market participants rely on demand and price forecasts to decide their bidding strategies, allocate assets, negotiate bilateral contracts, hedge risks, and plan facility investments. However, forecasting is hampered by the non-linear and stochastic nature of price time series. Diverse modeling strategies, from neural networks to traditional transfer functions, have been explored. These approaches are based on the assumption that price series contain correlations that can be exploited for model-based prediction purposes. While many works have been devoted to the demand and price modeling, a limited number of reports on the nature and dynamics of electricity market correlations are available. This paper uses detrended fluctuation analysis to study correlations in the demand and price time series and takes the Australian market as a case study. The results show the existence of correlations in both demand and prices over three orders of magnitude in time ranging from hours to months. However, the Hurst exponent is not constant over time, and its time evolution was computed over a subsample moving window of 250 observations. The computations, also made for two Canadian markets, show that the correlations present important fluctuations over a seasonal one-year cycle. Interestingly, non-linearities (measured in terms of a multifractality index) and reduced price predictability are found for the June-July periods, while the converse behavior is displayed during the December-January period. In terms of forecasting models, our results suggest that non-linear recursive models should be considered for accurate day-ahead price estimation. On the other hand, linear models seem to suffice for demand forecasting purposes.
NASA Astrophysics Data System (ADS)
Paiva, Rodrigo C. D.; Durand, Michael T.; Hossain, Faisal
2015-01-01
Recent efforts have sought to estimate river discharge and other surface water-related quantities using spaceborne sensors, with better spatial coverage but worse temporal sampling as compared with in situ measurements. The Surface Water and Ocean Topography (SWOT) mission will provide river discharge estimates globally from space. However, questions on how to optimally use the spatially distributed but asynchronous satellite observations to generate continuous fields still exist. This paper presents a statistical model (River Kriging-RK), for estimating discharge time series in a river network in the context of the SWOT mission. RK uses discharge estimates at different locations and times to produce a continuous field using spatiotemporal kriging. A key component of RK is the space-time river discharge covariance, which was derived analytically from the diffusive wave approximation of Saint Venant's equations. The RK covariance also accounts for the loss of correlation at confluences. The model performed well in a case study on Ganges-Brahmaputra-Meghna (GBM) River system in Bangladesh using synthetic SWOT observations. The correlation model reproduced empirically derived values. RK (R2=0.83) outperformed other kriging-based methods (R2=0.80), as well as a simple time series linear interpolation (R2=0.72). RK was used to combine discharge from SWOT and in situ observations, improving estimates when the latter is included (R2=0.91). The proposed statistical concepts may eventually provide a feasible framework to estimate continuous discharge time series across a river network based on SWOT data, other altimetry missions, and/or in situ data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena
The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
The relationship between the change of magnetic energy and eruption behavior in NOAA AR 11429
NASA Astrophysics Data System (ADS)
Wang, R.; Liu, Y. D.
2013-12-01
We study the evolution of magnetic energy in an active region (AR) NOAA 11429, which produced a series of X/M class flares and fast coronal mass ejections (CMEs) in March 2012. In particular, this AR spawned double X-class flares (X5.4/X1.3) within a time internal of only 1 hr on March 7, which are associated with wide and fast CMEs with speeds of ~2000 km/s. A nonlinear force-free field extrapolation method is adopted to reconstruct the coronal magnetic field. We apply this method to a time series of 176 high-cadence vector magnetograms of the AR acquired by the Helioseismic and Magnetic Imager on board the Solar Dynamics Observatory (HMI/SDO), which span a time interval of 1.5 days. We investigate the budgets of the free magnetic energy and relative magnetic helicity. We find that there exist some relations between the changes of magnetic energy and flare magnitudes. Compared with previous studies, our results indicate that the magnetic energy decrease occurs before the flare and CME launch time. We will also combine images from the Atmospheric Imaging Assembly (AIA) to further explore the detailed process of the eruptions.
Time series inversion of spectra from ground-based radiometers
NASA Astrophysics Data System (ADS)
Christensen, O. M.; Eriksson, P.
2013-02-01
Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the OSO water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.
Amplitude variations in the sdBV star PG 1605+072: Another beating time scale?
NASA Astrophysics Data System (ADS)
Pereira, T. M. D.; Lopes, I. P.
2004-10-01
PG 1605+072 has an unique and complex oscillation spectrum amongst the pulsating members of the EC 14026 stars. It has the longest periods and the richest, most puzzling frequency spectrum. We present a quantitative analysis of the photometric time-series obtained at 1-m telescope of the South African Astronomical Observatory. Thirteen oscillation parameters, frequencies, amplitudes and initial phases were determined from a 45 h time-series. Our work confirm previous observational results. The observed frequencies are within a difference smaller than 2.7% of the theoretical values, and less than 0.1% of other previous studies. We also infer the existence of variation of a periodicity of 4-5 days on the amplitude of the observed modes, similar to the yearly time-scale variation found by previous studies. Furthermore, we found a new frequency of 2133 μ Hz which has not been previously reported, its origin being yet unclear. Based on observations obtained at the South African Astronomical Observatory (SAAO). This research was supported by a grant from Fundação da Ciência e Tecnologia, grant No. PESO/P/PRO/40142/2000.
NASA Astrophysics Data System (ADS)
Patra, S. R.
2017-12-01
Evapotranspiration (ET0) influences water resources and it is considered as a vital process in aridic hydrologic frameworks. It is one of the most important measure in finding the drought condition. Therefore, time series forecasting of evapotranspiration is very important in order to help the decision makers and water system mangers build up proper systems to sustain and manage water resources. Time series considers that -history repeats itself, hence by analysing the past values, better choices, or forecasts, can be carried out for the future. Ten years of ET0 data was used as a part of this study to make sure a satisfactory forecast of monthly values. In this study, three models: (ARIMA) mathematical model, artificial neural network model, support vector machine model are presented. These three models are used for forecasting monthly reference crop evapotranspiration based on ten years of past historical records (1991-2001) of measured evaporation at Ganjam region, Odisha, India without considering the climate data. The developed models will allow water resource managers to predict up to 12 months, making these predictions very useful to optimize the resources needed for effective water resources management. In this study multistep-ahead prediction is performed which is more complex and troublesome than onestep ahead. Our investigation proposed that nonlinear relationships may exist among the monthly indices, so that the ARIMA model might not be able to effectively extract the full relationship hidden in the historical data. Support vector machines are potentially helpful time series forecasting strategies on account of their strong nonlinear mapping capability and resistance to complexity in forecasting data. SVMs have great learning capability in time series modelling compared to ANN. For instance, the SVMs execute the structural risk minimization principle, which allows in better generalization as compared to neural networks that use the empirical risk minimization principle. The reliability of these computational models was analysed in light of simulation results and it was found out that SVM model produces better results among the three. The future research should be routed to extend the validation data set and to check the validity of our results on different areas with hybrid intelligence techniques.
Analysis of satellite precipitation over East Africa during last decades
NASA Astrophysics Data System (ADS)
Cattani, Elsa; Wenhaji Ndomeni, Claudine; Merino, Andrés; Levizzani, Vincenzo
2016-04-01
Daily accumulated precipitation time series from satellite retrieval algorithms (e.g., ARC2 and TAMSAT) are exploited to extract the spatial and temporal variability of East Africa (EA - 5°S-20°N, 28°E-52°E) precipitation during last decades (1983-2013). The Empirical Orthogonal Function (EOF) analysis is applied to precipitation time series to investigate the spatial and temporal variability in particular for October-November-December referred to as the short rain season. Moreover, the connection among EA's precipitation, sea surface temperature, and soil moisture is analyzed through the correlation with the dominant EOF modes of variability. Preliminary results concern the first two EOF's modes for the ARC2 data set. EOF1 is characterized by an inter-annual variability and a positive correlation between precipitation and El Niño, positive Indian Ocean Dipole mode, and soil moisture, while EOF2 shows a dipole structure of spatial variability associated with a longer scale temporal variability. This second dominant mode is mostly linked to sea surface temperature variations in the North Atlantic Ocean. Further analyses are carried out by computing the time series of the joint CCI/CLIVAR/JCOMM Expert Team on Climate Change Detection and Indices (ETCCDI, http://etccdi.pacificclimate.org/index.shtml), i.e. RX1day, RX5day, CDD, CDD, CWD, SDII, PRCPTOT, R10, R20. The purpose is to identify the occurrenes of extreme events (droughts and floods) and extract precipitation temporal variation by trend analysis (Mann-Kendall technique). Results for the ARC2 data set demonstrate the existence of a dipole spatial pattern in the linear trend of the time series of PRCPTOT (annual precipitation considering days with a rain rate > 1 mm) and SDII (average precipitation on wet days over a year). A negative trend is mainly present over West Ethiopia and Sudan, whereas a positive trend is exhibited over East Ethiopia and Somalia. CDD (maximum number of consecutive dry days) and CWD (maximum number of consecutive wet days) time series do not exhibit a similar behavior and trends are generally weaker with a lower significance level with respect to PRCPTOT and SDII.
Granger causality--statistical analysis under a configural perspective.
von Eye, Alexander; Wiedermann, Wolfgang; Mun, Eun-Young
2014-03-01
The concept of Granger causality can be used to examine putative causal relations between two series of scores. Based on regression models, it is asked whether one series can be considered the cause for the second series. In this article, we propose extending the pool of methods available for testing hypotheses that are compatible with Granger causation by adopting a configural perspective. This perspective allows researchers to assume that effects exist for specific categories only or for specific sectors of the data space, but not for other categories or sectors. Configural Frequency Analysis (CFA) is proposed as the method of analysis from a configural perspective. CFA base models are derived for the exploratory analysis of Granger causation. These models are specified so that they parallel the regression models used for variable-oriented analysis of hypotheses of Granger causation. An example from the development of aggression in adolescence is used. The example shows that only one pattern of change in aggressive impulses over time Granger-causes change in physical aggression against peers.
You, Shutang; Hadley, Stanton W.; Shankar, Mallikarjun; ...
2016-01-12
This paper studies the generation and transmission expansion co-optimization problem with a high wind power penetration rate in the US Eastern Interconnection (EI) power grid. In this paper, the generation and transmission expansion problem for the EI system is modeled as a mixed-integer programming (MIP) problem. Our paper also analyzed a time series generation method to capture the variation and correlation of both load and wind power across regions. The obtained series can be easily introduced into the expansion planning problem and then solved through existing MIP solvers. Simulation results show that the proposed planning model and series generation methodmore » can improve the expansion result significantly through modeling more detailed information of wind and load variation among regions in the US EI system. Moreover, the improved expansion plan that combines generation and transmission will aid system planners and policy makers to maximize the social welfare in large-scale power grids.« less
InSAR time series analysis of ALOS-2 ScanSAR data and its implications for NISAR
NASA Astrophysics Data System (ADS)
Liang, C.; Liu, Z.; Fielding, E. J.; Huang, M. H.; Burgmann, R.
2017-12-01
The JAXA's ALOS-2 mission was launched on May 24, 2014. It operates at L-band and can acquire data in multiple modes. ScanSAR is the main operational mode and has a 350 km swath, somewhat larger than the 250 km swath of the SweepSAR mode planned for the NASA-ISRO SAR (NISAR) mission. ALOS-2 has been acquiring a wealth of L-band InSAR data. These data are of particular value in areas of dense vegetation and high relief. The InSAR technical development for ALOS-2 also enables the preparation for the upcoming NISAR mission. We have been developing advanced InSAR processing techniques for ALOS-2 over the past two years. Here, we report the important issues for doing InSAR time series analysis using ALOS-2 ScanSAR data. First, we present ionospheric correction techniques for both regular ScanSAR InSAR and MAI (multiple aperture InSAR) ScanSAR InSAR. We demonstrate the large-scale ionospheric signals in the ScanSAR interferograms. They can be well mitigated by the correction techniques. Second, based on our technical development of burst-by-burst InSAR processing for ALOS-2 ScanSAR data, we find that the azimuth Frequency Modulation (FM) rate error is an important issue not only for MAI, but also for regular InSAR time series analysis. We identify phase errors caused by azimuth FM rate errors during the focusing process of ALOS-2 product. The consequence is mostly a range ramp in the InSAR time series result. This error exists in all of the time series results we have processed. We present the correction techniques for this error following a theoretical analysis. After corrections, we present high quality ALOS-2 ScanSAR InSAR time series results in a number of areas. The development for ALOS-2 can provide important implications for NISAR mission. For example, we find that in most cases the relative azimuth shift caused by ionosphere can be as large as 4 m in a large area imaged by ScanSAR. This azimuth shift is half of the 8 m azimuth resolution of the SweepSAR mode planned for NISAR, which implies that a good coregistration strategy for NISAR's SweepSAR mode is geometrical coregistration followed by MAI or spectral diversity analysis. Besides, our development also provides implications for the processing and system parameter requirements of NISAR, such as the accuracy requirement of azimuth FM rate and range timing.
Challenges to Standardization: A Case Study Using Coastal and Deep-Ocean Water Level Data
NASA Astrophysics Data System (ADS)
Sweeney, A. D.; Stroker, K. J.; Mungov, G.; McLean, S. J.
2015-12-01
Sea levels recorded at coastal stations and inferred from deep-ocean pressure observations at the seafloor are submitted for archive in multiple data and metadata formats. These formats include two forms of schema-less XML and a custom binary format accompanied by metadata in a spreadsheet. The authors report on efforts to use existing standards to make this data more discoverable and more useful beyond their initial use in detecting tsunamis. An initial review of data formats for sea level data around the globe revealed heterogeneity in presentation and content. In the absence of a widely-used domain-specific format, we adopted the general model for structuring data and metadata expressed by the Network Common Data Form (netCDF). netCDF has been endorsed by the Open Geospatial Consortium and has the advantages of small size when compared to equivalent plain text representation and provides a standard way of embedding metadata in the same file. We followed the orthogonal time-series profile of the Climate and Forecast discrete sampling geometries as the convention for structuring the data and describing metadata relevant for use. We adhered to the Attribute Convention for Data Discovery for capturing metadata to support user search. Beyond making it possible to structure data and metadata in a standard way, netCDF is supported by multiple software tools in providing programmatic cataloging, access, subsetting, and transformation to other formats. We will describe our successes and failures in adhering to existing standards and provide requirements for either augmenting existing conventions or developing new ones. Some of these enhancements are specific to sea level data, while others are applicable to time-series data in general.
NASA Astrophysics Data System (ADS)
Krietemeyer, Andreas; ten Veldhuis, Marie-claire; van de Giesen, Nick
2017-04-01
Exploiting GNSS signal delays is one possibility to obtain Precipitable Water Vapor (PWV) estimates in the atmosphere. The technique is well known since the early 1990s and by now an established method in the meteorological community. The data is crucial for weather forecasting and its assimilation into numerical weather forecasting models is a topic of ongoing research. However, the spatial resolution of ground based GNSS receivers is usually low, in the order of tens of kilometres. Since severe weather events such as convective storms can be concentrated in spatial extent, existing GNSS networks are often not sufficient to retrieve small scale PWV fluctuations and need to be densified. For economic reasons, the use of low-cost single-frequency receivers is a promising solution. In this study, we will deploy a network of single-frequency receivers to densify an existing dual-frequency network in order to investigate the spatial and temporal PWV variations. We demonstrate a test network consisting of four single-frequency receivers in the Rotterdam area (Netherlands). In order to eliminate the delay caused by the ionosphere, the Satellite-specific Epoch-differenced Ionospheric Delay model (SEID) is applied, using a surrounding dual-frequency network distributed over a radius of approximately 25 km. With the synthesized L2 frequency, the tropospheric delays are estimated using the Precise Point Positioning (PPP) strategy and International GNSS Service (IGS) final orbits. The PWV time series are validated by a comparison of a collocated single-frequency and a dual-frequency receiver. The time series themselves form the basis for potential further studies like data assimilation into numerical weather models and GNSS tomography to study the impact of the increased spatial resolution on local heavy rain forecast.
Artificial neural networks for modeling time series of beach litter in the southern North Sea.
Schulz, Marcus; Matthies, Michael
2014-07-01
In European marine waters, existing monitoring programs of beach litter need to be improved concerning litter items used as indicators of pollution levels, efficiency, and effectiveness. In order to ease and focus future monitoring of beach litter on few important litter items, feed-forward neural networks consisting of three layers were developed to relate single litter items to general categories of marine litter. The neural networks developed were applied to seven beaches in the southern North Sea and modeled time series of five general categories of marine litter, such as litter from fishing, shipping, and tourism. Results of regression analyses show that general categories were predicted significantly moderately to well. Measured and modeled data were in the same order of magnitude, and minima and maxima overlapped well. Neural networks were found to be eligible tools to deliver reliable predictions of marine litter with low computational effort and little input of information. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hopke, P K; Liu, C; Rubin, D B
2001-03-01
Many chemical and environmental data sets are complicated by the existence of fully missing values or censored values known to lie below detection thresholds. For example, week-long samples of airborne particulate matter were obtained at Alert, NWT, Canada, between 1980 and 1991, where some of the concentrations of 24 particulate constituents were coarsened in the sense of being either fully missing or below detection limits. To facilitate scientific analysis, it is appealing to create complete data by filling in missing values so that standard complete-data methods can be applied. We briefly review commonly used strategies for handling missing values and focus on the multiple-imputation approach, which generally leads to valid inferences when faced with missing data. Three statistical models are developed for multiply imputing the missing values of airborne particulate matter. We expect that these models are useful for creating multiple imputations in a variety of incomplete multivariate time series data sets.
Comparison of floods non-stationarity detection methods: an Austrian case study
NASA Astrophysics Data System (ADS)
Salinas, Jose Luis; Viglione, Alberto; Blöschl, Günter
2016-04-01
Non-stationarities in flood regimes have a huge impact in any mid and long term flood management strategy. In particular the estimation of design floods is very sensitive to any kind of flood non-stationarity, as they should be linked to a return period, concept that can be ill defined in a non-stationary context. Therefore it is crucial when analyzing existent flood time series to detect and, where possible, attribute flood non-stationarities to changing hydroclimatic and land-use processes. This works presents the preliminary results of applying different non-stationarity detection methods on annual peak discharges time series over more than 400 gauging stations in Austria. The kind of non-stationarities analyzed include trends (linear and non-linear), breakpoints, clustering beyond stochastic randomness, and detection of flood rich/flood poor periods. Austria presents a large variety of landscapes, elevations and climates that allow us to interpret the spatial patterns obtained with the non-stationarity detection methods in terms of the dominant flood generation mechanisms.
The evolution of monitoring system: the INFN-CNAF case study
NASA Astrophysics Data System (ADS)
Bovina, Stefano; Michelotto, Diego
2017-10-01
Over the past two years, the operations at CNAF, the ICT center of the Italian Institute for Nuclear Physics, have undergone significant changes. The adoption of configuration management tools, such as Puppet, and the constant increase of dynamic and cloud infrastructures have led us to investigate a new monitoring approach. The present work deals with the centralization of the monitoring service at CNAF through a scalable and highly configurable monitoring infrastructure. The selection of tools has been made taking into account the following requirements given by users: (I) adaptability to dynamic infrastructures, (II) ease of configuration and maintenance, capability to provide more flexibility, (III) compatibility with existing monitoring system, (IV) re-usability and ease of access to information and data. In the paper, the CNAF monitoring infrastructure and its related components are hereafter described: Sensu as monitoring router, InfluxDB as time series database to store data gathered from sensors, Uchiwa as monitoring dashboard and Grafana as a tool to create dashboards and to visualize time series metrics.
Cosmogenic 36Cl in karst waters: Quantifying contributions from atmospheric and bedrock sources
NASA Astrophysics Data System (ADS)
Johnston, V. E.; McDermott, F.
2009-12-01
Improved reconstructions of cosmogenic isotope production through time are crucial to understand past solar variability. As a preliminary step to derive atmospheric 36Cl/Cl solar proxy time-series from speleothems, we quantify 36Cl sources in cave dripwaters. Atmospheric 36Cl fallout rates are a potential proxy for solar output; however extraneous 36Cl derived from in-situ production in cave host-rocks could complicate the solar signal. Results from numerical modeling and preliminary geochemical data presented here show that the atmospheric 36Cl source dominates in many, but not all cave dripwaters. At favorable low elevation, mid-latitude sites, 36Cl based speleothem solar irradiance reconstructions could extend back to 500 ka, with a possible centennial scale temporal resolution. This would represent a marginal improvement in resolution compared with existing polar ice core records, with the added advantages of a wider geographic range, independent U-series constrained chronology, and the potential for contemporaneous climate signals within the same speleothem material.
Forecasting dengue hemorrhagic fever cases using ARIMA model: a case study in Asahan district
NASA Astrophysics Data System (ADS)
Siregar, Fazidah A.; Makmur, Tri; Saprin, S.
2018-01-01
Time series analysis had been increasingly used to forecast the number of dengue hemorrhagic fever in many studies. Since no vaccine exist and poor public health infrastructure, predicting the occurrence of dengue hemorrhagic fever (DHF) is crucial. This study was conducted to determine trend and forecasting the occurrence of DHF in Asahan district, North Sumatera Province. Monthly reported dengue cases for the years 2012-2016 were obtained from the district health offices. A time series analysis was conducted by Autoregressive integrated moving average (ARIMA) modeling to forecast the occurrence of DHF. The results demonstrated that the reported DHF cases showed a seasonal variation. The SARIMA (1,0,0)(0,1,1)12 model was the best model and adequate for the data. The SARIMA model for DHF is necessary and could applied to predict the incidence of DHF in Asahan district and assist with design public health maesures to prevent and control the diseases.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Mauelshagen, F.
2009-09-01
Switzerland lies almost in the centre of a zone of high frequency of hail storm occurrence, often causing costly damage to agriculture, motor vehicles, the built environment and - consequentially - to insurance companies. Over the last ten years hailstorms and the resulting damage have been discussed, with some noticeable frequency, in the context of recent climate change. The final report of the Swiss National Research Programme No. 31: "Climate Change and Natural Disasters” (NFP 31: Klimaänderung und Naturkatastrophen) concludes that "the number of days with agricultural hail damage has increased”. This can be demonstrated from time series of days with severe hail occurrence in Switzerland between 1920 and 2005. Radar observations provide evidence for a doubling of severe hailstorms (on a scale >100 km) within the twenty-year period 1983 to 2003. More recent large-scale damage resulted from hailstorms on 24 June 2002 (causing damage of approx. 250 million CHF on insured risks) and 8 July 2004 (causing loss of 100 mill. on car insurance alone). 2007 was particularly disastrous for crop insurance. The latest OcCC-report on Klimaänderung und die Schweiz 2050 ("Climate Change and Switzerland, 2050”) concludes that peasants, house owners, and insurers should prepare for more extreme hailstorms to come if the frequency of synoptic weather situations favouring hailstorms develops along the trend of the last two decades. However, the same report argues that hailstorms can hardly be simulated by existing climate models, because hail occurrence is a local phenomenon. In other words, existing models of global warming cannot predict the effect global change is likely to have on hailstorm patterns (frequency, severity etc.), which is partly due to the limits of existing time series on hailstorm occurrence. For hail, the instrumental period doesn't begin before the 1950s. As early as 1954, Meteo-Swiss meteorologist M. Bider stated that insurance data were more reliable than observations from the existing network of meteorological offices. Some researchers have even suggested that the entire period before radar observation, beginning in the 1980s, should be classified as pre-instrumental. However, it is undoubted that documents kept in the archives of insurance companies provide valuable proxy information on hail storm occurrence for, at least, the pre-1950-period (well back into the 19th century). This paper discusses key problems in dealing with these proxy data (reliability, interpretation and density of records), as well as methodologies that may lead to extend existing time series on hail storm occurrence in Switzerland. As a consequence, this paper suggests that, for some meteorological phenomena, the field of reconstruction from documentary archival sources must be extended well up into the 20th century, which cannot simply and statically be categorized as "instrumental period”.
Dai, Zongli; Zhao, Aiwu; He, Jie
2018-01-01
In this paper, we propose a hybrid method to forecast the stock prices called High-order-fuzzy-fluctuation-Trends-based Back Propagation(HTBP)Neural Network model. First, we compare each value of the historical training data with the previous day's value to obtain a fluctuation trend time series (FTTS). On this basis, the FTTS blur into fuzzy time series (FFTS) based on the fluctuation of the increasing, equality, decreasing amplitude and direction. Since the relationship between FFTS and future wave trends is nonlinear, the HTBP neural network algorithm is used to find the mapping rules in the form of self-learning. Finally, the results of the algorithm output are used to predict future fluctuations. The proposed model provides some innovative features:(1)It combines fuzzy set theory and neural network algorithm to avoid overfitting problems existed in traditional models. (2)BP neural network algorithm can intelligently explore the internal rules of the actual existence of sequential data, without the need to analyze the influence factors of specific rules and the path of action. (3)The hybrid modal can reasonably remove noises from the internal rules by proper fuzzy treatment. This paper takes the TAIEX data set of Taiwan stock exchange as an example, and compares and analyzes the prediction performance of the model. The experimental results show that this method can predict the stock market in a very simple way. At the same time, we use this method to predict the Shanghai stock exchange composite index, and further verify the effectiveness and universality of the method. PMID:29420584
Guan, Hongjun; Dai, Zongli; Zhao, Aiwu; He, Jie
2018-01-01
In this paper, we propose a hybrid method to forecast the stock prices called High-order-fuzzy-fluctuation-Trends-based Back Propagation(HTBP)Neural Network model. First, we compare each value of the historical training data with the previous day's value to obtain a fluctuation trend time series (FTTS). On this basis, the FTTS blur into fuzzy time series (FFTS) based on the fluctuation of the increasing, equality, decreasing amplitude and direction. Since the relationship between FFTS and future wave trends is nonlinear, the HTBP neural network algorithm is used to find the mapping rules in the form of self-learning. Finally, the results of the algorithm output are used to predict future fluctuations. The proposed model provides some innovative features:(1)It combines fuzzy set theory and neural network algorithm to avoid overfitting problems existed in traditional models. (2)BP neural network algorithm can intelligently explore the internal rules of the actual existence of sequential data, without the need to analyze the influence factors of specific rules and the path of action. (3)The hybrid modal can reasonably remove noises from the internal rules by proper fuzzy treatment. This paper takes the TAIEX data set of Taiwan stock exchange as an example, and compares and analyzes the prediction performance of the model. The experimental results show that this method can predict the stock market in a very simple way. At the same time, we use this method to predict the Shanghai stock exchange composite index, and further verify the effectiveness and universality of the method.
Fractal scaling analysis of groundwater dynamics in confined aquifers
NASA Astrophysics Data System (ADS)
Tu, Tongbi; Ercan, Ali; Kavvas, M. Levent
2017-10-01
Groundwater closely interacts with surface water and even climate systems in most hydroclimatic settings. Fractal scaling analysis of groundwater dynamics is of significance in modeling hydrological processes by considering potential temporal long-range dependence and scaling crossovers in the groundwater level fluctuations. In this study, it is demonstrated that the groundwater level fluctuations in confined aquifer wells with long observations exhibit site-specific fractal scaling behavior. Detrended fluctuation analysis (DFA) was utilized to quantify the monofractality, and multifractal detrended fluctuation analysis (MF-DFA) and multiscale multifractal analysis (MMA) were employed to examine the multifractal behavior. The DFA results indicated that fractals exist in groundwater level time series, and it was shown that the estimated Hurst exponent is closely dependent on the length and specific time interval of the time series. The MF-DFA and MMA analyses showed that different levels of multifractality exist, which may be partially due to a broad probability density distribution with infinite moments. Furthermore, it is demonstrated that the underlying distribution of groundwater level fluctuations exhibits either non-Gaussian characteristics, which may be fitted by the Lévy stable distribution, or Gaussian characteristics depending on the site characteristics. However, fractional Brownian motion (fBm), which has been identified as an appropriate model to characterize groundwater level fluctuation, is Gaussian with finite moments. Therefore, fBm may be inadequate for the description of physical processes with infinite moments, such as the groundwater level fluctuations in this study. It is concluded that there is a need for generalized governing equations of groundwater flow processes that can model both the long-memory behavior and the Brownian finite-memory behavior.
Modeling Individual Cyclic Variation in Human Behavior.
Pierson, Emma; Althoff, Tim; Leskovec, Jure
2018-04-01
Cycles are fundamental to human health and behavior. Examples include mood cycles, circadian rhythms, and the menstrual cycle. However, modeling cycles in time series data is challenging because in most cases the cycles are not labeled or directly observed and need to be inferred from multidimensional measurements taken over time. Here, we present Cyclic Hidden Markov Models (CyH-MMs) for detecting and modeling cycles in a collection of multidimensional heterogeneous time series data. In contrast to previous cycle modeling methods, CyHMMs deal with a number of challenges encountered in modeling real-world cycles: they can model multivariate data with both discrete and continuous dimensions; they explicitly model and are robust to missing data; and they can share information across individuals to accommodate variation both within and between individual time series. Experiments on synthetic and real-world health-tracking data demonstrate that CyHMMs infer cycle lengths more accurately than existing methods, with 58% lower error on simulated data and 63% lower error on real-world data compared to the best-performing baseline. CyHMMs can also perform functions which baselines cannot: they can model the progression of individual features/symptoms over the course of the cycle, identify the most variable features, and cluster individual time series into groups with distinct characteristics. Applying CyHMMs to two real-world health-tracking datasets-of human menstrual cycle symptoms and physical activity tracking data-yields important insights including which symptoms to expect at each point during the cycle. We also find that people fall into several groups with distinct cycle patterns, and that these groups differ along dimensions not provided to the model. For example, by modeling missing data in the menstrual cycles dataset, we are able to discover a medically relevant group of birth control users even though information on birth control is not given to the model.
Modeling Individual Cyclic Variation in Human Behavior
Pierson, Emma; Althoff, Tim; Leskovec, Jure
2018-01-01
Cycles are fundamental to human health and behavior. Examples include mood cycles, circadian rhythms, and the menstrual cycle. However, modeling cycles in time series data is challenging because in most cases the cycles are not labeled or directly observed and need to be inferred from multidimensional measurements taken over time. Here, we present Cyclic Hidden Markov Models (CyH-MMs) for detecting and modeling cycles in a collection of multidimensional heterogeneous time series data. In contrast to previous cycle modeling methods, CyHMMs deal with a number of challenges encountered in modeling real-world cycles: they can model multivariate data with both discrete and continuous dimensions; they explicitly model and are robust to missing data; and they can share information across individuals to accommodate variation both within and between individual time series. Experiments on synthetic and real-world health-tracking data demonstrate that CyHMMs infer cycle lengths more accurately than existing methods, with 58% lower error on simulated data and 63% lower error on real-world data compared to the best-performing baseline. CyHMMs can also perform functions which baselines cannot: they can model the progression of individual features/symptoms over the course of the cycle, identify the most variable features, and cluster individual time series into groups with distinct characteristics. Applying CyHMMs to two real-world health-tracking datasets—of human menstrual cycle symptoms and physical activity tracking data—yields important insights including which symptoms to expect at each point during the cycle. We also find that people fall into several groups with distinct cycle patterns, and that these groups differ along dimensions not provided to the model. For example, by modeling missing data in the menstrual cycles dataset, we are able to discover a medically relevant group of birth control users even though information on birth control is not given to the model. PMID:29780976
Identifying stochastic oscillations in single-cell live imaging time series using Gaussian processes
Manning, Cerys; Rattray, Magnus
2017-01-01
Multiple biological processes are driven by oscillatory gene expression at different time scales. Pulsatile dynamics are thought to be widespread, and single-cell live imaging of gene expression has lead to a surge of dynamic, possibly oscillatory, data for different gene networks. However, the regulation of gene expression at the level of an individual cell involves reactions between finite numbers of molecules, and this can result in inherent randomness in expression dynamics, which blurs the boundaries between aperiodic fluctuations and noisy oscillators. This underlies a new challenge to the experimentalist because neither intuition nor pre-existing methods work well for identifying oscillatory activity in noisy biological time series. Thus, there is an acute need for an objective statistical method for classifying whether an experimentally derived noisy time series is periodic. Here, we present a new data analysis method that combines mechanistic stochastic modelling with the powerful methods of non-parametric regression with Gaussian processes. Our method can distinguish oscillatory gene expression from random fluctuations of non-oscillatory expression in single-cell time series, despite peak-to-peak variability in period and amplitude of single-cell oscillations. We show that our method outperforms the Lomb-Scargle periodogram in successfully classifying cells as oscillatory or non-oscillatory in data simulated from a simple genetic oscillator model and in experimental data. Analysis of bioluminescent live-cell imaging shows a significantly greater number of oscillatory cells when luciferase is driven by a Hes1 promoter (10/19), which has previously been reported to oscillate, than the constitutive MoMuLV 5’ LTR (MMLV) promoter (0/25). The method can be applied to data from any gene network to both quantify the proportion of oscillating cells within a population and to measure the period and quality of oscillations. It is publicly available as a MATLAB package. PMID:28493880
Global Autocorrelation Scales of the Partial Pressure of Oceanic CO2
NASA Technical Reports Server (NTRS)
Li, Zhen; Adamec, David; Takahashi, Taro; Sutherland, Stewart C.
2004-01-01
A global database of approximately 1.7 million observations of the partial pressure of carbon dioxide in surface ocean waters (pCO2) collected between 1970 and 2003 is used to estimate its spatial autocorrelation structure. The patterns of the lag distance where the autocorrelation exceeds 0.8 is similar to patterns in the spatial distribution of the first baroclinic Rossby radius of deformation indicating that ocean circulation processes play a significant role in determining the spatial variability of pCO2. For example, the global maximum of the distance at which autocorrelations exceed 0.8 averages about 140 km in the equatorial Pacific. Also, the lag distance at which the autocorrelation exceed 0.8 is greater in the vicinity of the Gulf Stream than it is near the Kuroshio, approximately 50 km near the Gulf Stream as opposed to 20 km near the Kuroshio. Separate calculations for times when the sun is north and south of the equator revealed no obvious seasonal dependence of the spatial autocorrelation scales. The pCO2 measurements at Ocean Weather Station (OWS) 'P', in the eastern subarctic Pacific (50 N, 145 W) is the only fixed location where an uninterrupted time series of sufficient length exists to calculate a meaningful temporal autocorrelation function for lags greater than a few days. The estimated temporal autocorrelation function at OWS 'P', is highly variable. A spectral analysis of the longest four pCO2 time series indicates a high level of variability occurring over periods from the atmospheric synoptic to the maximum length of the time series, in this case 42 days. It is likely that a relative peak in variability with a period of 3-6 days is related to atmospheric synoptic period variability and ocean mixing events due to wind stirring. However, the short length of available time series makes identifying temporal relationships between pCO2 and atmospheric or ocean processes problematic.
Detection of a sudden change of the field time series based on the Lorenz system.
Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.
Dissolved organic nitrogen dynamics in the North Sea: A time series analysis (1995-2005)
NASA Astrophysics Data System (ADS)
Van Engeland, T.; Soetaert, K.; Knuijt, A.; Laane, R. W. P. M.; Middelburg, J. J.
2010-09-01
Dissolved organic nitrogen (DON) dynamics in the North Sea was explored by means of long-term time series of nitrogen parameters from the Dutch national monitoring program. Generally, the data quality was good with little missing data points. Different imputation methods were used to verify the robustness of the patterns against these missing data. No long-term trends in DON concentrations were found over the sampling period (1995-2005). Inter-annual variability in the different time series showed both common and station-specific behavior. The stations could be divided into two regions, based on absolute concentrations and the dominant times scales of variability. Average DON concentrations were 11 μmol l -1 in the coastal region and 5 μmol l -1 in the open sea. Organic fractions of total dissolved nitrogen (TDN) averaged 38 and 71% in the coastal zone and open sea, respectively, but increased over time due to decreasing dissolved inorganic nitrogen (DIN) concentrations. In both regions intra-annual variability dominated over inter-annual variability, but DON variation in the open sea was markedly shifted towards shorter time scales relative to coastal stations. In the coastal zone a consistent seasonal DON cycle existed with high values in spring-summer and low values in autumn-winter. In the open sea seasonality was weak. A marked shift in the seasonality was found at the Dogger Bank, with DON accumulation towards summer and low values in winter prior to 1999, and accumulation in spring and decline throughout summer after 1999. This study clearly shows that DON is a dynamic actor in the North Sea and should be monitored systematically to enable us to understand fully the functioning of this ecosystem.
Atmospheric extinction in simulation tools for solar tower plants
NASA Astrophysics Data System (ADS)
Hanrieder, Natalie; Wilbert, Stefan; Schroedter-Homscheidt, Marion; Schnell, Franziska; Guevara, Diana Mancera; Buck, Reiner; Giuliano, Stefano; Pitz-Paal, Robert
2017-06-01
Atmospheric extinction causes significant radiation losses between the heliostat field and the receiver in a solar tower plants. These losses vary with site and time. State of the art is that in ray-tracing and plant optimization tools, atmospheric extinction is included by choosing between few constant standard atmospheric conditions. Even though some tools allow the consideration of site and time dependent extinction data, such data sets are nearly never available. This paper summarizes and compares the most common model equations implemented in several ray-tracing tools. There are already several methods developed and published to measure extinction on-site. An overview of the existing methods is also given here. Ray-tracing simulations of one exemplary tower plant at the Plataforma Solar de Almería (PSA) are presented to estimate the plant yield deviations between simulations using standard model equations instead of extinction time series. For PSA, the effect of atmospheric extinction accounts for losses between 1.6 and 7 %. This range is caused by considering overload dumping or not. Applying standard clear or hazy model equations instead of extinction time series lead to an underestimation of the annual plant yield at PSA. The discussion of the effect of extinction in tower plants has to include overload dumping. Situations in which overload dumping occurs are mostly connected to high radiation levels and low atmospheric extinction. Therefore it can be recommended that project developers should consider site and time dependent extinction data especially on hazy sites. A reduced uncertainty of the plant yield prediction can significantly reduce costs due to smaller risk margins for financing and EPCs. The generation of extinction data for several locations in form of representative yearly time series or geographical maps should be further elaborated.
Volatility of linear and nonlinear time series
NASA Astrophysics Data System (ADS)
Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo
2005-07-01
Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.
Potentiation Following Ballistic and Nonballistic Complexes: The Effect of Strength Level.
Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H
2016-07-01
Suchomel, TJ, Sato, K, DeWeese, BH, Ebben, WP, and Stone, MH. Potentiation following ballistic and nonballistic complexes: the effect of strength level. J Strength Cond Res 30(7): 1825-1833, 2016-The purpose of this study was to compare the temporal profile of strong and weak subjects during ballistic and nonballistic potentiation complexes. Eight strong (relative back squat = 2.1 ± 0.1 times body mass) and 8 weak (relative back squat = 1.6 ± 0.2 times body mass) males performed squat jumps immediately and every minute up to 10 minutes following potentiation complexes that included ballistic or nonballistic concentric-only half-squat (COHS) performed at 90% of their 1 repetition maximum COHS. Jump height (JH) and allometrically scaled peak power (PPa) were compared using a series of 2 × 12 repeated measures analyses of variance. No statistically significant strength level main effects for JH (p = 0.442) or PPa (p = 0.078) existed during the ballistic condition. In contrast, statistically significant main effects for time existed for both JH (p = 0.014) and PPa (p < 0.001); however, no statistically significant pairwise comparisons were present (p > 0.05). Statistically significant strength level main effects existed for PPa (p = 0.039) but not for JH (p = 0.137) during the nonballistic condition. Post hoc analysis revealed that the strong subjects produced statistically greater PPa than the weaker subjects (p = 0.039). Statistically significant time main effects existed for time existed for PPa (p = 0.015), but not for JH (p = 0.178). No statistically significant strength level × time interaction effects for JH (p = 0.319) or PPa (p = 0.203) were present for the ballistic or nonballistic conditions. Practical significance indicated by effect sizes and the relationships between maximum potentiation and relative strength suggest that stronger subjects potentiate earlier and to a greater extent than weaker subjects during ballistic and nonballistic potentiation complexes.
Duality between Time Series and Networks
Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.
2011-01-01
Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093
NASA Astrophysics Data System (ADS)
Obulesu, O.; Rama Mohan Reddy, A., Dr; Mahendra, M.
2017-08-01
Detecting regular and efficient cyclic models is the demanding activity for data analysts due to unstructured, vigorous and enormous raw information produced from web. Many existing approaches generate large candidate patterns in the occurrence of huge and complex databases. In this work, two novel algorithms are proposed and a comparative examination is performed by considering scalability and performance parameters. The first algorithm is, EFPMA (Extended Regular Model Detection Algorithm) used to find frequent sequential patterns from the spatiotemporal dataset and the second one is, ETMA (Enhanced Tree-based Mining Algorithm) for detecting effective cyclic models with symbolic database representation. EFPMA is an algorithm grows models from both ends (prefixes and suffixes) of detected patterns, which results in faster pattern growth because of less levels of database projection compared to existing approaches such as Prefixspan and SPADE. ETMA uses distinct notions to store and manage transactions data horizontally such as segment, sequence and individual symbols. ETMA exploits a partition-and-conquer method to find maximal patterns by using symbolic notations. Using this algorithm, we can mine cyclic models in full-series sequential patterns including subsection series also. ETMA reduces the memory consumption and makes use of the efficient symbolic operation. Furthermore, ETMA only records time-series instances dynamically, in terms of character, series and section approaches respectively. The extent of the pattern and proving efficiency of the reducing and retrieval techniques from synthetic and actual datasets is a really open & challenging mining problem. These techniques are useful in data streams, traffic risk analysis, medical diagnosis, DNA sequence Mining, Earthquake prediction applications. Extensive investigational outcomes illustrates that the algorithms outperforms well towards efficiency and scalability than ECLAT, STNR and MAFIA approaches.
Automated Spatio-Temporal Analysis of Remotely Sensed Imagery for Water Resources Management
NASA Astrophysics Data System (ADS)
Bahr, Thomas
2016-04-01
Since 2012, the state of California faces an extreme drought, which impacts water supply in many ways. Advanced remote sensing is an important technology to better assess water resources, monitor drought conditions and water supplies, plan for drought response and mitigation, and measure drought impacts. In the present case study latest time series analysis capabilities are used to examine surface water in reservoirs located along the western flank of the Sierra Nevada region of California. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. A time series from Landsat images (L-5 TM, L-7 ETM+, L-8 OLI) of the AOI was obtained for 1999 to 2015 (October acquisitions). Downloaded from the USGS EarthExplorer web site, they already were georeferenced to a UTM Zone 10N (WGS-84) coordinate system. ENVITasks were used to pre-process the Landsat images as follows: • Triangulation based gap-filling for the SLC-off Landsat-7 ETM+ images. • Spatial subsetting to the same geographic extent. • Radiometric correction to top-of-atmosphere (TOA) reflectance. • Atmospheric correction using QUAC®, which determines atmospheric correction parameters directly from the observed pixel spectra in a scene, without ancillary information. Spatio-temporal analysis was executed with the following tasks: • Creation of Modified Normalized Difference Water Index images (MNDWI, Xu 2006) to enhance open water features while suppressing noise from built-up land, vegetation, and soil. • Threshold based classification of the water index images to extract the water features. • Classification aggregation as a post-classification cleanup process. • Export of the respective water classes to vector layers for further evaluation in a GIS. • Animation of the classification series and export to a common video format. • Plotting the time series of water surface area in square kilometers. The automated spatio-temporal analysis introduced here can be embedded in virtually any existing geospatial workflow for operational applications. Three integration options were implemented in this case study: • Integration within any ArcGIS environment whether deployed on the desktop, in the cloud, or online. Execution uses a customized ArcGIS script tool. A Python script file retrieves the parameters from the user interface and runs the precompiled IDL code. That IDL code is used to interface between the Python script and the relevant ENVITasks. • Publishing the spatio-temporal analysis tasks as services via the ENVI Services Engine (ESE). ESE is a cloud-based image analysis solution to publish and deploy advanced ENVI image and data analytics to existing enterprise infrastructures. For this purpose the entire IDL code can be capsuled in a single ENVITask. • Integration in an existing geospatial workflow using the Python-to-IDL Bridge. This mechanism allows calling IDL code within Python on a user-defined platform. The results of this case study verify the drastic decrease of the amount of surface water in the AOI, indicative of the major drought that is pervasive throughout California. Accordingly, the time series analysis was correlated successfully with the daily reservoir elevations of the Don Pedro reservoir (station DNP, operated by CDEC).
NASA Astrophysics Data System (ADS)
Steenberg, Ryan
Advancements in medicine have allowed surgeons a menu of options in post-mastectomy breast reconstruction. A conundrum exists, however, in flap selection when faced with varying patient body types. In the case of the athletic patient who does not have the appropriate amount of donor site tissue to warrant a Transverse Rectus Abdominus Musculocutaneuos Flap (TRAM) the Transverse Musculocutaneous Gracilis Flap (TMG) is an appropriate alternative due to its functional and aesthetic benefits. An intricate and timely process, the TMG procedure can be difficult to understand for the layperson. Therefore, a need for a condensed and standardized description exists. By breaking the process down and illustrating the procedure one can effectively deliver the information for use across all realms of publication and education.
Overall evaluation of Skylab imagery for mapping of Latin America
NASA Technical Reports Server (NTRS)
Staples, J. E.; Eoldan, J. J. M.; Fernandez, O. W.; Alves, M.; Mutis, J.; Fletcher, A. G.; Ferrero, M. B.; Morell, J. J. H.; Romero, L. E.; Garcia, J. A. G. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Skylab imagery is both desired and needed by the Latin American catographic agencies. The imagery is cost beneficial for the production of new mapping and maintenance of existing maps at national topographic series scales. If this information was available on a near time routine coverage basis, it would provide an excellent additional data base to the Latin American cartographic community, specifically Argentina, Bolivia, Chile, Colombia, Dominican Republic, Guatemala, Paraguay, and Venezuela.
1987-09-01
Reelfoot Lake was formed in a series of meander scars after the earthquake. Most recent active channels have affected only the western margin of the...with a few prominent meander loop scars, and the lower eastern margin as essentially a collective backswamp. Subsidence of Reelfoot Lake itself as a...open river with the various bankline and sandbar habitats involved there. Open water may well have existed dt various times in part of the Reelfoot Lake
Deforestation and Secondary Growth in Rondonia, Brazil from SIR-C SAR and Landsat.SPOT data
NASA Technical Reports Server (NTRS)
Rignot, Eric; Salas, William A.; Skole, David L.
1996-01-01
Covers problems with existing data collected with high-resolution optical sensors. They say active microwave sensors could complement other sensors in getting through things like cloud cover. They analyzed SIR-C data in combination with Landsat TM data, a 9-year time series of SPOT XS data, and a preliminary field survey. They report findings and draw conclusions, including that SARs operating at long radar wavelengths, with both like and cross-polarizations, are needed for tropical deforestation studies.
Challenge: How Effective is Routing for Wireless Networking
2017-03-03
sage (called a“ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between...these schemes. A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is...counted. A cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across
How Effective is Routing for Wireless Networking
2016-03-05
Routing (LAR) [31]. The basic mechanism of how link-based routing schemes operate is as follows: a user broadcasts a control message (called a “ hello ...to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between them. Routes are then be...description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is counted. A cost is then
Challenge: How Effective is Routing for Wireless Networking
2015-09-07
sage (called a“ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between...these schemes. A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is...counted. A cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across
After High School, Then What? A Look at the Postsecondary Sorting-Out Process for American Youth
1991-01-01
then remained stable from 1984 to 1987. The two time series for women show slightly different patterns, in that the college entrance rates in Table 8...standing of the sorting-out process-the process by which young people with widely differing talents and ambitions choose among competing alternatives such...Table 3.1 These differences between the male and female rates underscore the huge gender gap in college enrollment patterns that existed in 1970. Men
Faster Conceptual Blending Predictors on Relational Time Series
2012-07-01
immediately ceases to exist. For example, a percept that describes “a ball hitting the wall” becomes obsolete immediately after it occurred. An interval...percept occurred and remains true until something happens that change its state. For example, a percept that describes “a ball is in the box” is true...until the ball is removed. The interval percept has a ‘+’ indicator in the predicate as shown in Figure 1. A percept that is true is said to be
The Other End of the Spear: The Tooth-to-Tail Ratio (T3R) in Modern Military Operations
2007-01-01
units. Such a vehicle gave infantry much more of the fi repower and survivability inherent in heavy (mechanized infantry and armored) units while...The Other End of the Spear: The Tooth- to-Tail Ratio (T3R) in Modern Military Operations John J. McGrath The Long War Series Occasional Paper 23...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources
On statistical properties of traded volume in financial markets
NASA Astrophysics Data System (ADS)
de Souza, J.; Moyano, L. G.; Duarte Queirós, S. M.
2006-03-01
In this article we study the dependence degree of the traded volume of the Dow Jones 30 constituent equities by using a nonextensive generalised form of the Kullback-Leibler information measure. Our results show a slow decay of the dependence degree as a function of the lag. This feature is compatible with the existence of non-linearities in this type time series. In addition, we introduce a dynamical mechanism whose associated stationary probability density function (PDF) presents a good agreement with the empirical results.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-11
... Chief, at (202) 551-6821 (Division of Investment Management, Exemptive Applications Office... management investment company currently comprising 23 series (the ``Compass Funds'').\\1\\ Each series of the... series of the Trust and any other existing or future registered open-end management investment company or...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-19
...-0691; Directorate Identifier 2011-NE-26-AD] RIN 2120-AA64 Airworthiness Directives; Lycoming Engines Model TIO 540-A Series Reciprocating Engines AGENCY: Federal Aviation Administration (FAA), DOT. ACTION... directive (AD) for Lycoming Engines model TIO 540-A series reciprocating engines. The existing AD, AD 71-13...
Native Americans; A Bibliography for Young People. Bibliographic Series #7.
ERIC Educational Resources Information Center
Fuson, Elgie M., Comp.
Early in 1969, the Sacramento State College Library began a series of bibliographies designed to aid its patrons in making more effective use of existing library resources. This publication, Number 7 of that series directed specifically to areas involved in the college's developing ethnic studies programs, cites materials published between 1905…
Alignment of time-resolved data from high throughput experiments.
Abidi, Nada; Franke, Raimo; Findeisen, Peter; Klawonn, Frank
2016-12-01
To better understand the dynamics of the underlying processes in cells, it is necessary to take measurements over a time course. Modern high-throughput technologies are often used for this purpose to measure the behavior of cell products like metabolites, peptides, proteins, [Formula: see text]RNA or mRNA at different points in time. Compared to classical time series, the number of time points is usually very limited and the measurements are taken at irregular time intervals. The main reasons for this are the costs of the experiments and the fact that the dynamic behavior usually shows a strong reaction and fast changes shortly after a stimulus and then slowly converges to a certain stable state. Another reason might simply be missing values. It is common to repeat the experiments and to have replicates in order to carry out a more reliable analysis. The ideal assumptions that the initial stimulus really started exactly at the same time for all replicates and that the replicates are perfectly synchronized are seldom satisfied. Therefore, there is a need to first adjust or align the time-resolved data before further analysis is carried out. Dynamic time warping (DTW) is considered as one of the common alignment techniques for time series data with equidistant time points. In this paper, we modified the DTW algorithm so that it can align sequences with measurements at different, non-equidistant time points with large gaps in between. This type of data is usually known as time-resolved data characterized by irregular time intervals between measurements as well as non-identical time points for different replicates. This new algorithm can be easily used to align time-resolved data from high-throughput experiments and to come across existing problems such as time scarcity and existing noise in the measurements. We propose a modified method of DTW to adapt requirements imposed by time-resolved data by use of monotone cubic interpolation splines. Our presented approach provides a nonlinear alignment of two sequences that neither need to have equi-distant time points nor measurements at identical time points. The proposed method is evaluated with artificial as well as real data. The software is available as an R package tra (Time-Resolved data Alignment) which is freely available at: http://public.ostfalia.de/klawonn/tra.zip .
Detection of a sudden change of the field time series based on the Lorenz system
Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832
Thin film type 248-nm bottom antireflective coatings
NASA Astrophysics Data System (ADS)
Enomoto, Tomoyuki; Nakayama, Keisuke; Mizusawa, Kenichi; Nakajima, Yasuyuki; Yoon, Sangwoong; Kim, Yong-Hoon; Kim, Young-Ho; Chung, Hoesik; Chon, Sang Mun
2003-06-01
A frequent problem encountered by photoresists during the manufacturing of semiconductor device is that activating radiation is reflected back into the photoresist by the substrate. So, it is necessary that the light reflection is reduced from the substrate. One approach to reduce the light reflection is the use of bottom anti-reflective coating (BARC) applied to the substrate beneath the photoresist layer. The BARC technology has been utilized for a few years to minimize the reflectivity. As the chip size is reduced to sub 0.13-micron, the photoresist thickness has to decrease with the aspect ratio being less than 3.0. Therefore, new Organic BARC is strongly required which has the minimum reflectivity with thinner BARC thickness and higher etch selectivity towards resist. SAMSUNG Electronics has developed the advanced Organic BARC with Nissan Chemical Industries, Ltd. and Brewer Science, Inc. for achieving the above purpose. As a result, the suitable high performance SNAC2002 series KrF Organic BARCs were developed. Using CF4 gas as etchant, the plasma etch rate of SNAC2002 series is about 1.4 times higher than that of conventional KrF resists and 1.25 times higher than the existing product. The SNAC2002 series can minimize the substrate reflectivity at below 40nm BARC thickness, shows excellent litho performance and coating properties.
Hurst exponent of very long birth time series in XX century Romania. Social and religious aspects
NASA Astrophysics Data System (ADS)
Rotundo, G.; Ausloos, M.; Herteliu, C.; Ileanu, B.
2015-07-01
The Hurst exponent of very long birth time series in Romania has been extracted from official daily records, i.e. over 97 years between 1905 and 2001 included. The series result from distinguishing between families located in urban (U) or rural (R) areas, and belonging (Ox) or not (NOx) to the orthodox religion. Four time series combining both criteria, (U,R) and (Ox, NOx), are also examined. A statistical information is given on these sub-populations measuring their XX-th century state as a snapshot. However, the main goal is to investigate whether the "daily" production of babies is purely noisy or is fluctuating according to some non trivial fractional Brownian motion, - in the four types of populations, characterized by either their habitat or their religious attitude, yet living within the same political regime. One of the goals was also to find whether combined criteria implied a different behavior. Moreover, we wish to observe whether some seasonal periodicity exists. The detrended fluctuation analysis technique is used for finding the fractal correlation dimension of such (9) signals. It has been first necessary, due to two periodic tendencies, to define the range regime in which the Hurst exponent is meaningfully defined. It results that the birth of babies in all cases is a very strongly persistent signal. It is found that the signal fractal correlation dimension is weaker (i) for NOx than for Ox, and (ii) or U with respect to R. Moreover, it is observed that the combination of U or R with NOx or OX enhances the UNOx, UOx, and ROx fluctuations, but smoothens the RNOx signal, thereby suggesting a stronger conditioning on religiosity rituals or rules.
Mechanical energy fluctuations in granular chains: the possibility of rogue fluctuations or waves.
Han, Ding; Westley, Matthew; Sen, Surajit
2014-09-01
The existence of rogue or freak waves in the ocean has been known for some time. They have been reported in the context of optical lattices and the financial market. We ask whether such waves are generic to late time behavior in nonlinear systems. In that vein, we examine the dynamics of an alignment of spherical elastic beads held within fixed, rigid walls at zero precompression when they are subjected to sufficiently rich initial conditions. Here we define such waves generically as unusually large energy fluctuations that sustain for short periods of time. Our simulations suggest that such unusually large fluctuations ("hot spots") and occasional series of such fluctuations through space and time ("rogue fluctuations") are likely to exist in the late time dynamics of the granular chain system at zero dissipation. We show that while hot spots are common in late time evolution, rogue fluctuations are seen in purely nonlinear systems (i.e., no precompression) at late enough times. We next show that the number of such fluctuations grows exponentially with increasing nonlinearity whereas rogue fluctuations decrease superexponentially with increasing precompression. Dissipation-free granular alignment systems may be possible to realize as integrated circuits and hence our observations may potentially be testable in the laboratory.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.
2018-02-01
River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of prescribed β values and gap distributions. The aliasing method, however, does not itself account for sampling irregularity, and this introduces some bias in the result. Nonetheless, the wavelet method is recommended for estimating β in irregular time series until improved methods are developed. Finally, all methods' performances depend strongly on the sampling irregularity, highlighting that the accuracy and precision of each method are data specific. Accurately quantifying the strength of fractal scaling in irregular water-quality time series remains an unresolved challenge for the hydrologic community and for other disciplines that must grapple with irregular sampling.
NASA Astrophysics Data System (ADS)
Sun, Dongye; Lin, Xinyou; Qin, Datong; Deng, Tao
2012-11-01
Energy management(EM) is a core technique of hybrid electric bus(HEB) in order to advance fuel economy performance optimization and is unique for the corresponding configuration. There are existing algorithms of control strategy seldom take battery power management into account with international combustion engine power management. In this paper, a type of power-balancing instantaneous optimization(PBIO) energy management control strategy is proposed for a novel series-parallel hybrid electric bus. According to the characteristic of the novel series-parallel architecture, the switching boundary condition between series and parallel mode as well as the control rules of the power-balancing strategy are developed. The equivalent fuel model of battery is implemented and combined with the fuel of engine to constitute the objective function which is to minimize the fuel consumption at each sampled time and to coordinate the power distribution in real-time between the engine and battery. To validate the proposed strategy effective and reasonable, a forward model is built based on Matlab/Simulink for the simulation and the dSPACE autobox is applied to act as a controller for hardware in-the-loop integrated with bench test. Both the results of simulation and hardware-in-the-loop demonstrate that the proposed strategy not only enable to sustain the battery SOC within its operational range and keep the engine operation point locating the peak efficiency region, but also the fuel economy of series-parallel hybrid electric bus(SPHEB) dramatically advanced up to 30.73% via comparing with the prototype bus and a similar improvement for PBIO strategy relative to rule-based strategy, the reduction of fuel consumption is up to 12.38%. The proposed research ensures the algorithm of PBIO is real-time applicability, improves the efficiency of SPHEB system, as well as suite to complicated configuration perfectly.
Climate change and precipitation evolution in Ifran region (Middle Atlas of Morocco).
NASA Astrophysics Data System (ADS)
Reddad, H.; Bakhat, M.; Damnati, B.
2012-04-01
Climate variability and extreme climatic events pose significant risks to human beings and generate terrestrial ecosystem dysfunctions. These effects are usually amplified by an inappropriate use of the existing natural resources. To face the new context of climate change, a rational and efficient use of these resources - particularly, water resource - on a global and regional scale must be implemented. Annual precipitation provides an overall amount of water, the assessment and management of this water is complicated due to the spatio-temporal variation of disturbance (aridity, rainfall intensity, length of dry season...). Therefore, understanding rainfall behavior would at least help to plan interventions to manage this resource and protect ecosystems that depend on it. Time-series analysis has become one of the major tools in hydrology. It is used for building mathematical models to detect trends and shifts in hydrologic records and to forecast hydrologic events. In this paper we present a case study of IFRAN region, which is situated in the Middle Atlas Mountains in Morocco. This study deals with modeling and forecasting rainfall time series using monthly rainfall data for the period 1970-2005. To determine the seasonal properties of this series we used first the Box-Jenkins methodology to build ARIMA model, and we expended the analysis with the Hylleberg-Engle-Granger-Yoo (HEGY) tests. The results of time series modeling showed the presence of significant deterministic seasonal pattern and no seasonal unit roots. This means that the series is stationary in all frequencies. The model can be used to predict rainfall in IFRAN and near sites; this prediction is not without interest in so far as any information about these random variables could provide a contribution to the researches made in domain for fighting against climate change. It doesn't give solutions to eradicate the precipitation variability phenomenon, but just to adapt to it.
Multiple Indicator Stationary Time Series Models.
ERIC Educational Resources Information Center
Sivo, Stephen A.
2001-01-01
Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
A novel weight determination method for time series data aggregation
NASA Astrophysics Data System (ADS)
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
Generalization bounds of ERM-based learning processes for continuous-time Markov chains.
Zhang, Chao; Tao, Dacheng
2012-12-01
Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration 14 CFR Part 39 [Docket No. FAA-2011... installed on Airbus Model A330-200 and -300 series airplanes, Model A340-200 and -300 series airplanes, and Model A340-500 and -600 series airplanes. That NPRM proposed to supersede an existing AD. That NPRM...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-20
... Airworthiness Directives; Teledyne Continental Motors (TCM) and Rolls-Royce Motors Ltd. (R-RM) Series... superseding an existing airworthiness directive (AD) for certain TCM and R-RM series reciprocating engines... adds R-RM C-125, C- 145, O-300, IO-360, TSIO-360, and LTSIO-520-AE series reciprocating engines to the...
Association mining of dependency between time series
NASA Astrophysics Data System (ADS)
Hafez, Alaaeldin
2001-03-01
Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.
Clean Floquet Time Crystals: Models and Realizations in Cold Atoms
NASA Astrophysics Data System (ADS)
Huang, Biao; Wu, Ying-Hai; Liu, W. Vincent
2018-03-01
Time crystals, a phase showing spontaneous breaking of time-translation symmetry, has been an intriguing subject for systems far away from equilibrium. Recent experiments found such a phase in both the presence and the absence of localization, while in theories localization by disorder is usually assumed a priori. In this work, we point out that time crystals can generally exist in systems without disorder. A series of clean quasi-one-dimensional models under Floquet driving are proposed to demonstrate this unexpected result in principle. Robust time crystalline orders are found in the strongly interacting regime along with the emergent integrals of motion in the dynamical system, which can be characterized by level statistics and the out-of-time-ordered correlators. We propose two cold atom experimental schemes to realize the clean Floquet time crystals, one by making use of dipolar gases and another by synthetic dimensions.
Mott Time Crystal: Models and Realizations in Cold Atoms
NASA Astrophysics Data System (ADS)
Huang, Biao; Wu, Ying-Hai; Liu, W. Vincent
2017-04-01
Time crystals, a phase showing spontaneously breaking of time-translation symmetry, has been an intriguing subject for systems far away from equilibrium. Recent experiments found such a phase both in the presence and absence of localization, while in theories localization is usually assumed a priori. In this work, we point out that time crystals can generally exist in systems without disorder and is not in a pre-thermal state. A series of driven interacting ladder models are proposed to demonstrate this unexpected result in principle. Robust time crystalline orders are found in the Mott regime due to the emergent integrals of motion in the dynamical system, which can be characterized by the out-of-time-order correlators (OTOC). We propose two cold atom experimental schemes to realize the Mott time crystals, one by making use of dipolar gases and another by synthetic dimensions. U.S. ARO (W911NF-11-1-0230), AFOSR (FA9550-16-1-0006).
Generating synthetic wave climates for coastal modelling: a linear mixed modelling approach
NASA Astrophysics Data System (ADS)
Thomas, C.; Lark, R. M.
2013-12-01
Numerical coastline morphological evolution models require wave climate properties to drive morphological change through time. Wave climate properties (typically wave height, period and direction) may be temporally fixed, culled from real wave buoy data, or allowed to vary in some way defined by a Gaussian or other pdf. However, to examine sensitivity of coastline morphologies to wave climate change, it seems desirable to be able to modify wave climate time series from a current to some new state along a trajectory, but in a way consistent with, or initially conditioned by, the properties of existing data, or to generate fully synthetic data sets with realistic time series properties. For example, mean or significant wave height time series may have underlying periodicities, as revealed in numerous analyses of wave data. Our motivation is to develop a simple methodology to generate synthetic wave climate time series that can change in some stochastic way through time. We wish to use such time series in a coastline evolution model to test sensitivities of coastal landforms to changes in wave climate over decadal and centennial scales. We have worked initially on time series of significant wave height, based on data from a Waverider III buoy located off the coast of Yorkshire, England. The statistical framework for the simulation is the linear mixed model. The target variable, perhaps after transformation (Box-Cox), is modelled as a multivariate Gaussian, the mean modelled as a function of a fixed effect, and two random components, one of which is independently and identically distributed (iid) and the second of which is temporally correlated. The model was fitted to the data by likelihood methods. We considered the option of a periodic mean, the period either fixed (e.g. at 12 months) or estimated from the data. We considered two possible correlation structures for the second random effect. In one the correlation decays exponentially with time. In the second (spherical) model, it cuts off at a temporal range. Having fitted the model, multiple realisations were generated; the random effects were simulated by specifying a covariance matrix for the simulated values, with the estimated parameters. The Cholesky factorisation of the covariance matrix was computed and realizations of the random component of the model generated by pre-multiplying a vector of iid standard Gaussian variables by the lower triangular factor. The resulting random variate was added to the mean value computed from the fixed effects, and the result back-transformed to the original scale of the measurement. Realistic simulations result from approach described above. Background exploratory data analysis was undertaken on 20-day sets of 30-minute buoy data, selected from days 5-24 of months January, April, July, October, 2011, to elucidate daily to weekly variations, and to keep numerical analysis tractable computationally. Work remains to be undertaken to develop suitable models for synthetic directional data. We suggest that the general principles of the method will have applications in other geomorphological modelling endeavours requiring time series of stochastically variable environmental parameters.
Scaling behaviors of precipitation over China
NASA Astrophysics Data System (ADS)
Jiang, Lei; Li, Nana; Zhao, Xia
2017-04-01
Scaling behaviors in the precipitation time series derived from 1951 to 2009 over China are investigated by detrended fluctuation analysis (DFA) method. The results show that there exists long-term memory for the precipitation time series in some stations, where the values of the scaling exponent α are less than 0.62, implying weak persistence characteristics. The values of scaling exponent in other stations indicate random behaviors. In addition, the scaling exponent α in precipitation records varies from station to station over China. A numerical test is made to verify the significance in DFA exponents by shuffling the data records many times. We think it is significant when the values of scaling exponent before shuffled precipitation records are larger than the interval threshold for 95 % confidence level after shuffling precipitation records many times. By comparison, the daily precipitation records exhibit weak positively long-range correlation in a power law fashion mainly at the stations taking on zonal distributions in south China, upper and middle reaches of the Yellow River, northern part of northeast China. This may be related to the subtropical high. Furthermore, the values of scaling exponent which cannot pass the significance test do not show a clear distribution pattern. It seems that the stations are mainly distributed in coastal areas, southwest China, and southern part of north China. In fact, many complicated factors may affect the scaling behaviors of precipitation such as the system of the east and south Asian monsoon, the interaction between sea and land, and the big landform of the Tibetan Plateau. These results may provide a better prerequisite to long-term predictor of precipitation time series for different regions over China.
Tweedell, Andrew J.; Haynes, Courtney A.
2017-01-01
The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60–90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity. PMID:28489897
On the Selection of Models for Runtime Prediction of System Resources
NASA Astrophysics Data System (ADS)
Casolari, Sara; Colajanni, Michele
Applications and services delivered through large Internet Data Centers are now feasible thanks to network and server improvement, but also to virtualization, dynamic allocation of resources and dynamic migrations. The large number of servers and resources involved in these systems requires autonomic management strategies because no amount of human administrators would be capable of cloning and migrating virtual machines in time, as well as re-distributing or re-mapping the underlying hardware. At the basis of most autonomic management decisions, there is the need of evaluating own global behavior and change it when the evaluation indicates that they are not accomplishing what they were intended to do or some relevant anomalies are occurring. Decisions algorithms have to satisfy different time scales constraints. In this chapter we are interested to short-term contexts where runtime prediction models work on the basis of time series coming from samples of monitored system resources, such as disk, CPU and network utilization. In similar environments, we have to address two main issues. First, original time series are affected by limited predictability because measurements are characterized by noises due to system instability, variable offered load, heavy-tailed distributions, hardware and software interactions. Moreover, there is no existing criteria that can help us to choose a suitable prediction model and related parameters with the purpose of guaranteeing an adequate prediction quality. In this chapter, we evaluate the impact that different choices on prediction models have on different time series, and we suggest how to treat input data and whether it is convenient to choose the parameters of a prediction model in a static or dynamic way. Our conclusions are supported by a large set of analyses on realistic and synthetic data traces.
Information-Theoretical Analysis of EEG Microstate Sequences in Python.
von Wegner, Frederic; Laufs, Helmut
2018-01-01
We present an open-source Python package to compute information-theoretical quantities for electroencephalographic data. Electroencephalography (EEG) measures the electrical potential generated by the cerebral cortex and the set of spatial patterns projected by the brain's electrical potential on the scalp surface can be clustered into a set of representative maps called EEG microstates. Microstate time series are obtained by competitively fitting the microstate maps back into the EEG data set, i.e., by substituting the EEG data at a given time with the label of the microstate that has the highest similarity with the actual EEG topography. As microstate sequences consist of non-metric random variables, e.g., the letters A-D, we recently introduced information-theoretical measures to quantify these time series. In wakeful resting state EEG recordings, we found new characteristics of microstate sequences such as periodicities related to EEG frequency bands. The algorithms used are here provided as an open-source package and their use is explained in a tutorial style. The package is self-contained and the programming style is procedural, focusing on code intelligibility and easy portability. Using a sample EEG file, we demonstrate how to perform EEG microstate segmentation using the modified K-means approach, and how to compute and visualize the recently introduced information-theoretical tests and quantities. The time-lagged mutual information function is derived as a discrete symbolic alternative to the autocorrelation function for metric time series and confidence intervals are computed from Markov chain surrogate data. The software package provides an open-source extension to the existing implementations of the microstate transform and is specifically designed to analyze resting state EEG recordings.
Fast and Flexible Multivariate Time Series Subsequence Search
NASA Technical Reports Server (NTRS)
Bhaduri, Kanishka; Oza, Nikunj C.; Zhu, Qiang; Srivastava, Ashok N.
2010-01-01
Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical monitoring, and financial systems. Domain experts are often interested in searching for interesting multivariate patterns from these MTS databases which often contain several gigabytes of data. Surprisingly, research on MTS search is very limited. Most of the existing work only supports queries with the same length of data, or queries on a fixed set of variables. In this paper, we propose an efficient and flexible subsequence search framework for massive MTS databases, that, for the first time, enables querying on any subset of variables with arbitrary time delays between them. We propose two algorithms to solve this problem (1) a List Based Search (LBS) algorithm which uses sorted lists for indexing, and (2) a R*-tree Based Search (RBS) which uses Minimum Bounding Rectangles (MBR) to organize the subsequences. Both algorithms guarantee that all matching patterns within the specified thresholds will be returned (no false dismissals). The very few false alarms can be removed by a post-processing step. Since our framework is also capable of Univariate Time-Series (UTS) subsequence search, we first demonstrate the efficiency of our algorithms on several UTS datasets previously used in the literature. We follow this up with experiments using two large MTS databases from the aviation domain, each containing several millions of observations. Both these tests show that our algorithms have very high prune rates (>99%) thus needing actual disk access for only less than 1% of the observations. To the best of our knowledge, MTS subsequence search has never been attempted on datasets of the size we have used in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brentlinger, L.A.; Hofmann, P.L.; Peterson, R.W.
1989-08-01
The movement of nuclear waste can be accomplished by various transport modal options involving different types of vehicles, transport casks, transport routes, and intermediate intermodal transfer facilities. A series of systems studies are required to evaluate modal/intermodal spent fuel transportation options in a consistent fashion. This report provides total life-cycle cost and life-cycle dose estimates for a series of transport modal options under existing site constraints. 14 refs., 7 figs., 28 tabs.
Duality between QCD perturbative series and power corrections
NASA Astrophysics Data System (ADS)
Narison, S.; Zakharov, V. I.
2009-08-01
We elaborate on the relation between perturbative and power-like corrections to short-distance sensitive QCD observables. We confront theoretical expectations with explicit perturbative calculations existing in literature. As is expected, the quadratic correction is dual to a long perturbative series and one should use one of them but not both. However, this might be true only for very long perturbative series, with number of terms needed in most cases exceeding the number of terms available. What has not been foreseen, the quartic corrections might also be dual to the perturbative series. If confirmed, this would imply a crucial modification of the dogma. We confront this quadratic correction against existing phenomenology (QCD (spectral) sum rules scales, determinations of light quark masses and of αs from τ-decay). We find no contradiction and (to some extent) better agreement with the data and with recent lattice calculations.
Simulation Exploration through Immersive Parallel Planes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M
We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less
Simulation Exploration through Immersive Parallel Planes: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny
We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less
Guastello, Stephen J; Reiter, Katherine; Shircel, Anton; Timm, Paul; Malon, Matthew; Fabisch, Megan
2014-07-01
This study examined the relationship between performance variability and actual performance of financial decision makers who were working under experimental conditions of increasing workload and fatigue. The rescaled range statistic, also known as the Hurst exponent (H) was used as an index of variability. Although H is defined as having a range between 0 and 1, 45% of the 172 time series generated by undergraduates were negative. Participants in the study chose the optimum investment out of sets of 3 to 5 options that were presented a series of 350 displays. The sets of options varied in both the complexity of the options and number of options under simultaneous consideration. One experimental condition required participants to make their choices within 15 sec, and the other condition required them to choose within 7.5 sec. Results showed that (a) negative H was possible and not a result of psychometric error; (b) negative H was associated with negative autocorrelations in a time series. (c) H was the best predictor of performance of the variables studied; (d) three other significant predictors were scores on an anagrams test and ratings of physical demands and performance demands; (e) persistence as evidenced by the autocorrelations was associated with ratings of greater time pressure. It was concluded, furthermore, that persistence and overall performance were correlated, that 'healthy' variability only exists within a limited range, and other individual differences related to ability and resistance to stress or fatigue are also involved in the prediction of performance.
ERIC Educational Resources Information Center
Ngan, Chun-Kit
2013-01-01
Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…
A Statistical Reappraisal in the Relationship between Global and Greek Seismic Activity
NASA Astrophysics Data System (ADS)
Liritzis, I.; Diagourtas, D.; Makropoulos, C.
1995-01-01
For the period 1917 1987, Greek seismic activity exhibits a very significant positive correlation to the preceding global activity with a time-lag of 15 years. It seems that all Greece and the two characteristic areas in which we have separated it (Greece without Arc, and the area of the Greek seismic Arc), follow the global seismic activity but with a time-shift of 15 years. Moreover, it seems to exist an intrinsic interaction mechanism between the Greek seismic arc and the rest of Greece, which may be deduced by their different behavior exhibited when they are correlated with the global activity, as well as from the correlation between themselves, where a very significant positive correlation has been found with a time-lag of 3 years, for Greece without arc preceding. A quasi-periodic term of 30-yrs is also observed in these detailed four seismic time-series. The cross-correlation analysis of seismic time-series, as shown, is served as a powerful tool to clarify the complicated space-time pattern of the world wide mosaic of tectonic plate motions. The implications of spring-block model of tectonic plates interaction is invoked, considering the earth's rotation rate changes as their triggering agent. Particular emphasis is given to the potential of such studies in earthquake prediction efforts from local or regional scales to a global scale and vice-versa.
Combining Fourier and lagged k-nearest neighbor imputation for biomedical time series data.
Rahman, Shah Atiqur; Huang, Yuxiao; Claassen, Jan; Heintzman, Nathaniel; Kleinberg, Samantha
2015-12-01
Most clinical and biomedical data contain missing values. A patient's record may be split across multiple institutions, devices may fail, and sensors may not be worn at all times. While these missing values are often ignored, this can lead to bias and error when the data are mined. Further, the data are not simply missing at random. Instead the measurement of a variable such as blood glucose may depend on its prior values as well as that of other variables. These dependencies exist across time as well, but current methods have yet to incorporate these temporal relationships as well as multiple types of missingness. To address this, we propose an imputation method (FLk-NN) that incorporates time lagged correlations both within and across variables by combining two imputation methods, based on an extension to k-NN and the Fourier transform. This enables imputation of missing values even when all data at a time point is missing and when there are different types of missingness both within and across variables. In comparison to other approaches on three biological datasets (simulated and actual Type 1 diabetes datasets, and multi-modality neurological ICU monitoring) the proposed method has the highest imputation accuracy. This was true for up to half the data being missing and when consecutive missing values are a significant fraction of the overall time series length. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
George, Daniel; Huerta, E. A.
2018-03-01
The recent Nobel-prize-winning detections of gravitational waves from merging black holes and the subsequent detection of the collision of two neutron stars in coincidence with electromagnetic observations have inaugurated a new era of multimessenger astrophysics. To enhance the scope of this emergent field of science, we pioneered the use of deep learning with convolutional neural networks, that take time-series inputs, for rapid detection and characterization of gravitational wave signals. This approach, Deep Filtering, was initially demonstrated using simulated LIGO noise. In this article, we present the extension of Deep Filtering using real data from LIGO, for both detection and parameter estimation of gravitational waves from binary black hole mergers using continuous data streams from multiple LIGO detectors. We demonstrate for the first time that machine learning can detect and estimate the true parameters of real events observed by LIGO. Our results show that Deep Filtering achieves similar sensitivities and lower errors compared to matched-filtering while being far more computationally efficient and more resilient to glitches, allowing real-time processing of weak time-series signals in non-stationary non-Gaussian noise with minimal resources, and also enables the detection of new classes of gravitational wave sources that may go unnoticed with existing detection algorithms. This unified framework for data analysis is ideally suited to enable coincident detection campaigns of gravitational waves and their multimessenger counterparts in real-time.
A Review of Subsequence Time Series Clustering
Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332
A review of subsequence time series clustering.
Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.
Relativistic Fluid Dynamics Far From Local Equilibrium
NASA Astrophysics Data System (ADS)
Romatschke, Paul
2018-01-01
Fluid dynamics is traditionally thought to apply only to systems near local equilibrium. In this case, the effective theory of fluid dynamics can be constructed as a gradient series. Recent applications of resurgence suggest that this gradient series diverges, but can be Borel resummed, giving rise to a hydrodynamic attractor solution which is well defined even for large gradients. Arbitrary initial data quickly approaches this attractor via nonhydrodynamic mode decay. This suggests the existence of a new theory of far-from-equilibrium fluid dynamics. In this Letter, the framework of fluid dynamics far from local equilibrium for a conformal system is introduced, and the hydrodynamic attractor solutions for resummed Baier-Romatschke-Son-Starinets-Stephanov theory, kinetic theory in the relaxation time approximation, and strongly coupled N =4 super Yang-Mills theory are identified for a system undergoing Bjorken flow.
Shaping low-thrust trajectories with thrust-handling feature
NASA Astrophysics Data System (ADS)
Taheri, Ehsan; Kolmanovsky, Ilya; Atkins, Ella
2018-02-01
Shape-based methods are becoming popular in low-thrust trajectory optimization due to their fast computation speeds. In existing shape-based methods constraints are treated at the acceleration level but not at the thrust level. These two constraint types are not equivalent since spacecraft mass decreases over time as fuel is expended. This paper develops a shape-based method based on a Fourier series approximation that is capable of representing trajectories defined in spherical coordinates and that enforces thrust constraints. An objective function can be incorporated to minimize overall mission cost, i.e., achieve minimum ΔV . A representative mission from Earth to Mars is studied. The proposed Fourier series technique is demonstrated capable of generating feasible and near-optimal trajectories. These attributes can facilitate future low-thrust mission designs where different trajectory alternatives must be rapidly constructed and evaluated.
Semiparametric modeling: Correcting low-dimensional model error in parametric models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Tyrus, E-mail: thb11@psu.edu; Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, 503 Walker Building, University Park, PA 16802-5013
2016-03-01
In this paper, a semiparametric modeling approach is introduced as a paradigm for addressing model error arising from unresolved physical phenomena. Our approach compensates for model error by learning an auxiliary dynamical model for the unknown parameters. Practically, the proposed approach consists of the following steps. Given a physics-based model and a noisy data set of historical observations, a Bayesian filtering algorithm is used to extract a time-series of the parameter values. Subsequently, the diffusion forecast algorithm is applied to the retrieved time-series in order to construct the auxiliary model for the time evolving parameters. The semiparametric forecasting algorithm consistsmore » of integrating the existing physics-based model with an ensemble of parameters sampled from the probability density function of the diffusion forecast. To specify initial conditions for the diffusion forecast, a Bayesian semiparametric filtering method that extends the Kalman-based filtering framework is introduced. In difficult test examples, which introduce chaotically and stochastically evolving hidden parameters into the Lorenz-96 model, we show that our approach can effectively compensate for model error, with forecasting skill comparable to that of the perfect model.« less
NASA Astrophysics Data System (ADS)
Anwar, Faizan; Bárdossy, András; Seidel, Jochen
2017-04-01
Estimating missing values in a time series of a hydrological variable is an everyday task for a hydrologist. Existing methods such as inverse distance weighting, multivariate regression, and kriging, though simple to apply, provide no indication of the quality of the estimated value and depend mainly on the values of neighboring stations at a given step in the time series. Copulas have the advantage of representing the pure dependence structure between two or more variables (given the relationship between them is monotonic). They rid us of questions such as transforming the data before use or calculating functions that model the relationship between the considered variables. A copula-based approach is suggested to infill discharge, precipitation, and temperature data. As a first step the normal copula is used, subsequently, the necessity to use non-normal / non-symmetrical dependence is investigated. Discharge and temperature are treated as regular continuous variables and can be used without processing for infilling and quality checking. Due to the mixed distribution of precipitation values, it has to be treated differently. This is done by assigning a discrete probability to the zeros and treating the rest as a continuous distribution. Building on the work of others, along with infilling, the normal copula is also utilized to identify values in a time series that might be erroneous. This is done by treating the available value as missing, infilling it using the normal copula and checking if it lies within a confidence band (5 to 95% in our case) of the obtained conditional distribution. Hydrological data from two catchments Upper Neckar River (Germany) and Santa River (Peru) are used to demonstrate the application for datasets with different data quality. The Python code used here is also made available on GitHub. The required input is the time series of a given variable at different stations.
NASA Astrophysics Data System (ADS)
Oriani, Fabio
2017-04-01
The unpredictable nature of rainfall makes its estimation as much difficult as it is essential to hydrological applications. Stochastic simulation is often considered a convenient approach to asses the uncertainty of rainfall processes, but preserving their irregular behavior and variability at multiple scales is a challenge even for the most advanced techniques. In this presentation, an overview on the Direct Sampling technique [1] and its recent application to rainfall and hydrological data simulation [2, 3] is given. The algorithm, having its roots in multiple-point statistics, makes use of a training data set to simulate the outcome of a process without inferring any explicit probability measure: the data are simulated in time or space by sampling the training data set where a sufficiently similar group of neighbor data exists. This approach allows preserving complex statistical dependencies at different scales with a good approximation, while reducing the parameterization to the minimum. The straights and weaknesses of the Direct Sampling approach are shown through a series of applications to rainfall and hydrological data: from time-series simulation to spatial rainfall fields conditioned by elevation or a climate scenario. In the era of vast databases, is this data-driven approach a valid alternative to parametric simulation techniques? [1] Mariethoz G., Renard P., and Straubhaar J. (2010), The Direct Sampling method to perform multiple-point geostatistical simulations, Water. Rerous. Res., 46(11), http://dx.doi.org/10.1029/2008WR007621 [2] Oriani F., Straubhaar J., Renard P., and Mariethoz G. (2014), Simulation of rainfall time series from different climatic regions using the direct sampling technique, Hydrol. Earth Syst. Sci., 18, 3015-3031, http://dx.doi.org/10.5194/hess-18-3015-2014 [3] Oriani F., Borghi A., Straubhaar J., Mariethoz G., Renard P. (2016), Missing data simulation inside flow rate time-series using multiple-point statistics, Environ. Model. Softw., vol. 86, pp. 264 - 276, http://dx.doi.org/10.1016/j.envsoft.2016.10.002
NASA Astrophysics Data System (ADS)
Elangasinghe, M. A.; Singhal, N.; Dirks, K. N.; Salmond, J. A.; Samarasinghe, S.
2014-09-01
This paper uses artificial neural networks (ANN), combined with k-means clustering, to understand the complex time series of PM10 and PM2.5 concentrations at a coastal location of New Zealand based on data from a single site. Out of available meteorological parameters from the network (wind speed, wind direction, solar radiation, temperature, relative humidity), key factors governing the pattern of the time series concentrations were identified through input sensitivity analysis performed on the trained neural network model. The transport pathways of particulate matter under these key meteorological parameters were further analysed through bivariate concentration polar plots and k-means clustering techniques. The analysis shows that the external sources such as marine aerosols and local sources such as traffic and biomass burning contribute equally to the particulate matter concentrations at the study site. These results are in agreement with the results of receptor modelling by the Auckland Council based on Positive Matrix Factorization (PMF). Our findings also show that contrasting concentration-wind speed relationships exist between marine aerosols and local traffic sources resulting in very noisy and seemingly large random PM10 concentrations. The inclusion of cluster rankings as an input parameter to the ANN model showed a statistically significant (p < 0.005) improvement in the performance of the ANN time series model and also showed better performance in picking up high concentrations. For the presented case study, the correlation coefficient between observed and predicted concentrations improved from 0.77 to 0.79 for PM2.5 and from 0.63 to 0.69 for PM10 and reduced the root mean squared error (RMSE) from 5.00 to 4.74 for PM2.5 and from 6.77 to 6.34 for PM10. The techniques presented here enable the user to obtain an understanding of potential sources and their transport characteristics prior to the implementation of costly chemical analysis techniques or advanced air dispersion models.
Atrial fibrillation detection using an iPhone 4S.
Lee, Jinseok; Reyes, Bersain A; McManus, David D; Maitas, Oscar; Mathias, Oscar; Chon, Ki H
2013-01-01
Atrial fibrillation (AF) affects three to five million Americans and is associated with significant morbidity and mortality. Existing methods to diagnose this paroxysmal arrhythmia are cumbersome and/or expensive. We hypothesized that an iPhone 4S can be used to detect AF based on its ability to record a pulsatile photoplethysmogram signal from a fingertip using the built-in camera lens. To investigate the capability of the iPhone 4S for AF detection, we first used two databases, the MIT-BIH AF and normal sinus rhythm (NSR) to derive discriminatory threshold values between two rhythms. Both databases include RR time series originating from 250 Hz sampled ECG recordings. We rescaled the RR time series to 30 Hz so that the RR time series resolution is 1/30 (s) which is equivalent to the resolution from an iPhone 4S. We investigated three statistical methods consisting of the root mean square of successive differences (RMSSD), the Shannon entropy (ShE) and the sample entropy (SampE), which have been proved to be useful tools for AF assessment. Using 64-beat segments from the MIT-BIH databases, we found the beat-to-beat accuracy value of 0.9405, 0.9300, and 0.9614 for RMSSD, ShE, and SampE, respectively. Using an iPhone 4S, we collected 2-min pulsatile time series from 25 prospectively recruited subjects with AF pre- and postelectrical cardioversion. Using derived threshold values of RMSSD, ShE and SampE from the MIT-BIH databases, we found the beat-to-beat accuracy of 0.9844, 0.8494, and 0.9522, respectively. It should be recognized that for clinical applications, the most relevant objective is to detect the presence of AF in the data. Using this criterion, we achieved an accuracy of 100% for both the MIT-BIH AF and iPhone 4S databases.
Stochastic Models for Precipitable Water in Convection
NASA Astrophysics Data System (ADS)
Leung, Kimberly
Atmospheric precipitable water vapor (PWV) is the amount of water vapor in the atmosphere within a vertical column of unit cross-sectional area and is a critically important parameter of precipitation processes. However, accurate high-frequency and long-term observations of PWV in the sky were impossible until the availability of modern instruments such as radar. The United States Department of Energy (DOE)'s Atmospheric Radiation Measurement (ARM) Program facility made the first systematic and high-resolution observations of PWV at Darwin, Australia since 2002. At a resolution of 20 seconds, this time series allowed us to examine the volatility of PWV, including fractal behavior with dimension equal to 1.9, higher than the Brownian motion dimension of 1.5. Such strong fractal behavior calls for stochastic differential equation modeling in an attempt to address some of the difficulties of convective parameterization in various kinds of climate models, ranging from general circulation models (GCM) to weather research forecasting (WRF) models. This important observed data at high resolution can capture the fractal behavior of PWV and enables stochastic exploration into the next generation of climate models which considers scales from micrometers to thousands of kilometers. As a first step, this thesis explores a simple stochastic differential equation model of water mass balance for PWV and assesses accuracy, robustness, and sensitivity of the stochastic model. A 1000-day simulation allows for the determination of the best-fitting 25-day period as compared to data from the TWP-ICE field campaign conducted out of Darwin, Australia in early 2006. The observed data and this portion of the simulation had a correlation coefficient of 0.6513 and followed similar statistics and low-resolution temporal trends. Building on the point model foundation, a similar algorithm was applied to the National Center for Atmospheric Research (NCAR)'s existing single-column model as a test-of-concept for eventual inclusion in a general circulation model. The stochastic scheme was designed to be coupled with the deterministic single-column simulation by modifying results of the existing convective scheme (Zhang-McFarlane) and was able to produce a 20-second resolution time series that effectively simulated observed PWV, as measured by correlation coefficient (0.5510), fractal dimension (1.9), statistics, and visual examination of temporal trends. Results indicate that simulation of a highly volatile time series of observed PWV is certainly achievable and has potential to improve prediction capabilities in climate modeling. Further, this study demonstrates the feasibility of adding a mathematics- and statistics-based stochastic scheme to an existing deterministic parameterization to simulate observed fractal behavior.
NASA Astrophysics Data System (ADS)
Cuen-Romero, F. J.; Valdez-Holguín, J. E.; Buitrón-Sánchez, B. E.; Monreal, R.; Enríquez-Ocaña, L. F.; Aguirre-Hinojosa, E.; Ochoa-Granillo, J. A.; Palafox-Reyes, J. J.
2018-04-01
A biostratigraphic analysis based on trilobites of the main Cambrian outcrops from Sonora, Mexico is performed. The data are based on a combination of field work and published sources, including four previously studied locations from northern and eastern Sonora (Caborca, Cananea, Mazatán, and Arivechi) as well as a new location in the central part of the state of Sonora (San José de Gracia). Chronostratigraphic positions are assigned to the Cambrian outcrops according to Peng et al., 2012 and Cohen et al., 2017. In the Caborca area, the Puerto Blanco, Proveedora, Buelna, Cerro Prieto, Arrojos and El Tren formations comprise a wide range of biozones, which starts from the Fritzaspis Zone until the Glossopleura walcotti Zone (Begadean-Lincolnian Series, global Stage 3-Stage 5, Series 2-Series 3). The Bolsa Quartzite and the Abrigo Limestone exposed in Cananea area are assigned to the Cedaria/Cedarina dakotaensis Zone until the Crepicephalus Zone (Lincolnian Series-Marjuman Stage, global Series 3-Guzhangian). In the San José de Gracia area, The Proveedora, Buelna, Cerro Prieto and El Gavilán formations range from the ?Bristolia mohavensis or ?Bristolia insolens zones until the upper part of Mexicella mexicana Zone, Albertella highlandensis Subzone (Series 2-Series 3, Stage 4-Stage 5). In the Arivechi area, the La Sata, El Mogallón, La Huerta and the Milpillas formations range from Poliella denticulata Zone to the Elvinia Zone (Lincolnian-Millardan, Delamaran-Steptoean, global Series 3-Furongian, Stage 5-Paibian). Paleozoic marine fauna distribution in northwest Mexico and the southwest United States of America, suggest that during this time an extensive faunal province existed, containing a great variety of marine invertebrates with notorious intraspecific affinity. The biotic association includes poriferans, archaeocyathids, brachiopods, mollusks, arthropods and echinoderms as predominant elements.
NASA Astrophysics Data System (ADS)
Tamkevičiūtė, Marija; Edvardsson, Johannes; Pukienė, Rūtilė; Taminskas, Julius; Stoffel, Markus; Corona, Christophe; Kibirkštis, Gintautas
2018-03-01
Continuous water-table (WT) measurements from peatlands are scarce and - if existing at all -very short. Consequently, proxy indicators are critically needed to simulate hydrological changes in peatlands over longer time periods. In this study, we demonstrate that tree-ring width (TRW) records of Scots pine (Pinus sylvestris L.) growing in the Čepkeliai peatland (southern Lithuania) can be used as a proxy to reconstruct hydrological variability in a raised bog environment. A two-step modelling procedure was applied to extend existing measurements and to develop a new and longer peatland WT time series. To this end, we used instrumental WT measurements extending back to 2002, meteorological records, a P-PET (difference between precipitation and potential evapotranspiration) series covering the period 1935-2014, so as to construct a tree-ring based time series of WT fluctuations at the site for the period 1870-2014. Strongest correlations were obtained between average annual WT measured at the bog margin and total P-PET over 7 years (r = 0.923, p < 0.00001), as well as between modelled WT and standardized TRW data with a two years lag (r = -0.602, p < 0.001) for those periods where WT fluctuated at the level of pine roots which is typically at <50 cm depth below the peat surface. Our results suggest that moisture is a limiting factor for tree growth at peatlands, but below a certain WT level (<50 cm under the soil surface), drought becomes a limiting factor instead. To validate the WT reconstruction from the Čepkeliai bog, results were compared to Nemunas river runoff since CE 1812 (r = 0.39, p < 0.00001, 1870-2014). We conclude that peatlands can act both as sinks and sources of greenhouse gases in case that hydrological conditions change, but that hydrological lags and complex feedbacks still hamper our understanding of several processes affecting the hydrology and carbon budget in peatlands. We therefore call for the development of further proxy records of water-table variability in peatlands to improve our understanding of peatland responses to climatic changes.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-07
... Series, adjusted option series and any options series until the time to expiration for such series is... time to expiration for such series is less than nine months be treated differently. Specifically, under... series until the time to expiration for such series is less than nine months. Accordingly, the...
Time series inversion of spectra from ground-based radiometers
NASA Astrophysics Data System (ADS)
Christensen, O. M.; Eriksson, P.
2013-07-01
Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument, which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP) retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the Onsala Space Observatory (OSO) water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.
Surgery 101: evaluating the use of podcasting in a general surgery clerkship.
White, J S; Sharma, N; Boora, P
2011-01-01
Provision of learning resources online is rapidly becoming a feature of medical education. This study set out to determine how medical students engaged in a 6-week clerkship in General Surgery would make use of a series of audio podcasts designed to meet their educational objectives. Patterns of use and student learning styles were determined using an anonymous survey. Of the 112 students, 93 responded to the survey (83%); 68% of students reported listening to at least one podcast (average number: six). While students reported listening in a variety of time and places, the majority of students reported listening on a computer in dedicated study time. Of the listeners, 84% agreed the podcasts helped them learn core topics, and over 80% found the recordings interesting and engaging. This study demonstrates that podcasts are an acceptable learning resource for medical students engaged in a surgery clerkship, and can be integrated into existing study habits. We believe that podcasting can help us cater to busy students with a range of learning styles. We have also shown that a free online resource developed by one school can reach a global audience many times larger than its intended target: to date, the 'Surgery 101' podcast series has been downloaded more than 160,000 times worldwide.
Alternative methods of flexible base compaction acceptance.
DOT National Transportation Integrated Search
2012-05-01
In the Texas Department of Transportation, flexible base construction is governed by a series of stockpile : and field tests. A series of concerns with these existing methods, along with some premature failures in the : field, led to this project inv...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-22
... International Inc. TPE331-10 and TPE331-11 Series Turboprop Engines AGENCY: Federal Aviation Administration (FAA... airworthiness directive (AD) for Honeywell International Inc. TPE331-10 and TPE331-11 series turboprop engines... likely to exist or develop on other products of this same type design. For that reason, we are proposing...
Sedimentation in small reservoirs on the San Rafael Swell, Utah
King, Norman Julius; Mace, Mervyn M.
1953-01-01
Movement of sediment from upland areas and eventually into main drainages and rivers is by no means through continuous transportation of material from the source to the delta. Instead it consists of a series of intermittent erosional and depositional phases that present a pulsating movement. Hence, sediment carried off upland areas may be deposited in lower reaches or along main drainages if an existing combination of factors tend to effect deposition. During this period actual sediment movement out of the basin may be relatively small. Following any change in existing conditions, however, these unconsolidated alluvial fills may be subjected to rapid removal; thus, for a limited time, abnormally high sediment production rates occur until the deposits are either removed or another cycle of deposition is started.
Chemical carcinogens and inhibitors of carcinogenesis in the human diet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carr, B.I.
1985-01-01
The induction of cancer by chemicals as presently understood involves a series of steps, some of which require the passage of time. Many substances that are potent carcinogens in experimental animals are known to exist in nature and occur as part of the human diet. In addition, many of the substances that are known to inhibit experimental carcinogenesis also exist in the human diet. Thus, in addition to industrially produced carcinogens, humans can be presumed to have evolved in an environment that contains both carcinogens and anti-carcinogens. There is also a great deal of experimental and human epidemiologic data onmore » the influence of lipids, proteins and carbohydrates on cancer incidence rates; however, much of those data are confusing and conflicting.« less
Farahmand, Touraj; Fleming, Sean W; Quilty, Edward J
2007-10-01
Urbanization often alters catchment storm responses, with a broad range of potentially significant environmental and engineering consequences. At a practical, site-specific management level, efficient and effective assessment and control of such downstream impacts requires a technical capability to rapidly identify development-induced storm hydrograph changes. The method should also speak specifically to alteration of internal watershed dynamics, require few resources to implement, and provide results that are intuitively accessible to all watershed stakeholders. In this short paper, we propose a potential method which might satisfy these criteria. Our emphasis lies upon the integration of existing concepts to provide tools for pragmatic, relatively low-cost environmental monitoring and management. The procedure involves calibration of rainfall-runoff time-series models in each of several successive time windows, which sample varying degrees of watershed urbanization. As implemented here, only precipitation and stream discharge or stage data are required. The readily generated unit impulse response functions of these time-series models might then provide a mathematically formal, yet visually based and intuitive, representation of changes in watershed storm response. Nominally, the empirical response functions capture such changes as soon as they occur, and the assessments of storm hydrograph alteration are independent of variability in meteorological forcing. We provide a preliminary example of how the technique may be applied using a low-order linear ARX model. The technique may offer a fresh perspective on such watershed management issues, and potentially also several advantages over existing approaches. Substantial further testing is required before attempting to apply the concept as a practical environmental management technique; some possible directions for additional work are suggested.
Sylvan, J B; Pyenson, B C; Rouxel, O; German, C R; Edwards, K J
2012-03-01
We deployed sediment traps adjacent to two active hydrothermal vents at 9°50'N on the East Pacific Rise (EPR) to assess the variability in bacterial community structure associated with plume particles on the timescale of weeks to months, to determine whether an endemic population of plume microbes exists, and to establish ecological relationships between bacterial populations and vent chemistry. Automated rRNA intergenic spacer analysis (ARISA) indicated that there are separate communities at the two different vents and temporal community variations between each vent. Correlation analysis between chemistry and microbiology indicated that shifts in the coarse particulate (>1 mm) Fe/(Fe+Mn+Al), Cu, V, Ca, Al, (232) Th, and Ti as well as fine-grained particulate (<1 mm) Fe/(Fe+Mn+Al), Fe, Ca, and Co are reflected in shifts in microbial populations. 16S rRNA clone libraries from each trap at three time points revealed a high percentage of Epsilonproteobacteria clones and hyperthermophilic Aquificae. There is a shift toward the end of the experiment to more Gammaproteobacteria and Alphaproteobacteria, many of whom likely participate in Fe and S cycling. The particle-attached plume environment is genetically distinct from the surrounding seawater. While work to date in hydrothermal environments has focused on determining the microbial communities on hydrothermal chimneys and the basaltic lavas that form the surrounding seafloor, little comparable data exist on the plume environment that physically and chemically connects them. By employing sediment traps for a time-series approach to sampling, we show that bacterial community composition on plume particles changes on timescales much shorter than previously known. © 2012 Blackwell Publishing Ltd.