76 FR 2665 - Caribbean Fishery Management Council; Scoping Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-14
... time series of catch data that is considered to be consistently reliable across all islands as defined... based on what the Council considers to be the longest time series of catch data that is consistently... preferred management reference point time series. Action 3b. Recreational Bag Limits Option 1: No action. Do...
Developing consistent time series landsat data products
USDA-ARS?s Scientific Manuscript database
The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...
A 40 Year Time Series of SBUV Observations: the Version 8.6 Processing
NASA Technical Reports Server (NTRS)
McPeters, Richard; Bhartia, P. K.; Flynn, L.
2012-01-01
Under a NASA program to produce long term data records from instruments on multiple satellites (MEaSUREs), data from a series of eight SBUV and SBUV 12 instruments have been reprocessed to create a 40 year long ozone time series. Data from the Nimbus 4 BUV, Nimbus 7 SBUV, and SBUV/2 instruments on NOAA 9, 11, 14, 16, 17, and 18 were used covering the period 1970 to 1972 and 1979 to the present. In past analyses an ozone time series was created from these instruments by adjusting ozone itself, instrument by instrument, for consistency during overlap periods. In the version 8.6 processing adjustments were made to the radiance calibration of each instrument to maintain a consistent calibration over the entire time series. Data for all eight instruments were then reprocessed using the adjusted radiances. Reprocessing is necessary to produce an accurate latitude dependence. Other improvements incorporated in version 8.6 included the use of the ozone cross sections of Brion, Daumont, and Malicet, and the use of a cloud height climatology derived from Aura OMI measurements. The new cross sections have a more accurate temperature dependence than the cross sections previously used. The OMI-based cloud heights account for the penetration of UV into the upper layers of clouds. The consistency of the version 8.6 time series was evaluated by intra-instrument comparisons during overlap periods, comparisons with ground-based instruments, and comparisons with measurements made by instruments on other satellites such as SAGE II and UARS MLS. These comparisons show that for the instruments on NOAA 16, 17 and 18, the instrument calibrations were remarkably stable and consistent from instrument to instrument. The data record from the Nimbus 7 SBUV was also very stable, and SAGE and ground-based comparisons show that the' calibration was consistent with measurements made years laterby the NOAA 16 instrument. The calibrations of the SBUV/2 instruments on NOAA 9, 11, and 14 were more of a problem. The rapidly drifting orbits of these satellites resulted in relative time and altitude dependent differences that are significant. Despite these problems, total column ozone appears to be consistent to better than 1% over the entire time series, while the ozone vertical distribution is consistent to approximately 5%.
NASA Astrophysics Data System (ADS)
Reimer, Janet J.; Cai, Wei-Jun; Xue, Liang; Vargas, Rodrigo; Noakes, Scott; Hu, Xinping; Signorini, Sergio R.; Mathis, Jeremy T.; Feely, Richard A.; Sutton, Adrienne J.; Sabine, Christopher; Musielewicz, Sylvia; Chen, Baoshan; Wanninkhof, Rik
2017-08-01
Marine carbonate system monitoring programs often consist of multiple observational methods that include underway cruise data, moored autonomous time series, and discrete water bottle samples. Monitored parameters include all, or some of the following: partial pressure of CO2 of the water (pCO2w) and air, dissolved inorganic carbon (DIC), total alkalinity (TA), and pH. Any combination of at least two of the aforementioned parameters can be used to calculate the others. In this study at the Gray's Reef (GR) mooring in the South Atlantic Bight (SAB) we: examine the internal consistency of pCO2w from underway cruise, moored autonomous time series, and calculated from bottle samples (DIC-TA pairing); describe the seasonal to interannual pCO2w time series variability and air-sea flux (FCO2), as well as describe the potential sources of pCO2w variability; and determine the source/sink for atmospheric pCO2. Over the 8.5 years of GR mooring time series, mooring-underway and mooring-bottle calculated-pCO2w strongly correlate with r-values > 0.90. pCO2w and FCO2 time series follow seasonal thermal patterns; however, seasonal non-thermal processes, such as terrestrial export, net biological production, and air-sea exchange also influence variability. The linear slope of time series pCO2w increases by 5.2 ± 1.4 μatm y-1 with FCO2 increasing 51-70 mmol m-2 y-1. The net FCO2 sign can switch interannually with the magnitude varying greatly. Non-thermal pCO2w is also increasing over the time series, likely indicating that terrestrial export and net biological processes drive the long term pCO2w increase.
Code of Federal Regulations, 2014 CFR
2014-07-01
... pressure relief device. This release can be one release or a series of releases over a short time period... reduces the mass of HAP emitted to the air. The equipment may consist of an individual device or a series... do not occur simultaneously in a batch operation. A batch process consists of a series of batch...
Code of Federal Regulations, 2012 CFR
2012-07-01
... can be one release or a series of releases over a short time period due to a malfunction in the... reduces the mass of HAP emitted to the air. The equipment may consist of an individual device or a series... do not occur simultaneously in a batch operation. A batch process consists of a series of batch...
Interactive Digital Signal Processor
NASA Technical Reports Server (NTRS)
Mish, W. H.
1985-01-01
Interactive Digital Signal Processor, IDSP, consists of set of time series analysis "operators" based on various algorithms commonly used for digital signal analysis. Processing of digital signal time series to extract information usually achieved by applications of number of fairly standard operations. IDSP excellent teaching tool for demonstrating application for time series operators to artificially generated signals.
Sumi, Ayako; Kobayashi, Nobumichi
2017-01-01
In this report, we present a short review of applications of time series analysis, which consists of spectral analysis based on the maximum entropy method in the frequency domain and the least squares method in the time domain, to the incidence data of infectious diseases. This report consists of three parts. First, we present our results obtained by collaborative research on infectious disease epidemics with Chinese, Indian, Filipino and North European research organizations. Second, we present the results obtained with the Japanese infectious disease surveillance data and the time series numerically generated from a mathematical model, called the susceptible/exposed/infectious/recovered (SEIR) model. Third, we present an application of the time series analysis to pathologic tissues to examine the usefulness of time series analysis for investigating the spatial pattern of pathologic tissue. It is anticipated that time series analysis will become a useful tool for investigating not only infectious disease surveillance data but also immunological and genetic tests.
The rationale for chemical time-series sampling has its roots in the same fundamental relationships as govern well hydraulics. Samples of ground water are collected as a function of increasing time of pumpage. The most efficient pattern of collection consists of logarithmically s...
Space Object Classification Using Fused Features of Time Series Data
NASA Astrophysics Data System (ADS)
Jia, B.; Pham, K. D.; Blasch, E.; Shen, D.; Wang, Z.; Chen, G.
In this paper, a fused feature vector consisting of raw time series and texture feature information is proposed for space object classification. The time series data includes historical orbit trajectories and asteroid light curves. The texture feature is derived from recurrence plots using Gabor filters for both unsupervised learning and supervised learning algorithms. The simulation results show that the classification algorithms using the fused feature vector achieve better performance than those using raw time series or texture features only.
Warren B. Cohen; Hans-Erik Andersen; Sean P. Healey; Gretchen G. Moisen; Todd A. Schroeder; Christopher W. Woodall; Grant M. Domke; Zhiqiang Yang; Robert E. Kennedy; Stephen V. Stehman; Curtis Woodcock; Jim Vogelmann; Zhe Zhu; Chengquan Huang
2015-01-01
We are developing a system that provides temporally consistent biomass estimates for national greenhouse gas inventory reporting to the United Nations Framework Convention on Climate Change. Our model-assisted estimation framework relies on remote sensing to scale from plot measurements to lidar strip samples, to Landsat time series-based maps. As a demonstration, new...
Rainfall disaggregation for urban hydrology: Effects of spatial consistence
NASA Astrophysics Data System (ADS)
Müller, Hannes; Haberlandt, Uwe
2015-04-01
For urban hydrology rainfall time series with a high temporal resolution are crucial. Observed time series of this kind are very short in most cases, so they cannot be used. On the contrary, time series with lower temporal resolution (daily measurements) exist for much longer periods. The objective is to derive time series with a long duration and a high resolution by disaggregating time series of the non-recording stations with information of time series of the recording stations. The multiplicative random cascade model is a well-known disaggregation model for daily time series. For urban hydrology it is often assumed, that a day consists of only 1280 minutes in total as starting point for the disaggregation process. We introduce a new variant for the cascade model, which is functional without this assumption and also outperforms the existing approach regarding time series characteristics like wet and dry spell duration, average intensity, fraction of dry intervals and extreme value representation. However, in both approaches rainfall time series of different stations are disaggregated without consideration of surrounding stations. This yields in unrealistic spatial patterns of rainfall. We apply a simulated annealing algorithm that has been used successfully for hourly values before. Relative diurnal cycles of the disaggregated time series are resampled to reproduce the spatial dependence of rainfall. To describe spatial dependence we use bivariate characteristics like probability of occurrence, continuity ratio and coefficient of correlation. Investigation area is a sewage system in Northern Germany. We show that the algorithm has the capability to improve spatial dependence. The influence of the chosen disaggregation routine and the spatial dependence on overflow occurrences and volumes of the sewage system will be analyzed.
Testing the shape of distributions of weather data
NASA Astrophysics Data System (ADS)
Baccon, Ana L. P.; Lunardi, José T.
2016-08-01
The characterization of the statistical distributions of observed weather data is of crucial importance both for the construction and for the validation of weather models, such as weather generators (WG's). An important class of WG's (e.g., the Richardson-type generators) reduce the time series of each variable to a time series of its residual elements, and the residuals are often assumed to be normally distributed. In this work we propose an approach to investigate if the shape assumed for the distribution of residuals is consistent or not with the observed data of a given site. Specifically, this procedure tests if the same distribution shape for the residuals noise is maintained along the time. The proposed approach is an adaptation to climate time series of a procedure first introduced to test the shapes of distributions of growth rates of business firms aggregated in large panels of short time series. We illustrate the procedure by applying it to the residuals time series of maximum temperature in a given location, and investigate the empirical consistency of two assumptions, namely i) the most common assumption that the distribution of the residuals is Gaussian and ii) that the residuals noise has a time invariant shape which coincides with the empirical distribution of all the residuals noise of the whole time series pooled together.
Seasonal to multi-decadal trends in apparent optical properties in the Sargasso Sea
NASA Astrophysics Data System (ADS)
Allen, James G.; Nelson, Norman B.; Siegel, David A.
2017-01-01
Multi-decadal, monthly observations of optical and biogeochemical properties, made as part of the Bermuda Bio-Optics Project (BBOP) at the Bermuda Atlantic Time-series Study (BATS) site in the Sargasso Sea, allow for the examination of temporal trends in vertical light attenuation and their potential controls. Trends in the magnitude of the diffuse attenuation coefficient, Kd(λ), and a proxy for its spectral shape reflect changes in phytoplankton and chromophoric dissolved organic matter (CDOM) characteristics. The length and methodological consistency of this time series provide an excellent opportunity to extend analyses of seasonal cycles of apparent optical properties to interannual and decadal time scales. Here, we characterize changes in the magnitude and spectral shape proxy of diffuse attenuation coefficient spectra and compare them to available biological and optical data from the BATS time series program. The time series analyses reveal a 1.01%±0.18% annual increase of the magnitude of the diffuse attenuation coefficient at 443 nm over the upper 75 m of the water column while showing no significant change in selected spectral characteristics over the study period. These and other observations indicate that changes in phytoplankton rather than changes in CDOM abundance are the primary driver for the diffuse attenuation trends on multi-year timescales for this region. Our findings are inconsistent with previous decadal-scale global ocean water clarity and global satellite ocean color analyses yet are consistent with recent analyses of the BATS time series and highlight the value of long-term consistent observation at ocean time series sites.
A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall
NASA Astrophysics Data System (ADS)
Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.
2017-06-01
Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.
NASA Technical Reports Server (NTRS)
Verger, Aleixandre; Baret, F.; Weiss, M.; Kandasamy, S.; Vermote, E.
2013-01-01
Consistent, continuous, and long time series of global biophysical variables derived from satellite data are required for global change research. A novel climatology fitting approach called CACAO (Consistent Adjustment of the Climatology to Actual Observations) is proposed to reduce noise and fill gaps in time series by scaling and shifting the seasonal climatological patterns to the actual observations. The shift and scale CACAO parameters adjusted for each season allow quantifying shifts in the timing of seasonal phenology and inter-annual variations in magnitude as compared to the average climatology. CACAO was assessed first over simulated daily Leaf Area Index (LAI) time series with varying fractions of missing data and noise. Then, performances were analyzed over actual satellite LAI products derived from AVHRR Long-Term Data Record for the 1981-2000 period over the BELMANIP2 globally representative sample of sites. Comparison with two widely used temporal filtering methods-the asymmetric Gaussian (AG) model and the Savitzky-Golay (SG) filter as implemented in TIMESAT-revealed that CACAO achieved better performances for smoothing AVHRR time series characterized by high level of noise and frequent missing observations. The resulting smoothed time series captures well the vegetation dynamics and shows no gaps as compared to the 50-60% of still missing data after AG or SG reconstructions. Results of simulation experiments as well as confrontation with actual AVHRR time series indicate that the proposed CACAO method is more robust to noise and missing data than AG and SG methods for phenology extraction.
76 FR 51239 - North American Industry Classification System; Revision for 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... definitional and economic changes so that they can create continuous time series and accurately analyze data changes over time. The inclusion of revenues from FGP activities in manufacturing will effectively change...) to exclude production that occurs in a foreign country for historical consistency in time series...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... 500 Index option series in the pilot: (1) A time series analysis of open interest; and (2) an analysis... issue's total market share value, which is the share price times the number of shares outstanding. These... other series. Strike price intervals would be set no less than 5 points apart. Consistent with existing...
Interactive digital signal processor
NASA Technical Reports Server (NTRS)
Mish, W. H.; Wenger, R. M.; Behannon, K. W.; Byrnes, J. B.
1982-01-01
The Interactive Digital Signal Processor (IDSP) is examined. It consists of a set of time series analysis Operators each of which operates on an input file to produce an output file. The operators can be executed in any order that makes sense and recursively, if desired. The operators are the various algorithms used in digital time series analysis work. User written operators can be easily interfaced to the sysatem. The system can be operated both interactively and in batch mode. In IDSP a file can consist of up to n (currently n=8) simultaneous time series. IDSP currently includes over thirty standard operators that range from Fourier transform operations, design and application of digital filters, eigenvalue analysis, to operators that provide graphical output, allow batch operation, editing and display information.
A 16-year time series of 1 km AVHRR satellite data of the conterminous United States and Alaska
Eidenshink, Jeff
2006-01-01
The U.S. Geological Survey (USGS) has developed a 16-year time series of vegetation condition information for the conterminous United States and Alaska using 1 km Advanced Very High Resolution Radiometer (AVHRR) data. The AVHRR data have been processed using consistent methods that account for radiometric variability due to calibration uncertainty, the effects of the atmosphere on surface radiometric measurements obtained from wide field-of-view observations, and the geometric registration accuracy. The conterminous United States and Alaska data sets have an atmospheric correction for water vapor, ozone, and Rayleigh scattering and include a cloud mask derived using the Clouds from AVHRR (CLAVR) algorithm. In comparison with other AVHRR time series data sets, the conterminous United States and Alaska data are processed using similar techniques. The primary difference is that the conterminous United States and Alaska data are at 1 km resolution, while others are at 8 km resolution. The time series consists of weekly and biweekly maximum normalized difference vegetation index (NDVI) composites.
A Time Series of Mean Global Sea Surface Temperature from the Along-Track Scanning Radiometers
NASA Astrophysics Data System (ADS)
Veal, Karen L.; Corlett, Gary; Remedios, John; Llewellyn-Jones, David
2010-12-01
A climate data set requires a long time series of consistently processed data with suitably long periods of overlap of different instruments which allows characterization of any inter-instrument biases. The data obtained from ESA's three Along-Track Scanning Radiometers (ATSRs) together comprise an 18 year record of SST with overlap periods of at least 6 months. The data from all three ATSRs has been consistently processed. These factors together with the stability of the instruments and the precision of the derived SST makes this data set eminently suitable for the construction of a time series of SST that complies with many of the GCOS requirements for a climate data set. A time series of global and regional average SST anomalies has been constructed from the ATSR version 2 data set. An analysis of the overlap periods of successive instruments was used to remove intra-series biases and align the series to a common reference. An ATSR climatology has been developed and has been used to calculate the SST anomalies. The ATSR-1 time series and the AATSR time series have been aligned to ATSR-2. The largest adjustment is ~0.2 K between ATSR-2 and AATSR which is suspected to be due to a shift of the 12 μm filter function for AATSR. An uncertainty of 0.06 K is assigned to the relative anomaly record that is derived from the dual three-channel night-time data. A relative uncertainty of 0.07 K is assigned to the dual night-time two-channel record, except in the ATSR-1 period (1994-1996) where it is larger.
Complex dynamic in ecological time series
Peter Turchin; Andrew D. Taylor
1992-01-01
Although the possibility of complex dynamical behaviors-limit cycles, quasiperiodic oscillations, and aperiodic chaos-has been recognized theoretically, most ecologists are skeptical of their importance in nature. In this paper we develop a methodology for reconstructing endogenous (or deterministic) dynamics from ecological time series. Our method consists of fitting...
Code of Federal Regulations, 2013 CFR
2013-07-01
... can be one release or a series of releases over a short time period due to a malfunction in the... or a series of devices. Examples include incinerators, carbon adsorption units, condensers, flares... do not occur simultaneously in a batch operation. A batch process consists of a series of batch...
Time-reversibility in seismic sequences: Application to the seismicity of Mexican subduction zone
NASA Astrophysics Data System (ADS)
Telesca, L.; Flores-Márquez, E. L.; Ramírez-Rojas, A.
2018-02-01
In this paper we investigate the time-reversibility of series associated with the seismicity of five seismic areas of the subduction zone beneath the Southwest Pacific Mexican coast, applying the horizontal visibility graph method to the series of earthquake magnitudes, interevent times, interdistances and magnitude increments. We applied the Kullback-Leibler divergence D that is a metric for quantifying the degree of time-irreversibility in time series. Our findings suggest that among the five seismic areas, Jalisco-Colima is characterized by time-reversibility in all the four seismic series. Our results are consistent with the peculiar seismo-tectonic characteristics of Jalisco-Colima, which is the closest to the Middle American Trench and belongs to the Mexican volcanic arc.
A study of stationarity in time series by using wavelet transform
NASA Astrophysics Data System (ADS)
Dghais, Amel Abdoullah Ahmed; Ismail, Mohd Tahir
2014-07-01
In this work the core objective is to apply discrete wavelet transform (DWT) functions namely Haar, Daubechies, Symmlet, Coiflet and discrete approximation of the meyer wavelets in non-stationary financial time series data from US stock market (DJIA30). The data consists of 2048 daily data of closing index starting from December 17, 2004 until October 23, 2012. From the unit root test the results show that the data is non stationary in the level. In order to study the stationarity of a time series, the autocorrelation function (ACF) is used. Results indicate that, Haar function is the lowest function to obtain noisy series as compared to Daubechies, Symmlet, Coiflet and discrete approximation of the meyer wavelets. In addition, the original data after decomposition by DWT is less noisy series than decomposition by DWT for return time series.
A novel water quality data analysis framework based on time-series data mining.
Deng, Weihui; Wang, Guoyin
2017-07-01
The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Association mining of dependency between time series
NASA Astrophysics Data System (ADS)
Hafez, Alaaeldin
2001-03-01
Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.
NASA Astrophysics Data System (ADS)
Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang
2018-02-01
This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.
Kuntzelman, Karl; Jack Rhodes, L; Harrington, Lillian N; Miskovic, Vladimir
2018-06-01
There is a broad family of statistical methods for capturing time series regularity, with increasingly widespread adoption by the neuroscientific community. A common feature of these methods is that they permit investigators to quantify the entropy of brain signals - an index of unpredictability/complexity. Despite the proliferation of algorithms for computing entropy from neural time series data there is scant evidence concerning their relative stability and efficiency. Here we evaluated several different algorithmic implementations (sample, fuzzy, dispersion and permutation) of multiscale entropy in terms of their stability across sessions, internal consistency and computational speed, accuracy and precision using a combination of electroencephalogram (EEG) and synthetic 1/ƒ noise signals. Overall, we report fair to excellent internal consistency and longitudinal stability over a one-week period for the majority of entropy estimates, with several caveats. Computational timing estimates suggest distinct advantages for dispersion and permutation entropy over other entropy estimates. Considered alongside the psychometric evidence, we suggest several ways in which researchers can maximize computational resources (without sacrificing reliability), especially when working with high-density M/EEG data or multivoxel BOLD time series signals. Copyright © 2018 Elsevier Inc. All rights reserved.
Analysis and generation of groundwater concentration time series
NASA Astrophysics Data System (ADS)
Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae
2018-01-01
Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.
Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time
NASA Astrophysics Data System (ADS)
Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.
2017-12-01
We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.
ERIC Educational Resources Information Center
Schmitz, Bernhard; Wiese, Bettina S.
2006-01-01
The present study combines a standardized diary approach with time-series analysis methods to investigate the process of self-regulated learning. Based on a process-focused adaptation of Zimmerman's (2000) learning model, an intervention (consisting of four weekly training sessions) to increase self-regulated learning was developed. The diaries…
Land science with Sentinel-2 and Sentinel-3 data series synergy
NASA Astrophysics Data System (ADS)
Moreno, Jose; Guanter, Luis; Alonso, Luis; Gomez, Luis; Amoros, Julia; Camps, Gustavo; Delegido, Jesus
2010-05-01
Although the GMES/Sentinel satellite series were primarily designed to provide observations for operational services and routine applications, there is a growing interest in the scientific community towards the usage of Sentinel data for more advanced and innovative science. Apart from the improved spatial and spectral capabilities, the availability of consistent time series covering a period of over 20 years opens possibilities never explored before, such as systematic data assimilation approaches exploiting the time-series concept, or the incorporation in the modelling approaches of processes covering time scales from weeks to decades. Sentinel-3 will provide continuity to current ENVISAT MERIS/AATSR capabilities. The results already derived from MERIS/AATRS will be more systematically exploited by using OLCI in synergy with SLST. Particularly innovative is the case of Sentinel-2, which is specifically designed for land applications. Built on a constellation of two satellites operating simultaneously to provide 5 days geometric revisit time, the Sentinel-2 system will providing global and systematic acquisitions with high spatial resolution and with a high revisit time tailored towards the needs of land monitoring. Apart from providing continuity to Landsat and SPOT time series, the Sentinel-2 Multi-Spectral Instrument (MSI) incorporates new narrow bands around the red-edge for improved retrievals of biophysical parameters. The limitations imposed by the need of a proper cloud screening and atmospheric corrections have represented a serious constraint in the past for optical data. The fact that both Sentinel-2 and 3 have dedicated bands to allow such needed corrections for optical data represents an important step towards a proper exploitation, guarantying consistent time series showing actual variability in land surface conditions without the artefacts introduced by the atmosphere. Expected operational products (such as Land Cover maps, Leaf Area Index, Fractional Vegetation Cover, Fraction of Absorbed Photosynthetically Active Radiation, and Leaf Chlorophyll and Water Contents), will be enhanced with new scientific applications. Higher level products will also be provided, by means of mosaicking, averaging, synthesising or compositing of spatially and temporally resampled data. A key element in the exploitation of the Sentinel series will be the adequate use of data synergy, which will open new possibilities for improved Land Models. This paper analyses in particular the possibilities offered by mosaicking and compositing information derived from Sentinel-2 observations in high spatial resolution to complement dense time series derived from Sentinel-3 data with more frequent coverage. Interpolation of gaps in high spatial resolution time series (from Sentinel-2 data) by using medium/low resolution data from Sentinel-3 (OLCI and SLSTR) is also a way of making series more temporally consistent with high spatial resolution. The primary goal of such temporal interpolation / spatial mosaicking techniques is to derive consistent surface reflectance data virtually for every date and geographical location, no matter the initial spatial/temporal coverage of the original data used to produce the composite. As a result, biophysical products can be derived in a more consistent way from the spectral information of Sentinel-3 data by making use of a description of surface heterogeneity derived from Sentinel-2 data. Using data from dedicated experiments (SEN2FLEX, CEFLES2, SEN3EXP), that include a large dataset of satellite and airborne data and of ground-based measurements of atmospheric and vegetation parameters, different techniques are tested, including empirical / statistical approaches that builds nonlinear regression by mapping spectra to a high dimensional space, up to model inversion / data assimilation scenarios. Exploitation of the temporal domain and spatial multi-scale domain becomes then a driver for the systematic exploitation of GMES/Sentinels data time series. This paper review current status, and identifies research priorities in such direction.
Detecting chaos in irregularly sampled time series.
Kulp, C W
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.
Comparison between four dissimilar solar panel configurations
NASA Astrophysics Data System (ADS)
Suleiman, K.; Ali, U. A.; Yusuf, Ibrahim; Koko, A. D.; Bala, S. I.
2017-12-01
Several studies on photovoltaic systems focused on how it operates and energy required in operating it. Little attention is paid on its configurations, modeling of mean time to system failure, availability, cost benefit and comparisons of parallel and series-parallel designs. In this research work, four system configurations were studied. Configuration I consists of two sub-components arranged in parallel with 24 V each, configuration II consists of four sub-components arranged logically in parallel with 12 V each, configuration III consists of four sub-components arranged in series-parallel with 8 V each, and configuration IV has six sub-components with 6 V each arranged in series-parallel. Comparative analysis was made using Chapman Kolmogorov's method. The derivation for explicit expression of mean time to system failure, steady state availability and cost benefit analysis were performed, based on the comparison. Ranking method was used to determine the optimal configuration of the systems. The results of analytical and numerical solutions of system availability and mean time to system failure were determined and it was found that configuration I is the optimal configuration.
The Version 8.6 SBUV Ozone Data Record: An Overview
NASA Technical Reports Server (NTRS)
McPeters, Richard D.; Bhartia, P. K.; Haffner, D.; Labow, Gordon J.; Flynn, Larry
2013-01-01
Under a NASA program to produce long-term data records from instruments on multiple satellites, data from a series of nine Solar Backscatter Ultraviolet (SBUV and SBUV2) instruments have been re-processed to create a coherent ozone time series. Data from the BUV instrument on Nimbus 4, SBUV on Nimbus 7, and SBUV2 instruments on NOAA 9, 11, 14, 16, 17, 18, and 19 covering the period 1970-1972 and 1979-2011 were used to create a long-term data set. The goal is an ozone Earth Science Data Record - a consistent, calibrated ozone time series that can be used for trend analyses and other studies. In order to create this ozone data set, the radiances were adjusted and used to re-process the entire data records for each of the nine instruments. Inter-instrument comparisons during periods of overlap as well as comparisons with data from other satellite and ground-based instruments were used to evaluate the consistency of the record and make calibration adjustments as needed. Additional improvements in this version 8.6 processing included the use of the Brion, Daumont, and Malicet ozone cross sections, and a cloud-height climatology derived from Aura OMI measurements. Validation of the re-processed ozone shows that total column ozone is consistent with the Brewer Dobson network to within about 1 for the new time series. Comparisons with MLS, SAGE, sondes, and lidar show that ozone at individual levels in the stratosphere is generally consistent to within 5 percent.
NASA Technical Reports Server (NTRS)
Jackson, T. J.; Shiue, J.; Oneill, P.; Wang, J.; Fuchs, J.; Owe, M.
1984-01-01
The verification of a multi-sensor aircraft system developed to study soil moisture applications is discussed. This system consisted of a three beam push broom L band microwave radiometer, a thermal infrared scanner, a multispectral scanner, video and photographic cameras and an onboard navigational instrument. Ten flights were made of agricultural sites in Maryland and Delaware with little or no vegetation cover. Comparisons of aircraft and ground measurements showed that the system was reliable and consistent. Time series analysis of microwave and evaporation data showed a strong similarity that indicates a potential direction for future research.
Quality and Consistency of the NASA Ocean Color Data Record
NASA Technical Reports Server (NTRS)
Franz, Bryan A.
2012-01-01
The NASA Ocean Biology Processing Group (OBPG) recently reprocessed the multimission ocean color time-series from SeaWiFS, MODIS-Aqua, and MODIS-Terra using common algorithms and improved instrument calibration knowledge. Here we present an analysis of the quality and consistency of the resulting ocean color retrievals, including spectral water-leaving reflectance, chlorophyll a concentration, and diffuse attenuation. Statistical analysis of satellite retrievals relative to in situ measurements will be presented for each sensor, as well as an assessment of consistency in the global time-series for the overlapping periods of the missions. Results will show that the satellite retrievals are in good agreement with in situ measurements, and that the sensor ocean color data records are highly consistent over the common mission lifespan for the global deep oceans, but with degraded agreement in higher productivity, higher complexity coastal regions.
Retrieving hydrological connectivity from empirical causality in karst systems
NASA Astrophysics Data System (ADS)
Delforge, Damien; Vanclooster, Marnik; Van Camp, Michel; Poulain, Amaël; Watlet, Arnaud; Hallet, Vincent; Kaufmann, Olivier; Francis, Olivier
2017-04-01
Because of their complexity, karst systems exhibit nonlinear dynamics. Moreover, if one attempts to model a karst, the hidden behavior complicates the choice of the most suitable model. Therefore, both intense investigation methods and nonlinear data analysis are needed to reveal the underlying hydrological connectivity as a prior for a consistent physically based modelling approach. Convergent Cross Mapping (CCM), a recent method, promises to identify causal relationships between time series belonging to the same dynamical systems. The method is based on phase space reconstruction and is suitable for nonlinear dynamics. As an empirical causation detection method, it could be used to highlight the hidden complexity of a karst system by revealing its inner hydrological and dynamical connectivity. Hence, if one can link causal relationships to physical processes, the method should show great potential to support physically based model structure selection. We present the results of numerical experiments using karst model blocks combined in different structures to generate time series from actual rainfall series. CCM is applied between the time series to investigate if the empirical causation detection is consistent with the hydrological connectivity suggested by the karst model.
Modeling Geodetic Processes with Levy α-Stable Distribution and FARIMA
NASA Astrophysics Data System (ADS)
Montillet, Jean-Philippe; Yu, Kegen
2015-04-01
Over the last years the scientific community has been using the auto regressive moving average (ARMA) model in the modeling of the noise in global positioning system (GPS) time series (daily solution). This work starts with the investigation of the limit of the ARMA model which is widely used in signal processing when the measurement noise is white. Since a typical GPS time series consists of geophysical signals (e.g., seasonal signal) and stochastic processes (e.g., coloured and white noise), the ARMA model may be inappropriate. Therefore, the application of the fractional auto-regressive integrated moving average (FARIMA) model is investigated. The simulation results using simulated time series as well as real GPS time series from a few selected stations around Australia show that the FARIMA model fits the time series better than other models when the coloured noise is larger than the white noise. The second fold of this work focuses on fitting the GPS time series with the family of Levy α-stable distributions. Using this distribution, a hypothesis test is developed to eliminate effectively coarse outliers from GPS time series, achieving better performance than using the rule of thumb of n standard deviations (with n chosen empirically).
A Synthesis of VIIRS Solar and Lunar Calibrations
NASA Technical Reports Server (NTRS)
Eplee, Robert E.; Turpie, Kevin R.; Meister, Gerhard; Patt, Frederick S.; Fireman, Gwyn F.; Franz, Bryan A.; McClain, Charles R.
2013-01-01
The NASA VIIRS Ocean Science Team (VOST) has developed two independent calibrations of the SNPP VIIRS moderate resolution reflective solar bands using solar diffuser and lunar observations through June 2013. Fits to the solar calibration time series show mean residuals per band of 0.078-0.10%. There are apparent residual lunar libration correlations in the lunar calibration time series that are not accounted for by the ROLO photometric model of the Moon. Fits to the lunar time series that account for residual librations show mean residuals per band of 0.071-0.17%. Comparison of the solar and lunar time series shows that the relative differences in the two calibrations are 0.12-0.31%. Relative uncertainties in the VIIRS solar and lunar calibration time series are comparable to those achieved for SeaWiFS, Aqua MODIS, and Terra MODIS. Intercomparison of the VIIRS lunar time series with those from SeaWiFS, Aqua MODIS, and Terra MODIS shows that the scatter in the VIIRS lunar observations is consistent with that observed for the heritage instruments. Based on these analyses, the VOST has derived a calibration lookup table for VIIRS ocean color data based on fits to the solar calibration time series.
NASA Astrophysics Data System (ADS)
Cardille, J. A.; Lee, J.
2017-12-01
With the opening of the Landsat archive, there is a dramatically increased potential for creating high-quality time series of land use/land-cover (LULC) classifications derived from remote sensing. Although LULC time series are appealing, their creation is typically challenging in two fundamental ways. First, there is a need to create maximally correct LULC maps for consideration at each time step; and second, there is a need to have the elements of the time series be consistent with each other, without pixels that flip improbably between covers due only to unavoidable, stray classification errors. We have developed the Bayesian Updating of Land Cover - Unsupervised (BULC-U) algorithm to address these challenges simultaneously, and introduce and apply it here for two related but distinct purposes. First, with minimal human intervention, we produced an internally consistent, high-accuracy LULC time series in rapidly changing Mato Grosso, Brazil for a time interval (1986-2000) in which cropland area more than doubled. The spatial and temporal resolution of the 59 LULC snapshots allows users to witness the establishment of towns and farms at the expense of forest. The new time series could be used by policy-makers and analysts to unravel important considerations for conservation and management, including the timing and location of past development, the rate and nature of changes in forest connectivity, the connection with road infrastructure, and more. The second application of BULC-U is to sharpen the well-known GlobCover 2009 classification from 300m to 30m, while improving accuracy measures for every class. The greatly improved resolution and accuracy permits a better representation of the true LULC proportions, the use of this map in models, and quantification of the potential impacts of changes. Given that there may easily be thousands and potentially millions of images available to harvest for an LULC time series, it is imperative to build useful algorithms requiring minimal human intervention. Through image segmentation and classification, BULC-U allows us to use both the spectral and spatial characteristics of imagery to sharpen classifications and create time series. It is hoped that this study may allow us and other users of this new method to consider time series across ever larger areas.
Time Series Data Visualization in World Wide Telescope
NASA Astrophysics Data System (ADS)
Fay, J.
WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.
NASA Technical Reports Server (NTRS)
Gao, Feng; DeColstoun, Eric Brown; Ma, Ronghua; Weng, Qihao; Masek, Jeffrey G.; Chen, Jin; Pan, Yaozhong; Song, Conghe
2012-01-01
Cities have been expanding rapidly worldwide, especially over the past few decades. Mapping the dynamic expansion of impervious surface in both space and time is essential for an improved understanding of the urbanization process, land-cover and land-use change, and their impacts on the environment. Landsat and other medium-resolution satellites provide the necessary spatial details and temporal frequency for mapping impervious surface expansion over the past four decades. Since the US Geological Survey opened the historical record of the Landsat image archive for free access in 2008, the decades-old bottleneck of data limitation has gone. Remote-sensing scientists are now rich with data, and the challenge is how to make best use of this precious resource. In this article, we develop an efficient algorithm to map the continuous expansion of impervious surface using a time series of four decades of medium-resolution satellite images. The algorithm is based on a supervised classification of the time-series image stack using a decision tree. Each imerpervious class represents urbanization starting in a different image. The algorithm also allows us to remove inconsistent training samples because impervious expansion is not reversible during the study period. The objective is to extract a time series of complete and consistent impervious surface maps from a corresponding times series of images collected from multiple sensors, and with a minimal amount of image preprocessing effort. The approach was tested in the lower Yangtze River Delta region, one of the fastest urban growth areas in China. Results from nearly four decades of medium-resolution satellite data from the Landsat Multispectral Scanner (MSS), Thematic Mapper (TM), Enhanced Thematic Mapper plus (ETM+) and China-Brazil Earth Resources Satellite (CBERS) show a consistent urbanization process that is consistent with economic development plans and policies. The time-series impervious spatial extent maps derived from this study agree well with an existing urban extent polygon data set that was previously developed independently. The overall mapping accuracy was estimated at about 92.5% with 3% commission error and 12% omission error for the impervious type from all images regardless of image quality and initial spatial resolution.
NASA Astrophysics Data System (ADS)
Beirle, Steffen; Lampel, Johannes; Wang, Yang; Mies, Kornelia; Dörner, Steffen; Grossi, Margherita; Loyola, Diego; Dehn, Angelika; Danielczok, Anja; Schröder, Marc; Wagner, Thomas
2018-03-01
We present time series of the global distribution of water vapor columns over more than 2 decades based on measurements from the satellite instruments GOME, SCIAMACHY, and GOME-2 in the red spectral range. A particular focus is the consistency amongst the different sensors to avoid jumps from one instrument to another. This is reached by applying robust and simple retrieval settings consistently. Potentially systematic effects due to differences in ground pixel size are avoided by merging SCIAMACHY and GOME-2 observations to GOME spatial resolution, which also allows for a consistent treatment of cloud effects. In addition, the GOME-2 swath is reduced to that of GOME and SCIAMACHY to have consistent viewing geometries.Remaining systematic differences between the different sensors are investigated during overlap periods and are corrected for in the homogenized time series. The resulting Climate
product v2.2 (https://doi.org/10.1594/WDCC/GOME-EVL_water_vapor_clim_v2.2) allows the study of the temporal evolution of water vapor over the last 20 years on a global scale.
Allan deviation analysis of financial return series
NASA Astrophysics Data System (ADS)
Hernández-Pérez, R.
2012-05-01
We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.
NASA Astrophysics Data System (ADS)
Eduardo Virgilio Silva, Luiz; Otavio Murta, Luiz
2012-12-01
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiffmax) for q ≠1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiffmax values were capable of distinguish HRV groups (p-values 5.10×10-3, 1.11×10-7, and 5.50×10-7 for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis.
Testing for nonlinearity in non-stationary physiological time series.
Guarín, Diego; Delgado, Edilson; Orozco, Álvaro
2011-01-01
Testing for nonlinearity is one of the most important preprocessing steps in nonlinear time series analysis. Typically, this is done by means of the linear surrogate data methods. But it is a known fact that the validity of the results heavily depends on the stationarity of the time series. Since most physiological signals are non-stationary, it is easy to falsely detect nonlinearity using the linear surrogate data methods. In this document, we propose a methodology to extend the procedure for generating constrained surrogate time series in order to assess nonlinearity in non-stationary data. The method is based on the band-phase-randomized surrogates, which consists (contrary to the linear surrogate data methods) in randomizing only a portion of the Fourier phases in the high frequency domain. Analysis of simulated time series showed that in comparison to the linear surrogate data method, our method is able to discriminate between linear stationarity, linear non-stationary and nonlinear time series. Applying our methodology to heart rate variability (HRV) records of five healthy patients, we encountered that nonlinear correlations are present in this non-stationary physiological signals.
How fast do stock prices adjust to market efficiency? Evidence from a detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Reboredo, Juan C.; Rivera-Castro, Miguel A.; Miranda, José G. V.; García-Rubio, Raquel
2013-04-01
In this paper we analyse price fluctuations with the aim of measuring how long the market takes to adjust prices to weak-form efficiency, i.e., how long it takes for prices to adjust to a fractional Brownian motion with a Hurst exponent of 0.5. The Hurst exponent is estimated for different time horizons using detrended fluctuation analysis-a method suitable for non-stationary series with trends-in order to identify at which time scale the Hurst exponent is consistent with the efficient market hypothesis. Using high-frequency share price, exchange rate and stock data, we show how price dynamics exhibited important deviations from efficiency for time periods of up to 15 min; thereafter, price dynamics was consistent with a geometric Brownian motion. The intraday behaviour of the series also indicated that price dynamics at trade opening and close was hardly consistent with efficiency, which would enable investors to exploit price deviations from fundamental values. This result is consistent with intraday volume, volatility and transaction time duration patterns.
NASA Astrophysics Data System (ADS)
Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.
2017-12-01
The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time series change detection and update approach followed here, science outcomes or reports representing one temporal epoch can be considered stable and will not be altered when a time series is updated with newly available data.
Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin
2013-01-01
Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509
Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin
2014-01-30
The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.
Overhead tray for cable test system
NASA Technical Reports Server (NTRS)
Saltz, K. T.
1976-01-01
System consists of overhead slotted tray, series of compatible adapter cables, and automatic test set which consists of control console and cable-switching console. System reduces hookup time and also reduces cost of fabricating and storing test cables.
NASA Astrophysics Data System (ADS)
Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md
2017-08-01
Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.
Detecting a periodic signal in the terrestrial cratering record
NASA Technical Reports Server (NTRS)
Grieve, Richard A. F.; Rupert, James D.; Goodacre, Alan K.; Sharpton, Virgil L.
1988-01-01
A time-series analysis of model periodic data, where the period and phase are known, has been performed in order to investigate whether a significant period can be detected consistently from a mix of random and periodic impacts. Special attention is given to the effect of age uncertainties and random ages in the detection of a periodic signal. An equivalent analysis is performed with observed data on crater ages and compared with the model data, and the effects of the temporal distribution of crater ages on the results from the time-series analysis are studied. Evidence for a consistent 30-m.y. period is found to be weak.
NASA Astrophysics Data System (ADS)
Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo
2017-04-01
In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.
Moran, John L; Solomon, Patricia J
2011-02-01
Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time-dependence may be established using conventional and advanced econometric time series estimators. © 2010 Blackwell Publishing Ltd.
The long-term changes in total ozone, as derived from Dobson measurements at Arosa (1948-2001)
NASA Astrophysics Data System (ADS)
Krzyscin, J. W.
2003-04-01
The longest possible total ozone time series (Arosa, Switzerland) is examined for a detection of trends. Two-step procedure is proposed to estimate the long-term (decadal) variations in the ozone time series. The first step consists of a standard least-squares multiple regression applied to the total ozone monthly means to parameterize "natural" (related to the oscillations in the atmospheric dynamics) variations in the analyzed time series. The standard proxies for the dynamical ozone variations are used including; the 11-year solar activity cycle, and indices of QBO, ENSO and NAO. We use the detrended time series of temperature at 100 hPa and 500 hPa over Arosa to parameterize short-term variations (with time periods<1 year) in total ozone related to local changes in the meteorological conditions over the station. The second step consists of a smooth-curve fitting to the total ozone residuals (original minus modeled "natural" time series), the time derivation applied to this curve to obtain local trends, and bootstrapping of the residual time series to estimate the standard error of local trends. Locally weighted regression and the wavelet analysis methodology are used to extract the smooth component out of the residual time series. The time integral over the local trend values provides the cumulative long-term change since the data beginning. Examining the pattern of the cumulative change we see the periods with total ozone loss (the end of 50s up to early 60s - probably the effect of the nuclear bomb tests), recovery (mid 60s up to beginning of 70s), apparent decrease (beginning of 70s lasting to mid 90s - probably the effect of the atmosphere contamination by anthropogenic substances containing chlorine), and with a kind of stabilization or recovery (starting in the mid of 90s - probably the effect of the Montreal protocol to eliminate substances reducing the ozone layer). We can also estimate that a full ozone recovery (return to the undisturbed total ozone level from the beginning of 70s) is expected around 2050. We propose to calculate both time series of local trends and the cumulative long-term change instead single trend value derived as a slope of straight line fit to the data.
Model Performance Evaluation and Scenario Analysis ...
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too
Evenly spaced Detrended Fluctuation Analysis: Selecting the number of points for the diffusion plot
NASA Astrophysics Data System (ADS)
Liddy, Joshua J.; Haddad, Jeffrey M.
2018-02-01
Detrended Fluctuation Analysis (DFA) has become a widely-used tool to examine the correlation structure of a time series and provided insights into neuromuscular health and disease states. As the popularity of utilizing DFA in the human behavioral sciences has grown, understanding its limitations and how to properly determine parameters is becoming increasingly important. DFA examines the correlation structure of variability in a time series by computing α, the slope of the log SD- log n diffusion plot. When using the traditional DFA algorithm, the timescales, n, are often selected as a set of integers between a minimum and maximum length based on the number of data points in the time series. This produces non-uniformly distributed values of n in logarithmic scale, which influences the estimation of α due to a disproportionate weighting of the long-timescale regions of the diffusion plot. Recently, the evenly spaced DFA and evenly spaced average DFA algorithms were introduced. Both algorithms compute α by selecting k points for the diffusion plot based on the minimum and maximum timescales of interest and improve the consistency of α estimates for simulated fractional Gaussian noise and fractional Brownian motion time series. Two issues that remain unaddressed are (1) how to select k and (2) whether the evenly-spaced DFA algorithms show similar benefits when assessing human behavioral data. We manipulated k and examined its effects on the accuracy, consistency, and confidence limits of α in simulated and experimental time series. We demonstrate that the accuracy and consistency of α are relatively unaffected by the selection of k. However, the confidence limits of α narrow as k increases, dramatically reducing measurement uncertainty for single trials. We provide guidelines for selecting k and discuss potential uses of the evenly spaced DFA algorithms when assessing human behavioral data.
Series transistors isolate amplifier from flyback voltage
NASA Technical Reports Server (NTRS)
Banks, W.
1967-01-01
Circuit enables high sawtooth currents to be passed through a deflection coil and isolate the coil driving amplifier from the flyback voltage. It incorporates a switch consisting of transistors in series with the driving amplifier and deflection coil. The switch disconnects the deflection coil from the amplifier during the retrace time.
Testing for nonlinearity in time series: The method of surrogate data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, J.; Galdrikian, B.; Longtin, A.
1991-01-01
We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less
Kong, Deguo; MacLeod, Matthew; Hung, Hayley; Cousins, Ian T
2014-11-04
During recent decades concentrations of persistent organic pollutants (POPs) in the atmosphere have been monitored at multiple stations worldwide. We used three statistical methods to analyze a total of 748 time series of selected POPs in the atmosphere to determine if there are statistically significant reductions in levels of POPs that have had control actions enacted to restrict or eliminate manufacture, use and emissions. Significant decreasing trends were identified in 560 (75%) of the 748 time series collected from the Arctic, North America, and Europe, indicating that the atmospheric concentrations of these POPs are generally decreasing, consistent with the overall effectiveness of emission control actions. Statistically significant trends in synthetic time series could be reliably identified with the improved Mann-Kendall (iMK) test and the digital filtration (DF) technique in time series longer than 5 years. The temporal trends of new (or emerging) POPs in the atmosphere are often unclear because time series are too short. A statistical detrending method based on the iMK test was not able to identify abrupt changes in the rates of decline of atmospheric POP concentrations encoded into synthetic time series.
NASA Technical Reports Server (NTRS)
Mlynczak, Martin G.; Martin-Torres, F. Javier; Mertens, Christopher J.; Marshall, B. Thomas; Thompson, R. Earl; Kozyra, Janet U.; Remsberg, Ellis E.; Gordley, Larry L.; Russell, James M.; Woods, Thomas
2008-01-01
We examine time series of the daily global power (W) radiated by carbon dioxide (at 15 microns) and by nitric oxide (at 5.3 microns) from the Earth s thermosphere between 100 km and 200 km altitude. Also examined is a time series of the daily absorbed solar ultraviolet power in the same altitude region in the wavelength span 0 to 175 nm. The infrared data are derived from the SABER instrument and the solar data are derived from the SEE instrument, both on the NASA TIMED satellite. The time series cover nearly 5 years from 2002 through 2006. The infrared and solar time series exhibit a decrease in radiated and absorbed power consistent with the declining phase of the current 11-year solar cycle. The infrared time series also exhibits high frequency variations that are not evident in the solar power time series. Spectral analysis shows a statistically significant 9-day periodicity in the infrared data but not in the solar data. A very strong 9-day periodicity is also found to exist in the time series of daily A(sub p) and K(sub p) geomagnetic indexes. These 9-day periodicities are linked to the recurrence of coronal holes on the Sun. These results demonstrate a direct coupling between the upper atmosphere of the Sun and the infrared energy budget of the thermosphere.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... provides a consistent time series according to which groundfish resources may be managed more efficiently...: Business or other for-profit organizations. Estimated Number of Respondents: 166. Estimated Time per...
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
This paper develops techniques to evaluate the discrete Fourier transform (DFT), the autocorrelation function (ACF), and the cross-correlation function (CCF) of time series which are not evenly sampled. The series may consist of quantized point data (e.g., yes/no processes such as photon arrival). The DFT, which can be inverted to recover the original data and the sampling, is used to compute correlation functions by means of a procedure which is effectively, but not explicitly, an interpolation. The CCF can be computed for two time series not even sampled at the same set of times. Techniques for removing the distortion of the correlation functions caused by the sampling, determining the value of a constant component to the data, and treating unequally weighted data are also discussed. FORTRAN code for the Fourier transform algorithm and numerical examples of the techniques are given.
48 CFR 814.201 - Preparation of invitations for bids.
Code of Federal Regulations, 2010 CFR
2010-10-01
... numbered at the time of issue. Numbers assigned locally must consist of the facility or VA National... Year 2007. A series beginning with the number 1 must be started each fiscal year. Numbers assigned from... is numbered locally must be numbered in the series of the year in which it is issued, will be...
48 CFR 814.201 - Preparation of invitations for bids.
Code of Federal Regulations, 2012 CFR
2012-10-01
... numbered at the time of issue. Numbers assigned locally must consist of the facility or VA National... Year 2007. A series beginning with the number 1 must be started each fiscal year. Numbers assigned from... is numbered locally must be numbered in the series of the year in which it is issued, will be...
48 CFR 814.201 - Preparation of invitations for bids.
Code of Federal Regulations, 2014 CFR
2014-10-01
... numbered at the time of issue. Numbers assigned locally must consist of the facility or VA National... Year 2007. A series beginning with the number 1 must be started each fiscal year. Numbers assigned from... is numbered locally must be numbered in the series of the year in which it is issued, will be...
Fontana, Fabio; Rixen, Christian; Jonas, Tobias; Aberegg, Gabriel; Wunderle, Stefan
2008-01-01
This study evaluates the ability to track grassland growth phenology in the Swiss Alps with NOAA-16 Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) time series. Three growth parameters from 15 alpine and subalpine grassland sites were investigated between 2001 and 2005: Melt-Out (MO), Start Of Growth (SOG), and End Of Growth (EOG). We tried to estimate these phenological dates from yearly NDVI time series by identifying dates, where certain fractions (thresholds) of the maximum annual NDVI amplitude were crossed for the first time. For this purpose, the NDVI time series were smoothed using two commonly used approaches (Fourier adjustment or alternatively Savitzky-Golay filtering). Moreover, AVHRR NDVI time series were compared against data from the newer generation sensors SPOT VEGETATION and TERRA MODIS. All remote sensing NDVI time series were highly correlated with single point ground measurements and therefore accurately represented growth dynamics of alpine grassland. The newer generation sensors VGT and MODIS performed better than AVHRR, however, differences were minor. Thresholds for the determination of MO, SOG, and EOG were similar across sensors and smoothing methods, which demonstrated the robustness of the results. For our purpose, the Fourier adjustment algorithm created better NDVI time series than the Savitzky-Golay filter, since latter appeared to be more sensitive to noisy NDVI time series. Findings show that the application of various thresholds to NDVI time series allows the observation of the temporal progression of vegetation growth at the selected sites with high consistency. Hence, we believe that our study helps to better understand large-scale vegetation growth dynamics above the tree line in the European Alps. PMID:27879852
Fontana, Fabio; Rixen, Christian; Jonas, Tobias; Aberegg, Gabriel; Wunderle, Stefan
2008-04-23
This study evaluates the ability to track grassland growth phenology in the Swiss Alps with NOAA-16 Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) time series. Three growth parameters from 15 alpine and subalpine grassland sites were investigated between 2001 and 2005: Melt-Out (MO), Start Of Growth (SOG), and End Of Growth (EOG).We tried to estimate these phenological dates from yearly NDVI time series by identifying dates, where certain fractions (thresholds) of the maximum annual NDVI amplitude were crossed for the first time. For this purpose, the NDVI time series were smoothed using two commonly used approaches (Fourier adjustment or alternatively Savitzky-Golay filtering). Moreover, AVHRR NDVI time series were compared against data from the newer generation sensors SPOT VEGETATION and TERRA MODIS. All remote sensing NDVI time series were highly correlated with single point ground measurements and therefore accurately represented growth dynamics of alpine grassland. The newer generation sensors VGT and MODIS performed better than AVHRR, however, differences were minor. Thresholds for the determination of MO, SOG, and EOG were similar across sensors and smoothing methods, which demonstrated the robustness of the results. For our purpose, the Fourier adjustment algorithm created better NDVI time series than the Savitzky-Golay filter, since latter appeared to be more sensitive to noisy NDVI time series. Findings show that the application of various thresholds to NDVI time series allows the observation of the temporal progression of vegetation growth at the selected sites with high consistency. Hence, we believe that our study helps to better understand largescale vegetation growth dynamics above the tree line in the European Alps.
A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting
NASA Astrophysics Data System (ADS)
Kim, T.; Joo, K.; Seo, J.; Heo, J. H.
2016-12-01
Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375
[Introduction and some problems of the rapid time series laboratory reporting system].
Kanao, M; Yamashita, K; Kuwajima, M
1999-09-01
We introduced an on-line system of biochemical, hematological, serological, urinary, bacteriological, and emergency examinations and associated office work using a client server system NEC PC-LACS based on a system consisting of concentration of outpatient blood collection, concentration of outpatient reception, and outpatient examination by reservation. Using this on-line system, results of 71 items in chemical serological, hematological, and urinary examinations are rapidly reported within 1 hour. Since the ordering system at our hospital has not been completed yet, we constructed a rapid time series reporting system in which time series data obtained on 5 serial occasions are printed on 2 sheets of A4 paper at the time of the final report. In each consultation room of the medical outpatient clinic, at the neuromedical outpatient clinic, and at the kidney center where examinations are frequently performed, terminal equipment and a printer for inquiry were established for real-time output of time series reports. Results are reported by FAX to the other outpatient clinics and wards, and subsequently, time series reports are output at the clinical laboratory department. This system allowed rapid examination, especially preconsultation examination. This system was also useful for reducing office work and effectively utilize examination data.
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
Chen, Chi-Kan
2017-07-26
The identification of genetic regulatory networks (GRNs) provides insights into complex cellular processes. A class of recurrent neural networks (RNNs) captures the dynamics of GRN. Algorithms combining the RNN and machine learning schemes were proposed to reconstruct small-scale GRNs using gene expression time series. We present new GRN reconstruction methods with neural networks. The RNN is extended to a class of recurrent multilayer perceptrons (RMLPs) with latent nodes. Our methods contain two steps: the edge rank assignment step and the network construction step. The former assigns ranks to all possible edges by a recursive procedure based on the estimated weights of wires of RNN/RMLP (RE RNN /RE RMLP ), and the latter constructs a network consisting of top-ranked edges under which the optimized RNN simulates the gene expression time series. The particle swarm optimization (PSO) is applied to optimize the parameters of RNNs and RMLPs in a two-step algorithm. The proposed RE RNN -RNN and RE RMLP -RNN algorithms are tested on synthetic and experimental gene expression time series of small GRNs of about 10 genes. The experimental time series are from the studies of yeast cell cycle regulated genes and E. coli DNA repair genes. The unstable estimation of RNN using experimental time series having limited data points can lead to fairly arbitrary predicted GRNs. Our methods incorporate RNN and RMLP into a two-step structure learning procedure. Results show that the RE RMLP using the RMLP with a suitable number of latent nodes to reduce the parameter dimension often result in more accurate edge ranks than the RE RNN using the regularized RNN on short simulated time series. Combining by a weighted majority voting rule the networks derived by the RE RMLP -RNN using different numbers of latent nodes in step one to infer the GRN, the method performs consistently and outperforms published algorithms for GRN reconstruction on most benchmark time series. The framework of two-step algorithms can potentially incorporate with different nonlinear differential equation models to reconstruct the GRN.
Muhs, Daniel R.; Simmons, Kathleen R.; Porat, Naomi
2015-01-01
We analyzed corals from the Neotyrrhenian beds on Mallorca, which gave U-series ages from ~ 126 ka to ~ 118 ka. These ages are consistent with previously published amino acid data that show that the Neotyrrhenian and Eutyrrhenian deposits are not significantly different in age. A fossil molluscan fauna from the Neotyrrhenian deposits on Mallorca has a warm-water paleozoogeographic aspect, with nine southward-ranging species and four extralimital southern species. When compared with sea surface temperatures obtained from planktonic foraminifera and alkenones from ODP core 977 in the nearby Alboran Sea, the only time period that shows comparable warmth is MIS 5.5/5e, consistent with the U-series ages of corals from the Neotyrrhenian deposits. We propose that the Neotyrrhenian deposits are a beachrock facies of the same age as the Eutyrrhenian deposits. This interpretation is consistent with the differences in physical sedimentology of the two deposits, explains the U-series and amino acid data indicating the same age, is consistent with the very slight elevation difference of the Neotyrrhenian and Eutyrrhenian beds, and explains the similar, though not identical paleozoogeographic aspects of their fossil faunas.
Combinations of Earth Orientation Measurements: SPACE2001, COMB2001, and POLE2001
NASA Technical Reports Server (NTRS)
Gross, Richard S.
2002-01-01
Independent Earth-orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the global positioning system have been combined using a Kalman filter. The resulting combined Earth-orientation series, SPACE2001, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28.0, 1976 to January 19.0, 2002 at daily intervals. The space-geodetic measurements used to generate SPACE2001 have been combined with optical astrometric measurements to form two additional combined Earth-orientation series: (1) COMB2001, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20.0, 1962 to January 15.0, 2002 at five-day intervals, and (2) POLE2001, consisting of values and uncertainties for polar motion and its rates that span from January 20, 1900 to December 21, 2001 at 30.4375-day intervals.
Ye, Yu; Kerr, William C
2011-01-01
To explore various model specifications in estimating relationships between liver cirrhosis mortality rates and per capita alcohol consumption in aggregate-level cross-section time-series data. Using a series of liver cirrhosis mortality rates from 1950 to 2002 for 47 U.S. states, the effects of alcohol consumption were estimated from pooled autoregressive integrated moving average (ARIMA) models and 4 types of panel data models: generalized estimating equation, generalized least square, fixed effect, and multilevel models. Various specifications of error term structure under each type of model were also examined. Different approaches controlling for time trends and for using concurrent or accumulated consumption as predictors were also evaluated. When cirrhosis mortality was predicted by total alcohol, highly consistent estimates were found between ARIMA and panel data analyses, with an average overall effect of 0.07 to 0.09. Less consistent estimates were derived using spirits, beer, and wine consumption as predictors. When multiple geographic time series are combined as panel data, none of existent models could accommodate all sources of heterogeneity such that any type of panel model must employ some form of generalization. Different types of panel data models should thus be estimated to examine the robustness of findings. We also suggest cautious interpretation when beverage-specific volumes are used as predictors. Copyright © 2010 by the Research Society on Alcoholism.
2011-01-01
Background Network inference methods reconstruct mathematical models of molecular or genetic networks directly from experimental data sets. We have previously reported a mathematical method which is exclusively data-driven, does not involve any heuristic decisions within the reconstruction process, and deliveres all possible alternative minimal networks in terms of simple place/transition Petri nets that are consistent with a given discrete time series data set. Results We fundamentally extended the previously published algorithm to consider catalysis and inhibition of the reactions that occur in the underlying network. The results of the reconstruction algorithm are encoded in the form of an extended Petri net involving control arcs. This allows the consideration of processes involving mass flow and/or regulatory interactions. As a non-trivial test case, the phosphate regulatory network of enterobacteria was reconstructed using in silico-generated time-series data sets on wild-type and in silico mutants. Conclusions The new exact algorithm reconstructs extended Petri nets from time series data sets by finding all alternative minimal networks that are consistent with the data. It suggested alternative molecular mechanisms for certain reactions in the network. The algorithm is useful to combine data from wild-type and mutant cells and may potentially integrate physiological, biochemical, pharmacological, and genetic data in the form of a single model. PMID:21762503
Dst and a map of average equivalent ring current: 1958-2007
NASA Astrophysics Data System (ADS)
Love, J. J.
2008-12-01
A new Dst index construction is made using the original hourly magnetic-observatory data collected over the years 1958-2007; stations: Hermanus South Africa, Kakioka Japan, Honolulu Hawaii, and San Juan Puerto Rico. The construction method we use is generally consistent with the algorithm defined by Sugiura (1964), and which forms the basis for the standard Kyoto Dst index. This involves corrections for observatory baseline shifts, subtraction of the main-field secular variation, and subtraction of specific harmonics that approximate the solar-quiet (Sq) variation. Fourier analysis of the observatory data reveals the nature of Sq: it consists primarily of periodic variation driven by the Earth's rotation, the Moon's orbit, the Earth's orbit, and, to some extent, the solar cycle. Cross coupling of the harmonics associated with each of the external periodic driving forces results in a seemingly complicated Sq time series that is sometimes considered to be relatively random and unpredictable, but which is, in fact, well described in terms of Fourier series. Working in the frequency domain, Sq can be filtered out, and, upon return to the time domain, the local disturbance time series (Dist) for each observatory can be recovered. After averaging the local disturbance time series from each observatory, the global magnetic disturbance time series Dst is obtained. Analysis of this new Dst index is compared with that produced by Kyoto, and various biases and differences are discussed. The combination of the Dist and Dst time series can be used to explore the local-time/universal-time symmetry of an equivalent ring current. Individual magnetic storms can have a complicated disturbance field that is asymmetrical in longitude, presumably due to partial ring currents. Using 50 years of data we map the average local-time magnetic disturbance, finding that it is very nearly proportional to Dst. To our surprise, the primary asymmetry in mean magnetic disturbance is not between midnight and noon, but rather between dawn and dusk, with greatest mean disturbance occurring at dusk. As a result, proposed corrections to Dst for magnetopause and tail currents might be reasonably reconsidered.
The potential of using Landsat time-series to extract tropical dry forest phenology
NASA Astrophysics Data System (ADS)
Zhu, X.; Helmer, E.
2016-12-01
Vegetation phenology is the timing of seasonal developmental stages in plant life cycles. Due to the persistent cloud cover in tropical regions, current studies often use satellite data with high frequency, such as AVHRR and MODIS, to detect vegetation phenology. However, the spatial resolution of these data is from 250 m to 1 km, which does not have enough spatial details and it is difficult to relate to field observations. To produce maps of phenology at a finer spatial resolution, this study explores the feasibility of using Landsat images to detect tropical forest phenology through reconstructing a high-quality, seasonal time-series of images, and tested it in Mona Island, Puerto Rico. First, an automatic method was applied to detect cloud and cloud shadow, and a spatial interpolator was use to retrieve pixels covered by clouds, shadows, and SLC-off gaps. Second, enhanced vegetation index time-series derived from the reconstructed Landsat images were used to detect 11 phenology variables. Detected phenology is consistent with field investigations, and its spatial pattern is consistent with the rainfall distribution on this island. In addition, we may expect that phenology should correlate with forest biophysical attributes, so 47 plots with field measurement of biophysical attributes were used to indirectly validate the phenology product. Results show that phenology variables can explain a lot of variations in biophysical attributes. This study suggests that Landsat time-series has great potential to detect phenology in tropical areas.
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Morton, A.
2017-10-01
Identifying erratic or unstable time-series is an area of interest to many fields. Recently, there have been successful developments towards this goal. These new developed methodologies however come from domains where it is typical to have several thousand or more temporal observations. This creates a challenge when attempting to apply these methodologies to time-series with much fewer temporal observations such as for socio-cultural understanding, a domain where a typical time series of interest might only consist of 20-30 annual observations. Most existing methodologies simply cannot say anything interesting with so few data points, yet researchers are still tasked to work within in the confines of the data. Recently a method for characterizing instability in a time series with limitedtemporal observations was published. This method, Attribute Stability Index (ASI), uses an approximate entropy based method tocharacterize a time series' instability. In this paper we propose an explicitly spatially weighted extension of the Attribute StabilityIndex. By including a mechanism to account for spatial autocorrelation, this work represents a novel approach for the characterizationof space-time instability. As a case study we explore national youth male unemployment across the world from 1991-2014.
NASA Astrophysics Data System (ADS)
Müller, H.; Haberlandt, U.
2018-01-01
Rainfall time series of high temporal resolution and spatial density are crucial for urban hydrology. The multiplicative random cascade model can be used for temporal rainfall disaggregation of daily data to generate such time series. Here, the uniform splitting approach with a branching number of 3 in the first disaggregation step is applied. To achieve a final resolution of 5 min, subsequent steps after disaggregation are necessary. Three modifications at different disaggregation levels are tested in this investigation (uniform splitting at Δt = 15 min, linear interpolation at Δt = 7.5 min and Δt = 3.75 min). Results are compared both with observations and an often used approach, based on the assumption that a time steps with Δt = 5.625 min, as resulting if a branching number of 2 is applied throughout, can be replaced with Δt = 5 min (called the 1280 min approach). Spatial consistence is implemented in the disaggregated time series using a resampling algorithm. In total, 24 recording stations in Lower Saxony, Northern Germany with a 5 min resolution have been used for the validation of the disaggregation procedure. The urban-hydrological suitability is tested with an artificial combined sewer system of about 170 hectares. The results show that all three variations outperform the 1280 min approach regarding reproduction of wet spell duration, average intensity, fraction of dry intervals and lag-1 autocorrelation. Extreme values with durations of 5 min are also better represented. For durations of 1 h, all approaches show only slight deviations from the observed extremes. The applied resampling algorithm is capable to achieve sufficient spatial consistence. The effects on the urban hydrological simulations are significant. Without spatial consistence, flood volumes of manholes and combined sewer overflow are strongly underestimated. After resampling, results using disaggregated time series as input are in the range of those using observed time series. Best overall performance regarding rainfall statistics are obtained by the method in which the disaggregation process ends at time steps with 7.5 min duration, deriving the 5 min time steps by linear interpolation. With subsequent resampling this method leads to a good representation of manhole flooding and combined sewer overflow volume in terms of hydrological simulations and outperforms the 1280 min approach.
Sea change: Charting the course for biogeochemical ocean time-series research in a new millennium
NASA Astrophysics Data System (ADS)
Church, Matthew J.; Lomas, Michael W.; Muller-Karger, Frank
2013-09-01
Ocean time-series provide vital information needed for assessing ecosystem change. This paper summarizes the historical context, major program objectives, and future research priorities for three contemporary ocean time-series programs: The Hawaii Ocean Time-series (HOT), the Bermuda Atlantic Time-series Study (BATS), and the CARIACO Ocean Time-Series. These three programs operate in physically and biogeochemically distinct regions of the world's oceans, with HOT and BATS located in the open-ocean waters of the subtropical North Pacific and North Atlantic, respectively, and CARIACO situated in the anoxic Cariaco Basin of the tropical Atlantic. All three programs sustain near-monthly shipboard occupations of their field sampling sites, with HOT and BATS beginning in 1988, and CARIACO initiated in 1996. The resulting data provide some of the only multi-disciplinary, decadal-scale determinations of time-varying ecosystem change in the global ocean. Facilitated by a scoping workshop (September 2010) sponsored by the Ocean Carbon Biogeochemistry (OCB) program, leaders of these time-series programs sought community input on existing program strengths and for future research directions. Themes that emerged from these discussions included: 1. Shipboard time-series programs are key to informing our understanding of the connectivity between changes in ocean-climate and biogeochemistry 2. The scientific and logistical support provided by shipboard time-series programs forms the backbone for numerous research and education programs. Future studies should be encouraged that seek mechanistic understanding of ecological interactions underlying the biogeochemical dynamics at these sites. 3. Detecting time-varying trends in ocean properties and processes requires consistent, high-quality measurements. Time-series must carefully document analytical procedures and, where possible, trace the accuracy of analyses to certified standards and internal reference materials. 4. Leveraged implementation, testing, and validation of autonomous and remote observing technologies at time-series sites provide new insights into spatiotemporal variability underlying ecosystem changes. 5. The value of existing time-series data for formulating and validating ecosystem models should be promoted. In summary, the scientific underpinnings of ocean time-series programs remain as strong and important today as when these programs were initiated. The emerging data inform our knowledge of the ocean's biogeochemistry and ecology, and improve our predictive capacity about planetary change.
Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals
NASA Astrophysics Data System (ADS)
Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.
2018-02-01
Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ <α-, where α+ is related to heart rate decelerations and α- to heart rate accelerations, and the proportion of the signal in which the above inequality holds. A very similar effect is observed if asymmetric noise is added to a symmetric self-affine function. No such phenomena are observed in the same physiological data after shuffling or with a group of symmetric synthetic time series.
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2010 CFR
2010-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2011 CFR
2011-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2012 CFR
2012-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2014 CFR
2014-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
10 CFR 300.5 - Submission of an entity statement.
Code of Federal Regulations, 2013 CFR
2013-01-01
... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Telesca, Luciano; Kanevski, Mikhail
2018-02-01
In this paper, we study the periodic fluctuations of connectivity density time series of a wind speed-monitoring network in Switzerland. By using the correlogram-based robust periodogram annual periodic oscillations were found in the correlation-based network. The intensity of such annual periodic oscillations is larger for lower correlation thresholds and smaller for higher. The annual periodicity in the connectivity density seems reasonably consistent with the seasonal meteo-climatic cycle.
NASA Astrophysics Data System (ADS)
Mizukami, N.; Smith, M. B.
2010-12-01
It is common for the error characteristics of long-term precipitation data to change over time due to various factors such as gauge relocation and changes in data processing methods. The temporal consistency of precipitation data error characteristics is as important as data accuracy itself for hydrologic model calibration and subsequent use of the calibrated model for streamflow prediction. In mountainous areas, the generation of precipitation grids relies on sparse gage networks, the makeup of which often varies over time. This causes a change in error characteristics of the long-term precipitation data record. We will discuss the diagnostic analysis of the consistency of gridded precipitation time series and illustrate the adverse effect of inconsistent precipitation data on a hydrologic model simulation. We used hourly 4 km gridded precipitation time series over a mountainous basin in the Sierra Nevada Mountains of California from October 1988 through September 2006. The basin is part of the broader study area that served as the focus of the second phase of the Distributed Model Intercomparison Project (DMIP-2), organized by the U.S. National Weather Service (NWS) of the National Oceanographic and Atmospheric Administration (NOAA). To check the consistency of the gridded precipitation time series, double mass analysis was performed using single pixel and basin mean areal precipitation (MAP) values derived from gridded DMIP-2 and Parameter-Elevation Regressions on Independent Slopes Model (PRISM) precipitation data. The analysis leads to the conclusion that over the entire study time period, a clear change in error characteristics in the DMIP-2 data occurred in the beginning of 2003. This matches the timing of one of the major gage network changes. The inconsistency of two MAP time series computed from the gridded precipitation fields over two elevation zones was corrected by adjusting hourly values based on the double mass analysis. We show that model simulations using the adjusted MAP data produce improved stream flow compared to simulations using the inconsistent MAP input data.
Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability.
Krafty, Robert T
2016-07-01
Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modeled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possess favorable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability.
Analysis of Vlbi, Slr and GPS Site Position Time Series
NASA Astrophysics Data System (ADS)
Angermann, D.; Krügel, M.; Meisel, B.; Müller, H.; Tesmer, V.
Conventionally the IERS terrestrial reference frame (ITRF) is realized by the adoption of a set of epoch coordinates and linear velocities for a set of global tracking stations. Due to the remarkable progress of the space geodetic observation techniques (e.g. VLBI, SLR, GPS) the accuracy and consistency of the ITRF increased continuously. The accuracy achieved today is mainly limited by technique-related systematic errors, which are often poorly characterized or quantified. Therefore it is essential to analyze the individual techniques' solutions with respect to systematic differences, models, parameters, datum definition, etc. Main subject of this presentation is the analysis of GPS, SLR and VLBI time series of site positions. The investigations are based on SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS time series are based on weekly IGS station coordinates solutions. We analyze the time series with respect to the issues mentioned above. In particular we characterize the noise in the time series, identify periodic signals, and investigate non-linear effects that complicate the assignment of linear velocities for global tracking sites. One important aspect is the comparison of results obtained by different techniques at colocation sites.
Sentinel 2 products and data quality status
NASA Astrophysics Data System (ADS)
Clerc, Sebastien; Gascon, Ferran; Bouzinac, Catherine; Touli-Lebreton, Dimitra; Francesconi, Benjamin; Lafrance, Bruno; Louis, Jerome; Alhammoud, Bahjat; Massera, Stephane; Pflug, Bringfried; Viallefont, Francoise; Pessiot, Laetitia
2017-04-01
Since July 2015, Sentinel-2A provides high-quality multi-spectral images with 10 m spatial resolution. With the launch of Sentinel-2B scheduled for early March 2017, the mission will create a consistent time series with a revisit time of 5 days. The consistency of the time series is ensured by some specific performance requirements such as multi-temporal spatial co-registration and radiometric stability, routinely monitored by the Sentinel-2 Mission Performance Centre (S2MPC). The products also provide a rich set of metadata and auxiliary data to support higher-level processing. This presentation will focus on the current status of the Sentinel-2 L1C and L2A products, including dissemination and product format aspects. Up-to-date mission performance estimations will be presented. Finally we will provide an outlook on the future evolutions: commissioning tasks for Sentinel-2B, geometric refinement, product format and processing improvements.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-10
... Mexico stocks of gag and greater amberjack will consist of two workshops and a series of webinars: a Data Workshop, an Assessment process conducted via webinars, and a Review Workshop. This series of workshops and.... eastern time, will last approximately four hours, and will be conducted using GoToWebinar. Participants...
Using Time Series Analysis to Predict Cardiac Arrest in a PICU.
Kennedy, Curtis E; Aoki, Noriaki; Mariscalco, Michele; Turley, James P
2015-11-01
To build and test cardiac arrest prediction models in a PICU, using time series analysis as input, and to measure changes in prediction accuracy attributable to different classes of time series data. Retrospective cohort study. Thirty-one bed academic PICU that provides care for medical and general surgical (not congenital heart surgery) patients. Patients experiencing a cardiac arrest in the PICU and requiring external cardiac massage for at least 2 minutes. None. One hundred three cases of cardiac arrest and 109 control cases were used to prepare a baseline dataset that consisted of 1,025 variables in four data classes: multivariate, raw time series, clinical calculations, and time series trend analysis. We trained 20 arrest prediction models using a matrix of five feature sets (combinations of data classes) with four modeling algorithms: linear regression, decision tree, neural network, and support vector machine. The reference model (multivariate data with regression algorithm) had an accuracy of 78% and 87% area under the receiver operating characteristic curve. The best model (multivariate + trend analysis data with support vector machine algorithm) had an accuracy of 94% and 98% area under the receiver operating characteristic curve. Cardiac arrest predictions based on a traditional model built with multivariate data and a regression algorithm misclassified cases 3.7 times more frequently than predictions that included time series trend analysis and built with a support vector machine algorithm. Although the final model lacks the specificity necessary for clinical application, we have demonstrated how information from time series data can be used to increase the accuracy of clinical prediction models.
RankExplorer: Visualization of Ranking Changes in Large Time Series Data.
Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin
2012-12-01
For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-11
... queen triggerfish will consist of a series of workshops and webinars: This notice is for a webinar... time. The established times may be adjusted as necessary to accommodate the timely completion of..., or completed prior to, the time established by this notice. ADDRESSES: The meeting will be held via...
A 305-year continuous monthly rainfall series for the island of Ireland (1711-2016)
NASA Astrophysics Data System (ADS)
Murphy, Conor; Broderick, Ciaran; Burt, Timothy P.; Curley, Mary; Duffy, Catriona; Hall, Julia; Harrigan, Shaun; Matthews, Tom K. R.; Macdonald, Neil; McCarthy, Gerard; McCarthy, Mark P.; Mullan, Donal; Noone, Simon; Osborn, Timothy J.; Ryan, Ciara; Sweeney, John; Thorne, Peter W.; Walsh, Seamus; Wilby, Robert L.
2018-03-01
A continuous 305-year (1711-2016) monthly rainfall series (IoI_1711) is created for the Island of Ireland. The post 1850 series draws on an existing quality assured rainfall network for Ireland, while pre-1850 values come from instrumental and documentary series compiled, but not published by the UK Met Office. The series is evaluated by comparison with independent long-term observations and reconstructions of precipitation, temperature and circulation indices from across the British-Irish Isles. Strong decadal consistency of IoI_1711 with other long-term observations is evident throughout the annual, boreal spring and autumn series. Annually, the most recent decade (2006-2015) is found to be the wettest in over 300 years. The winter series is probably too dry between the 1740s and 1780s, but strong consistency with other long-term observations strengthens confidence from 1790 onwards. The IoI_1711 series has remarkably wet winters during the 1730s, concurrent with a period of strong westerly airflow, glacial advance throughout Scandinavia and near unprecedented warmth in the Central England Temperature record - all consistent with a strongly positive phase of the North Atlantic Oscillation. Unusually wet summers occurred in the 1750s, consistent with proxy (tree-ring) reconstructions of summer precipitation in the region. Our analysis shows that inter-decadal variability of precipitation is much larger than previously thought, while relationships with key modes of climate variability are time-variant. The IoI_1711 series reveals statistically significant multi-centennial trends in winter (increasing) and summer (decreasing) seasonal precipitation. However, given uncertainties in the early winter record, the former finding should be regarded as tentative. The derived record, one of the longest continuous series in Europe, offers valuable insights for understanding multi-decadal and centennial rainfall variability in Ireland, and provides a firm basis for benchmarking other long-term records and reconstructions of past climate. Correlation of Irish rainfall with other parts of Europe increases the utility of the series for understanding historical climate in further regions.
NASA Astrophysics Data System (ADS)
Nechad, Bouchra; Alvera-Azcaràte, Aida; Ruddick, Kevin; Greenwood, Naomi
2011-08-01
In situ measurements of total suspended matter (TSM) over the period 2003-2006, collected with two autonomous platforms from the Centre for Environment, Fisheries and Aquatic Sciences (Cefas) measuring the optical backscatter (OBS) in the southern North Sea, are used to assess the accuracy of TSM time series extracted from satellite data. Since there are gaps in the remote sensing (RS) data, due mainly to cloud cover, the Data Interpolating Empirical Orthogonal Functions (DINEOF) is used to fill in the TSM time series and build a continuous daily "recoloured" dataset. The RS datasets consist of TSM maps derived from MODIS imagery using the bio-optical model of Nechad et al. (Rem Sens Environ 114: 854-866, 2010). In this study, the DINEOF time series are compared to the in situ OBS measured in moderately to very turbid waters respectively in West Gabbard and Warp Anchorage, in the southern North Sea. The discrepancies between instantaneous RS, DINEOF-filled RS data and Cefas data are analysed in terms of TSM algorithm uncertainties, space-time variability and DINEOF reconstruction uncertainty.
NASA Astrophysics Data System (ADS)
Wu, Xiaoping; Abbondanza, Claudio; Altamimi, Zuheir; Chin, T. Mike; Collilieux, Xavier; Gross, Richard S.; Heflin, Michael B.; Jiang, Yan; Parker, Jay W.
2015-05-01
The current International Terrestrial Reference Frame is based on a piecewise linear site motion model and realized by reference epoch coordinates and velocities for a global set of stations. Although linear motions due to tectonic plates and glacial isostatic adjustment dominate geodetic signals, at today's millimeter precisions, nonlinear motions due to earthquakes, volcanic activities, ice mass losses, sea level rise, hydrological changes, and other processes become significant. Monitoring these (sometimes rapid) changes desires consistent and precise realization of the terrestrial reference frame (TRF) quasi-instantaneously. Here, we use a Kalman filter and smoother approach to combine time series from four space geodetic techniques to realize an experimental TRF through weekly time series of geocentric coordinates. In addition to secular, periodic, and stochastic components for station coordinates, the Kalman filter state variables also include daily Earth orientation parameters and transformation parameters from input data frames to the combined TRF. Local tie measurements among colocated stations are used at their known or nominal epochs of observation, with comotion constraints applied to almost all colocated stations. The filter/smoother approach unifies different geodetic time series in a single geocentric frame. Fragmented and multitechnique tracking records at colocation sites are bridged together to form longer and coherent motion time series. While the time series approach to TRF reflects the reality of a changing Earth more closely than the linear approximation model, the filter/smoother is computationally powerful and flexible to facilitate incorporation of other data types and more advanced characterization of stochastic behavior of geodetic time series.
LAI, FAPAR and FCOVER products derived from AVHRR long time series: principles and evaluation
NASA Astrophysics Data System (ADS)
Verger, A.; Baret, F.; Weiss, M.; Lacaze, R.; Makhmara, H.; Pacholczyk, P.; Smets, B.; Kandasamy, S.; Vermote, E.
2012-04-01
Continuous and long term global monitoring of the terrestrial biosphere has draught an intense interest in the recent years in the context of climate and global change. Developing methodologies for generating historical data records from data collected with different satellite sensors over the past three decades by taking benefits from the improvements identified in the processing of the new generation sensors is a new central issue in remote sensing community. In this context, the Bio-geophysical Parameters (BioPar) service within Geoland2 project (http://www.geoland2.eu) aims at developing pre-operational infrastructures for providing global land products both in near real time and off-line mode with long time series. In this contribution, we describe the principles of the GEOLAND algorithm for generating long term datasets of three key biophysical variables, leaf area index (LAI), Fraction of Absorbed Photosynthetic Active Radiation (FAPAR) and cover fraction (FCOVER), that play a key role in several processes, including photosynthesis, respiration and transpiration. LAI, FAPAR and FCOVER are produced globally from AVHRR Long Term Data Record (LTDR) for the 1981-2000 period at 0.05° spatial resolution and 10 days temporal sampling frequency. The proposed algorithm aims to ensure robustness of the derived long time series and consistency with the ones developed in the recent years, and particularly with GEOLAND products derived from VEGETATION sensor. The approach is based on the capacity of neural networks to learn a particular biophysical product (GEOLAND) from reflectances from another sensor (AVHRR normalized reflectances in the red and near infrared bands). Outliers due to possible cloud contamination or residual atmospheric correction are iteratively eliminated. Prior information based on the climatology is used to get more robust estimates. A specific gap filing and smoothing procedure was applied to generate continuous and smooth time series of decadal products. Finally, quality assessment information as well as tentative quantitative uncertainties were proposed. The comparison of the resulting AVHRR LTDR products with actual GEOLAND series derived from VEGETATION demonstrates that they are very consistent, providing continuous time series of global observations of LAI, FAPAR and FCOVER for the last 30-year period, with continuation after 2011.
40 CFR 86.1725-99 - Maintenance.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) through (e) and subsequent model year provisions. (b) Manufacturers of series hybrid electric vehicles and... the first time the minimum performance level is observed for all battery system components. Possible... system consisting of a light that shall illuminate the first time the battery system is unable to achieve...
40 CFR 86.1725-99 - Maintenance.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) through (e) and subsequent model year provisions. (b) Manufacturers of series hybrid electric vehicles and... the first time the minimum performance level is observed for all battery system components. Possible... system consisting of a light that shall illuminate the first time the battery system is unable to achieve...
40 CFR 86.1725-99 - Maintenance.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) through (e) and subsequent model year provisions. (b) Manufacturers of series hybrid electric vehicles and... the first time the minimum performance level is observed for all battery system components. Possible... system consisting of a light that shall illuminate the first time the battery system is unable to achieve...
40 CFR 86.1725-99 - Maintenance.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) through (e) and subsequent model year provisions. (b) Manufacturers of series hybrid electric vehicles and... the first time the minimum performance level is observed for all battery system components. Possible... system consisting of a light that shall illuminate the first time the battery system is unable to achieve...
Panel data analysis of cardiotocograph (CTG) data.
Horio, Hiroyuki; Kikuchi, Hitomi; Ikeda, Tomoaki
2013-01-01
Panel data analysis is a statistical method, widely used in econometrics, which deals with two-dimensional panel data collected over time and over individuals. Cardiotocograph (CTG) which monitors fetal heart rate (FHR) using Doppler ultrasound and uterine contraction by strain gage is commonly used in intrapartum treatment of pregnant women. Although the relationship between FHR waveform pattern and the outcome such as umbilical blood gas data at delivery has long been analyzed, there exists no accumulated FHR patterns from large number of cases. As time-series economic fluctuations in econometrics such as consumption trend has been studied using panel data which consists of time-series and cross-sectional data, we tried to apply this method to CTG data. The panel data composed of a symbolized segment of FHR pattern can be easily handled, and a perinatologist can get the whole FHR pattern view from the microscopic level of time-series FHR data.
Chen, Hungyen; Chen, Ching-Yi; Shao, Kwang-Tsao
2018-05-08
Long-term time series datasets with consistent sampling methods are rather rare, especially the ones of non-target coastal fishes. Here we described a long-term time series dataset of fish collected by trammel net fish sampling and observed by an underwater diving visual census near the thermal discharges at two nuclear power plants on the northern coast of Taiwan. Both experimental and control stations of these two investigations were monitored four times per year in the surrounding seas at both plants from 2000 to 2017. The underwater visual census mainly monitored reef fish assemblages and trammel net samples monitored pelagic or demersal fishes above the muddy/sandy bottom. In total, 508 samples containing 203,863 individuals from 347 taxa were recorded in both investigations at both plants. These data can be used by ecologists and fishery biologists interested in the elucidation of the temporal patterns of species abundance and composition.
The Validation of Version 8 Ozone Profiles: Is SBUV Ready for Prime Time?
NASA Technical Reports Server (NTRS)
McPeters, R. D.; Wellemeyer, C. G.; Ahn, C.
2004-01-01
Ozone profile data are now available from a series of BUV instruments - SBUV on Nimbus 7 and SBW/2 instruments on NOAA 9, NOAA 11, and NOAA 16. The data have been processed through the new version 8 algorithm, which is designed to be more accurate and, more importantly, to reduce the influence of the a priori on ozone trends. As a part of the version 8 reprocessing we have attempted to apply a consistent calibration to the individual instruments so that their data records can be used together in a time series analysis. Validation consists of examining not only the mean difference from external datasets (i.e trends) but also consistency in the interannual variability of the data. Here we validate the v8 BUV data through comparison with ECC sondes, lidar and microwave measurements, and with SAGE II and HALOE satellite data records. We find that individual profiles generally agree with external data sets within +/-10% between 30 hPa and 1 hPa (approx. 24 - 50 km) and frequently agree within +/-5%. The interannual variability of the BUV ozone time series agrees well with that of SAGE II . On the average, different B W instruments usually agree within +/-5% with each other, though the relative error increases near the ends of the Nimbus 7 and NOAA 16 data records as a result of instrument problems. The combined v8 BUV data sets cover the 1979-2003 time period giving daily global coverage of the ozone vertical distribution to better accuracy than has ever been possible before.
NASA Astrophysics Data System (ADS)
Lambert, S. B.; Ziegler, Y.; Rosat, S.; Bizouard, C.
2017-12-01
Nutation time series derived from very long baseline interferometry (VLBI) and time varying surface gravity data recorded by superconducting gravimeters (SG) have long been used separately to assess the Earth's interior via the estimation of the free core and inner core resonance effects on nutation or tidal gravity. The results obtained from these two techniques have shown recently to be consistent, making relevant the combination of VLBI and SG observables and the estimation of Earth's interior parameters in a single inversion. We present here the results of combining nutation and surface gravity time series to improve estimates of the Earth's core and inner core resonant frequencies. We use VLBI nutation time series spanning 1984-2016 derived by several analysis centers affiliated to the International VLBI Service for Geodesy and Astrometry, together with surface gravity data from about 15 SG stations. We address the resonance model used for describing the Earth's interior response to tidal excitation, the data preparation consisting of the error recalibration and amplitude fitting to nutation data, and processing of SG time-varying gravity to remove any gaps, spikes, steps and other disturbances, followed by the tidal analysis with the ETERNA 3.4 software package. New estimates of the resonant periods are proposed and correlations between the parameters are investigated.
NASA Astrophysics Data System (ADS)
Ziegler, Yann; Lambert, Sébastien; Rosat, Séverine; Nurul Huda, Ibnu; Bizouard, Christian
2017-04-01
Nutation time series derived from very long baseline interferometry (VLBI) and time varying surface gravity data recorded by superconducting gravimeters (SG) have long been used separately to assess the Earth's interior via the estimation of the free core and inner core resonance effects on nutation or tidal gravity. The results obtained from these two techniques have been shown recently to be consistent, making relevant the combination of VLBI and SG observables and the estimation of Earth's interior parameters in a single inversion. We present here the intermediate results of the ongoing project of combining nutation and surface gravity time series to improve estimates of the Earth's core and inner core resonant frequencies. We use VLBI nutation time series spanning 1984-2016 derived by the International VLBI Service for geodesy and astrometry (IVS) as the result of a combination of inputs from various IVS analysis centers, and surface gravity data from about 15 SG stations. We address here the resonance model used for describing the Earth's interior response to tidal excitation, the data preparation consisting of the error recalibration and amplitude fitting for nutation data, and processing of SG time-varying gravity to remove any gaps, spikes, steps and other disturbances, followed by the tidal analysis with the ETERNA 3.4 software package, the preliminary estimates of the resonant periods, and the correlations between parameters.
40 CFR 86.115-78 - EPA urban dynamometer driving schedule.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission.... time relationships. They each consist of a distinct nonrepetitive series of idle, acceleration, cruise...
40 CFR 86.115-78 - EPA urban dynamometer driving schedule.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission.... time relationships. They each consist of a distinct nonrepetitive series of idle, acceleration, cruise...
40 CFR 86.115-78 - EPA urban dynamometer driving schedule.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission.... time relationships. They each consist of a distinct nonrepetitive series of idle, acceleration, cruise...
Layered Ensemble Architecture for Time Series Forecasting.
Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin
2016-01-01
Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.
Discovering time-lagged rules from microarray data using gene profile classifiers
2011-01-01
Background Gene regulatory networks have an essential role in every process of life. In this regard, the amount of genome-wide time series data is becoming increasingly available, providing the opportunity to discover the time-delayed gene regulatory networks that govern the majority of these molecular processes. Results This paper aims at reconstructing gene regulatory networks from multiple genome-wide microarray time series datasets. In this sense, a new model-free algorithm called GRNCOP2 (Gene Regulatory Network inference by Combinatorial OPtimization 2), which is a significant evolution of the GRNCOP algorithm, was developed using combinatorial optimization of gene profile classifiers. The method is capable of inferring potential time-delay relationships with any span of time between genes from various time series datasets given as input. The proposed algorithm was applied to time series data composed of twenty yeast genes that are highly relevant for the cell-cycle study, and the results were compared against several related approaches. The outcomes have shown that GRNCOP2 outperforms the contrasted methods in terms of the proposed metrics, and that the results are consistent with previous biological knowledge. Additionally, a genome-wide study on multiple publicly available time series data was performed. In this case, the experimentation has exhibited the soundness and scalability of the new method which inferred highly-related statistically-significant gene associations. Conclusions A novel method for inferring time-delayed gene regulatory networks from genome-wide time series datasets is proposed in this paper. The method was carefully validated with several publicly available data sets. The results have demonstrated that the algorithm constitutes a usable model-free approach capable of predicting meaningful relationships between genes, revealing the time-trends of gene regulation. PMID:21524308
Ghalyan, Najah F; Miller, David J; Ray, Asok
2018-06-12
Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.
Big Data Analytics for Demand Response: Clustering Over Space and Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelmis, Charalampos; Kolte, Jahanvi; Prasanna, Viktor K.
The pervasive deployment of advanced sensing infrastructure in Cyber-Physical systems, such as the Smart Grid, has resulted in an unprecedented data explosion. Such data exhibit both large volumes and high velocity characteristics, two of the three pillars of Big Data, and have a time-series notion as datasets in this context typically consist of successive measurements made over a time interval. Time-series data can be valuable for data mining and analytics tasks such as identifying the “right” customers among a diverse population, to target for Demand Response programs. However, time series are challenging to mine due to their high dimensionality. Inmore » this paper, we motivate this problem using a real application from the smart grid domain. We explore novel representations of time-series data for BigData analytics, and propose a clustering technique for determining natural segmentation of customers and identification of temporal consumption patterns. Our method is generizable to large-scale, real-world scenarios, without making any assumptions about the data. We evaluate our technique using real datasets from smart meters, totaling ~ 18,200,000 data points, and show the efficacy of our technique in efficiency detecting the number of optimal number of clusters.« less
Time Series Observations of the 2015 Eclipse of b Persei (not beta Persei) (Abstract)
NASA Astrophysics Data System (ADS)
Collins, D. F.
2016-06-01
(Abstract only) The bright (V = 4.6) ellipsoidal variable b Persei consists of a close non-eclipsing binary pair that shows a nearly sinusoidal light curve with a ~1.5 day period. This system also contains a third star that orbits the binary pair every 702 days. AAVSO observers recently detected the first ever optical eclipse of A-B binary pair by the third star as a series of snapshots (D. Collins, R. Zavala, J. Sanborn - AAVSO Spring Meeting, 2013); abstract published in Collins, JAAVSO, 41, 2, 391 (2013); b Per mis-printed as b Per therein. A follow-up eclipse campaign in mid-January 2015 recorded time-series observations. These new time-series observations clearly show multiple ingress and egress of each component of the binary system by the third star over the eclipse duration of 2 to 3 days. A simulation of the eclipse was created. Orbital and some astrophysical parameters were adjusted within constraints to give a reasonable fit to the observed light curve.
Zeng, Nianyin; Wang, Zidong; Li, Yurong; Du, Min; Cao, Jie; Liu, Xiaohui
2013-12-01
In this paper, the expectation maximization (EM) algorithm is applied to the modeling of the nano-gold immunochromatographic assay (nano-GICA) via available time series of the measured signal intensities of the test and control lines. The model for the nano-GICA is developed as the stochastic dynamic model that consists of a first-order autoregressive stochastic dynamic process and a noisy measurement. By using the EM algorithm, the model parameters, the actual signal intensities of the test and control lines, as well as the noise intensity can be identified simultaneously. Three different time series data sets concerning the target concentrations are employed to demonstrate the effectiveness of the introduced algorithm. Several indices are also proposed to evaluate the inferred models. It is shown that the model fits the data very well.
Detecting Abrupt Changes in a Piecewise Locally Stationary Time Series
Last, Michael; Shumway, Robert
2007-01-01
Non-stationary time series arise in many settings, such as seismology, speech-processing, and finance. In many of these settings we are interested in points where a model of local stationarity is violated. We consider the problem of how to detect these change-points, which we identify by finding sharp changes in the time-varying power spectrum. Several different methods are considered, and we find that the symmetrized Kullback-Leibler information discrimination performs best in simulation studies. We derive asymptotic normality of our test statistic, and consistency of estimated change-point locations. We then demonstrate the technique on the problem of detecting arrival phases in earthquakes. PMID:19190715
NASA Astrophysics Data System (ADS)
Zingone, Adriana; Harrison, Paul J.; Kraberg, Alexandra; Lehtinen, Sirpa; McQuatters-Gollop, Abigail; O'Brien, Todd; Sun, Jun; Jakobsen, Hans H.
2015-09-01
Phytoplankton diversity and its variation over an extended time scale can provide answers to a wide range of questions relevant to societal needs. These include human health, the safe and sustained use of marine resources and the ecological status of the marine environment, including long-term changes under the impact of multiple stressors. The analysis of phytoplankton data collected at the same place over time, as well as the comparison among different sampling sites, provide key information for assessing environmental change, and evaluating new actions that must be made to reduce human induced pressures on the environment. To achieve these aims, phytoplankton data may be used several decades later by users that have not participated in their production, including automatic data retrieval and analysis. The methods used in phytoplankton species analysis vary widely among research and monitoring groups, while quality control procedures have not been implemented in most cases. Here we highlight some of the main differences in the sampling and analytical procedures applied to phytoplankton analysis and identify critical steps that are required to improve the quality and inter-comparability of data obtained at different sites and/or times. Harmonization of methods may not be a realistic goal, considering the wide range of purposes of phytoplankton time-series data collection. However, we propose that more consistent and detailed metadata and complementary information be recorded and made available along with phytoplankton time-series datasets, including description of the procedures and elements allowing for a quality control of the data. To keep up with the progress in taxonomic research, there is a need for continued training of taxonomists, and for supporting and complementing existing web resources, in order to allow a constant upgrade of knowledge in phytoplankton classification and identification. Efforts towards the improvement of metadata recording, data annotation and quality control procedures will ensure the internal consistency of phytoplankton time series and facilitate their comparability and accessibility, thus strongly increasing the value of the precious information they provide. Ultimately, the sharing of quality controlled data will allow one to recoup the high cost of obtaining the data through the multiple use of the time-series data in various projects over many decades.
Meteorological factors for PM10 concentration levels in Northern Spain
NASA Astrophysics Data System (ADS)
Santurtún, Ana; Mínguez, Roberto; Villar-Fernández, Alejandro; González Hidalgo, Juan Carlos; Zarrabeitia, María Teresa
2013-04-01
Atmospheric particulate matter (PM) is made up of a mixture of solid and aqueous species which enter the atmosphere by anthropogenic and natural pathways. The levels and composition of ambient air PM depend on the climatology and on the geography (topography, soil cover, proximity to arid zones or to the coast) of a given region. Spain has particular difficulties in achieving compliance with the limit values established by the European Union (based on recommendations from the World Health Organization) for particulate matter on the order of 10 micrometers of diameter or less (PM10), but not only antropogenical emissions are responsible for this: some studies show that PM10 concentrations originating from these kinds of sources are similar to what is found in other European countries, while some of the geographical features of the Iberian Peninsula (such as African mineral dust intrusion, soil aridity or rainfall) are proven to be a factor for higher PM concentrations. This work aims to describe PM10 concentration levels in Cantabria (Northern Spain) and their relationship with the following meteorological variables: rainfall, solar radiation, temperature, barometric pressure and wind speed. Data consists of daily series obtained from hourly data records for the 2000-2010 period, of PM10 concentrations from 4 different urban-background stations, and daily series of the meteorological variables provided by Spanish National Meteorology Agency. The method used for establishing the relationships between these variables consists of several steps: i) fitting a non-stationary probability density function for each variable accounting for long-term trends, seasonality during the year and possible seasonality during the week to distinguish between work and weekend days, ii) using the marginal distribution function obtained, transform the time series of historical values of each variable into a normalized Gaussian time series. This step allows using consistently time series models, iii) fitting of a times series model (Autoregressive moving average, ARMA) to the transformed historical values in order to eliminate the temporal autocorrelation structure of each stochastic process, obtaining a white noise for each variable, and finally, iv) the calculation of cross correlations between white noises at different time lags. These cross correlations allow characterization of the true correlation between signals, avoiding the problems induced by data scaling or autocorrelations inherent to each signal. Results provide the relationship and possible contribution to PM10 concentration levels associated with each meteorological variable. This information can be used to improve PM10 concentration levels forecasting using existing meteorological forecasts.
Ramifications of a potential gap in passive microwave data for the long-term sea ice climate record
NASA Astrophysics Data System (ADS)
Meier, W.; Stewart, J. S.
2017-12-01
The time series of sea ice concentration and extent from passive microwave sensors is one of the longest satellite-derived climate records and the significant decline in Arctic sea ice extent is one of the most iconic indicators of climate change. However, this continuous and consistent record is under threat due to the looming gap in passive microwave sensor coverage. The record started in late 1978 with the launch of the Scanning Multichannel Microwave Radiometer (SMMR) and has continued with a series of Special Sensor Microwave Imager (SSMI) and Special Sensor Microwave Imager and Sounder (SSMIS) instruments on U.S. Defense Meteorological Satellite Program (DMSP) satellites. The data from the different sensors are intercalibrated at the algorithm level by adjusting algorithm coefficients so that the output sea ice data is as consistent as possible between the older and the newer sensor. A key aspect in constructing the time series is to have at least two sensors operating simultaneously so that data from the older and newer sensor can be obtained from the same locations. However, with recent losses of the DMSP F19 and F20, the remaining SSMIS sensors are all well beyond their planned mission lifetime. This means that risk of failure is not small and is increasing with each day of operation. The newest passive microwave sensor, the JAXA Advanced Microwave Scanning Radiometer-2 (AMSR2), is a potential contributor to the time series (though it too is now beyond it's planned 5-year mission lifetime). However, AMSR2's larger antenna and higher spatial resolution presents a challenge in integrating its data with the rest of the sea ice record because the ice edge is quite sensitive to the sensor resolution, which substantially affects the total sea ice extent and area estimates. This will need to be adjusted for if AMSR2 is used to continue the time series. Here we will discuss efforts at NSIDC to integrate AMSR2 estimates into the sea ice climate record if needed. We will also discuss potential contingency plans, such as using operational sea ice charts, to fill any gaps. This would allow the record to continue, but the consistency of the time series will be degraded because the ice charts use human analysis and differing sources, amounts and quality of input data, which makes them sub-optimal for long-term climate records.
Consistent Long-Time Series of GPS Satellite Antenna Phase Center Corrections
NASA Astrophysics Data System (ADS)
Steigenberger, P.; Schmid, R.; Rothacher, M.
2004-12-01
The current IGS processing strategy disregards satellite antenna phase center variations (pcvs) depending on the nadir angle and applies block-specific phase center offsets only. However, the transition from relative to absolute receiver antenna corrections presently under discussion necessitates the consideration of satellite antenna pcvs. Moreover, studies of several groups have shown that the offsets are not homogeneous within a satellite block. Manufacturer specifications seem to confirm this assumption. In order to get best possible antenna corrections, consistent ten-year time series (1994-2004) of satellite-specific pcvs and offsets were generated. This challenging effort became possible as part of the reprocessing of a global GPS network currently performed by the Technical Universities of Munich and Dresden. The data of about 160 stations since the official start of the IGS in 1994 have been reprocessed, as today's GPS time series are mostly inhomogeneous and inconsistent due to continuous improvements in the processing strategies and modeling of global GPS solutions. An analysis of the signals contained in the time series of the phase center offsets demonstrates amplitudes on the decimeter level, at least one order of magnitude worse than the desired accuracy. The periods partly arise from the GPS orbit configuration, as the orientation of the orbit planes with regard to the inertial system repeats after about 350 days due to the rotation of the ascending nodes. In addition, the rms values of the X- and Y-offsets show a high correlation with the angle between the orbit plane and the direction to the sun. The time series of the pcvs mainly point at the correlation with the global terrestrial scale. Solutions with relative and absolute phase center corrections, with block- and satellite-specific satellite antenna corrections demonstrate the effect of this parameter group on other global GPS parameters such as the terrestrial scale, station velocities, the geocenter position or the tropospheric delays. Thus, deeper insight into the so-called `Bermuda triangle' of several highly correlated parameters is given.
ERIC Educational Resources Information Center
LEVISON, MELVIN E.
THIS PROJECT TESTED A METHOD FOR DEVELOPING "AUDIO-VISUAL LITERACY" AND, AT THE SAME TIME, AN EMPATHIC UNDERSTANDING OF ANOTHER CIVILIZATION THROUGH THE USE OF A SERIES OF SELECT FILMS. THE POPULATION CONSISTED OF 28 TEACHERS IN AN IN-SERVICE COURSE AND CLASSES LATER TAUGHT BY IN-SERVICE TRAINED TEACHERS IN FIVE SECONDARY SCHOOLS--THREE…
Assessing the Suitability of Historical PM(2.5) Element Measurements for Trend Analysis.
Hyslop, Nicole P; Trzepla, Krystyna; White, Warren H
2015-08-04
The IMPROVE (Interagency Monitoring of Protected Visual Environments) network has characterized fine particulate matter composition at locations throughout the United States since 1988. A main objective of the network is to evaluate long-term trends in aerosol concentrations. Measurements inevitably advance over time, but changes in measurement technique have the potential to confound the interpretation of long-term trends. Problems of interpretation typically arise from changing biases, and changes in bias can be difficult to identify without comparison data that are consistent throughout the measurement series, which rarely exist. We created a consistent measurement series for exactly this purpose by reanalyzing the 15-year archives (1995-2009) of aerosol samples from three sites - Great Smoky Mountains National Park, Mount Rainier National Park, and Point Reyes National Seashore-as single batches using consistent analytical methods. In most cases, trend estimates based on the original and reanalysis measurements are statistically different for elements that were not measured above the detection limit consistently over the years (e.g., Na, Cl, Si, Ti, V, Mn). The original trends are more reliable for elements consistently measured above the detection limit. All but one of the 23 site-element series with detection rates >80% had statistically indistinguishable original and reanalysis trends (overlapping 95% confidence intervals).
30 CFR 18.98 - Enclosures, joints, and fastenings; pressure testing.
Code of Federal Regulations, 2013 CFR
2013-07-01
... consistent with unyielding components during a pressure-time history as derived from a series of oscillograms...; pressure testing. (a) Cast or welded enclosures shall be designed to withstand a minimum internal pressure...
30 CFR 18.98 - Enclosures, joints, and fastenings; pressure testing.
Code of Federal Regulations, 2012 CFR
2012-07-01
... consistent with unyielding components during a pressure-time history as derived from a series of oscillograms...; pressure testing. (a) Cast or welded enclosures shall be designed to withstand a minimum internal pressure...
30 CFR 18.98 - Enclosures, joints, and fastenings; pressure testing.
Code of Federal Regulations, 2014 CFR
2014-07-01
... consistent with unyielding components during a pressure-time history as derived from a series of oscillograms...; pressure testing. (a) Cast or welded enclosures shall be designed to withstand a minimum internal pressure...
30 CFR 18.98 - Enclosures, joints, and fastenings; pressure testing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... consistent with unyielding components during a pressure-time history as derived from a series of oscillograms...; pressure testing. (a) Cast or welded enclosures shall be designed to withstand a minimum internal pressure...
30 CFR 18.98 - Enclosures, joints, and fastenings; pressure testing.
Code of Federal Regulations, 2011 CFR
2011-07-01
... consistent with unyielding components during a pressure-time history as derived from a series of oscillograms...; pressure testing. (a) Cast or welded enclosures shall be designed to withstand a minimum internal pressure...
Derivation of GNSS derived station velocities for a surface deformation model in the Austrian region
NASA Astrophysics Data System (ADS)
Umnig, Elke; Weber, Robert; Maras, Jadre; Brückl, Ewald
2016-04-01
This contribution deals with the first comprehensive analysis of GNSS derived surface velocities computed within an observation network of about 100 stations covering the whole Austrian territory and parts of the neighbouring countries. Coordinate time series are available now, spanning a period of 5 years (2010.0-2015.0) for one focus area in East Austria and one and a half year (2013.5-2015.0) for the remaining part of the tracking network. In principle the data series are stemming from two different GNSS campaigns. The former was set up to investigate intra plate tectonic movements within the framework of the project ALPAACT (seismological and geodetic monitoring of ALpine-PAnnonian ACtive Tectonics), the latter was designed to support a number of various requests, e.g. derivation of GNSS derived water vapour fields, but also to expand the foresaid tectonic studies. In addition the activities within the ALPAACT project supplement the educational initiative SHOOLS & QUAKES, where scholars contribute to seismological research. For the whole period of the processed coordinate time series daily solutions have been computed by means of the Bernese software. The processed coordinate time series are tied to the global reference frame ITRF2000 as well as to the frame ITRF2008. Due to the transition of the reference from ITRF2000 to ITRF2008 within the processing period, but also due to updates of the Bernese software from version 5.0 to 5.2 the time series were initially not fully consistent and have to be re-aligned to a common frame. So the goal of this investigation is to derive a nationwide consistent horizontal motion field on base of GNSS reference station data within the ITRF2008 frame, but also with respect to the Eurasian plate. In this presentation we focus on the set-up of the coordinate time series and on the problem of frame alignment. Special attention is also paid to the separation into linear and periodic motion signals, originating from tectonic or non-tectonic sources.
Ren, Meng; Li, Na; Wang, Zhan; Liu, Yisi; Chen, Xi; Chu, Yuanyuan; Li, Xiangyu; Zhu, Zhongmin; Tian, Liqiao; Xiang, Hao
2017-01-01
Few studies have compared different methods when exploring the short-term effects of air pollutants on respiratory disease mortality in Wuhan, China. This study assesses the association between air pollutants and respiratory disease mortality with both time-series and time-stratified–case-crossover designs. The generalized additive model (GAM) and the conditional logistic regression model were used to assess the short-term effects of air pollutants on respiratory disease mortality. Stratified analyses were performed by age, sex, and diseases. A 10 μg/m3 increment in SO2 level was associated with an increase in relative risk for all respiratory disease mortality of 2.4% and 1.9% in the case-crossover and time-series analyses in single pollutant models, respectively. Strong evidence of an association between NO2 and daily respiratory disease mortality among men or people older than 65 years was found in the case-crossover study. There was a positive association between air pollutants and respiratory disease mortality in Wuhan, China. Both time-series and case-crossover analyses consistently reveal the association between three air pollutants and respiratory disease mortality. The estimates of association between air pollution and respiratory disease mortality from the case–crossover analysis displayed greater variation than that from the time-series analysis. PMID:28084399
Ren, Meng; Li, Na; Wang, Zhan; Liu, Yisi; Chen, Xi; Chu, Yuanyuan; Li, Xiangyu; Zhu, Zhongmin; Tian, Liqiao; Xiang, Hao
2017-01-13
Few studies have compared different methods when exploring the short-term effects of air pollutants on respiratory disease mortality in Wuhan, China. This study assesses the association between air pollutants and respiratory disease mortality with both time-series and time-stratified-case-crossover designs. The generalized additive model (GAM) and the conditional logistic regression model were used to assess the short-term effects of air pollutants on respiratory disease mortality. Stratified analyses were performed by age, sex, and diseases. A 10 μg/m 3 increment in SO 2 level was associated with an increase in relative risk for all respiratory disease mortality of 2.4% and 1.9% in the case-crossover and time-series analyses in single pollutant models, respectively. Strong evidence of an association between NO 2 and daily respiratory disease mortality among men or people older than 65 years was found in the case-crossover study. There was a positive association between air pollutants and respiratory disease mortality in Wuhan, China. Both time-series and case-crossover analyses consistently reveal the association between three air pollutants and respiratory disease mortality. The estimates of association between air pollution and respiratory disease mortality from the case-crossover analysis displayed greater variation than that from the time-series analysis.
NASA Astrophysics Data System (ADS)
Ren, Meng; Li, Na; Wang, Zhan; Liu, Yisi; Chen, Xi; Chu, Yuanyuan; Li, Xiangyu; Zhu, Zhongmin; Tian, Liqiao; Xiang, Hao
2017-01-01
Few studies have compared different methods when exploring the short-term effects of air pollutants on respiratory disease mortality in Wuhan, China. This study assesses the association between air pollutants and respiratory disease mortality with both time-series and time-stratified-case-crossover designs. The generalized additive model (GAM) and the conditional logistic regression model were used to assess the short-term effects of air pollutants on respiratory disease mortality. Stratified analyses were performed by age, sex, and diseases. A 10 μg/m3 increment in SO2 level was associated with an increase in relative risk for all respiratory disease mortality of 2.4% and 1.9% in the case-crossover and time-series analyses in single pollutant models, respectively. Strong evidence of an association between NO2 and daily respiratory disease mortality among men or people older than 65 years was found in the case-crossover study. There was a positive association between air pollutants and respiratory disease mortality in Wuhan, China. Both time-series and case-crossover analyses consistently reveal the association between three air pollutants and respiratory disease mortality. The estimates of association between air pollution and respiratory disease mortality from the case-crossover analysis displayed greater variation than that from the time-series analysis.
Impact of Sensor Degradation on the MODIS NDVI Time Series
NASA Technical Reports Server (NTRS)
Wang, Dongdong; Morton, Douglas; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert
2011-01-01
Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, we evaluated the impact of sensor degradation on trend detection using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004/yr decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends MODIS NDVI over North America were consistent with simulated results, with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in NDVI trends over vegetation.
Time series modeling by a regression approach based on a latent process.
Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice
2009-01-01
Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.
Spectral signatures of jumps and turbulence in interplanetary speed and magnetic field data
NASA Technical Reports Server (NTRS)
Roberts, D. A.; Goldstein, M. L.
1987-01-01
It is shown here that, consistent with a suggestion of Burlaga and Mish (1987), the f exp -2 spectra in the magnitudes of the magnetic and velocity fields in the solar wind result from jumps due to various rapid changes in the time series for these quantities. If these jumps are removed from the data, the spectra of the resulting 'difference' time series have the f exp -5/3 form. It is concluded that f exp -2 spectra in these magnitudes arise from phase coherent structures that can be distinguished clearly from incoherent turbulent fluctuations.
Sun, J
1995-09-01
In this paper we discuss the non-parametric estimation of a distribution function based on incomplete data for which the measurement origin of a survival time or the date of enrollment in a study is known only to belong to an interval. Also the survival time of interest itself is observed from a truncated distribution and is known only to lie in an interval. To estimate the distribution function, a simple self-consistency algorithm, a generalization of Turnbull's (1976, Journal of the Royal Statistical Association, Series B 38, 290-295) self-consistency algorithm, is proposed. This method is then used to analyze two AIDS cohort studies, for which direct use of the EM algorithm (Dempster, Laird and Rubin, 1976, Journal of the Royal Statistical Association, Series B 39, 1-38), which is computationally complicated, has previously been the usual method of the analysis.
Approximate scaling properties of RNA free energy landscapes
NASA Technical Reports Server (NTRS)
Baskaran, S.; Stadler, P. F.; Schuster, P.
1996-01-01
RNA free energy landscapes are analysed by means of "time-series" that are obtained from random walks restricted to excursion sets. The power spectra, the scaling of the jump size distribution, and the scaling of the curve length measured with different yard stick lengths are used to describe the structure of these "time series". Although they are stationary by construction, we find that their local behavior is consistent with both AR(1) and self-affine processes. Random walks confined to excursion sets (i.e., with the restriction that the fitness value exceeds a certain threshold at each step) exhibit essentially the same statistics as free random walks. We find that an AR(1) time series is in general approximately self-affine on timescales up to approximately the correlation length. We present an empirical relation between the correlation parameter rho of the AR(1) model and the exponents characterizing self-affinity.
The construction of a Central Netherlands temperature
NASA Astrophysics Data System (ADS)
van der Schrier, G.; van Ulden, A.; van Oldenborgh, G. J.
2011-05-01
The Central Netherlands Temperature (CNT) is a monthly daily mean temperature series constructed from homogenized time series from the centre of the Netherlands. The purpose of this series is to offer a homogeneous time series representative of a larger area in order to study large-scale temperature changes. It will also facilitate a comparison with climate models, which resolve similar scales. From 1906 onwards, temperature measurements in the Netherlands have been sufficiently standardized to construct a high-quality series. Long time series have been constructed by merging nearby stations and using the overlap to calibrate the differences. These long time series and a few time series of only a few decades in length have been subjected to a homogeneity analysis in which significant breaks and artificial trends have been corrected. Many of the detected breaks correspond to changes in the observations that are documented in the station metadata. This version of the CNT, to which we attach the version number 1.1, is constructed as the unweighted average of four stations (De Bilt, Winterswijk/Hupsel, Oudenbosch/Gilze-Rijen and Gemert/Volkel) with the stations Eindhoven and Deelen added from 1951 and 1958 onwards, respectively. The global gridded datasets used for detecting and attributing climate change are based on raw observational data. Although some homogeneity adjustments are made, these are not based on knowledge of local circumstances but only on statistical evidence. Despite this handicap, and the fact that these datasets use grid boxes that are far larger then the area associated with that of the Central Netherlands Temperature, the temperature interpolated to the CNT region shows a warming trend that is broadly consistent with the CNT trend in all of these datasets. The actual trends differ from the CNT trend up to 30 %, which highlights the need to base future global gridded temperature datasets on homogenized time series.
Jasper Seamount: seven million years of volcanism
Pringle, M.S.; Staudigel, H.; Gee, J.
1991-01-01
Jasper Seamount is a young, mid-sized (690 km3) oceanic intraplate volcano located about 500 km west-southwest of San Diego, California. Reliable 40Ar/39Ar age data were obtained for several milligram-sized samples of 4 to 10 Ma plagioclase by using a defocused laser beam to clean the samples before fusion. Gee and Staudigel suggested that Jasper Seamount consists of a transitional to tholeiitic shield volcano formed by flank transitional series lavas, overlain by flank alkalic series lavas and summit alkalic series lavas. Twenty-nine individual 40Ar/39Ar laser fusion analyses on nine samples confirm the stratigraphy: 10.3-10.0 Ma for the flank transitional series, 8.7-7.5 Ma for the flank alkalic series, and 4.8-4.1 Ma for the summit alkalic series. The alkalinity of the lavas clearly increases with time, and there appear to be 1 to 3 m.y. hiatuses between each series. -from Authors
A novel time series link prediction method: Learning automata approach
NASA Astrophysics Data System (ADS)
Moradabadi, Behnaz; Meybodi, Mohammad Reza
2017-09-01
Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.
Combinations of Earth Orientation Measurements: SPACE2005, COMB2005, and POLE2005
NASA Technical Reports Server (NTRS)
Gross, Richard S.
2006-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, by very long baseline interferometry, and by the Global Positioning System have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2005, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28, 1976, to January 7, 2006, at daily intervals and is available in versions whose epochs are given at either midnight or noon. The space-geodetic measurements used to generate SPACE2005 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2005, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20, 1962, to January 7, 2006, at daily intervals and which is also available in versions whose epochs are given at either midnight or noon; and (2) POLE2005, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900, to December 21, 2005, at 30.4375-day intervals.
Combinations of Earth Orientation Measurements: SPACE2003, COMB2003, and POLE2003
NASA Technical Reports Server (NTRS)
Gross, Richard S.
2004-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the global positioning system have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2003, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28.0, 1976 to January 31.0, 2004 at daily intervals and is available in versions whose epochs are given at either midnight or noon. The space-geodetic measurements used to generate SPACE2003 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2003, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20.0, 1962 to January 31.0, 2004 at daily intervals and which is also available in versions whose epochs are given at either midnight or noon, and (2) POLE2003, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900 to January 21,2004 at 30.4375-day intervals.
Combinations of Earth Orientation Measurements: SPACE2004, COMB2004, and POLE2004
NASA Technical Reports Server (NTRS)
Gross, Richard R.
2005-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the global positioning system have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2004, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28, 1976, to January 22, 2005, at daily intervals and is available in versions whose epochs are given at either midnight or noon. The space-geodetic measurements used to generate SPACE2004 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2004, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20, 1962, to January 22, 2005, at daily intervals and which is also available in versions whose epochs are given at either midnight or noon, and (2) POLE2004, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900, to January 20, 2005, at 30.4375-day intervals.
Combinations of Earth Orientation Measurements: SPACE2014, COMB2014, and POLE2014
NASA Technical Reports Server (NTRS)
Ratcliff, J. T.; Gross, R. S.
2015-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the Global Positioning System have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2013, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28, 1976, to June 30, 2014, at daily intervals and is available in versions with epochs given at either midnight or noon. The space-geodetic measurements used to generate SPACE2013 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2013, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20, 1962, to June 30, 2014, at daily intervals and which are also available in versions with epochs given at either midnight or noon; and (2) POLE2013, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900, to June 22, 2014, at 30.4375-day intervals.
Combinations of Earth Orientation Measurements: SPACE2011, COMB2011, and POLE2011
NASA Technical Reports Server (NTRS)
Ratcliff, J. T.; Gross, R. S.
2013-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the Global Positioning System have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2011, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28, 1976, to July 13, 2012, at daily intervals and is available in versions with epochs given at either midnight or noon. The space-geodetic measurements used to generate SPACE2011 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2011, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20, 1962, to July 13, 2012, at daily intervals and which are also available in versions with epochs given at either midnight or noon; and (2) POLE2011, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900, to June 21, 2012, at 30.4375-day intervals.
Combinations of Earth Orientation Measurements: SPACE2013, COMB2013, and POLE2013
NASA Technical Reports Server (NTRS)
Ratcliff, J. T.; Gross, R. S.
2015-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the Global Positioning System have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2013, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28, 1976, to June 30, 2014, at daily intervals and is available in versions with epochs given at either midnight or noon. The space-geodetic measurements used to generate SPACE2013 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2013, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20, 1962, to June 30, 2014, at daily intervals and which are also available in versions with epochs given at either midnight or noon; and (2) POLE2013, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900, to June 22, 2014, at 30.4375-day intervals.
Combinations of Earth Orientation Measurements: SPACE2016, COMB2016, and POLE2016
NASA Technical Reports Server (NTRS)
Ratcliff, J. T.; Gross, R. S.
2017-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the Global Positioning System have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2016, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28, 1976, to June 30, 2017, at daily intervals and is available in versions with epochs given at either midnight or noon. The space-geodetic measurements used to generate SPACE2016 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2016, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20, 1962, to June 30, 2017, at daily intervals and which are also available in versions with epochs given at either midnight or noon; and (2) POLE2016, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900, to June 22, 2017, at 30.4375-day intervals.
Combinations of Earth Orientation Measurements: SPACE2012, COMB2012, and POLE2012
NASA Technical Reports Server (NTRS)
Ratcliff, J. T.; Gross, R. S.
2013-01-01
Independent Earth orientation measurements taken by the space-geodetic techniques of lunar and satellite laser ranging, very long baseline interferometry, and the Global Positioning System have been combined using a Kalman filter. The resulting combined Earth orientation series, SPACE2012, consists of values and uncertainties for Universal Time, polar motion, and their rates that span from September 28, 1976, to April 26, 2013, at daily intervals and is available in versions with epochs given at either midnight or noon. The space-geodetic measurements used to generate SPACE2012 have then been combined with optical astrometric measurements to form two additional combined Earth orientation series: (1) COMB2012, consisting of values and uncertainties for Universal Time, polar motion, and their rates that span from January 20, 1962, to April 26, 2013, at daily intervals and which are also available in versions with epochs given at either midnight or noon; and (2) POLE2012, consisting of values and uncertainties for polar motion and its rate that span from January 20, 1900, to May 22, 2013, at 30.4375-day intervals.
Detection of long term persistence in time series of the Neuquen River (Argentina)
NASA Astrophysics Data System (ADS)
Seoane, Rafael; Paz González, Antonio
2014-05-01
In the Patagonian region (Argentina), previous hydrometeorological studies that have been developed using general circulation models show variations in annual mean flows. Future climate scenarios obtained from high-resolution models indicate decreases in total annual precipitation, and these scenarios are more important in the Neuquén river basin (23000 km2). The aim of this study was the estimation of long term persistence in the Neuquén River basin (Argentina). The detection of variations in the long range dependence term and long memory of time series was evaluated with the Hurst exponent. We applied rescaled adjusted range analysis (R/S) to time series of River discharges measured from 1903 to 2011 and this time series was divided into two subperiods: the first was from 1903 to 1970 and the second from 1970 to 2011. Results show a small increase in persistence for the second period. Our results are consistent with those obtained by Koch and Markovic (2007), who observed and estimated an increase of the H exponent for the period 1960-2000 in the Elbe River (Germany). References Hurst, H. (1951).Long term storage capacities of reservoirs". Trans. Am. Soc. Civil Engrs., 116:776-808. Koch and Markovic (2007). Evidences for Climate Change in Germany over the 20th Century from the Stochastic Analysis of hydro-meteorological Time Series, MODSIM07, International Congress on Modelling and Simulation, Christchurch, New Zealand.
NASA Astrophysics Data System (ADS)
Zhou, Wei-Xing; Sornette, Didier
2007-07-01
We have recently introduced the “thermal optimal path” (TOP) method to investigate the real-time lead-lag structure between two time series. The TOP method consists in searching for a robust noise-averaged optimal path of the distance matrix along which the two time series have the greatest similarity. Here, we generalize the TOP method by introducing a more general definition of distance which takes into account possible regime shifts between positive and negative correlations. This generalization to track possible changes of correlation signs is able to identify possible transitions from one convention (or consensus) to another. Numerical simulations on synthetic time series verify that the new TOP method performs as expected even in the presence of substantial noise. We then apply it to investigate changes of convention in the dependence structure between the historical volatilities of the USA inflation rate and economic growth rate. Several measures show that the new TOP method significantly outperforms standard cross-correlation methods.
Merging climate and multi-sensor time-series data in real-time drought monitoring across the U.S.A.
Brown, Jesslyn F.; Miura, T.; Wardlow, B.; Gu, Yingxin
2011-01-01
Droughts occur repeatedly in the United States resulting in billions of dollars of damage. Monitoring and reporting on drought conditions is a necessary function of government agencies at multiple levels. A team of Federal and university partners developed a drought decision- support tool with higher spatial resolution relative to traditional climate-based drought maps. The Vegetation Drought Response Index (VegDRI) indicates general canopy vegetation condition assimilation of climate, satellite, and biophysical data via geospatial modeling. In VegDRI, complementary drought-related data are merged to provide a comprehensive, detailed representation of drought stress on vegetation. Time-series data from daily polar-orbiting earth observing systems [Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS)] providing global measurements of land surface conditions are ingested into VegDRI. Inter-sensor compatibility is required to extend multi-sensor data records; thus, translations were developed using overlapping observations to create consistent, long-term data time series.
NASA Astrophysics Data System (ADS)
Bock, Y.; Fang, P.; Moore, A. W.; Kedar, S.; Liu, Z.; Owen, S. E.; Glasscoe, M. T.
2016-12-01
Detection of time-dependent crustal deformation relies on the availability of accurate surface displacements, proper time series analysis to correct for secular motion, coseismic and non-tectonic instrument offsets, periodic signatures at different frequencies, and a realistic estimate of uncertainties for the parameters of interest. As part of the NASA Solid Earth Science ESDR System (SESES) project, daily displacement time series are estimated for about 2500 stations, focused on tectonic plate boundaries and having a global distribution for accessing the terrestrial reference frame. The "combined" time series are optimally estimated from independent JPL GIPSY and SIO GAMIT solutions, using a consistent set of input epoch-date coordinates and metadata. The longest time series began in 1992; more than 30% of the stations have experienced one or more of 35 major earthquakes with significant postseismic deformation. Here we present three examples of time-dependent deformation that have been detected in the SESES displacement time series. (1) Postseismic deformation is a fundamental time-dependent signal that indicates a viscoelastic response of the crust/mantle lithosphere, afterslip, or poroelastic effects at different spatial and temporal scales. It is critical to identify and estimate the extent of postseismic deformation in both space and time not only for insight into the crustal deformation and earthquake cycles and their underlying physical processes, but also to reveal other time-dependent signals. We report on our database of characterized postseismic motions using a principal component analysis to isolate different postseismic processes. (2) Starting with the SESES combined time series and applying a time-dependent Kalman filter, we examine episodic tremor and slow slip (ETS) in the Cascadia subduction zone. We report on subtle slip details, allowing investigation of the spatiotemporal relationship between slow slip transients and tremor and their underlying physical mechanisms. (3) We present evolving strain dilatation and shear rates based on the SESES velocities for regional subnetworks as a metric for assigning earthquake probabilities and detection of possible time-dependent deformation related to underlying physical processes.
Generalized multiplicative error models: Asymptotic inference and empirical analysis
NASA Astrophysics Data System (ADS)
Li, Qian
This dissertation consists of two parts. The first part focuses on extended Multiplicative Error Models (MEM) that include two extreme cases for nonnegative series. These extreme cases are common phenomena in high-frequency financial time series. The Location MEM(p,q) model incorporates a location parameter so that the series are required to have positive lower bounds. The estimator for the location parameter turns out to be the minimum of all the observations and is shown to be consistent. The second case captures the nontrivial fraction of zero outcomes feature in a series and combines a so-called Zero-Augmented general F distribution with linear MEM(p,q). Under certain strict stationary and moment conditions, we establish a consistency and asymptotic normality of the semiparametric estimation for these two new models. The second part of this dissertation examines the differences and similarities between trades in the home market and trades in the foreign market of cross-listed stocks. We exploit the multiplicative framework to model trading duration, volume per trade and price volatility for Canadian shares that are cross-listed in the New York Stock Exchange (NYSE) and the Toronto Stock Exchange (TSX). We explore the clustering effect, interaction between trading variables, and the time needed for price equilibrium after a perturbation for each market. The clustering effect is studied through the use of univariate MEM(1,1) on each variable, while the interactions among duration, volume and price volatility are captured by a multivariate system of MEM(p,q). After estimating these models by a standard QMLE procedure, we exploit the Impulse Response function to compute the calendar time for a perturbation in these variables to be absorbed into price variance, and use common statistical tests to identify the difference between the two markets in each aspect. These differences are of considerable interest to traders, stock exchanges and policy makers.
Application of dynamic topic models to toxicogenomics data.
Lee, Mikyung; Liu, Zhichao; Huang, Ruili; Tong, Weida
2016-10-06
All biological processes are inherently dynamic. Biological systems evolve transiently or sustainably according to sequential time points after perturbation by environment insults, drugs and chemicals. Investigating the temporal behavior of molecular events has been an important subject to understand the underlying mechanisms governing the biological system in response to, such as, drug treatment. The intrinsic complexity of time series data requires appropriate computational algorithms for data interpretation. In this study, we propose, for the first time, the application of dynamic topic models (DTM) for analyzing time-series gene expression data. A large time-series toxicogenomics dataset was studied. It contains over 3144 microarrays of gene expression data corresponding to rat livers treated with 131 compounds (most are drugs) at two doses (control and high dose) in a repeated schedule containing four separate time points (4-, 8-, 15- and 29-day). We analyzed, with DTM, the topics (consisting of a set of genes) and their biological interpretations over these four time points. We identified hidden patterns embedded in this time-series gene expression profiles. From the topic distribution for compound-time condition, a number of drugs were successfully clustered by their shared mode-of-action such as PPARɑ agonists and COX inhibitors. The biological meaning underlying each topic was interpreted using diverse sources of information such as functional analysis of the pathways and therapeutic uses of the drugs. Additionally, we found that sample clusters produced by DTM are much more coherent in terms of functional categories when compared to traditional clustering algorithms. We demonstrated that DTM, a text mining technique, can be a powerful computational approach for clustering time-series gene expression profiles with the probabilistic representation of their dynamic features along sequential time frames. The method offers an alternative way for uncovering hidden patterns embedded in time series gene expression profiles to gain enhanced understanding of dynamic behavior of gene regulation in the biological system.
Global trends in vegetation phenology from 32-year GEOV1 leaf area index time series
NASA Astrophysics Data System (ADS)
Verger, Aleixandre; Baret, Frédéric; Weiss, Marie; Filella, Iolanda; Peñuelas, Josep
2013-04-01
Phenology is a critical component in understanding ecosystem response to climate variability. Long term data records from global mapping satellite platforms are valuable tools for monitoring vegetation responses to climate change at the global scale. Phenology satellite products and trend detection from satellite time series are expected to contribute to improve our understanding of climate forcing on vegetation dynamics. The capacity of monitoring ecosystem responses to global climate change was evaluated in this study from the 32-year time series of global Leaf Area Index (LAI) which have been recently produced within the geoland2 project. The long term GEOV1 LAI products were derived from NOAA/AVHRR (1981 to 2000) and SPOT/VGT (1999 to the present) with specific emphasis on consistency and continuity. Since mid-November, GEOV1 LAI products are freely available to the scientific community at geoland2 portal (www.geoland2.eu/core-mapping-services/biopar.html). These products are distributed at a dekadal time step for the period 1981-2000 and 2000-2012 at 0.05° and 1/112°, respectively. The use of GEOV1 data covering a long time period and providing information at dense time steps are expected to increase the reliability of trend detection. In this study, GEOV1 LAI time series aggregated at 0.5° spatial resolution are used. The CACAO (Consistent Adjustment of the Climatology to Actual Observations) method (Verger et al, 2013) was applied to characterize seasonal anomalies as well as identify trends. For a given pixel, CACAO computes, for each season, the time shift and the amplitude difference between the current temporal profile and the climatology computed over the 32 years. These CACAO parameters allow quantifying shifts in the timing of seasonal phenology and inter-annual variations in magnitude as compared to the average climatology. Interannual variations in the timing of the Start of Season and End of Season, Season Length and LAI level in the peak of the growing season are analyzed. Trend analysis with robust statistical test of significance is conducted. Climate variables (precipitation, temperature, radiation) are then used to interpret the anomaly patterns detected in vegetation response.
A Smoothing Technique for the Multifractal Analysis of a Medium Voltage Feeders Electric Current
NASA Astrophysics Data System (ADS)
de Santis, Enrico; Sadeghian, Alireza; Rizzi, Antonello
2017-12-01
The current paper presents a data-driven detrending technique allowing to smooth complex sinusoidal trends from a real-world electric load time series before applying the Detrended Multifractal Fluctuation Analysis (MFDFA). The algorithm we call Smoothed Sort and Cut Fourier Detrending (SSC-FD) is based on a suitable smoothing of high power periodicities operating directly in the Fourier spectrum through a polynomial fitting technique of the DFT. The main aim consists of disambiguating the characteristic slow varying periodicities, that can impair the MFDFA analysis, from the residual signal in order to study its correlation properties. The algorithm performances are evaluated on a simple benchmark test consisting of a persistent series where the Hurst exponent is known, with superimposed ten sinusoidal harmonics. Moreover, the behavior of the algorithm parameters is assessed computing the MFDFA on the well-known sunspot data, whose correlation characteristics are reported in literature. In both cases, the SSC-FD method eliminates the apparent crossover induced by the synthetic and natural periodicities. Results are compared with some existing detrending methods within the MFDFA paradigm. Finally, a study of the multifractal characteristics of the electric load time series detrendended by the SSC-FD algorithm is provided, showing a strong persistent behavior and an appreciable amplitude of the multifractal spectrum that allows to conclude that the series at hand has multifractal characteristics.
a Landsat Time-Series Stacks Model for Detection of Cropland Change
NASA Astrophysics Data System (ADS)
Chen, J.; Chen, J.; Zhang, J.
2017-09-01
Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.
Results of Russian geomagnetic observatories in the 19th century: magnetic activity, 1841-1862
NASA Astrophysics Data System (ADS)
Nevanlinna, H.; Häkkinen, L.
2010-04-01
Hourly (spot readings) magnetic data (H- and D-components) were digitized from Russian yearbook tables for the years 1850-1862 from four observatories. The pdf pictures for digitization were taken by a normal digital camera. The database obtained consists of about 900 000 single data points. The time series of hourly magnetic values reveal slow secular variations (declination only) as well as transient and regular geomagnetic variations of external origin. The quality and homogeneity of the data is satisfactory. Daily Ak-indices were calculated using the index algorithm that has been earlier applied to 19th century data from Helsinki (Finland) as well as modern magnetic observatory recordings. The activity index series derived from the Russian data is consistent with earlier activity index series for 1850-1862. The digitized index data series derived in this study was extended back to 1841 by including magnetic C9 activity index data available from a Russian observatory (St. Petersburg). Magnetic data rescued here is well suitable for various reconstructions for studies of the long-term variation of the space weather in the 19th century.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...
37 CFR 1.78 - Claiming benefit of earlier filing date and cross-references to other applications.
Code of Federal Regulations, 2012 CFR
2012-07-01
... such prior-filed application, identifying it by application number (consisting of the series code and.... These time periods are not extendable. Except as provided in paragraph (a)(3) of this section, the... application. The time periods in this paragraph do not apply if the later-filed application is: (A) An...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...
The Influence of Time on the Relationship of Precognition and Creativity.
ERIC Educational Resources Information Center
Costello, Francis J.
The possibility of a time sensitive relationship between scores on the Dean Miholasky precognition test and those on Activity 2 of the Torrance Test for Creative Thinking was investigated with 58 freshmen students in a school of engineering. The Torrance Test consists of three activities that present a series of figural stimuli to the subject who…
NASA Astrophysics Data System (ADS)
Baisden, W. T.
2011-12-01
Time-series radiocarbon measurements have substantial ability to constrain the size and residence time of the soil C pools commonly represented in ecosystem models. Radiocarbon remains unique in the ability to constrain the large stabilized C pool with decadal residence times. Radiocarbon also contributes usefully to constraining the size and turnover rate of the passive pool, but typically struggles to constrain pools with residence times less than a few years. Overall, the number of pools and associated turnover rates that can be constrained depends upon the number of time-series samples available, the appropriateness of chemical or physical fractions to isolate unequivocal pools, and the utility of additional C flux data to provide additional constraints. In New Zealand pasture soils, we demonstrate the ability to constrain decadal turnover times with in a few years for the stabilized pool and reasonably constrain the passive fraction. Good constraint is obtained with two time-series samples spaced 10 or more years apart after 1970. Three or more time-series samples further improve the level of constraint. Work within this context shows that a two-pool model does explain soil radiocarbon data for the most detailed profiles available (11 time-series samples), and identifies clear and consistent differences in rates of C turnover and passive fraction in Andisols vs Non-Andisols. Furthermore, samples from multiple horizons can commonly be combined, yielding consistent residence times and passive fraction estimates that are stable with, or increase with, depth in different sites. Radiocarbon generally fails to quantify rapid C turnover, however. Given that the strength of radiocarbon is estimating the size and turnover of the stabilized (decadal) and passive (millennial) pools, the magnitude of fast cycling pool(s) can be estimated by subtracting the radiocarbon-based estimates of turnover within stabilized and passive pools from total estimates of NPP. In grazing land, these estimates can be derived primarily from measured aboveground NPP and calculated belowground NPP. Results suggest that only 19-36% of heterotrophic soil respiration is derived from the soil C with rapid turnover times. A final logical step in synthesis is the analysis of temporal variation in NPP, primarily due to climate, as driver of changes in plant inputs and resulting in dynamic changes in rapid and decadal soil C pools. In sites with good time series samples from 1959-1975, we examine the apparent impacts of measured or modelled (Biome-BGC) NPP on soil Δ14C. Ultimately, these approaches have the ability to empirically constrain, and provide limited verification, of the soil C cycle as commonly depicted ecosystem biogeochemistry models.
Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)
NASA Astrophysics Data System (ADS)
Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.
2017-10-01
When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach), which departs from a (in operational hydrology) commonly used definition of consistency. A period is considered to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the rating curve model behaves satisfactorily. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each country, regional information is maximally used to estimate observational uncertainty. Based on this uncertainty, a BReach analysis is performed and, subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear to be consistent with this knowledge of historical changes and thus facilitates a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model evaluation design about consistent time periods to analyze).
The annual cycles of phytoplankton biomass
Winder, M.; Cloern, J.E.
2010-01-01
Terrestrial plants are powerful climate sentinels because their annual cycles of growth, reproduction and senescence are finely tuned to the annual climate cycle having a period of one year. Consistency in the seasonal phasing of terrestrial plant activity provides a relatively low-noise background from which phenological shifts can be detected and attributed to climate change. Here, we ask whether phytoplankton biomass also fluctuates over a consistent annual cycle in lake, estuarine-coastal and ocean ecosystems and whether there is a characteristic phenology of phytoplankton as a consistent phase and amplitude of variability. We compiled 125 time series of phytoplankton biomass (chloro-phyll a concentration) from temperate and subtropical zones and used wavelet analysis to extract their dominant periods of variability and the recurrence strength at those periods. Fewer than half (48%) of the series had a dominant 12-month period of variability, commonly expressed as the canonical spring-bloom pattern. About 20 per cent had a dominant six-month period of variability, commonly expressed as the spring and autumn or winter and summer blooms of temperate lakes and oceans. These annual patterns varied in recurrence strength across sites, and did not persist over the full series duration at some sites. About a third of the series had no component of variability at either the six-or 12-month period, reflecting a series of irregular pulses of biomass. These findings show that there is high variability of annual phytoplankton cycles across ecosystems, and that climate-driven annual cycles can be obscured by other drivers of population variability, including human disturbance, aperiodic weather events and strong trophic coupling between phytoplankton and their consumers. Regulation of phytoplankton biomass by multiple processes operating at multiple time scales adds complexity to the challenge of detecting climate-driven trends in aquatic ecosystems where the noise to signal ratio is high. ?? 2010 The Royal Society.
Use of a prototype pulse oximeter for time series analysis of heart rate variability
NASA Astrophysics Data System (ADS)
González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica
2015-05-01
This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.
NASA Astrophysics Data System (ADS)
Walker, Gary E.
2015-01-01
We observe the long period (5.6 years) Eclipsing Binary Variable Star EE Cep during it's 2014 eclipse. It was observed on every clear night from the Maria Mitchell Observatory as well as remote sites for a total of 25 nights. Each night consisted of a detailed time series in BVRI looking for short term variations for a total of >9000 observations. The data was transformed to the Standard System. In addition, a time series was captured during the night of the eclipse. This data provides an alternate method to determine Time of Minimum than traditionally performed. The TOM varied with color. Several strong correlations are seen between colors substantiating the detection of variations on a time scale of hours. The long term light curve shows 5 interesting and different Phases with different characteristics.
The high order dispersion analysis based on first-passage-time probability in financial markets
NASA Astrophysics Data System (ADS)
Liu, Chenggong; Shang, Pengjian; Feng, Guochen
2017-04-01
The study of first-passage-time (FPT) event about financial time series has gained broad research recently, which can provide reference for risk management and investment. In this paper, a new measurement-high order dispersion (HOD)-is developed based on FPT probability to explore financial time series. The tick-by-tick data of three Chinese stock markets and three American stock markets are investigated. We classify the financial markets successfully through analyzing the scaling properties of FPT probabilities of six stock markets and employing HOD method to compare the differences of FPT decay curves. It can be concluded that long-range correlation, fat-tailed broad probability density function and its coupling with nonlinearity mainly lead to the multifractality of financial time series by applying HOD method. Furthermore, we take the fluctuation function of multifractal detrended fluctuation analysis (MF-DFA) to distinguish markets and get consistent results with HOD method, whereas the HOD method is capable of fractionizing the stock markets effectively in the same region. We convince that such explorations are relevant for a better understanding of the financial market mechanisms.
NASA Astrophysics Data System (ADS)
Brigatti, E.; Vieira, M. V.; Kajin, M.; Almeida, P. J. A. L.; de Menezes, M. A.; Cerqueira, R.
2016-02-01
We study the population size time series of a Neotropical small mammal with the intent of detecting and modelling population regulation processes generated by density-dependent factors and their possible delayed effects. The application of analysis tools based on principles of statistical generality are nowadays a common practice for describing these phenomena, but, in general, they are more capable of generating clear diagnosis rather than granting valuable modelling. For this reason, in our approach, we detect the principal temporal structures on the bases of different correlation measures, and from these results we build an ad-hoc minimalist autoregressive model that incorporates the main drivers of the dynamics. Surprisingly our model is capable of reproducing very well the time patterns of the empirical series and, for the first time, clearly outlines the importance of the time of attaining sexual maturity as a central temporal scale for the dynamics of this species. In fact, an important advantage of this analysis scheme is that all the model parameters are directly biologically interpretable and potentially measurable, allowing a consistency check between model outputs and independent measurements.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
MEM spectral analysis for predicting influenza epidemics in Japan.
Sumi, Ayako; Kamo, Ken-ichi
2012-03-01
The prediction of influenza epidemics has long been the focus of attention in epidemiology and mathematical biology. In this study, we tested whether time series analysis was useful for predicting the incidence of influenza in Japan. The method of time series analysis we used consists of spectral analysis based on the maximum entropy method (MEM) in the frequency domain and the nonlinear least squares method in the time domain. Using this time series analysis, we analyzed the incidence data of influenza in Japan from January 1948 to December 1998; these data are unique in that they covered the periods of pandemics in Japan in 1957, 1968, and 1977. On the basis of the MEM spectral analysis, we identified the periodic modes explaining the underlying variations of the incidence data. The optimum least squares fitting (LSF) curve calculated with the periodic modes reproduced the underlying variation of the incidence data. An extension of the LSF curve could be used to predict the incidence of influenza quantitatively. Our study suggested that MEM spectral analysis would allow us to model temporal variations of influenza epidemics with multiple periodic modes much more effectively than by using the method of conventional time series analysis, which has been used previously to investigate the behavior of temporal variations in influenza data.
Acoustical Applications of the HHT Method
NASA Technical Reports Server (NTRS)
Huang, Norden E.
2003-01-01
A document discusses applications of a method based on the Huang-Hilbert transform (HHT). The method was described, without the HHT name, in Analyzing Time Series Using EMD and Hilbert Spectra (GSC-13817), NASA Tech Briefs, Vol. 24, No. 10 (October 2000), page 63. To recapitulate: The method is especially suitable for analyzing time-series data that represent nonstationary and nonlinear physical phenomena. The method involves the empirical mode decomposition (EMD), in which a complicated signal is decomposed into a finite number of functions, called intrinsic mode functions (IMFs), that admit well-behaved Hilbert transforms. The HHT consists of the combination of EMD and Hilbert spectral analysis.
NASA Astrophysics Data System (ADS)
Caro Cuenca, Miguel; Esfahany, Sami Samiei; Hanssen, Ramon F.
2010-12-01
Persistent scatterer Radar Interferometry (PSI) can provide with a wealth of information on surface motion. These methods overcome the major limitations of the antecessor technique, interferometric SAR (InSAR), such as atmospheric disturbances, by detecting the scatterers which are slightly affected by noise. The time span that surface deformation processes are observed is limited by the satellite lifetime, which is usually less than 10 years. However most of deformation phenomena last longer. In order to fully monitor and comprehend the observed signal, acquisitions from different sensors can be merged. This is a complex task for one main reason. PSI methods provide with estimations that are relative in time to one of the acquisitions which is referred to as master or reference image. Therefore, time series acquired by different sensors will have different reference images and cannot be directly compared or joint unless they are set to the same time reference system. In global terms, the operation of translating from one to another reference systems consist of calculating a vertical offset, which is the total deformation that occurs between the two master times. To estimate this offset, different strategies can be applied, for example, using additional data such as leveling or GPS measurements. In this contribution we propose to use a least squares to merge PSI time series without any ancillary information. This method treats the time series individually, i.e. per PS, and requires some knowledge of the deformation signal, for example, if a polynomial would fairly describe the expected behavior. To test the proposed approach, we applied it to the southern Netherlands, where the surface is affected by ground water processes in abandoned mines. The time series were obtained after processing images provided by ERS1/2 and Envisat. The results were validated using in-situ water measurements, which show very high correlation with deformation time series.
Wavelet entropy of BOLD time series: An application to Rolandic epilepsy.
Gupta, Lalit; Jansen, Jacobus F A; Hofman, Paul A M; Besseling, René M H; de Louw, Anton J A; Aldenkamp, Albert P; Backes, Walter H
2017-12-01
To assess the wavelet entropy for the characterization of intrinsic aberrant temporal irregularities in the time series of resting-state blood-oxygen-level-dependent (BOLD) signal fluctuations. Further, to evaluate the temporal irregularities (disorder/order) on a voxel-by-voxel basis in the brains of children with Rolandic epilepsy. The BOLD time series was decomposed using the discrete wavelet transform and the wavelet entropy was calculated. Using a model time series consisting of multiple harmonics and nonstationary components, the wavelet entropy was compared with Shannon and spectral (Fourier-based) entropy. As an application, the wavelet entropy in 22 children with Rolandic epilepsy was compared to 22 age-matched healthy controls. The images were obtained by performing resting-state functional magnetic resonance imaging (fMRI) using a 3T system, an 8-element receive-only head coil, and an echo planar imaging pulse sequence ( T2*-weighted). The wavelet entropy was also compared to spectral entropy, regional homogeneity, and Shannon entropy. Wavelet entropy was found to identify the nonstationary components of the model time series. In Rolandic epilepsy patients, a significantly elevated wavelet entropy was observed relative to controls for the whole cerebrum (P = 0.03). Spectral entropy (P = 0.41), regional homogeneity (P = 0.52), and Shannon entropy (P = 0.32) did not reveal significant differences. The wavelet entropy measure appeared more sensitive to detect abnormalities in cerebral fluctuations represented by nonstationary effects in the BOLD time series than more conventional measures. This effect was observed in the model time series as well as in Rolandic epilepsy. These observations suggest that the brains of children with Rolandic epilepsy exhibit stronger nonstationary temporal signal fluctuations than controls. 2 Technical Efficacy: Stage 3 J. Magn. Reson. Imaging 2017;46:1728-1737. © 2017 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Rivera Landa, Rogelio; Cardenas Cardenas, Eduardo; Fossion, Ruben; Pérez Zepeda, Mario Ulises
2014-11-01
Technological advances in the last few decennia allow the monitoring of many physiological observables in a continuous way, which in physics is called a "time series". The best studied physiological time series is that of the heart rhythm, which can be derived from an electrocardiogram (ECG). Studies have shown that a healthy heart is characterized by a complex time series and high heart rate variability (HRV). In adverse conditions, the cardiac time series degenerates towards randomness (as seen in, e.g., fibrillation) or rigidity (as seen in, e.g., ageing), both corresponding to a loss of HRV as described by, e.g., Golberger et. al [1]. Cardiac and digestive rhythms are regulated by the autonomous nervous system (ANS), that consists of two antagonistic branches, the orthosympathetic branch (ONS) that accelerates the cardiac rhythm but decelerates the digestive system, and the parasympathetic brand (PNS) that works in the opposite way. Because of this reason, one might expect that the statistics of gastro-esophageal time series, as described by Gardner et. al. [2,3], reflects the health state of the digestive system in a similar way as HRV in the cardiac case, described by Minocha et. al. In the present project, we apply statistical methods derived from HRV analysis to time series of esophageal acidity (24h pHmetry). The study is realized on data from a large patient population from the Instituto Nacional de Ciencias Médicas y Nutrición Salvador Zubirán. Our focus is on patients with functional disease (symptoms but no anatomical damage). We find that traditional statistical approaches (e.g. Fourier spectral analysis) are unable to distinguish between different degenerations of the digestive system, such as gastric esophageal reflux disease (GERD) or functional gastrointestinal disorder (FGID).
Araújo, Ricardo de A
2010-12-01
This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shean, D. E.; Joughin, I.; Smith, B.; Floricioiu, D.
2015-12-01
Greenland's large marine-terminating outlet glaciers have displayed marked retreat, speedup, and thinning in recent decades. Jakobshavn Isbrae, one of Greenland's largest outlet glaciers, has retreated ~15 km, accelerated ~150%, and thinned ~200 m since the early 1990s. Here, we present a comprehensive analysis of high-resolution elevation (~2-5 m/px) and velocity (~100 m/px) time series with dense temporal coverage (daily-monthly). The Jakobshavn DEM time series consists of >70 WorldView-1/2/3 stereo DEMs and >11 TanDEM-X DEMs spanning 2008-2015. Complementary point elevation data from Operation IceBridge (ATM, LVIS), pre-IceBridge ATM flights, and ICESat-1 GLAS extend the surface elevation record to 1999 and provide essential absolute control data, enabling sub-meter horizontal/vertical accuracy for gridded DEMs. Velocity data are primarily derived from TerraSAR-X/TanDEM-X image pairs with 11-day interval from 2009-2015. These elevation and velocity data capture outlet glacier evolution with unprecedented detail during the post-ICESat era. The lower trunk of Jakobshavn displays significant seasonal velocity variations, with recent rates of ~8 km/yr during winter and >17 km/yr during summer. DEM data show corresponding seasonal elevation changes of -30 to -45 m in summer and +15 to +20 m in winter, with decreasing magnitude upstream. Seasonal discharge varies from ~30-35 Gt/yr in winter to ~45-55 Gt/yr in summer, and we integrate these measurements for improved long-term mass-balance estimates. Recent interannual trends show increased discharge, velocity, and thinning (-15 to -20 m/yr), which is consistent with long-term altimetry records. The DEM time series also reveal new details about calving front and mélange evolution during the seasonal cycle. Similar time series are available for Kangerdlugssuaq and Helheim Glaciers. These observations are improving our understanding of outlet glacier dynamics, while complementing ongoing efforts to constrain estimates for ice-sheet mass balance and present/future sea level rise contributions.
This study applied a phenology-based land-cover classification approach across the Laurentian Great Lakes Basin (GLB) using time-series data consisting of 23 Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) composite images (250 ...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... Atlantic stock of black sea bass and golden tilefish will consist of a series of three workshops: A Data Workshop, an Assessment Workshop, and a Review Workshop. The Review Workshop date, time, and location will...
NASA Astrophysics Data System (ADS)
Zhu, Ning; Sun, Shouguang; Li, Qiang; Zou, Hua
2016-05-01
When a train runs at high speeds, the external exciting frequencies approach the natural frequencies of bogie critical components, thereby inducing strong elastic vibrations. The present international reliability test evaluation standard and design criteria of bogie frames are all based on the quasi-static deformation hypothesis. Structural fatigue damage generated by structural elastic vibrations has not yet been included. In this paper, theoretical research and experimental validation are done on elastic dynamic load spectra on bogie frame of high-speed train. The construction of the load series that correspond to elastic dynamic deformation modes is studied. The simplified form of the load series is obtained. A theory of simplified dynamic load-time histories is then deduced. Measured data from the Beijing-Shanghai Dedicated Passenger Line are introduced to derive the simplified dynamic load-time histories. The simplified dynamic discrete load spectra of bogie frame are established. Based on the damage consistency criterion and a genetic algorithm, damage consistency calibration of the simplified dynamic load spectra is finally performed. The computed result proves that the simplified load series is reasonable. The calibrated damage that corresponds to the elastic dynamic discrete load spectra can cover the actual damage at the operating conditions. The calibrated damage satisfies the safety requirement of damage consistency criterion for bogie frame. This research is helpful for investigating the standardized load spectra of bogie frame of high-speed train.
Appropriate use of the increment entropy for electrophysiological time series.
Liu, Xiaofeng; Wang, Xue; Zhou, Xu; Jiang, Aimin
2018-04-01
The increment entropy (IncrEn) is a new measure for quantifying the complexity of a time series. There are three critical parameters in the IncrEn calculation: N (length of the time series), m (dimensionality), and q (quantifying precision). However, the question of how to choose the most appropriate combination of IncrEn parameters for short datasets has not been extensively explored. The purpose of this research was to provide guidance on choosing suitable IncrEn parameters for short datasets by exploring the effects of varying the parameter values. We used simulated data, epileptic EEG data and cardiac interbeat (RR) data to investigate the effects of the parameters on the calculated IncrEn values. The results reveal that IncrEn is sensitive to changes in m, q and N for short datasets (N≤500). However, IncrEn reaches stability at a data length of N=1000 with m=2 and q=2, and for short datasets (N=100), it shows better relative consistency with 2≤m≤6 and 2≤q≤8 We suggest that the value of N should be no less than 100. To enable a clear distinction between different classes based on IncrEn, we recommend that m and q should take values between 2 and 4. With appropriate parameters, IncrEn enables the effective detection of complexity variations in physiological time series, suggesting that IncrEn should be useful for the analysis of physiological time series in clinical applications. Copyright © 2018 Elsevier Ltd. All rights reserved.
The nature of turbulence in a triangular lattice gas automaton
NASA Astrophysics Data System (ADS)
Duong-Van, Minh; Feit, M. D.; Keller, P.; Pound, M.
1986-12-01
Power spectra calculated from the coarse-graining of a simple lattice gas automaton, and those of time averaging other stochastic times series that we have investigated, have exponents in the range -1.6 to -2, consistent with observation of fully developed turbulence. This power spectrum is a natural consequence of coarse-graining; the exponent -2 represents the continuum limit.
A Bayesian account of quantum histories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marlow, Thomas
2006-05-15
We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistentmore » with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.« less
NASA Technical Reports Server (NTRS)
Beckley, Brian D.; Ray, Richard D.; Lemoine, Frank G.; Zelensky, N. P.; Holmes, S. A.; Desal, Shailen D.; Brown, Shannon; Mitchum, G. T.; Jacob, Samuel; Luthcke, Scott B.
2010-01-01
The science value of satellite altimeter observations has grown dramatically over time as enabling models and technologies have increased the value of data acquired on both past and present missions. With the prospect of an observational time series extending into several decades from TOPEX/Poseidon through Jason-1 and the Ocean Surface Topography Mission (OSTM), and further in time with a future set of operational altimeters, researchers are pushing the bounds of current technology and modeling capability in order to monitor global sea level rate at an accuracy of a few tenths of a mm/yr. The measurement of mean sea-level change from satellite altimetry requires an extreme stability of the altimeter measurement system since the signal being measured is at the level of a few mm/yr. This means that the orbit and reference frame within which the altimeter measurements are situated, and the associated altimeter corrections, must be stable and accurate enough to permit a robust MSL estimate. Foremost, orbit quality and consistency are critical to satellite altimeter measurement accuracy. The orbit defines the altimeter reference frame, and orbit error directly affects the altimeter measurement. Orbit error remains a major component in the error budget of all past and present altimeter missions. For example, inconsistencies in the International Terrestrial Reference Frame (ITRF) used to produce the precision orbits at different times cause systematic inconsistencies to appear in the multimission time-frame between TOPEX and Jason-1, and can affect the intermission calibration of these data. In an effort to adhere to cross mission consistency, we have generated the full time series of orbits for TOPEX/Poseidon (TP), Jason-1, and OSTM based on recent improvements in the satellite force models, reference systems, and modeling strategies. The recent release of the entire revised Jason-1 Geophysical Data Records, and recalibration of the microwave radiometer correction also require the further re-examination of inter-mission consistency issues. Here we present an assessment of these recent improvements to the accuracy of the 17 -year sea surface height time series, and evaluate the subsequent impact on global and regional mean sea level estimates.
Representations of time coordinates in FITS. Time and relative dimension in space
NASA Astrophysics Data System (ADS)
Rots, Arnold H.; Bunclark, Peter S.; Calabretta, Mark R.; Allen, Steven L.; Manchester, Richard N.; Thompson, William T.
2015-02-01
Context. In a series of three previous papers, formulation and specifics of the representation of world coordinate transformations in FITS data have been presented. This fourth paper deals with encoding time. Aims: Time on all scales and precisions known in astronomical datasets is to be described in an unambiguous, complete, and self-consistent manner. Methods: Employing the well-established World Coordinate System (WCS) framework, and maintaining compatibility with the FITS conventions that are currently in use to specify time, the standard is extended to describe rigorously the time coordinate. Results: World coordinate functions are defined for temporal axes sampled linearly and as specified by a lookup table. The resulting standard is consistent with the existing FITS WCS standards and specifies a metadata set that achieves the aims enunciated above.
Time-Scale Modification of Complex Acoustic Signals in Noise
1994-02-04
of a response from a closing stapler . 15 6 Short-time processing of long waveforms. 16 7 Time-scale expansion (x 2) of sequence of transients using...filter bank/overlap- add. 17 8 Time-scale expansion (x2) of a closing stapler using filter bank/overlap-add. 18 9 Composite subband time-scale...INTRODUCTION Short-duration complex sounds, as from the closing of a stapler or the tapping of a drum stick, often consist of a series of brief
NASA Astrophysics Data System (ADS)
Lyons, Mitchell B.; Roelfsema, Chris M.; Phinn, Stuart R.
2013-03-01
The spatial and temporal dynamics of seagrasses have been well studied at the leaf to patch scales, however, the link to large spatial extent landscape and population dynamics is still unresolved in seagrass ecology. Traditional remote sensing approaches have lacked the temporal resolution and consistency to appropriately address this issue. This study uses two high temporal resolution time-series of thematic seagrass cover maps to examine the spatial and temporal dynamics of seagrass at both an inter- and intra-annual time scales, one of the first globally to do so at this scale. Previous work by the authors developed an object-based approach to map seagrass cover level distribution from a long term archive of Landsat TM and ETM+ images on the Eastern Banks (≈200 km2), Moreton Bay, Australia. In this work a range of trend and time-series analysis methods are demonstrated for a time-series of 23 annual maps from 1988 to 2010 and a time-series of 16 monthly maps during 2008-2010. Significant new insight was presented regarding the inter- and intra-annual dynamics of seagrass persistence over time, seagrass cover level variability, seagrass cover level trajectory, and change in area of seagrass and cover levels over time. Overall we found that there was no significant decline in total seagrass area on the Eastern Banks, but there was a significant decline in seagrass cover level condition. A case study of two smaller communities within the Eastern Banks that experienced a decline in both overall seagrass area and condition are examined in detail, highlighting possible differences in environmental and process drivers. We demonstrate how trend and time-series analysis enabled seagrass distribution to be appropriately assessed in context of its spatial and temporal history and provides the ability to not only quantify change, but also describe the type of change. We also demonstrate the potential use of time-series analysis products to investigate seagrass growth and decline as well as the processes that drive it. This study demonstrates clear benefits over traditional seagrass mapping and monitoring approaches, and provides a proof of concept for the use of trend and time-series analysis of remotely sensed seagrass products to benefit current endeavours in seagrass ecology.
Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure
NASA Astrophysics Data System (ADS)
Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak
2017-09-01
Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.
Proisy, Christophe; Viennois, Gaëlle; Sidik, Frida; Andayani, Ariani; Enright, James Anthony; Guitet, Stéphane; Gusmawati, Niken; Lemonnier, Hugues; Muthusankar, Gowrappan; Olagoke, Adewole; Prosperi, Juliana; Rahmania, Rinny; Ricout, Anaïs; Soulard, Benoit; Suhardjono
2018-06-01
Revegetation of abandoned aquaculture regions should be a priority for any integrated coastal zone management (ICZM). This paper examines the potential of a matchless time series of 20 very high spatial resolution (VHSR) optical satellite images acquired for mapping trends in the evolution of mangrove forests from 2001 to 2015 in an estuary fragmented into aquaculture ponds. Evolution of mangrove extent was quantified through robust multitemporal analysis based on supervised image classification. Results indicated that mangroves are expanding inside and outside ponds and over pond dykes. However, the yearly expansion rate of vegetation cover greatly varied between replanted ponds. Ground truthing showed that only Rhizophora species had been planted, whereas natural mangroves consist of Avicennia and Sonneratia species. In addition, the dense Rhizophora plantations present very low regeneration capabilities compared with natural mangroves. Time series of VHSR images provide comprehensive and intuitive level of information for the support of ICZM. Copyright © 2017 Elsevier Ltd. All rights reserved.
State energy data report 1996: Consumption estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The State Energy Data Report (SEDR) provides annual time series estimates of State-level energy consumption by major economic sectors. The estimates are developed in the Combined State Energy Data System (CSEDS), which is maintained and operated by the Energy Information Administration (EIA). The goal in maintaining CSEDS is to create historical time series of energy consumption by State that are defined as consistently as possible over time and across sectors. CSEDS exists for two principal reasons: (1) to provide State energy consumption estimates to Members of Congress, Federal and State agencies, and the general public and (2) to provide themore » historical series necessary for EIA`s energy models. To the degree possible, energy consumption has been assigned to five sectors: residential, commercial, industrial, transportation, and electric utility sectors. Fuels covered are coal, natural gas, petroleum, nuclear electric power, hydroelectric power, biomass, and other, defined as electric power generated from geothermal, wind, photovoltaic, and solar thermal energy. 322 tabs.« less
Variation in organic matter and water color in Lake Mälaren during the past 70 years.
Johansson, L; Temnerud, J; Abrahamsson, J; Berggren Kleja, D
2010-03-01
Interest in long time series of organic matter data has recently increased due to concerns about the effects of global climate change on aquatic ecosystems. This study presents and evaluates unique time series of chemical oxygen demand (COD) and water color from Lake Malaren, Sweden, stretching almost seven decades (1935-2004). A negative linear trend was found in COD, but not in water color. The decrease was mainly due to installation of sewage works around 1970. Time series of COD and water color had cyclic pattern. It was strongest for COD, with 23 years periodicity. Similar periodicity observed in air temperature and precipitation in Sweden has been attributed to the North Atlantic Oscillation index and solar system orbit, suggesting that COD in Lake Mälaren is partly derived from algae. Discharge influenced water color more than COD, possibly because water color consists of colored substances brought into the lake from surrounding soils.
Weekly Solutions of Time-Variable Gravity from 1993 to 2010
NASA Technical Reports Server (NTRS)
Lemoine, F.; Chinn, D.; Le Bail, K.; Zelensky, N.; Melachroinos, S.; Beall, J.
2011-01-01
The GRACE mission has been highly successful in determining the time-variable gravity field of the Earth, producing monthly or even more frequent solutions (cf. 10-day) solutions using both spherical harmonics and mascons. However the GRACE time series only commences in 2002 - 2003 and a gap of several years may occur in the series before a GRACE follow-on satellite is launched. Satellites tracked by SLR and DORIS have also been used to study time variations in the Earth's gravitational field. These include (most recently) the solutions of Cox and Chao (2002), Cheng et al. (2004, 2007) and Lemoine et al. (2007). In this paper we discuss the development of a new time series of low degree spherical harmonic fields based on the available SLR, DORIS and GPS data. We develop simultaneous solutions for both the geocenter and the low degree harmonics up to 5x5. The solutions integrate data from SLR geodetic satellites (e.g., Lageos1, Lageos2, Starlette, Stella, Ajisai, Larets, Westpac), altimetry satellites (TOPEX/Poseidon, Envisat, Jason-1, Jason-2), and satellites tracked solely by DORIS (e.g. SPOT2-5). We discuss some pertinent aspects of the satellite-specific modeling. We include altimeter crossovers in the weekly solutions where feasible and time permits. The resulting geocenter time series is compared with geophysical model predictions and other independently-derived solutions. Over the GRACE time period the fidelity and consistency with the GRACE solutions are presented.
Swarzenski, Peter; Reich, Chris; Rudnick, David
2009-01-01
Estimates of submarine ground-water discharge (SGD) into Florida Bay remain one of the least understood components of a regional water balance. To quantify the magnitude and seasonality of SGD into upper Florida Bay, research activities included the use of the natural geochemical tracer, 222Rn, to examine potential SGD hotspots (222Rn surveys) and to quantify the total (saline + fresh water component) SGD rates at select sites (222Rn time-series). To obtain a synoptic map of the 222Rn distribution within our study site in Florida Bay, we set up a flow-through system on a small boat that consisted of a Differential Global Positioning System, a calibrated YSI, Inc CTD sensor with a sampling rate of 0.5 min, and a submersible pump (z = 0.5 m) that continuously fed water into an air/water exchanger that was plumbed simultaneously into four RAD7 222Rn air monitors. To obtain local advective ground-water flux estimates, 222Rn time-series experiments were deployed at strategic positions across hydrologic and geologic gradients within our study site. These time-series stations consisted of a submersible pump, a Solinist DIVER (to record continuous CTD parameters) and two RAD7 222Rn air monitors plumbed into an air/water exchanger. Repeat time-series 222Rn measurements were conducted for 3-4 days across several tidal excursions. Radon was also measured in the air during each sampling campaign by a dedicated RAD7. We obtained ground-water discharge information by calculating a 222Rn mass balance that accounted for lateral and horizontal exchange, as well as an appropriate ground-water 222Rn end member activity. Another research component utilized marine continuous resistivity profiling (CRP) surveys to examine the subsurface salinity structure within Florida Bay sediments. This system consisted of an AGI SuperSting 8 channel receiver attached to a streamer cable that had two current (A,B) electrodes and nine potential electrodes that were spaced 10 m apart. A separate DGPS continuously sent position information to the SuperSting. Results indicate that the 222Rn maps provide a useful gauge of relative ground-water discharge into upper Florida Bay. The 222Rn time-series measurements provide a reasonable estimate of site- specific total (saline and fresh) ground-water discharge (mean = 12.5+-11.8 cm d-1), while the saline nature of the shallow ground-water at our study site, as evidenced by CPR results, indicates that most of this discharge must be recycled sea water. The CRP data show some interesting trends that appear to be consistent with subsurface geologic and hydrologic characterization. For example, some of the highest resistivity (electrical conductivity-1) values were recorded where one would expect a slight subsurface freshening (for example bayside Key Largo, or below the C111 canal).
Performance of time-series methods in forecasting the demand for red blood cell transfusion.
Pereira, Arturo
2004-05-01
Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.
Dual redundant core memory systems
NASA Technical Reports Server (NTRS)
Hull, F. E.
1972-01-01
Electronic memory system consisting of series redundant drive switch circuits, triple redundant majority voted memory timing functions, and two data registers to provide functional dual redundancy is described. Signal flow through the circuits is illustrated and equence of events which occur within the memory system is explained.
Wavelet application to the time series analysis of DORIS station coordinates
NASA Astrophysics Data System (ADS)
Bessissi, Zahia; Terbeche, Mekki; Ghezali, Boualem
2009-06-01
The topic developed in this article relates to the residual time series analysis of DORIS station coordinates using the wavelet transform. Several analysis techniques, already developed in other disciplines, were employed in the statistical study of the geodetic time series of stations. The wavelet transform allows one, on the one hand, to provide temporal and frequential parameter residual signals, and on the other hand, to determine and quantify systematic signals such as periodicity and tendency. Tendency is the change in short or long term signals; it is an average curve which represents the general pace of the signal evolution. On the other hand, periodicity is a process which is repeated, identical to itself, after a time interval called the period. In this context, the topic of this article consists, on the one hand, in determining the systematic signals by wavelet analysis of time series of DORIS station coordinates, and on the other hand, in applying the denoising signal to the wavelet packet, which makes it possible to obtain a well-filtered signal, smoother than the original signal. The DORIS data used in the treatment are a set of weekly residual time series from 1993 to 2004 from eight stations: DIOA, COLA, FAIB, KRAB, SAKA, SODB, THUB and SYPB. It is the ign03wd01 solution expressed in stcd format, which is derived by the IGN/JPL analysis center. Although these data are not very recent, the goal of this study is to detect the contribution of the wavelet analysis method on the DORIS data, compared to the other analysis methods already studied.
NASA Astrophysics Data System (ADS)
Butler, P. G.; Scourse, J. D.; Richardson, C. A.; Wanamaker, A. D., Jr.
2009-04-01
Determinations of the local correction (ΔR) to the globally averaged marine radiocarbon reservoir age are often isolated in space and time, derived from heterogeneous sources and constrained by significant uncertainties. Although time series of ΔR at single sites can be obtained from sediment cores, these are subject to multiple uncertainties related to sedimentation rates, bioturbation and interspecific variations in the source of radiocarbon in the analysed samples. Coral records provide better resolution, but these are available only for tropical locations. It is shown here that it is possible to use the shell of the long-lived bivalve mollusc Arctica islandica as a source of high resolution time series of absolutely-dated marine radiocarbon determinations for the shelf seas surrounding the North Atlantic ocean. Annual growth increments in the shell can be crossdated and chronologies can be constructed in a precise analogue with the use of tree-rings. Because the calendar dates of the samples are known, ΔR can be determined with high precision and accuracy and because all the samples are from the same species, the time series of ΔR values possesses a high degree of internal consistency. Presented here is a multi-centennial (AD 1593 - AD 1933) time series of 31 ΔR values for a site in the Irish Sea close to the Isle of Man. The mean value of ΔR (-62 14C yrs) does not change significantly during this period but increased variability is apparent before AD 1750.
Performance evaluation of the croissant production line with reparable machines
NASA Astrophysics Data System (ADS)
Tsarouhas, Panagiotis H.
2015-03-01
In this study, the analytical probability models for an automated serial production system, bufferless that consists of n-machines in series with common transfer mechanism and control system was developed. Both time to failure and time to repair a failure are assumed to follow exponential distribution. Applying those models, the effect of system parameters on system performance in actual croissant production line was studied. The production line consists of six workstations with different numbers of reparable machines in series. Mathematical models of the croissant production line have been developed using Markov process. The strength of this study is in the classification of the whole system in states, representing failures of different machines. Failure and repair data from the actual production environment have been used to estimate reliability and maintainability for each machine, workstation, and the entire line is based on analytical models. The analysis provides a useful insight into the system's behaviour, helps to find design inherent faults and suggests optimal modifications to upgrade the system and improve its performance.
Matsunaga, Yasuhiro
2018-01-01
Single-molecule experiments and molecular dynamics (MD) simulations are indispensable tools for investigating protein conformational dynamics. The former provide time-series data, such as donor-acceptor distances, whereas the latter give atomistic information, although this information is often biased by model parameters. Here, we devise a machine-learning method to combine the complementary information from the two approaches and construct a consistent model of conformational dynamics. It is applied to the folding dynamics of the formin-binding protein WW domain. MD simulations over 400 μs led to an initial Markov state model (MSM), which was then "refined" using single-molecule Förster resonance energy transfer (FRET) data through hidden Markov modeling. The refined or data-assimilated MSM reproduces the FRET data and features hairpin one in the transition-state ensemble, consistent with mutation experiments. The folding pathway in the data-assimilated MSM suggests interplay between hydrophobic contacts and turn formation. Our method provides a general framework for investigating conformational transitions in other proteins. PMID:29723137
Matsunaga, Yasuhiro; Sugita, Yuji
2018-05-03
Single-molecule experiments and molecular dynamics (MD) simulations are indispensable tools for investigating protein conformational dynamics. The former provide time-series data, such as donor-acceptor distances, whereas the latter give atomistic information, although this information is often biased by model parameters. Here, we devise a machine-learning method to combine the complementary information from the two approaches and construct a consistent model of conformational dynamics. It is applied to the folding dynamics of the formin-binding protein WW domain. MD simulations over 400 μs led to an initial Markov state model (MSM), which was then "refined" using single-molecule Förster resonance energy transfer (FRET) data through hidden Markov modeling. The refined or data-assimilated MSM reproduces the FRET data and features hairpin one in the transition-state ensemble, consistent with mutation experiments. The folding pathway in the data-assimilated MSM suggests interplay between hydrophobic contacts and turn formation. Our method provides a general framework for investigating conformational transitions in other proteins. © 2018, Matsunaga et al.
Agatha: Disentangling period signals from correlated noise in a periodogram framework
NASA Astrophysics Data System (ADS)
Feng, F.; Tuomi, M.; Jones, H. R. A.
2018-04-01
Agatha is a framework of periodograms to disentangle periodic signals from correlated noise and to solve the two-dimensional model selection problem: signal dimension and noise model dimension. These periodograms are calculated by applying likelihood maximization and marginalization and combined in a self-consistent way. Agatha can be used to select the optimal noise model and to test the consistency of signals in time and can be applied to time series analyses in other astronomical and scientific disciplines. An interactive web implementation of the software is also available at http://agatha.herts.ac.uk/.
A Methodology for the Parametric Reconstruction of Non-Steady and Noisy Meteorological Time Series
NASA Astrophysics Data System (ADS)
Rovira, F.; Palau, J. L.; Millán, M.
2009-09-01
Climatic and meteorological time series often show some persistence (in time) in the variability of certain features. One could regard annual, seasonal and diurnal time variability as trivial persistence in the variability of some meteorological magnitudes (as, e.g., global radiation, air temperature above surface, etc.). In these cases, the traditional Fourier transform into frequency space will show the principal harmonics as the components with the largest amplitude. Nevertheless, meteorological measurements often show other non-steady (in time) variability. Some fluctuations in measurements (at different time scales) are driven by processes that prevail on some days (or months) of the year but disappear on others. By decomposing a time series into time-frequency space through the continuous wavelet transformation, one is able to determine both the dominant modes of variability and how those modes vary in time. This study is based on a numerical methodology to analyse non-steady principal harmonics in noisy meteorological time series. This methodology combines both the continuous wavelet transform and the development of a parametric model that includes the time evolution of the principal and the most statistically significant harmonics of the original time series. The parameterisation scheme proposed in this study consists of reproducing the original time series by means of a statistically significant finite sum of sinusoidal signals (waves), each defined by using the three usual parameters: amplitude, frequency and phase. To ensure the statistical significance of the parametric reconstruction of the original signal, we propose a standard statistical t-student analysis of the confidence level of the amplitude in the parametric spectrum for the different wave components. Once we have assured the level of significance of the different waves composing the parametric model, we can obtain the statistically significant principal harmonics (in time) of the original time series by using the Fourier transform of the modelled signal. Acknowledgements The CEAM Foundation is supported by the Generalitat Valenciana and BANCAIXA (València, Spain). This study has been partially funded by the European Commission (FP VI, Integrated Project CIRCE - No. 036961) and by the Ministerio de Ciencia e Innovación, research projects "TRANSREG” (CGL2007-65359/CLI) and "GRACCIE” (CSD2007-00067, Program CONSOLIDER-INGENIO 2010).
NASA Astrophysics Data System (ADS)
Burgette, Reed J.; Watson, Christopher S.; Church, John A.; White, Neil J.; Tregoning, Paul; Coleman, Richard
2013-08-01
We quantify the rate of sea level rise around the Australian continent from an analysis of tide gauge and Global Positioning System (GPS) data sets. To estimate the underlying linear rates of sea level change in the presence of significant interannual and decadal variability (treated here as noise), we adopt and extend a novel network adjustment approach. We simultaneously estimate time-correlated noise as well as linear model parameters and realistic uncertainties from sea level time series at individual gauges, as well as from time-series differences computed between pairs of gauges. The noise content at individual gauges is consistent with a combination of white and time-correlated noise. We find that the noise in time series from the western coast of Australia is best described by a first-order Gauss-Markov model, whereas east coast stations generally exhibit lower levels of time-correlated noise that is better described by a power-law process. These findings suggest several decades of monthly tide gauge data are needed to reduce rate uncertainties to <0.5 mm yr-1 for undifferenced single site time series with typical noise characteristics. Our subsequent adjustment strategy exploits the more precise differential rates estimated from differenced time series from pairs of tide gauges to estimate rates among the network of 43 tide gauges that passed a stability analysis. We estimate relative sea level rates over three temporal windows (1900-2011, 1966-2011 and 1993-2011), accounting for covariance between time series. The resultant adjustment reduces the rate uncertainty across individual gauges, and partially mitigates the need for century-scale time series at all sites in the network. Our adjustment reveals a spatially coherent pattern of sea level rise around the coastline, with the highest rates in northern Australia. Over the time periods beginning in 1900, 1966 and 1993, we find weighted average rates of sea level rise of 1.4 ± 0.6, 1.7 ± 0.6 and 4.6 ± 0.8 mm yr-1, respectively. While the temporal pattern of the rate estimates is consistent with acceleration in sea level rise, it may not be significant, as the uncertainties for the shorter analysis periods may not capture the full range of temporal variation. Analysis of the available continuous GPS records that have been collected within 80 km of Australian tide gauges suggests that rates of vertical crustal motion are generally low, with the majority of sites showing motion statistically insignificant from zero. A notable exception is the significant component of vertical land motion that contributes to the rapid rate of relative sea level change (>4 mm yr-1) at the Hillarys site in the Perth area. This corresponds to crustal subsidence that we estimate in our GPS analysis at a rate of -3.1 ± 0.7 mm yr-1, and appears linked to groundwater withdrawal. Uncertainties on the rates of vertical displacement at GPS sites collected over a decade are similar to what we measure in several decades of tide gauge data. Our results motivate continued observations of relative sea level using tide gauges, maintained with high-accuracy terrestrial and continuous co-located satellite-based surveying.
Validation of a national hydrological model
NASA Astrophysics Data System (ADS)
McMillan, H. K.; Booker, D. J.; Cattoën, C.
2016-10-01
Nationwide predictions of flow time-series are valuable for development of policies relating to environmental flows, calculating reliability of supply to water users, or assessing risk of floods or droughts. This breadth of model utility is possible because various hydrological signatures can be derived from simulated flow time-series. However, producing national hydrological simulations can be challenging due to strong environmental diversity across catchments and a lack of data available to aid model parameterisation. A comprehensive and consistent suite of test procedures to quantify spatial and temporal patterns in performance across various parts of the hydrograph is described and applied to quantify the performance of an uncalibrated national rainfall-runoff model of New Zealand. Flow time-series observed at 485 gauging stations were used to calculate Nash-Sutcliffe efficiency and percent bias when simulating between-site differences in daily series, between-year differences in annual series, and between-site differences in hydrological signatures. The procedures were used to assess the benefit of applying a correction to the modelled flow duration curve based on an independent statistical analysis. They were used to aid understanding of climatological, hydrological and model-based causes of differences in predictive performance by assessing multiple hypotheses that describe where and when the model was expected to perform best. As the procedures produce quantitative measures of performance, they provide an objective basis for model assessment that could be applied when comparing observed daily flow series with competing simulated flow series from any region-wide or nationwide hydrological model. Model performance varied in space and time with better scores in larger and medium-wet catchments, and in catchments with smaller seasonal variations. Surprisingly, model performance was not sensitive to aquifer fraction or rain gauge density.
NASA Technical Reports Server (NTRS)
Keeling, Ralph F.; Campbell, J. A. (Technical Monitor)
2002-01-01
We successfully initiated a program to obtain continuous time series of atmospheric O2 concentrations at a semi-remote coastal site, in Trinidad, California. The installation, which was completed in September 1999, consists of a commercially-available O2 and CO2 analyzers interfaced to a custom gas handling system and housed in a dedicated building at the Trinidad site. Ultimately, the data from this site are expected to provide constraints, complementing satellite data, on variations in ocean productivity and carbon exchange on annual and interannual time scales, in the context of human-induced changes in global climate and other perturbations. The existing time-series, of limited duration, have been used in support of studies of the O2/CO2 exchange from a wild fire (which fortuitously occurred nearby in October 1999) and to quantify air-sea N2O and O2 exchanges related to coastal upwelling events. More generally, the project demonstrates the feasibility of obtaining semi-continuous O2 time series at moderate cost from strategic locations globally.
Building Change Detection in Very High Resolution Satellite Stereo Image Time Series
NASA Astrophysics Data System (ADS)
Tian, J.; Qin, R.; Cerra, D.; Reinartz, P.
2016-06-01
There is an increasing demand for robust methods on urban sprawl monitoring. The steadily increasing number of high resolution and multi-view sensors allows producing datasets with high temporal and spatial resolution; however, less effort has been dedicated to employ very high resolution (VHR) satellite image time series (SITS) to monitor the changes in buildings with higher accuracy. In addition, these VHR data are often acquired from different sensors. The objective of this research is to propose a robust time-series data analysis method for VHR stereo imagery. Firstly, the spatial-temporal information of the stereo imagery and the Digital Surface Models (DSMs) generated from them are combined, and building probability maps (BPM) are calculated for all acquisition dates. In the second step, an object-based change analysis is performed based on the derivative features of the BPM sets. The change consistence between object-level and pixel-level are checked to remove any outlier pixels. Results are assessed on six pairs of VHR satellite images acquired within a time span of 7 years. The evaluation results have proved the efficiency of the proposed method.
Berris, Steven N.; Hess, Glen W.; Bohman, Larry R.
2000-01-01
Title II of Public Law 101-618, the Truckee?Carson?Pyramid Lake Water Rights Settlement Act of 1990, provides direction, authority, and a mechanism for resolving conflicts over water rights in the Truckee and Carson River Basins. The Truckee Carson Program of the U.S. Geological Survey, to support implementation of Public Law 101-618, has developed an operations model to simulate lake/reservoir and river operations for the Truckee River Basin including diversion of Truckee River water to the Truckee Canal for transport to the Carson River Basin. Several types of hydrologic data, formatted in a chronological order with a daily time interval called 'time series,' are described in this report. Time series from water years 1933 to 1997 can be used to run the operations model. Auxiliary hydrologic data not currently used by the model are also described. The time series of hydrologic data consist of flow, lake/reservoir elevation and storage, precipitation, evaporation, evapotranspiration, municipal and industrial (M&I) demand, and streamflow and lake/reservoir level forecast data.
Pridemore, William Alex; Chamlin, Mitchell B; Cochran, John K
2007-06-01
The dissolution of the Soviet Union resulted in sudden, widespread, and fundamental changes to Russian society. The former social welfare system-with its broad guarantees of employment, healthcare, education, and other forms of social support-was dismantled in the shift toward democracy, rule of law, and a free-market economy. This unique natural experiment provides a rare opportunity to examine the potentially disintegrative effects of rapid social change on deviance, and thus to evaluate one of Durkheim's core tenets. We took advantage of this opportunity by performing interrupted time-series analyses of annual age-adjusted homicide, suicide, and alcohol-related mortality rates for the Russian Federation using data from 1956 to 2002, with 1992-2002 as the postintervention time-frame. The ARIMA models indicate that, controlling for the long-term processes that generated these three time series, the breakup of the Soviet Union was associated with an appreciable increase in each of the cause-of-death rates. We interpret these findings as being consistent with the Durkheimian hypothesis that rapid social change disrupts social order, thereby increasing the level of crime and deviance.
Scaling and efficiency determine the irreversible evolution of a market
Baldovin, F.; Stella, A. L.
2007-01-01
In setting up a stochastic description of the time evolution of a financial index, the challenge consists in devising a model compatible with all stylized facts emerging from the analysis of financial time series and providing a reliable basis for simulating such series. Based on constraints imposed by market efficiency and on an inhomogeneous-time generalization of standard simple scaling, we propose an analytical model which accounts simultaneously for empirical results like the linear decorrelation of successive returns, the power law dependence on time of the volatility autocorrelation function, and the multiscaling associated to this dependence. In addition, our approach gives a justification and a quantitative assessment of the irreversible character of the index dynamics. This irreversibility enters as a key ingredient in a novel simulation strategy of index evolution which demonstrates the predictive potential of the model.
EE Cep Winks in Full Color (Abstract)
NASA Astrophysics Data System (ADS)
Walker, G.
2015-06-01
(Abstract only) We observe the long period (5.6 years) Eclipsing Binary Variable Star EE Cep during its 2014 eclipse. It was observed on every clear night from the Maria Mitchell Observatory as well as remote sites for a total of 25 nights. Each night consisted of a detailed time series in BVRI looking for short term variations for a total of >10,000 observations. The data was transformed to the Standard System. In addition, a time series was captured during the night of the eclipse. This data provides an alternate method to determine Time of Minimum than traditionally performed. The TOM varied with color. Several strong correlations are seen between colors substantiating the detection of variations on a time scale of hours. The long term light curve shows five interesting and different Phases with different characteristics.
Goldstein, Steven J; Abdel-Fattah, Amr I; Murrell, Michael T; Dobson, Patrick F; Norman, Deborah E; Amato, Ronald S; Nunn, Andrew J
2010-03-01
Uranium-series data for groundwater samples from the Nopal I uranium ore deposit were obtained to place constraints on radionuclide transport and hydrologic processes for a nuclear waste repository located in fractured, unsaturated volcanic tuff. Decreasing uranium concentrations for wells drilled in 2003 are consistent with a simple physical mixing model that indicates that groundwater velocities are low ( approximately 10 m/y). Uranium isotopic constraints, well productivities, and radon systematics also suggest limited groundwater mixing and slow flow in the saturated zone. Uranium isotopic systematics for seepage water collected in the mine adit show a spatial dependence which is consistent with longer water-rock interaction times and higher uranium dissolution inputs at the front adit where the deposit is located. Uranium-series disequilibria measurements for mostly unsaturated zone samples indicate that (230)Th/(238)U activity ratios range from 0.005 to 0.48 and (226)Ra/(238)U activity ratios range from 0.006 to 113. (239)Pu/(238)U mass ratios for the saturated zone are <2 x 10(-14), and Pu mobility in the saturated zone is >1000 times lower than the U mobility. Saturated zone mobility decreases in the order (238)U approximately (226)Ra > (230)Th approximately (239)Pu. Radium and thorium appear to have higher mobility in the unsaturated zone based on U-series data from fractures and seepage water near the deposit.
NASA Astrophysics Data System (ADS)
Murray, J. R.; Svarc, J. L.
2016-12-01
Constant secular velocities estimated from Global Positioning System (GPS)-derived position time series are a central input for modeling interseismic deformation in seismically active regions. Both postseismic motion and temporally correlated noise produce long-period signals that are difficult to separate from secular motion and can bias velocity estimates. For GPS sites installed post-earthquake it is especially challenging to uniquely estimate velocities and postseismic signals and to determine when the postseismic transient has decayed sufficiently to enable use of subsequent data for estimating secular rates. Within 60 km of the 2003 M6.5 San Simeon and 2004 M6 Parkfield earthquakes in California, 16 continuous GPS sites (group 1) were established prior to mid-2001, and 52 stations (group 2) were installed following the events. We use group 1 data to investigate how early in the post-earthquake time period one may reliably begin using group 2 data to estimate velocities. For each group 1 time series, we obtain eight velocity estimates using observation time windows with successively later start dates (2006 - 2013) and a parameterization that includes constant velocity, annual, and semi-annual terms but no postseismic decay. We compare these to velocities estimated using only pre-San Simeon data to find when the pre- and post-earthquake velocities match within uncertainties. To obtain realistic velocity uncertainties, for each time series we optimize a temporally correlated noise model consisting of white, flicker, random walk, and, in some cases, band-pass filtered noise contributions. Preliminary results suggest velocities can be reliably estimated using data from 2011 to the present. Ongoing work will assess velocity bias as a function of epicentral distance and length of post-earthquake time series as well as explore spatio-temporal filtering of detrended group 1 time series to provide empirical corrections for postseismic motion in group 2 time series.
Toward automatic time-series forecasting using neural networks.
Yan, Weizhong
2012-07-01
Over the past few decades, application of artificial neural networks (ANN) to time-series forecasting (TSF) has been growing rapidly due to several unique features of ANN models. However, to date, a consistent ANN performance over different studies has not been achieved. Many factors contribute to the inconsistency in the performance of neural network models. One such factor is that ANN modeling involves determining a large number of design parameters, and the current design practice is essentially heuristic and ad hoc, this does not exploit the full potential of neural networks. Systematic ANN modeling processes and strategies for TSF are, therefore, greatly needed. Motivated by this need, this paper attempts to develop an automatic ANN modeling scheme. It is based on the generalized regression neural network (GRNN), a special type of neural network. By taking advantage of several GRNN properties (i.e., a single design parameter and fast learning) and by incorporating several design strategies (e.g., fusing multiple GRNNs), we have been able to make the proposed modeling scheme to be effective for modeling large-scale business time series. The initial model was entered into the NN3 time-series competition. It was awarded the best prediction on the reduced dataset among approximately 60 different models submitted by scholars worldwide.
Comparison of time-series registration methods in breast dynamic infrared imaging
NASA Astrophysics Data System (ADS)
Riyahi-Alam, S.; Agostini, V.; Molinari, F.; Knaflitz, M.
2015-03-01
Automated motion reduction in dynamic infrared imaging is on demand in clinical applications, since movement disarranges time-temperature series of each pixel, thus originating thermal artifacts that might bias the clinical decision. All previously proposed registration methods are feature based algorithms requiring manual intervention. The aim of this work is to optimize the registration strategy specifically for Breast Dynamic Infrared Imaging and to make it user-independent. We implemented and evaluated 3 different 3D time-series registration methods: 1. Linear affine, 2. Non-linear Bspline, 3. Demons applied to 12 datasets of healthy breast thermal images. The results are evaluated through normalized mutual information with average values of 0.70 ±0.03, 0.74 ±0.03 and 0.81 ±0.09 (out of 1) for Affine, Bspline and Demons registration, respectively, as well as breast boundary overlap and Jacobian determinant of the deformation field. The statistical analysis of the results showed that symmetric diffeomorphic Demons' registration method outperforms also with the best breast alignment and non-negative Jacobian values which guarantee image similarity and anatomical consistency of the transformation, due to homologous forces enforcing the pixel geometric disparities to be shortened on all the frames. We propose Demons' registration as an effective technique for time-series dynamic infrared registration, to stabilize the local temperature oscillation.
NASA Astrophysics Data System (ADS)
Duveiller, G.; Donatelli, M.; Fumagalli, D.; Zucchini, A.; Nelson, R.; Baruth, B.
2017-02-01
Coupled atmosphere-ocean general circulation models (GCMs) simulate different realizations of possible future climates at global scale under contrasting scenarios of land-use and greenhouse gas emissions. Such data require several additional processing steps before it can be used to drive impact models. Spatial downscaling, typically by regional climate models (RCM), and bias-correction are two such steps that have already been addressed for Europe. Yet, the errors in resulting daily meteorological variables may be too large for specific model applications. Crop simulation models are particularly sensitive to these inconsistencies and thus require further processing of GCM-RCM outputs. Moreover, crop models are often run in a stochastic manner by using various plausible weather time series (often generated using stochastic weather generators) to represent climate time scale for a period of interest (e.g. 2000 ± 15 years), while GCM simulations typically provide a single time series for a given emission scenario. To inform agricultural policy-making, data on near- and medium-term decadal time scale is mostly requested, e.g. 2020 or 2030. Taking a sample of multiple years from these unique time series to represent time horizons in the near future is particularly problematic because selecting overlapping years may lead to spurious trends, creating artefacts in the results of the impact model simulations. This paper presents a database of consolidated and coherent future daily weather data for Europe that addresses these problems. Input data consist of daily temperature and precipitation from three dynamically downscaled and bias-corrected regional climate simulations of the IPCC A1B emission scenario created within the ENSEMBLES project. Solar radiation is estimated from temperature based on an auto-calibration procedure. Wind speed and relative air humidity are collected from historical series. From these variables, reference evapotranspiration and vapour pressure deficit are estimated ensuring consistency within daily records. The weather generator ClimGen is then used to create 30 synthetic years of all variables to characterize the time horizons of 2000, 2020 and 2030, which can readily be used for crop modelling studies.
Autoregressive modeling for the spectral analysis of oceanographic data
NASA Technical Reports Server (NTRS)
Gangopadhyay, Avijit; Cornillon, Peter; Jackson, Leland B.
1989-01-01
Over the last decade there has been a dramatic increase in the number and volume of data sets useful for oceanographic studies. Many of these data sets consist of long temporal or spatial series derived from satellites and large-scale oceanographic experiments. These data sets are, however, often 'gappy' in space, irregular in time, and always of finite length. The conventional Fourier transform (FT) approach to the spectral analysis is thus often inapplicable, or where applicable, it provides questionable results. Here, through comparative analysis with the FT for different oceanographic data sets, the possibilities offered by autoregressive (AR) modeling to perform spectral analysis of gappy, finite-length series, are discussed. The applications demonstrate that as the length of the time series becomes shorter, the resolving power of the AR approach as compared with that of the FT improves. For the longest data sets examined here, 98 points, the AR method performed only slightly better than the FT, but for the very short ones, 17 points, the AR method showed a dramatic improvement over the FT. The application of the AR method to a gappy time series, although a secondary concern of this manuscript, further underlines the value of this approach.
Multi-scale clustering of functional data with application to hydraulic gradients in wetlands
Greenwood, Mark C.; Sojda, Richard S.; Sharp, Julia L.; Peck, Rory G.; Rosenberry, Donald O.
2011-01-01
A new set of methods are developed to perform cluster analysis of functions, motivated by a data set consisting of hydraulic gradients at several locations distributed across a wetland complex. The methods build on previous work on clustering of functions, such as Tarpey and Kinateder (2003) and Hitchcock et al. (2007), but explore functions generated from an additive model decomposition (Wood, 2006) of the original time se- ries. Our decomposition targets two aspects of the series, using an adaptive smoother for the trend and circular spline for the diurnal variation in the series. Different measures for comparing locations are discussed, including a method for efficiently clustering time series that are of different lengths using a functional data approach. The complicated nature of these wetlands are highlighted by the shifting group memberships depending on which scale of variation and year of the study are considered.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.
2018-02-01
River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of prescribed β values and gap distributions. The aliasing method, however, does not itself account for sampling irregularity, and this introduces some bias in the result. Nonetheless, the wavelet method is recommended for estimating β in irregular time series until improved methods are developed. Finally, all methods' performances depend strongly on the sampling irregularity, highlighting that the accuracy and precision of each method are data specific. Accurately quantifying the strength of fractal scaling in irregular water-quality time series remains an unresolved challenge for the hydrologic community and for other disciplines that must grapple with irregular sampling.
NASA Astrophysics Data System (ADS)
Ma, Lin; Chabaux, Francois; Pelt, Eric; Blaes, Estelle; Jin, Lixin; Brantley, Susan
2010-08-01
In the Critical Zone where rocks and life interact, bedrock equilibrates to Earth surface conditions, transforming to regolith. The factors that control the rates and mechanisms of formation of regolith, defined here as material that can be augered, are still not fully understood. To quantify regolith formation rates on shale lithology, we measured uranium-series (U-series) isotopes ( 238U, 234U, and 230Th) in three weathering profiles along a planar hillslope at the Susquehanna/Shale Hills Observatory (SSHO) in central Pennsylvania. All regolith samples show significant U-series disequilibrium: ( 234U/ 238U) and ( 230Th/ 238U) activity ratios range from 0.934 to 1.072 and from 0.903 to 1.096, respectively. These values display depth trends that are consistent with fractionation of U-series isotopes during chemical weathering and element transport, i.e., the relative mobility decreases in the order 234U > 238U > 230Th. The activity ratios observed in the regolith samples are explained by i) loss of U-series isotopes during water-rock interactions and ii) re-deposition of U-series isotopes downslope. Loss of U and Th initiates in the meter-thick zone of "bedrock" that cannot be augered but that nonetheless consists of up to 40% clay/silt/sand inferred to have lost K, Mg, Al, and Fe. Apparent equivalent regolith production rates calculated with these isotopes for these profiles decrease exponentially from 45 m/Myr to 17 m/Myr, with increasing regolith thickness from the ridge top to the valley floor. With increasing distance from the ridge top toward the valley, apparent equivalent regolith residence times increase from 7 kyr to 40 kyr. Given that the SSHO experienced peri-glacial climate ˜ 15 kyr ago and has a catchment-wide averaged erosion rate of ˜ 15 m/Myr as inferred from cosmogenic 10Be, we conclude that the hillslope retains regolith formed before the peri-glacial period and is not at geomorphologic steady state. Both chemical weathering reactions of clay minerals and translocation of fine particles/colloids are shown to contribute to mass loss of U and Th from the regolith, consistent with major element data at SSHO. This research documents a case study where U-series isotopes are used to constrain the time scales of chemical weathering and regolith production rates. Regolith production rates at the SSHO should be useful as a reference value for future work at other weathering localities.
Taxation, regulation, and addiction: a demand function for cigarettes based on time-series evidence.
Keeler, T E; Hu, T W; Barnett, P G; Manning, W G
1993-04-01
This work analyzes the effects of prices, taxes, income, and anti-smoking regulations on the consumption of cigarettes in California (a 25-cent-per-pack state tax increase in 1989 enhances the usefulness of this exercise). Analysis is based on monthly time-series data for 1980 through 1990. Results show a price elasticity of demand for cigarettes in the short run of -0.3 to -0.5 at mean data values, and -0.5 to -0.6 in the long run. We find at least some support for two further hypotheses: that antismoking regulations reduce cigarette consumption, and that consumers behave consistently with the model of rational addiction.
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shippert, Tim; Gaustad, Krista
Consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. These challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of data consolidation methods, present a frameworkmore » for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
Intercomparison of field measurements of nitrous acid (HONO) during the SHARP Campaign
Because of the importance of HONO as a radical reservoir, consistent and accurate measurements of its concentration are needed. As part of the SHARP (Study of Houston Atmospheric Radical Precursors), time series of HONO were obtained by five different measurement techniques on th...
Curriculum for Discussion Time.
ERIC Educational Resources Information Center
Steinhoff, Mary E.
This curriculum guide consists of materials for use in implementing two 10-meeting series of group discussions designed to enhance the process of the socialialization of students enrolled in an associate degree nursing program. Addressed in the discussion sessions are the following topics: developing an awareness of self-concept and gaining…
Dishman, J Donald; Weber, Kenneth A; Corbin, Roger L; Burke, Jeanmarie R
2012-09-30
The purpose of this research was to characterize unique neurophysiologic events following a high velocity, low amplitude (HVLA) spinal manipulation (SM) procedure. Descriptive time series analysis techniques of time plots, outlier detection and autocorrelation functions were applied to time series of tibial nerve H-reflexes that were evoked at 10-s intervals from 100 s before the event until 100 s after three distinct events L5-S1 HVLA SM, or a L5-S1 joint pre-loading procedure, or the control condition. Sixty-six subjects were randomly assigned to three procedures, i.e., 22 time series per group. If the detection of outliers and correlograms revealed a pattern of non-randomness that was only time-locked to a single, specific event in the normalized time series, then an experimental effect would be inferred beyond the inherent variability of H-reflex responses. Tibial nerve F-wave responses were included to determine if any new information about central nervous function following a HVLA SM procedure could be ascertained. Time series analyses of H(max)/M(max) ratios, pre-post L5-S1 HVLA SM, substantiated the hypothesis that the specific aspects of the manipulative thrust lead to a greater attenuation of the H(max)/M(max) ratio as compared to the non-specific aspects related to the postural perturbation and joint pre-loading. The attenuation of the H(max)/M(max) ratio following the HVLA SM procedure was reliable and may hold promise as a translational tool to measure the consistency and accuracy of protocol implementation involving SM in clinical trials research. F-wave responses were not sensitive to mechanical perturbations of the lumbar spine. Copyright © 2012 Elsevier B.V. All rights reserved.
Two-pass imputation algorithm for missing value estimation in gene expression time series.
Tsiporkova, Elena; Boeva, Veselka
2007-10-01
Gene expression microarray experiments frequently generate datasets with multiple values missing. However, most of the analysis, mining, and classification methods for gene expression data require a complete matrix of gene array values. Therefore, the accurate estimation of missing values in such datasets has been recognized as an important issue, and several imputation algorithms have already been proposed to the biological community. Most of these approaches, however, are not particularly suitable for time series expression profiles. In view of this, we propose a novel imputation algorithm, which is specially suited for the estimation of missing values in gene expression time series data. The algorithm utilizes Dynamic Time Warping (DTW) distance in order to measure the similarity between time expression profiles, and subsequently selects for each gene expression profile with missing values a dedicated set of candidate profiles for estimation. Three different DTW-based imputation (DTWimpute) algorithms have been considered: position-wise, neighborhood-wise, and two-pass imputation. These have initially been prototyped in Perl, and their accuracy has been evaluated on yeast expression time series data using several different parameter settings. The experiments have shown that the two-pass algorithm consistently outperforms, in particular for datasets with a higher level of missing entries, the neighborhood-wise and the position-wise algorithms. The performance of the two-pass DTWimpute algorithm has further been benchmarked against the weighted K-Nearest Neighbors algorithm, which is widely used in the biological community; the former algorithm has appeared superior to the latter one. Motivated by these findings, indicating clearly the added value of the DTW techniques for missing value estimation in time series data, we have built an optimized C++ implementation of the two-pass DTWimpute algorithm. The software also provides for a choice between three different initial rough imputation methods.
NASA Astrophysics Data System (ADS)
Piecuch, Christopher G.; Landerer, Felix W.; Ponte, Rui M.
2018-05-01
Monthly ocean bottom pressure solutions from the Gravity Recovery and Climate Experiment (GRACE), derived using surface spherical cap mass concentration (MC) blocks and spherical harmonics (SH) basis functions, are compared to tide gauge (TG) monthly averaged sea level data over 2003-2015 to evaluate improved gravimetric data processing methods near the coast. MC solutions can explain ≳ 42% of the monthly variance in TG time series over broad shelf regions and in semi-enclosed marginal seas. MC solutions also generally explain ˜5-32 % more TG data variance than SH estimates. Applying a coastline resolution improvement algorithm in the GRACE data processing leads to ˜ 31% more variance in TG records explained by the MC solution on average compared to not using this algorithm. Synthetic observations sampled from an ocean general circulation model exhibit similar patterns of correspondence between modeled TG and MC time series and differences between MC and SH time series in terms of their relationship with TG time series, suggesting that observational results here are generally consistent with expectations from ocean dynamics. This work demonstrates the improved quality of recent MC solutions compared to earlier SH estimates over the coastal ocean, and suggests that the MC solutions could be a useful tool for understanding contemporary coastal sea level variability and change.
NASA Astrophysics Data System (ADS)
Peltoniemi, Mikko; Aurela, Mika; Böttcher, Kristin; Kolari, Pasi; Loehr, John; Karhu, Jouni; Linkosalmi, Maiju; Melih Tanis, Cemal; Tuovinen, Juha-Pekka; Nadir Arslan, Ali
2018-01-01
In recent years, monitoring of the status of ecosystems using low-cost web (IP) or time lapse cameras has received wide interest. With broad spatial coverage and high temporal resolution, networked cameras can provide information about snow cover and vegetation status, serve as ground truths to Earth observations and be useful for gap-filling of cloudy areas in Earth observation time series. Networked cameras can also play an important role in supplementing laborious phenological field surveys and citizen science projects, which also suffer from observer-dependent observation bias. We established a network of digital surveillance cameras for automated monitoring of phenological activity of vegetation and snow cover in the boreal ecosystems of Finland. Cameras were mounted at 14 sites, each site having 1-3 cameras. Here, we document the network, basic camera information and access to images in the permanent data repository (http://www.zenodo.org/communities/phenology_camera/). Individual DOI-referenced image time series consist of half-hourly images collected between 2014 and 2016 (https://doi.org/10.5281/zenodo.1066862). Additionally, we present an example of a colour index time series derived from images from two contrasting sites.
NASA Astrophysics Data System (ADS)
Lee, Minsuk; Won, Youngjae; Park, Byungjun; Lee, Seungrag
2017-02-01
Not only static characteristics but also dynamic characteristics of the red blood cell (RBC) contains useful information for the blood diagnosis. Quantitative phase imaging (QPI) can capture sample images with subnanometer scale depth resolution and millisecond scale temporal resolution. Various researches have been used QPI for the RBC diagnosis, and recently many researches has been developed to decrease the process time of RBC information extraction using QPI by the parallel computing algorithm, however previous studies are interested in the static parameters such as morphology of the cells or simple dynamic parameters such as root mean square (RMS) of the membrane fluctuations. Previously, we presented a practical blood test method using the time series correlation analysis of RBC membrane flickering with QPI. However, this method has shown that there is a limit to the clinical application because of the long computation time. In this study, we present an accelerated time series correlation analysis of RBC membrane flickering using the parallel computing algorithm. This method showed consistent fractal scaling exponent results of the surrounding medium and the normal RBC with our previous research.
Circular analysis in complex stochastic systems
Valleriani, Angelo
2015-01-01
Ruling out observations can lead to wrong models. This danger occurs unwillingly when one selects observations, experiments, simulations or time-series based on their outcome. In stochastic processes, conditioning on the future outcome biases all local transition probabilities and makes them consistent with the selected outcome. This circular self-consistency leads to models that are inconsistent with physical reality. It is also the reason why models built solely on macroscopic observations are prone to this fallacy. PMID:26656656
Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul
2012-01-01
Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.
2012-01-01
Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037
NASA Astrophysics Data System (ADS)
Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron
2009-10-01
A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.
NASA Astrophysics Data System (ADS)
Socquet, Anne; Déprez, Aline; Cotte, Nathalie; Maubant, Louise; Walpersdorf, Andrea; Bato, Mary Grace
2017-04-01
We present here a new pan-European velocity field, obtained by processing 500+ cGPS stations in double difference, in the framework of the implementation phase of the European Plate Observing System (EPOS) project. This prototype solution spans the 2000-2016 period, and includes data from RING, NOA, RENAG and European Permanent Network (EPN) cGPS netwprks. The data set is first split into daily sub-networks (between 8 and 14 sub-networks). The sub-networks consist in about 40 stations, with 2 overlapping stations. For each day and for each sub-network, the GAMIT processing is conducted independently. Once each sub-network achieves satisfactory results, a daily combination is performed in order to produce SINEX files. The Chi square value associated with the combination allows us to evaluate its quality. Eventually, a multi year combination generates position time series for each station. Each time series is visualized and the jumps associated with material change (antenna or receiver) are estimated and corrected. This procedure allows us to generate daily solutions and position time series for all stations. The associated "interseismic" velocity field has then been estimated using a times series analysis using MIDAS software, and compared to another independent estimate obtained by Kalman filtering with globk software. In addition to this velocity field we made a specific zoom on Italy and present a strain rate map as well as time series showing co- and post- seismic movements associated with the 2016 Amatrice and Norcia earthquakes.
Near-Surface Flow Fields Deduced Using Correlation Tracking and Time-Distance Analysis
NASA Technical Reports Server (NTRS)
DeRosa, Marc; Duvall, T. L., Jr.; Toomre, Juri
1999-01-01
Near-photospheric flow fields on the Sun are deduced using two independent methods applied to the same time series of velocity images observed by SOI-MDI on SOHO. Differences in travel times between f modes entering and leaving each pixel measured using time-distance helioseismology are used to determine sites of supergranular outflows. Alternatively, correlation tracking analysis of mesogranular scales of motion applied to the same time series is used to deduce the near-surface flow field. These two approaches provide the means to assess the patterns and evolution of horizontal flows on supergranular scales even near disk center, which is not feasible with direct line-of-sight Doppler measurements. We find that the locations of the supergranular outflows seen in flow fields generated from correlation tracking coincide well with the locations of the outflows determined from the time-distance analysis, with a mean correlation coefficient after smoothing of bar-r(sub s) = 0.840. Near-surface velocity field measurements can used to study the evolution of the supergranular network, as merging and splitting events are observed to occur in these images. The data consist of one 2048-minute time series of high-resolution (0.6" pixels) line-of-sight velocity images taken by MDI on 1997 January 16-18 at a cadence of one minute.
Historical gaseous and primary aerosol emissions in the United States from 1990-2010
An accurate description of emissions is crucial for model simulations to reproduce and interpret observed phenomena over extended time periods. In this study, we used an approach based on activity data to develop a consistent series of spatially resolved emissions in the United S...
The Intern Studio: A Pilot Study.
ERIC Educational Resources Information Center
Wix, Linney
1995-01-01
Describes and discusses the Intern Studio Project, which consists of the provision of regular open studio time for art therapy interns in a state university graduate program. Psychological and artistic bases for the open studio approach are discussed, and include the relational approach, Hillman's essentialist paradigm, and series and context…
Collective Security and the Demand for Legal Handguns.
ERIC Educational Resources Information Center
McDowall, David; Loftin, Colin
1983-01-01
Law confidence in collective security contributes to the need for and the resistance to gun control policies. Time-series data on legal gun demand in Detroit from 1951 to 1977 are consistent with a model in which individuals respond to high violent crime rates, civil disorders, and police strength. (Author/RM)
USDA-ARS?s Scientific Manuscript database
The intracellular circadian clock consists of a series of transcriptional modulators that together allow the cell to perceive the time of day. Circadian clocks have been identified within various components of the cardiovascular system (e.g., cardiomyocytes, vascular smooth muscle cells) and possess...
Time Series ARIMA Models of Undergraduate Grade Point Average.
ERIC Educational Resources Information Center
Rogers, Bruce G.
The Auto-Regressive Integrated Moving Average (ARIMA) Models, often referred to as Box-Jenkins models, are regression methods for analyzing sequential dependent observations with large amounts of data. The Box-Jenkins approach, a three-stage procedure consisting of identification, estimation and diagnosis, was used to select the most appropriate…
NASA Astrophysics Data System (ADS)
Miller, R. J.; Reed, D.; Washburn, L.; Bell, T. W.; Blanchette, C. A.
2016-02-01
Time series data collected by the Santa Barbara Coastal Long-Term Ecological Research program on giant kelp forests and the environmental factors that influence them provide a unique opportunity to examine the extent and ecological consequences of recent anomalies in physical and chemical properties of a shallow water benthic marine ecosystem. Positive temperature anomalies have been recorded in all but two months since early 2013 with deviations ranging as high as 3.8 oC above the 14-year monthly mean, which is unprecedented in the time series. Positive anomalies in salinity (DS) were also observed every month since late 2012 and DS exceeded 0.3 for several months in 2013 and 2014. Positive DS values occurred in previous years, but were weaker and shorter in duration. Apart from 1-2 months, anomalies in nitrate, phosphate, and silicate turned consistently negative in late 2012. However, comparable anomalies in these nutrients occurred earlier in the record, especially before 2008 for nitrate and phosphate. Anomalies in key ecological characteristics of giant kelp forests associated with the large positive temperature anomalies have been much less striking. Water column chlorophyll a, the standing biomass of giant kelp and densities of many kelp forest consumers have been lower than normal in recent years, but not markedly so compared to other years in the time series. Shorter time series data on pigment concentrations in giant kelp revealed a declining trend in recent years, consistent with the below normal levels observed in kelp tissue nitrogen. The most dramatic change in kelp forests that coincided with the onset of the temperature anomalies was observed in sea stars, which first showed signs of a wasting disease in fall of 2013. The disease spread rapidly from north to south and by spring 2014 infections were prevalent throughout southern California. Large corresponding increases in the abundance of starfish prey have yet to be observed.
Ostrowski, M; Paulevé, L; Schaub, T; Siegel, A; Guziolowski, C
2016-11-01
Boolean networks (and more general logic models) are useful frameworks to study signal transduction across multiple pathways. Logic models can be learned from a prior knowledge network structure and multiplex phosphoproteomics data. However, most efficient and scalable training methods focus on the comparison of two time-points and assume that the system has reached an early steady state. In this paper, we generalize such a learning procedure to take into account the time series traces of phosphoproteomics data in order to discriminate Boolean networks according to their transient dynamics. To that end, we identify a necessary condition that must be satisfied by the dynamics of a Boolean network to be consistent with a discretized time series trace. Based on this condition, we use Answer Set Programming to compute an over-approximation of the set of Boolean networks which fit best with experimental data and provide the corresponding encodings. Combined with model-checking approaches, we end up with a global learning algorithm. Our approach is able to learn logic models with a true positive rate higher than 78% in two case studies of mammalian signaling networks; for a larger case study, our method provides optimal answers after 7min of computation. We quantified the gain in our method predictions precision compared to learning approaches based on static data. Finally, as an application, our method proposes erroneous time-points in the time series data with respect to the optimal learned logic models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob
2016-08-01
The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.
Dausey, David J; Chandra, Anita; Schaefer, Agnes G; Bahney, Ben; Haviland, Amelia; Zakowski, Sarah; Lurie, Nicole
2008-09-01
We tested telephone-based disease surveillance systems in local health departments to identify system characteristics associated with consistent and timely responses to urgent case reports. We identified a stratified random sample of 74 health departments and conducted a series of unannounced tests of their telephone-based surveillance systems. We used regression analyses to identify system characteristics that predicted fast connection with an action officer (an appropriate public health professional). Optimal performance in consistently connecting callers with an action officer in 30 minutes or less was achieved by 31% of participating health departments. Reaching a live person upon dialing, regardless of who that person was, was the strongest predictor of optimal performance both in being connected with an action officer and in consistency of connection times. Health departments can achieve optimal performance in consistently connecting a caller with an action officer in 30 minutes or less and may improve performance by using a telephone-based disease surveillance system in which the phone is answered by a live person at all times.
NASA Astrophysics Data System (ADS)
Susanto, R. D.; Setiawan, A.; Zheng, Q.; Sulistyo, B.; Adi, T. R.; Agustiadi, T.; Trenggono, M.; Triyono, T.; Kuswardani, A.
2016-12-01
The seasonal variability of a full lifetime of Aquarius sea surface salinity time series from August 25, 2011 to June 7, 2015 is compared to salinity time series obtained from in situ observations in the Karimata Strait. The Karimata Strait plays dual roles in water exchange between the Pacific and the Indian Ocean. The salinity in the Karimata Strait is strongly affected by seasonal monsoon winds. During the boreal winter monsoon, northwesterly winds draws low salinity water from the South China Sea into the Java Sea and at the same time, the Java Sea receives an influx of the Indian Ocean water via the Sunda Strait. The Java Sea water will reduce the main Indonesian throughflow in the Makassar Strait. Conditions are reversed during the summer monsoon. Low salinity water from the South China Sea also controls the vertical structure of water properties in the upper layer of the Makassar Strait and the Lombok Strait. As a part of the South China Sea and Indonesian Seas Transport/Exchange (SITE) program, trawl resistance bottom mounted CTD was deployed in the Karimata Strait in mid-2010 to mid-2016 at water depth of 40 m. CTD casts during the mooring recoveries and deployments are used to compare the bottom salinity data. This in situ salinity time series is compared with various Aquarius NASA salinity products (the level 2, level 3 ascending and descending tracks and the seven-days rolling averaged) to check the consistency, correlation and statistical analysis. The preliminary results show that the seasonal variability of Aquarius salinity time series has larger amplitude variability compared to that of in situ data.
NASA Technical Reports Server (NTRS)
Ulsig, Laura; Nichol, Caroline J.; Huemmrich, Karl F.; Landis, David R.; Middleton, Elizabeth M.; Lyapustin, Alexei I.; Mammarella, Ivan; Levula, Janne; Porcar-Castell, Albert
2017-01-01
Long-term observations of vegetation phenology can be used to monitor the response of terrestrial ecosystems to climate change. Satellite remote sensing provides the most efficient means to observe phenological events through time series analysis of vegetation indices such as the Normalized Difference Vegetation Index (NDVI). This study investigates the potential of a Photochemical Reflectance Index (PRI), which has been linked to vegetation light use efficiency, to improve the accuracy of MODIS-based estimates of phenology in an evergreen conifer forest. Timings of the start and end of the growing season (SGS and EGS) were derived from a 13-year-long time series of PRI and NDVI based on a MAIAC (multi-angle implementation of atmospheric correction) processed MODIS dataset and standard MODIS NDVI product data. The derived dates were validated with phenology estimates from ground-based flux tower measurements of ecosystem productivity. Significant correlations were found between the MAIAC time series and ground-estimated SGS (R (sup 2) equals 0.36-0.8), which is remarkable since previous studies have found it difficult to observe inter-annual phenological variations in evergreen vegetation from satellite data. The considerably noisier NDVI product could not accurately predict SGS, and EGS could not be derived successfully from any of the time series. While the strongest relationship overall was found between SGS derived from the ground data and PRI, MAIAC NDVI exhibited high correlations with SGS more consistently (R (sup 2) is greater than 0.6 in all cases). The results suggest that PRI can serve as an effective indicator of spring seasonal transitions, however, additional work is necessary to confirm the relationships observed and to further explore the usefulness of MODIS PRI for detecting phenology.
Chaos control in delayed phase space constructed by the Takens embedding theory
NASA Astrophysics Data System (ADS)
Hajiloo, R.; Salarieh, H.; Alasty, A.
2018-01-01
In this paper, the problem of chaos control in discrete-time chaotic systems with unknown governing equations and limited measurable states is investigated. Using the time-series of only one measurable state, an algorithm is proposed to stabilize unstable fixed points. The approach consists of three steps: first, using Takens embedding theory, a delayed phase space preserving the topological characteristics of the unknown system is reconstructed. Second, a dynamic model is identified by recursive least squares method to estimate the time-series data in the delayed phase space. Finally, based on the reconstructed model, an appropriate linear delayed feedback controller is obtained for stabilizing unstable fixed points of the system. Controller gains are computed using a systematic approach. The effectiveness of the proposed algorithm is examined by applying it to the generalized hyperchaotic Henon system, prey-predator population map, and the discrete-time Lorenz system.
Steps towards a consistent Climate Forecast System Reanalysis wave hindcast (1979-2016)
NASA Astrophysics Data System (ADS)
Stopa, Justin E.; Ardhuin, Fabrice; Huchet, Marion; Accensi, Mickael
2017-04-01
Surface gravity waves are being increasingly recognized as playing an important role within the climate system. Wave hindcasts and reanalysis products of long time series (>30 years) have been instrumental in understanding and describing the wave climate for the past several decades and have allowed a better understanding of extreme waves and inter-annual variability. Wave hindcasts have the advantage of covering the oceans in higher space-time resolution than possible with conventional observations from satellites and buoys. Wave reanalysis systems like ECWMF's ERA-Interim directly included a wave model that is coupled to the ocean and atmosphere, otherwise reanalysis wind fields are used to drive a wave model to reproduce the wave field in long time series. The ERA Interim dataset is consistent in time, but cannot adequately resolve extreme waves. On the other hand, the NCEP Climate Forecast System (CFSR) wind field better resolves the extreme wind speeds, but suffers from discontinuous features in time which are due to the quantity and quality of the remote sensing data incorporated into the product. Therefore, a consistent hindcast that resolves the extreme waves still alludes us limiting our understanding of the wave climate. In this study, we systematically correct the CFSR wind field to reproduce a homogeneous wave field in time. To verify the homogeneity of our hindcast we compute error metrics on a monthly basis using the observations from a merged altimeter wave database which has been calibrated and quality controlled from 1985-2016. Before 1985 only few wave observations exist and are limited to a select number of wave buoys mostly in the North Hemisphere. Therefore we supplement our wave observations with seismic data which responds to nonlinear wave interactions created by opposing waves with nearly equal wavenumbers. Within the CFSR wave hindcast, we find both spatial and temporal discontinuities in the error metrics. The Southern Hemisphere often has wind speed biases larger than the Northern Hemisphere and we propose a simple correction to reduce these features by applying a taper shaped by a half-Hanning window. The discontinuous features in time are corrected by scaling the entire wind field by percentages ranging typically ranging from 1-3%. Our analysis is performed on monthly time series and we expect the monthly statistics to be more adequate for climate studies.
Studying the Motivated Agent Through Time: Personal Goal Development During the Adult Life Span.
Dunlop, William L; Bannon, Brittany L; McAdams, Dan P
2017-04-01
This research examined the rank-order and mean-level consistency of personal goals at two periods in the adult life span. Personal goal continuity was considered among a group of young adults (N = 145) who reported their goals three times over a 3-year period and among a group of midlife adults (N = 163) who specified their goals annually over a 4-year period. Goals were coded for a series of motive-based (viz., achievement, affiliation, intimacy, power) and domain-based (viz., finance, generativity, health, travel) categories. In both samples, we noted a moderate degree of rank-order consistency across assessment periods. In addition, the majority of goal categories exhibited a high degree of mean-level consistency. The results of this research suggest that (a) the content of goals exhibits a modest degree of rank-order consistency and a substantial degree of mean-level consistency over time, and (b) considering personality continuity and development as manifest via goals represents a viable strategy for personality psychologists. © 2015 Wiley Periodicals, Inc.
Per capita alcohol consumption in Australia: will the real trend please step forward?
Chikritzhs, Tanya N; Allsop, Steve J; Moodie, A Rob; Hall, Wayne D
2010-11-15
To estimate the national trend in per capita consumption (PCC) of alcohol for Australians aged 15 years and older for the financial years 1990-91 to 2008-09. With the use of data obtained from Australian Bureau of Statistics' catalogues and World Advertising Research Centre reports, three alternative series of annual totals of PCC of alcohol for the past 20 years (1990-91 to 2008-09) were estimated based on different assumptions about the alcohol content of wine. For the "old" series, the alcohol content of wine was assumed to have been stable over time. For the "new" series, the alcohol content of wine was assumed to have increased once in 2004-05 and then to have remained stable to 2008-09. For the "adjusted" series, the alcohol content of wine was assumed to have gradually increased over time, beginning in 1998-99. Linear trend analysis was applied to identify significant trends. National trend in annual PCC of alcohol 1990-91 to 2008-09. The new and adjusted series of annual totals of PCC of alcohol showed increasing trends; the old series was stable. Until recently, official national annual totals of PCC of alcohol were underestimated and led to the mistaken impression that levels of alcohol consumption had been stable since the early 1990s. In fact, Australia's total PCC has been increasing significantly over time because of a gradual increase in the alcohol content and market share of wine and is now at one of its highest points since 1991-92. This new information is consistent with evidence of increasing alcohol-related harm and highlights the need for timely and accurate data on alcohol sales and harms across Australia.
NASA Astrophysics Data System (ADS)
Csatho, B. M.; Schenk, A. F.; Babonis, G. S.; van den Broeke, M. R.; Kuipers Munneke, P.; van der Veen, C. J.; Khan, S. A.; Porter, D. F.
2016-12-01
This study presents a new, comprehensive reconstruction of Greenland Ice Sheet elevation changes, generated using the Surface Elevation And Change detection (SERAC) approach. 35-year long elevation-change time series (1980-2015) were obtained at more than 150,000 locations from observations acquired by NASA's airborne and spaceborne laser altimeters (ATM, LVIS, ICESat), PROMICE laser altimetry data (2007-2011) and a DEM covering the ice sheet margin derived from stereo aerial photographs (1970s-80s). After removing the effect of Glacial Isostatic Adjustment (GIA) and the elastic crustal response to changes in ice loading, the time series were partitioned into changes due to surface processes and ice dynamics and then converted into mass change histories. Using gridded products, we examined ice sheet elevation, and mass change patterns, and compared them with other estimates at different scales from individual outlet glaciers through large drainage basins, on to the entire ice sheet. Both the SERAC time series and the grids derived from these time series revealed significant spatial and temporal variations of dynamic mass loss and widespread intermittent thinning, indicating the complexity of ice sheet response to climate forcing. To investigate the regional and local controls of ice dynamics, we examined thickness change time series near outlet glacier grounding lines. Changes on most outlet glaciers were consistent with one or more episodes of dynamic thinning that propagates upstream from the glacier terminus. The spatial pattern of the onset, duration, and termination of these dynamic thinning events suggest a regional control, such as warming ocean and air temperatures. However, the intricate spatiotemporal pattern of dynamic thickness change suggests that, regardless of the forcing responsible for initial glacier acceleration and thinning, the response of individual glaciers is modulated by local conditions. We use statistical methods, such as principal component analysis and multivariate regression to analyze the dynamic ice-thickness change time series derived by SERAC and to investigate the primary forcings and controls on outlet glacier changes.
NASA Astrophysics Data System (ADS)
Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng
2017-01-01
We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.
Small area population forecasting: some experience with British models.
Openshaw, S; Van Der Knaap, G A
1983-01-01
This study is concerned with the evaluation of the various models including time-series forecasts, extrapolation, and projection procedures, that have been developed to prepare population forecasts for planning purposes. These models are evaluated using data for the Netherlands. "As part of a research project at the Erasmus University, space-time population data has been assembled in a geographically consistent way for the period 1950-1979. These population time series are of sufficient length for the first 20 years to be used to build models and then evaluate the performance of the model for the next 10 years. Some 154 different forecasting models for 832 municipalities have been evaluated. It would appear that the best forecasts are likely to be provided by either a Holt-Winters model, or a ratio-correction model, or a low order exponential-smoothing model." excerpt
Beck, R.A.; Rettig, A.J.; Ivenso, C.; Eisner, Wendy R.; Hinkel, Kenneth M.; Jones, Benjamin M.; Arp, C.D.; Grosse, G.; Whiteman, D.
2010-01-01
Ice formation and breakup on Arctic rivers strongly influence river flow, sedimentation, river ecology, winter travel, and subsistence fishing and hunting by Alaskan Natives. We use time-series ground imagery ofthe Meade River to examine the process at high temporal and spatial resolution. Freezeup from complete liquid cover to complete ice cover ofthe Meade River at Atqasuk, Alaska in the fall of 2008 occurred in less than three days between 28 September and 2 October 2008. Breakup in 2009 occurred in less than two hours between 23:47 UTC on 23 May 2009 and 01:27 UTC on 24 May 2009. All times in UTC. Breakup in 2009 and 2010 was ofthe thermal style in contrast to the mechanical style observed in 1966 and is consistent with a warming Arctic. ?? 2010 Taylor & Francis.
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shippert, Tim; Gaustad, Krista
In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
Shippert, Tim; Gaustad, Krista
2016-12-16
In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
Empirical analysis of the effects of cyber security incidents.
Davis, Ginger; Garcia, Alfredo; Zhang, Weide
2009-09-01
We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.
Male unemployment and cause-specific mortality in postwar Scotland.
Forbes, J F; McGregor, A
1987-01-01
This article reports a time-series analysis of male unemployment and mortality in postwar Scotland. The results provide little evidence to support the hypothesis that unemployment exerts a significant and consistent positive impact on mortality from all causes, lung cancer, ischemic heart disease, and cerebrovascular disease. Although significant positive associations between unemployment and mortality from lung cancer and ischemic heart disease were detected for older males in the short term, the long-term association between unemployment and mortality tends to be negative. Further progress on establishing possible causal relationships between unemployment and health requires both the collaboration of medical and social scientists and a well designed prospective study that avoids many of the problems associated with time-series and cross-sectional analyses.
Status of CSR RL06 GRACE reprocessing and preliminary results
NASA Astrophysics Data System (ADS)
Save, H.
2017-12-01
The GRACE project plans to re-processes the GRACE mission data in order to be consistent with the first gravity products released by the GRACE-FO project. The RL06 reprocessing will harmonize the GRACE time-series with the first release of GRACE-FO. This paper catalogues the changes in the upcoming RL06 release and discusses the quality improvements as compared to the current RL05 release. The processing and parameterization changes as compared to the current release are also discussed. This paper discusses the evolution of the quality of the GRACE solutions and characterize the errors over the past few years. The possible challenges associated with connecting the GRACE time series with that from GRACE-FO are also discussed.
Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie
2012-01-01
We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock.
Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie
2012-01-01
We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221
Modeling sports highlights using a time-series clustering framework and model interpretation
NASA Astrophysics Data System (ADS)
Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay
2005-01-01
In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.
NASA Astrophysics Data System (ADS)
Tesmer, Volker; Boehm, Johannes; Heinkelmann, Robert; Schuh, Harald
2007-06-01
This paper compares estimated terrestrial reference frames (TRF) and celestial reference frames (CRF) as well as position time-series in terms of systematic differences, scale, annual signals and station position repeatabilities using four different tropospheric mapping functions (MF): The NMF (Niell Mapping Function) and the recently developed GMF (Global Mapping Function) consist of easy-to-handle stand-alone formulae, whereas the IMF (Isobaric Mapping Function) and the VMF1 (Vienna Mapping Function 1) are determined from numerical weather models. All computations were performed at the Deutsches Geodätisches Forschungsinstitut (DGFI) using the OCCAM 6.1 and DOGS-CS software packages for Very Long Baseline Interferometry (VLBI) data from 1984 until 2005. While it turned out that CRF estimates only slightly depend on the MF used, showing small systematic effects up to 0.025 mas, some station heights of the computed TRF change by up to 13 mm. The best agreement was achieved for the VMF1 and GMF results concerning the TRFs, and for the VMF1 and IMF results concerning scale variations and position time-series. The amplitudes of the annual periodical signals in the time-series of estimated heights differ by up to 5 mm. The best precision in terms of station height repeatability is found for the VMF1, which is 5 7% better than for the other MFs.
Work-related accidents among the Iranian population: a time series analysis, 2000–2011
Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood
2015-01-01
Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774
Work-related accidents among the Iranian population: a time series analysis, 2000-2011.
Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood
2015-01-01
Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.
An algebraic method for constructing stable and consistent autoregressive filters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, University Park, PA 16802; Hong, Hoon, E-mail: hong@ncsu.edu
2015-02-15
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides amore » discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.« less
Synthetic wind speed scenarios generation for probabilistic analysis of hybrid energy systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Jun; Rabiti, Cristian
Hybrid energy systems consisting of multiple energy inputs and multiple energy outputs have been proposed to be an effective element to enable ever increasing penetration of clean energy. In order to better understand the dynamic and probabilistic behavior of hybrid energy systems, this paper proposes a model combining Fourier series and autoregressive moving average (ARMA) to characterize historical weather measurements and to generate synthetic weather (e.g., wind speed) data. In particular, Fourier series is used to characterize the seasonal trend in historical data, while ARMA is applied to capture the autocorrelation in residue time series (e.g., measurements minus seasonal trends).more » The generated synthetic wind speed data is then utilized to perform probabilistic analysis of a particular hybrid energy system con guration, which consists of nuclear power plant, wind farm, battery storage, natural gas boiler, and chemical plant. As a result, requirements on component ramping rate, economic and environmental impacts of hybrid energy systems, and the effects of deploying different sizes of batteries in smoothing renewable variability, are all investigated.« less
Synthetic wind speed scenarios generation for probabilistic analysis of hybrid energy systems
Chen, Jun; Rabiti, Cristian
2016-11-25
Hybrid energy systems consisting of multiple energy inputs and multiple energy outputs have been proposed to be an effective element to enable ever increasing penetration of clean energy. In order to better understand the dynamic and probabilistic behavior of hybrid energy systems, this paper proposes a model combining Fourier series and autoregressive moving average (ARMA) to characterize historical weather measurements and to generate synthetic weather (e.g., wind speed) data. In particular, Fourier series is used to characterize the seasonal trend in historical data, while ARMA is applied to capture the autocorrelation in residue time series (e.g., measurements minus seasonal trends).more » The generated synthetic wind speed data is then utilized to perform probabilistic analysis of a particular hybrid energy system con guration, which consists of nuclear power plant, wind farm, battery storage, natural gas boiler, and chemical plant. As a result, requirements on component ramping rate, economic and environmental impacts of hybrid energy systems, and the effects of deploying different sizes of batteries in smoothing renewable variability, are all investigated.« less
NASA Astrophysics Data System (ADS)
Zhu, Ning; Sun, Shou-Guang; Li, Qiang; Zou, Hua
2014-12-01
One of the major problems in structural fatigue life analysis is establishing structural load spectra under actual operating conditions. This study conducts theoretical research and experimental validation of quasi-static load spectra on bogie frame structures of high-speed trains. The quasistatic load series that corresponds to quasi-static deformation modes are identified according to the structural form and bearing conditions of high-speed train bogie frames. Moreover, a force-measuring frame is designed and manufactured based on the quasi-static load series. The load decoupling model of the quasi-static load series is then established via calibration tests. Quasi-static load-time histories, together with online tests and decoupling analysis, are obtained for the intermediate range of the Beijing—Shanghai dedicated passenger line. The damage consistency calibration of the quasi-static discrete load spectra is performed according to a damage consistency criterion and a genetic algorithm. The calibrated damage that corresponds with the quasi-static discrete load spectra satisfies the safety requirements of bogie frames.
Pridemore, William Alex; Chamlin, Mitchell B.; Cochran, John K.
2009-01-01
The dissolution of the Soviet Union resulted in sudden, widespread, and fundamental changes to Russian society. The former social welfare system-with its broad guarantees of employment, healthcare, education, and other forms of social support-was dismantled in the shift toward democracy, rule of law, and a free-market economy. This unique natural experiment provides a rare opportunity to examine the potentially disintegrative effects of rapid social change on deviance, and thus to evaluate one of Durkheim's core tenets. We took advantage of this opportunity by performing interrupted time-series analyses of annual age-adjusted homicide, suicide, and alcohol-related mortality rates for the Russian Federation using data from 1956 to 2002, with 1992-2002 as the postintervention time-frame. The ARIMA models indicate that, controlling for the long-term processes that generated these three time series, the breakup of the Soviet Union was associated with an appreciable increase in each of the cause-of-death rates. We interpret these findings as being consistent with the Durkheimian hypothesis that rapid social change disrupts social order, thereby increasing the level of crime and deviance. PMID:20165565
NASA Astrophysics Data System (ADS)
Yu, Hongjuan; Guo, Jinyun; Kong, Qiaoli; Chen, Xiaodong
2018-04-01
The static observation data from a relative gravimeter contain noise and signals such as gravity tides. This paper focuses on the extraction of the gravity tides from the static relative gravimeter data for the first time applying the combined method of empirical mode decomposition (EMD) and independent component analysis (ICA), called the EMD-ICA method. The experimental results from the CG-5 gravimeter (SCINTREX Limited Ontario Canada) data show that the gravity tides time series derived by EMD-ICA are consistent with the theoretical reference (Longman formula) and the RMS of their differences only reaches 4.4 μGal. The time series of the gravity tides derived by EMD-ICA have a strong correlation with the theoretical time series and the correlation coefficient is greater than 0.997. The accuracy of the gravity tides estimated by EMD-ICA is comparable to the theoretical model and is slightly higher than that of independent component analysis (ICA). EMD-ICA could overcome the limitation of ICA having to process multiple observations and slightly improve the extraction accuracy and reliability of gravity tides from relative gravimeter data compared to that estimated with ICA.
Modeling Polio Data Using the First Order Non-Negative Integer-Valued Autoregressive, INAR(1), Model
NASA Astrophysics Data System (ADS)
Vazifedan, Turaj; Shitan, Mahendran
Time series data may consists of counts, such as the number of road accidents, the number of patients in a certain hospital, the number of customers waiting for service at a certain time and etc. When the value of the observations are large it is usual to use Gaussian Autoregressive Moving Average (ARMA) process to model the time series. However if the observed counts are small, it is not appropriate to use ARMA process to model the observed phenomenon. In such cases we need to model the time series data by using Non-Negative Integer valued Autoregressive (INAR) process. The modeling of counts data is based on the binomial thinning operator. In this paper we illustrate the modeling of counts data using the monthly number of Poliomyelitis data in United States between January 1970 until December 1983. We applied the AR(1), Poisson regression model and INAR(1) model and the suitability of these models were assessed by using the Index of Agreement(I.A.). We found that INAR(1) model is more appropriate in the sense it had a better I.A. and it is natural since the data are counts.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-11
... by significant amounts in a very short time period before suddenly reversing to prices consistent... circuit breaker pilot program, which was implemented through a series of rule filings by the equity exchanges and by FINRA.\\6\\ The single-stock circuit breaker was designed to reduce extraordinary market...
NASA Astrophysics Data System (ADS)
Geiger, Tobias
2018-04-01
Gross domestic product (GDP) represents a widely used metric to compare economic development across time and space. GDP estimates have been routinely assembled only since the beginning of the second half of the 20th century, making comparisons with prior periods cumbersome or even impossible. In recent years various efforts have been put forward to re-estimate national GDP for specific years in the past centuries and even millennia, providing new insights into past economic development on a snapshot basis. In order to make this wealth of data utilizable across research disciplines, we here present a first continuous and consistent data set of GDP time series for 195 countries from 1850 to 2009, based mainly on data from the Maddison Project and other population and GDP sources. The GDP data are consistent with Penn World Tables v8.1 and future GDP projections from the Shared Socio-economic Pathways (SSPs), and are freely available at http://doi.org/10.5880/pik.2018.010 (Geiger and Frieler, 2018). To ease usability, we additionally provide GDP per capita data and further supplementary and data description files in the online archive. We utilize various methods to handle missing data and discuss the advantages and limitations of our methodology. Despite known shortcomings this data set provides valuable input, e.g., for climate impact research, in order to consistently analyze economic impacts from pre-industrial times to the future.
Post-Flight Estimation of Motion of Space Structures: Part 2
NASA Technical Reports Server (NTRS)
Brugarolas, Paul; Breckenridge, William
2008-01-01
A computer program related to the one described in the immediately preceding article estimates the relative position of two space structures that are hinged to each other. The input to the program consists of time-series data on distances, measured by two range finders at different positions on one structure, to a corner-cube retroreflector on the other structure. Given a Cartesian (x,y,z) coordinate system and the known x coordinate of the retroreflector relative to the y,z plane that contains the range finders, the program estimates the y and z coordinates of the retroreflector. The estimation process involves solving for the y,z coordinates of the intersection between (1) the y,z plane that contains the retroreflector and (2) spheres, centered on the range finders, having radii equal to the measured distances. In general, there are two such solutions and the program chooses the one consistent with the design of the structures. The program implements a Kalman filter. The output of the program is a time series of estimates of the relative position of the structures.
Groen, Thomas A; L'Ambert, Gregory; Bellini, Romeo; Chaskopoulou, Alexandra; Petric, Dusan; Zgomba, Marija; Marrama, Laurence; Bicout, Dominique J
2017-10-26
Culex pipiens is the major vector of West Nile virus in Europe, and is causing frequent outbreaks throughout the southern part of the continent. Proper empirical modelling of the population dynamics of this species can help in understanding West Nile virus epidemiology, optimizing vector surveillance and mosquito control efforts. But modelling results may differ from place to place. In this study we look at which type of models and weather variables can be consistently used across different locations. Weekly mosquito trap collections from eight functional units located in France, Greece, Italy and Serbia for several years were combined. Additionally, rainfall, relative humidity and temperature were recorded. Correlations between lagged weather conditions and Cx. pipiens dynamics were analysed. Also seasonal autoregressive integrated moving-average (SARIMA) models were fitted to describe the temporal dynamics of Cx. pipiens and to check whether the weather variables could improve these models. Correlations were strongest between mean temperatures at short time lags, followed by relative humidity, most likely due to collinearity. Precipitation alone had weak correlations and inconsistent patterns across sites. SARIMA models could also make reasonable predictions, especially when longer time series of Cx. pipiens observations are available. Average temperature was a consistently good predictor across sites. When only short time series (~ < 4 years) of observations are available, average temperature can therefore be used to model Cx. pipiens dynamics. When longer time series (~ > 4 years) are available, SARIMAs can provide better statistical descriptions of Cx. pipiens dynamics, without the need for further weather variables. This suggests that density dependence is also an important determinant of Cx. pipiens dynamics.
A large set of potential past, present and future hydro-meteorological time series for the UK
NASA Astrophysics Data System (ADS)
Guillod, Benoit P.; Jones, Richard G.; Dadson, Simon J.; Coxon, Gemma; Bussi, Gianbattista; Freer, James; Kay, Alison L.; Massey, Neil R.; Sparrow, Sarah N.; Wallom, David C. H.; Allen, Myles R.; Hall, Jim W.
2018-01-01
Hydro-meteorological extremes such as drought and heavy precipitation can have large impacts on society and the economy. With potentially increasing risks associated with such events due to climate change, properly assessing the associated impacts and uncertainties is critical for adequate adaptation. However, the application of risk-based approaches often requires large sets of extreme events, which are not commonly available. Here, we present such a large set of hydro-meteorological time series for recent past and future conditions for the United Kingdom based on weather@home 2, a modelling framework consisting of a global climate model (GCM) driven by observed or projected sea surface temperature (SST) and sea ice which is downscaled to 25 km over the European domain by a regional climate model (RCM). Sets of 100 time series are generated for each of (i) a historical baseline (1900-2006), (ii) five near-future scenarios (2020-2049) and (iii) five far-future scenarios (2070-2099). The five scenarios in each future time slice all follow the Representative Concentration Pathway 8.5 (RCP8.5) and sample the range of sea surface temperature and sea ice changes from CMIP5 (Coupled Model Intercomparison Project Phase 5) models. Validation of the historical baseline highlights good performance for temperature and potential evaporation, but substantial seasonal biases in mean precipitation, which are corrected using a linear approach. For extremes in low precipitation over a long accumulation period ( > 3 months) and shorter-duration high precipitation (1-30 days), the time series generally represents past statistics well. Future projections show small precipitation increases in winter but large decreases in summer on average, leading to an overall drying, consistently with the most recent UK Climate Projections (UKCP09) but larger in magnitude than the latter. Both drought and high-precipitation events are projected to increase in frequency and intensity in most regions, highlighting the need for appropriate adaptation measures. Overall, the presented dataset is a useful tool for assessing the risk associated with drought and more generally with hydro-meteorological extremes in the UK.
Reliability models applicable to space telescope solar array assembly system
NASA Technical Reports Server (NTRS)
Patil, S. A.
1986-01-01
A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.
NASA Astrophysics Data System (ADS)
Staniec, Allison; Vlahos, Penny
2017-12-01
Long-term time series represent a critical part of the oceanographic community's efforts to discern natural and anthropogenically forced variations in the environment. They provide regular measurements of climate relevant indicators including temperature, oxygen concentrations, and salinity. When evaluating time series, it is essential to isolate long-term trends from autocorrelation in data and noise due to natural variability. Herein we apply a statistical approach, well-established in atmospheric time series, to key parameters in the U.S. east coast's Long Island Sound estuary (LIS). Analysis shows that the LIS time series (established in the early 1990s) is sufficiently long to detect significant trends in physical-chemical parameters including temperature (T) and dissolved oxygen (DO). Over the last two decades, overall (combined surface and deep) LIS T has increased at an average rate of 0.08 ± 0.03 °C yr-1 while overall DO has dropped at an average rate of 0.03 ± 0.01 mg L-1yr-1 since 1994 at the 95% confidence level. This trend is notably faster than the global open ocean T trend (0.01 °C yr-1), as might be expected for a shallower estuarine system. T and DO trends were always significant for the existing time series using four month data increments. Rates of change of DO and T in LIS are strongly correlated and the rate of decrease of DO concentrations is consistent with the expected reduced solubility of DO at these higher temperatures. Thus, changes in T alone, across decadal timescales can account for between 33 and 100% of the observed decrease in DO. This has significant implications for other dissolved gases and the long-term management of LIS hypoxia.
Explosive Infrasonic Events: Sensor Comparison Experiment (SCE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnurr, J. M.; Garces, M.; Rodgers, A. J.
SCE (sensor comparison experiment) 1 through 4 consists of a series of four controlled above-ground explosions designed to provide new data for overpressure propagation. Infrasound data were collected by LLNL iPhones and other sensors. Origin times, locations HOB, and yields are not being released at this time and are therefore not included in this report. This preliminary report will be updated as access to additional data changes, or instrument responses are determined.
Moran, John L; Solomon, Patricia J
2013-05-24
Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Monthly mean raw mortality (at hospital discharge) time series, 1995-2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) "in-control" status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues.
NASA Astrophysics Data System (ADS)
Rodriguez, Nicolas B.; McGuire, Kevin J.; Klaus, Julian
2017-04-01
Transit time distributions, residence time distributions and StorAge Selection functions are fundamental integrated descriptors of water storage, mixing, and release in catchments. In this contribution, we determined these time-variant functions in four neighboring forested catchments in H.J. Andrews Experimental Forest, Oregon, USA by employing a two year time series of 18O in precipitation and discharge. Previous studies in these catchments assumed stationary, exponentially distributed transit times, and complete mixing/random sampling to explore the influence of various catchment properties on the mean transit time. Here we relaxed such assumptions to relate transit time dynamics and the variability of StoreAge Selection functions to catchment characteristics, catchment storage, and meteorological forcing seasonality. Conceptual models of the catchments, consisting of two reservoirs combined in series-parallel, were calibrated to discharge and stable isotope tracer data. We assumed randomly sampled/fully mixed conditions for each reservoir, which resulted in an incompletely mixed system overall. Based on the results we solved the Master Equation, which describes the dynamics of water ages in storage and in catchment outflows Consistent between all catchments, we found that transit times were generally shorter during wet periods, indicating the contribution of shallow storage (soil, saprolite) to discharge. During extended dry periods, transit times increased significantly indicating the contribution of deeper storage (bedrock) to discharge. Our work indicated that the strong seasonality of precipitation impacted transit times by leading to a dynamic selection of stored water ages, whereas catchment size was not a control on transit times. In general this work showed the usefulness of using time-variant transit times with conceptual models and confirmed the existence of the catchment age mixing behaviors emerging from other similar studies.
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
Davis, Lindsay E
2014-12-15
To utilize a skills-based workshop series to develop pharmacy students' drug information, writing, critical-thinking, and evaluation skills during the final didactic year of training. A workshop series was implemented to focus on written (researched) responses to drug information questions. These workshops used blinded peer-grading to facilitate timely feedback and strengthen assessment skills. Each workshop was aligned to the didactic coursework content to complement and extend learning, while bridging and advancing research, writing, and critical thinking skills. Attainment of knowledge and skills was assessed by rubric-facilitated peer grades, faculty member grading, peer critique, and faculty member-guided discussion of drug information responses. Annual instructor and course evaluations consistently revealed favorable student feedback regarding workshop value. A drug information workshop series using peer-grading as the primary assessment tool was successfully implemented and was well received by pharmacy students.
Jehu, Deborah A; Lajoie, Yves; Paquet, Nicole
2017-12-21
The purpose of this study was to investigate obstacle clearance and reaction time parameters when crossing a series of six obstacles in older adults. A second aim was to examine the repeated exposure of this testing protocol once per week for 5 weeks. In total, 10 older adults (five females; age: 67.0 ± 6.9 years) walked onto and over six obstacles of varying heights (range: 100-200 mm) while completing no reaction time, simple reaction time, and choice reaction time tasks once per week for 5 weeks. The highest obstacles elicited the lowest toe clearance, and the first three obstacles revealed smaller heel clearance compared with the last three obstacles. Dual tasking negatively impacted obstacle clearance parameters when information processing demands were high. Longer and less consistent time to completion was observed in Session 1 compared with Sessions 2-5. Finally, improvements in simple reaction time were displayed after Session 2, but choice reaction time gradually improved and did not reach a plateau after repeated testing.
Lippmann, M.
1964-04-01
A cascade particle impactor capable of collecting particles and distributing them according to size is described. In addition the device is capable of collecting on a pair of slides a series of different samples so that less time is required for the changing of slides. Other features of the device are its compactness and its ruggedness making it useful under field conditions. Essentially the unit consists of a main body with a series of transverse jets discharging on a pair of parallel, spaced glass plates. The plates are capable of being moved incremental in steps to obtain the multiple samples. (AEC)
NASA Astrophysics Data System (ADS)
Lacaze, Roselyne; Smets, Bruno; Calvet, Jean-Christophe; Camacho, Fernando; Swinnen, Else; Verger, Aleixandre
2017-04-01
The Global component of the Copernicus Land Monitoring Service (CGLS) provides continuously a set of bio-geophysical variables describing the dynamics of vegetation, the energy budget at the continental surface, the water cycle and the cryosphere. Products are generated on a reliable and automatic basis from Earth Observation satellite data, at a frequency ranging from one hour to 10 days. They are accessible free of charge through the GCLS website (http://land.copernicus.eu/global/), associated with documentation describing the physical methodologies, the technical properties of products, and the quality of variables based on the results of validation exercises. The portfolio of the CGLS contains some Essential Climate Variables (ECV) like the Leaf Area Index (LAI), the Fraction of PAR absorbed by the vegetation (FAPAR), the surface albedo, and additional vegetation indices. These products were derived from SPOT/VEGETATION sensor data till December 2013, are currently derived from PROBA-V sensor data, and will be derived in the future from Sentinel-3 data. This talk will show how challenging is the transition between sensors to ensure the sustainability of the production while keeping the consistency of the time series. We will discuss the various sources of differences from input data, the impact of these differences on the biophysical variables and, in turn, on some final users' applications as such those based upon anomalies or assimilation of time series. We will present the mitigation measures taken to reduce as much as possible this impact. We will conclude with the lessons learnt and how this experience will be exploited to manage the transition towards Sentinel-3.
NASA Technical Reports Server (NTRS)
Douglass, A. R.; Schoeberl, M. R.; Kawa, S. R.; Browell, E. V.
2000-01-01
The processes which contribute to the ozone evolution in the high latitude northern lower stratosphere are evaluated using a three dimensional model simulation and ozone observations. The model uses winds and temperatures from the Goddard Earth Observing System Data Assimilation System. The simulation results are compared with ozone observations from three platforms: the differential absorption lidar (DIAL) which was flown on the NASA DC-8 as part of the Vortex Ozone Transport Experiment; the Microwave Limb Sounder (MLS); the Polar Ozone and Aerosol Measurement (POAM II) solar occultation instrument. Time series for the different data sets are consistent with each other, and diverge from model time series during December and January. The model ozone in December and January is shown to be much less sensitive to the model photochemistry than to the model vertical transport, which depends on the model vertical motion as well as the model vertical gradient. We evaluate the dependence of model ozone evolution on the model ozone gradient by comparing simulations with different initial conditions for ozone. The modeled ozone throughout December and January most closely resembles observed ozone when the vertical profiles between 12 and 20 km within the polar vortex closely match December DIAL observations. We make a quantitative estimate of the uncertainty in the vertical advection using diabatic trajectory calculations. The net transport uncertainty is significant, and should be accounted for when comparing observations with model ozone. The observed and modeled ozone time series during December and January are consistent when these transport uncertainties are taken into account.
Statistical inference methods for sparse biological time series data.
Ndukum, Juliet; Fonseca, Luís L; Santos, Helena; Voit, Eberhard O; Datta, Susmita
2011-04-25
Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR) spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values <0.0001). We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures, based on ANOVA likelihood ratio tests, for testing the significance of differences between short time course data under different biological perturbations.
NASA Astrophysics Data System (ADS)
Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol
2005-10-01
Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.
Zhao, Jiang Yan; Xie, Ping; Sang, Yan Fang; Xui, Qiang Qiang; Wu, Zi Yi
2018-04-01
Under the influence of both global climate change and frequent human activities, the variability of second-moment in hydrological time series become obvious, indicating changes in the consistency of hydrological data samples. Therefore, the traditional hydrological series analysis methods, which only consider the variability of mean values, are not suitable for handling all hydrological non-consistency problems. Traditional synthetic duration curve methods for the design of the lowest navigable water level, based on the consistency of samples, would cause more risks to navigation, especially under low water level in dry seasons. Here, we detected both mean variation and variance variation using the hydrological variation diagnosis system. Furthermore, combing the principle of decomposition and composition of time series, we proposed the synthetic duration curve method for designing the lowest navigable water level with inconsistent characters in dry seasons. With the Yunjinghong Station in the Lancang River Basin as an example, we analyzed its designed water levels in the present, the distant past and the recent past, as well as the differences among three situations (i.e., considering second moment variation, only considering mean variation, not considering any variation). Results showed that variability of the second moment changed the trend of designed water levels alteration in the Yunjinghong Station. When considering the first two moments or just considering the mean variation, the difference ofdesigned water levels was as bigger as -1.11 m. When considering the first two moments or not, the difference of designed water levels was as bigger as -1.01 m. Our results indicated the strong effects of variance variation on the designed water levels, and highlighted the importance of the second moment variation analysis for the channel planning and design.
Understanding Human Motion Skill with Peak Timing Synergy
NASA Astrophysics Data System (ADS)
Ueno, Ken; Furukawa, Koichi
The careful observation of motion phenomena is important in understanding the skillful human motion. However, this is a difficult task due to the complexities in timing when dealing with the skilful control of anatomical structures. To investigate the dexterity of human motion, we decided to concentrate on timing with respect to motion, and we have proposed a method to extract the peak timing synergy from multivariate motion data. The peak timing synergy is defined as a frequent ordered graph with time stamps, which has nodes consisting of turning points in motion waveforms. A proposed algorithm, PRESTO automatically extracts the peak timing synergy. PRESTO comprises the following 3 processes: (1) detecting peak sequences with polygonal approximation; (2) generating peak-event sequences; and (3) finding frequent peak-event sequences using a sequential pattern mining method, generalized sequential patterns (GSP). Here, we measured right arm motion during the task of cello bowing and prepared a data set of the right shoulder and arm motion. We successfully extracted the peak timing synergy on cello bowing data set using the PRESTO algorithm, which consisted of common skills among cellists and personal skill differences. To evaluate the sequential pattern mining algorithm GSP in PRESTO, we compared the peak timing synergy by using GSP algorithm and the one by using filtering by reciprocal voting (FRV) algorithm as a non time-series method. We found that the support is 95 - 100% in GSP, while 83 - 96% in FRV and that the results by GSP are better than the one by FRV in the reproducibility of human motion. Therefore we show that sequential pattern mining approach is more effective to extract the peak timing synergy than non-time series analysis approach.
A multiple-fan active control wind tunnel for outdoor wind speed and direction simulation
NASA Astrophysics Data System (ADS)
Wang, Jia-Ying; Meng, Qing-Hao; Luo, Bing; Zeng, Ming
2018-03-01
This article presents a new type of active controlled multiple-fan wind tunnel. The wind tunnel consists of swivel plates and arrays of direct current fans, and the rotation speed of each fan and the shaft angle of each swivel plate can be controlled independently for simulating different kinds of outdoor wind fields. To measure the similarity between the simulated wind field and the outdoor wind field, wind speed and direction time series of two kinds of wind fields are recorded by nine two-dimensional ultrasonic anemometers, and then statistical properties of the wind signals in different time scales are analyzed based on the empirical mode decomposition. In addition, the complexity of wind speed and direction time series is also investigated using multiscale entropy and multivariate multiscale entropy. Results suggest that the simulated wind field in the multiple-fan wind tunnel has a high degree of similarity with the outdoor wind field.
Using exogenous variables in testing for monotonic trends in hydrologic time series
Alley, William M.
1988-01-01
One approach that has been used in performing a nonparametric test for monotonic trend in a hydrologic time series consists of a two-stage analysis. First, a regression equation is estimated for the variable being tested as a function of an exogenous variable. A nonparametric trend test such as the Kendall test is then performed on the residuals from the equation. By analogy to stagewise regression and through Monte Carlo experiments, it is demonstrated that this approach will tend to underestimate the magnitude of the trend and to result in some loss in power as a result of ignoring the interaction between the exogenous variable and time. An alternative approach, referred to as the adjusted variable Kendall test, is demonstrated to generally have increased statistical power and to provide more reliable estimates of the trend slope. In addition, the utility of including an exogenous variable in a trend test is examined under selected conditions.
Fractal structure of the interplanetary magnetic field
NASA Technical Reports Server (NTRS)
Burlaga, L. F.; Klein, L. W.
1985-01-01
Under some conditions, time series of the interplanetary magnetic field strength and components have the properties of fractal curves. Magnetic field measurements made near 8.5 AU by Voyager 2 from June 5 to August 24, 1981 were self-similar over time scales from approximately 20 sec to approximately 3 x 100,000 sec, and the fractal dimension of the time series of the strength and components of the magnetic field was D = 5/3, corresponding to a power spectrum P(f) approximately f sup -5/3. Since the Kolmogorov spectrum for homogeneous, isotropic, stationary turbulence is also f sup -5/3, the Voyager 2 measurements are consistent with the observation of an inertial range of turbulence extending over approximately four decades in frequency. Interaction regions probably contributed most of the power in this interval. As an example, one interaction region is discussed in which the magnetic field had a fractal dimension D = 5/3.
NASA Astrophysics Data System (ADS)
Heinemeier, Jan; Jungner, Högne; Lindroos, Alf; Ringbom, Åsa; von Konow, Thorborg; Rud, Niels
1997-03-01
A method for refining lime mortar samples for 14C dating has been developed. It includes mechanical and chemical separation of mortar carbonate with optical control of the purity of the samples. The method has been applied to a large series of AMS datings on lime mortar from three medieval churches on the Åland Islands, Finland. The datings show convincing internal consistency and confine the construction time of the churches to AD 1280-1380 with a most probable date just before AD 1300. We have also applied the method to the controversial Newport Tower, Rhode Island, USA. Our mortar datings confine the building to colonial time in the 17th century and thus refute claims of Viking origin of the tower. For the churches, a parallel series of datings of organic (charcoal) inclusions in the mortar show less reliable results than the mortar samples, which is ascribed to poor association with the construction time.
ERIC Educational Resources Information Center
Kvetan, Vladimir, Ed.
2014-01-01
Reliable and consistent time series are essential to any kind of economic forecasting. Skills forecasting needs to combine data from national accounts and labour force surveys, with the pan-European dimension of Cedefop's skills supply and demand forecasts, relying on different international classification standards. Sectoral classification (NACE)…
Alternative method to validate the seasonal land cover regions of the conterminous United States
Zhiliang Zhu; Donald O. Ohlen; Raymond L. Czaplewski; Robert E. Burgan
1996-01-01
An accuracy assessment method involving double sampling and the multivariate composite estimator has been used to validate the prototype seasonal land cover characteristics database of the conterminous United States. The database consists of 159 land cover classes, classified using time series of 1990 1-km satellite data and augmented with ancillary data including...
New Ways in Teaching Young Children. New Ways in TESOL Series II. Innovative Classroom Techniques.
ERIC Educational Resources Information Center
Schinke-Llano, Linda, Ed.; Rauff, Rebecca, Ed.
The collection of class activities for teaching English as a second language (ESL) to young children consists of ideas contributed by classroom teachers. The book is divided into 14 sections: (1) social interaction, including activities ranging from first-time classroom encounters to learning about and working with special-needs children; (2)…
Time Series Evaluation of Race Relations Improvement.
1985-07-01
Perceiving sexism is not equivalent to *perceiving racism . Throughout the project, our consistent *" finding has been that white women perceive racial...number) perceived racism , upward mobility, personnel committees, intergroup theory , dialectical conflict, team development, archival information 20. A...eleven year period to determine whether the race relations improvement program was associated with changes in perceived racism , mobility patterns, and
EDUCATIONAL TELEVISION IN THE SMALL SCHOOL.
ERIC Educational Resources Information Center
LEDFORD, LOWELL E.
HENSLEY ELEMENTARY SCHOOL, CONSISTING OF 72 STUDENTS AND 3 TEACHERS, HAS INCORPORATED 12 EDUCATIONAL TELEVISION PROGRAMS AS A REGULAR PART OF THE CURRICULUM IN THE FIRST 6 GRADES. GRADES 1 AND 2 VIEWED PROGRAMS IN SCIENCE, SPEECH, ART, MUSIC, AND STORY TIME. GRADES 3 AND 4 VIEWED SERIES IN MUSIC, SCIENCE, ART, AND SPEECH, WHILE GRADES 5 AND 6 WERE…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-11
... time period before suddenly reversing to prices consistent with their pre-decline levels.\\5\\ This... implemented through a series of rule filings by the equity exchanges and by FINRA.\\6\\ The single-stock circuit breaker was designed to reduce extraordinary market volatility in NMS stocks by imposing a five-minute...
Depletions in winter total ozone values over southern England
NASA Technical Reports Server (NTRS)
Lapworth, A.
1994-01-01
A study has been made of the recently re-evaluated time series of daily total ozone values for the period 1979 to 1992 for southern England. The series consists of measurements made at two stations, Bracknell and Camborne. The series shows a steady decline in ozone values in the spring months over the period, and this is consistent with data from an earlier decade that has been published but not re-evaluated. Of exceptional note is the monthly mean for January 1992 which was very significantly reduced from the normal value, and was the lowest so far measured for this month. This winter was also noteworthy for a prolonged period during which a blocking anticyclone dominated the region, and the possibility existed that this was related to the ozone anomaly. It was possible to determine whether the origin of the low ozone value lay in ascending stratospheric motions. A linear regression analysis of ozone value deviation against 100hPa temperature deviations was used to reduce ozone values to those expected in the absence of high pressure. The assumption was made that the normal regression relation was not affected by atmospheric anomalies during the winter. This showed that vertical motions in the stratosphere only accounted for part of the ozone anomaly and that the main cause of the ozone deficit lay either in a reduced stratospheric circulation to which the anticyclone may be related or in chemical effects in the reduced stratospheric temperatures above the high pressure area. A study of the ozone time series adjusted to remove variations correlated with meteorological quantities, showed that during the period since 1979, one other winter, that of 1982/3, showed a similar although less well defined deficit in total ozone values.
Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu
2015-10-14
Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations.
Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu
2015-01-01
Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations. PMID:26473882
NASA Astrophysics Data System (ADS)
Lynne, Bridget Y.; Heasler, Henry; Jaworowski, Cheryl; Smith, Gary J.; Smith, Isaac J.; Foley, Duncan
2018-04-01
In April 2015, Ground Penetrating Radar (GPR) was used to characterize the shallow subsurface (< 5 m depth) of the western sinter slope immediately adjacent to Old Faithful Geyser and near the north side of an inferred geyser cavity. A series of time-sequence images were collected between two eruptions of Old Faithful Geyser. Each set of time-sequence GPR recordings consisted of four transects aligned to provide coverage near the potential location of the inferred 15 m deep geyser chamber. However, the deepest penetration we could achieve with a 200 MHz GPR antennae was 5 m. Seven time-sequence events were collected over a 48-minute interval to image changes in the near-surface, during pre- and post-eruptive cycles. Time-sequence GPR images revealed a series of possible micro-fractures in a highly porous siliceous sinter in the near-surface that fill and drain repetitively, immediately after an eruption and during the recharge period prior to the next main eruptive event.
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
NASA Astrophysics Data System (ADS)
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.
Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C
2003-12-01
The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. ((c) 2003 APA, all rights reserved)
Phase correlation of foreign exchange time series
NASA Astrophysics Data System (ADS)
Wu, Ming-Chya
2007-03-01
Correlation of foreign exchange rates in currency markets is investigated based on the empirical data of USD/DEM and USD/JPY exchange rates for a period from February 1 1986 to December 31 1996. The return of exchange time series is first decomposed into a number of intrinsic mode functions (IMFs) by the empirical mode decomposition method. The instantaneous phases of the resultant IMFs calculated by the Hilbert transform are then used to characterize the behaviors of pricing transmissions, and the correlation is probed by measuring the phase differences between two IMFs in the same order. From the distribution of phase differences, our results show explicitly that the correlations are stronger in daily time scale than in longer time scales. The demonstration for the correlations in periods of 1986-1989 and 1990-1993 indicates two exchange rates in the former period were more correlated than in the latter period. The result is consistent with the observations from the cross-correlation calculation.
On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series
NASA Astrophysics Data System (ADS)
Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman
2016-04-01
The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for different types of noise are determined. Furthermore, we have selected 40 globally distributed stations that have a clear non-linear behaviour from two different International GNSS Service (IGS) analysis centers: JPL (Jet Propulsion Laboratory) and BLT (British Isles continuous GNSS Facility and University of Luxembourg Tide Gauge Benchmark Monitoring (TIGA) Analysis Center). We obtained maximum accelerations of -1.8±1.2 mm2/y and -4.5±3.3 mm2/y for the horizontal and vertical components, respectively. The noise analysis tests have shown that the addition of the non-linear term has significantly whitened the power spectra of the position time series, i.e. shifted the spectral index from flicker towards white noise.
Feng, Zhujing; Schilling, Keith E; Chan, Kung-Sik
2013-06-01
Nitrate-nitrogen concentrations in rivers represent challenges for water supplies that use surface water sources. Nitrate concentrations are often modeled using time-series approaches, but previous efforts have typically relied on monthly time steps. In this study, we developed a dynamic regression model of daily nitrate concentrations in the Raccoon River, Iowa, that incorporated contemporaneous and lags of precipitation and discharge occurring at several locations around the basin. Results suggested that 95 % of the variation in daily nitrate concentrations measured at the outlet of a large agricultural watershed can be explained by time-series patterns of precipitation and discharge occurring in the basin. Discharge was found to be a more important regression variable than precipitation in our model but both regression parameters were strongly correlated with nitrate concentrations. The time-series model was consistent with known patterns of nitrate behavior in the watershed, successfully identifying contemporaneous dilution mechanisms from higher relief and urban areas of the basin while incorporating the delayed contribution of nitrate from tile-drained regions in a lagged response. The first difference of the model errors were modeled as an AR(16) process and suggest that daily nitrate concentration changes remain temporally correlated for more than 2 weeks although temporal correlation was stronger in the first few days before tapering off. Consequently, daily nitrate concentrations are non-stationary, i.e. of strong memory. Using time-series models to reliably forecast daily nitrate concentrations in a river based on patterns of precipitation and discharge occurring in its basin may be of great interest to water suppliers.
Discovering significant evolution patterns from satellite image time series.
Petitjean, François; Masseglia, Florent; Gançarski, Pierre; Forestier, Germain
2011-12-01
Satellite Image Time Series (SITS) provide us with precious information on land cover evolution. By studying these series of images we can both understand the changes of specific areas and discover global phenomena that spread over larger areas. Changes that can occur throughout the sensing time can spread over very long periods and may have different start time and end time depending on the location, which complicates the mining and the analysis of series of images. This work focuses on frequent sequential pattern mining (FSPM) methods, since this family of methods fits the above-mentioned issues. This family of methods consists of finding the most frequent evolution behaviors, and is actually able to extract long-term changes as well as short term ones, whenever the change may start and end. However, applying FSPM methods to SITS implies confronting two main challenges, related to the characteristics of SITS and the domain's constraints. First, satellite images associate multiple measures with a single pixel (the radiometric levels of different wavelengths corresponding to infra-red, red, etc.), which makes the search space multi-dimensional and thus requires specific mining algorithms. Furthermore, the non evolving regions, which are the vast majority and overwhelm the evolving ones, challenge the discovery of these patterns. We propose a SITS mining framework that enables discovery of these patterns despite these constraints and characteristics. Our proposal is inspired from FSPM and provides a relevant visualization principle. Experiments carried out on 35 images sensed over 20 years show the proposed approach makes it possible to extract relevant evolution behaviors.
Use of age-adjusted rates of suicide in time series studies in Israel.
Bridges, F Stephen; Tankersley, William B
2009-01-01
Durkheim's modified theory of suicide was examined to explore how consistent it was in predicting Israeli rates of suicide from 1965 to 1997 when using age-adjusted rates rather than crude ones. In this time-series study, Israeli male and female rates of suicide increased and decreased, respectively, between 1965 and 1997. Conforming to Durkheim's modified theory, the Israeli male rate of suicide was lower in years when rates of marriage and birth are higher, while rates of suicide are higher in years when rates of divorce are higher, the opposite to that of Israeli women. The corrected regression coefficients suggest that the Israeli female rate of suicide remained lower in years when rate of divorce is higher, again the opposite suggested by Durkheim's modified theory. These results may indicate that divorce affects the mental health of Israeli women as suggested by their lower rate of suicide. Perhaps the "multiple roles held by Israeli females creates suicidogenic stress" and divorce provides some sense of stress relief, mentally speaking. The results were not as consistent with predictions suggested by Durkheim's modified theory of suicide as were rates from the United States for the same period nor were they consistent with rates based on "crude" suicide data. Thus, using age-adjusted rates of suicide had an influence on the prediction of the Israeli rate of suicide during this period.
Pouch, Alison M; Aly, Ahmed H; Lai, Eric K; Yushkevich, Natalie; Stoffers, Rutger H; Gorman, Joseph H; Cheung, Albert T; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A
2017-09-01
Transesophageal echocardiography is the primary imaging modality for preoperative assessment of mitral valves with ischemic mitral regurgitation (IMR). While there are well known echocardiographic insights into the 3D morphology of mitral valves with IMR, such as annular dilation and leaflet tethering, less is understood about how quantification of valve dynamics can inform surgical treatment of IMR or predict short-term recurrence of the disease. As a step towards filling this knowledge gap, we present a novel framework for 4D segmentation and geometric modeling of the mitral valve in real-time 3D echocardiography (rt-3DE). The framework integrates multi-atlas label fusion and template-based medial modeling to generate quantitatively descriptive models of valve dynamics. The novelty of this work is that temporal consistency in the rt-3DE segmentations is enforced during both the segmentation and modeling stages with the use of groupwise label fusion and Kalman filtering. The algorithm is evaluated on rt-3DE data series from 10 patients: five with normal mitral valve morphology and five with severe IMR. In these 10 data series that total 207 individual 3DE images, each 3DE segmentation is validated against manual tracing and temporal consistency between segmentations is demonstrated. The ultimate goal is to generate accurate and consistent representations of valve dynamics that can both visually and quantitatively provide insight into normal and pathological valve function.
Discovering monotonic stemness marker genes from time-series stem cell microarray data.
Wang, Hsei-Wei; Sun, Hsing-Jen; Chang, Ting-Yu; Lo, Hung-Hao; Cheng, Wei-Chung; Tseng, George C; Lin, Chin-Teng; Chang, Shing-Jyh; Pal, Nikhil; Chung, I-Fang
2015-01-01
Identification of genes with ascending or descending monotonic expression patterns over time or stages of stem cells is an important issue in time-series microarray data analysis. We propose a method named Monotonic Feature Selector (MFSelector) based on a concept of total discriminating error (DEtotal) to identify monotonic genes. MFSelector considers various time stages in stage order (i.e., Stage One vs. other stages, Stages One and Two vs. remaining stages and so on) and computes DEtotal of each gene. MFSelector can successfully identify genes with monotonic characteristics. We have demonstrated the effectiveness of MFSelector on two synthetic data sets and two stem cell differentiation data sets: embryonic stem cell neurogenesis (ESCN) and embryonic stem cell vasculogenesis (ESCV) data sets. We have also performed extensive quantitative comparisons of the three monotonic gene selection approaches. Some of the monotonic marker genes such as OCT4, NANOG, BLBP, discovered from the ESCN dataset exhibit consistent behavior with that reported in other studies. The role of monotonic genes found by MFSelector in either stemness or differentiation is validated using information obtained from Gene Ontology analysis and other literature. We justify and demonstrate that descending genes are involved in the proliferation or self-renewal activity of stem cells, while ascending genes are involved in differentiation of stem cells into variant cell lineages. We have developed a novel system, easy to use even with no pre-existing knowledge, to identify gene sets with monotonic expression patterns in multi-stage as well as in time-series genomics matrices. The case studies on ESCN and ESCV have helped to get a better understanding of stemness and differentiation. The novel monotonic marker genes discovered from a data set are found to exhibit consistent behavior in another independent data set, demonstrating the utility of the proposed method. The MFSelector R function and data sets can be downloaded from: http://microarray.ym.edu.tw/tools/MFSelector/.
van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-08-07
Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.
Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-01-01
Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
Fractal analysis of GPS time series for early detection of disastrous seismic events
NASA Astrophysics Data System (ADS)
Filatov, Denis M.; Lyubushin, Alexey A.
2017-03-01
A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.
Modeling time-series data from microbial communities.
Ridenhour, Benjamin J; Brooker, Sarah L; Williams, Janet E; Van Leuven, James T; Miller, Aaron W; Dearing, M Denise; Remien, Christopher H
2017-11-01
As sequencing technologies have advanced, the amount of information regarding the composition of bacterial communities from various environments (for example, skin or soil) has grown exponentially. To date, most work has focused on cataloging taxa present in samples and determining whether the distribution of taxa shifts with exogenous covariates. However, important questions regarding how taxa interact with each other and their environment remain open thus preventing in-depth ecological understanding of microbiomes. Time-series data from 16S rDNA amplicon sequencing are becoming more common within microbial ecology, but methods to infer ecological interactions from these longitudinal data are limited. We address this gap by presenting a method of analysis using Poisson regression fit with an elastic-net penalty that (1) takes advantage of the fact that the data are time series; (2) constrains estimates to allow for the possibility of many more interactions than data; and (3) is scalable enough to handle data consisting of thousands of taxa. We test the method on gut microbiome data from white-throated woodrats (Neotoma albigula) that were fed varying amounts of the plant secondary compound oxalate over a period of 22 days to estimate interactions between OTUs and their environment.
Long-Range Correlations and Memory in the Dynamics of Internet Interdomain Routing
Havlin, Shlomo; Krioukov, Dmitri
2015-01-01
Data transfer is one of the main functions of the Internet. The Internet consists of a large number of interconnected subnetworks or domains, known as Autonomous Systems (ASes). Due to privacy and other reasons the information about what route to use to reach devices within other ASes is not readily available to any given AS. The Border Gateway Protocol (BGP) is responsible for discovering and distributing this reachability information to all ASes. Since the topology of the Internet is highly dynamic, all ASes constantly exchange and update this reachability information in small chunks, known as routing control packets or BGP updates. In the view of the quick growth of the Internet there are significant concerns with the scalability of the BGP updates and the efficiency of the BGP routing in general. Motivated by these issues we conduct a systematic time series analysis of BGP update rates. We find that BGP update time series are extremely volatile, exhibit long-term correlations and memory effects, similar to seismic time series, or temperature and stock market price fluctuations. The presented statistical characterization of BGP update dynamics could serve as a basis for validation of existing and developing better models of Internet interdomain routing. PMID:26529312
Long-Range Correlations and Memory in the Dynamics of Internet Interdomain Routing.
Kitsak, Maksim; Elmokashfi, Ahmed; Havlin, Shlomo; Krioukov, Dmitri
2015-01-01
Data transfer is one of the main functions of the Internet. The Internet consists of a large number of interconnected subnetworks or domains, known as Autonomous Systems (ASes). Due to privacy and other reasons the information about what route to use to reach devices within other ASes is not readily available to any given AS. The Border Gateway Protocol (BGP) is responsible for discovering and distributing this reachability information to all ASes. Since the topology of the Internet is highly dynamic, all ASes constantly exchange and update this reachability information in small chunks, known as routing control packets or BGP updates. In the view of the quick growth of the Internet there are significant concerns with the scalability of the BGP updates and the efficiency of the BGP routing in general. Motivated by these issues we conduct a systematic time series analysis of BGP update rates. We find that BGP update time series are extremely volatile, exhibit long-term correlations and memory effects, similar to seismic time series, or temperature and stock market price fluctuations. The presented statistical characterization of BGP update dynamics could serve as a basis for validation of existing and developing better models of Internet interdomain routing.
Schubert, Thomas W; Zickfeld, Janis H; Seibt, Beate; Fiske, Alan Page
2018-02-01
Feeling moved or touched can be accompanied by tears, goosebumps, and sensations of warmth in the centre of the chest. The experience has been described frequently, but psychological science knows little about it. We propose that labelling one's feeling as being moved or touched is a component of a social-relational emotion that we term kama muta (its Sanskrit label). We hypothesise that it is caused by appraising an intensification of communal sharing relations. Here, we test this by investigating people's moment-to-moment reports of feeling moved and touched while watching six short videos. We compare these to six other sets of participants' moment-to-moment responses watching the same videos: respectively, judgements of closeness (indexing communal sharing), reports of weeping, goosebumps, warmth in the centre of the chest, happiness, and sadness. Our eighth time series is expert ratings of communal sharing. Time series analyses show strong and consistent cross-correlations of feeling moved and touched and closeness with each other and with each of the three physiological variables and expert-rated communal sharing - but distinctiveness from happiness and sadness. These results support our model.
A Spatiotemporal Prediction Framework for Air Pollution Based on Deep RNN
NASA Astrophysics Data System (ADS)
Fan, J.; Li, Q.; Hou, J.; Feng, X.; Karimian, H.; Lin, S.
2017-10-01
Time series data in practical applications always contain missing values due to sensor malfunction, network failure, outliers etc. In order to handle missing values in time series, as well as the lack of considering temporal properties in machine learning models, we propose a spatiotemporal prediction framework based on missing value processing algorithms and deep recurrent neural network (DRNN). By using missing tag and missing interval to represent time series patterns, we implement three different missing value fixing algorithms, which are further incorporated into deep neural network that consists of LSTM (Long Short-term Memory) layers and fully connected layers. Real-world air quality and meteorological datasets (Jingjinji area, China) are used for model training and testing. Deep feed forward neural networks (DFNN) and gradient boosting decision trees (GBDT) are trained as baseline models against the proposed DRNN. Performances of three missing value fixing algorithms, as well as different machine learning models are evaluated and analysed. Experiments show that the proposed DRNN framework outperforms both DFNN and GBDT, therefore validating the capacity of the proposed framework. Our results also provides useful insights for better understanding of different strategies that handle missing values.
Persistent topological features of dynamical systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maletić, Slobodan, E-mail: slobodan@hitsz.edu.cn; Institute of Nuclear Sciences Vinča, University of Belgrade, Belgrade; Zhao, Yi, E-mail: zhao.yi@hitsz.edu.cn
Inspired by an early work of Muldoon et al., Physica D 65, 1–16 (1993), we present a general method for constructing simplicial complex from observed time series of dynamical systems based on the delay coordinate reconstruction procedure. The obtained simplicial complex preserves all pertinent topological features of the reconstructed phase space, and it may be analyzed from topological, combinatorial, and algebraic aspects. In focus of this study is the computation of homology of the invariant set of some well known dynamical systems that display chaotic behavior. Persistent homology of simplicial complex and its relationship with the embedding dimensions are examinedmore » by studying the lifetime of topological features and topological noise. The consistency of topological properties for different dynamic regimes and embedding dimensions is examined. The obtained results shed new light on the topological properties of the reconstructed phase space and open up new possibilities for application of advanced topological methods. The method presented here may be used as a generic method for constructing simplicial complex from a scalar time series that has a number of advantages compared to the mapping of the same time series to a complex network.« less
Hierarchical organization of brain functional networks during visual tasks.
Zhuo, Zhao; Cai, Shi-Min; Fu, Zhong-Qian; Zhang, Jie
2011-09-01
The functional network of the brain is known to demonstrate modular structure over different hierarchical scales. In this paper, we systematically investigated the hierarchical modular organizations of the brain functional networks that are derived from the extent of phase synchronization among high-resolution EEG time series during a visual task. In particular, we compare the modular structure of the functional network from EEG channels with that of the anatomical parcellation of the brain cortex. Our results show that the modular architectures of brain functional networks correspond well to those from the anatomical structures over different levels of hierarchy. Most importantly, we find that the consistency between the modular structures of the functional network and the anatomical network becomes more pronounced in terms of vision, sensory, vision-temporal, motor cortices during the visual task, which implies that the strong modularity in these areas forms the functional basis for the visual task. The structure-function relationship further reveals that the phase synchronization of EEG time series in the same anatomical group is much stronger than that of EEG time series from different anatomical groups during the task and that the hierarchical organization of functional brain network may be a consequence of functional segmentation of the brain cortex.
Wong, Raymond
2013-01-01
Voice biometrics is one kind of physiological characteristics whose voice is different for each individual person. Due to this uniqueness, voice classification has found useful applications in classifying speakers' gender, mother tongue or ethnicity (accent), emotion states, identity verification, verbal command control, and so forth. In this paper, we adopt a new preprocessing method named Statistical Feature Extraction (SFX) for extracting important features in training a classification model, based on piecewise transformation treating an audio waveform as a time-series. Using SFX we can faithfully remodel statistical characteristics of the time-series; together with spectral analysis, a substantial amount of features are extracted in combination. An ensemble is utilized in selecting only the influential features to be used in classification model induction. We focus on the comparison of effects of various popular data mining algorithms on multiple datasets. Our experiment consists of classification tests over four typical categories of human voice data, namely, Female and Male, Emotional Speech, Speaker Identification, and Language Recognition. The experiments yield encouraging results supporting the fact that heuristically choosing significant features from both time and frequency domains indeed produces better performance in voice classification than traditional signal processing techniques alone, like wavelets and LPC-to-CC. PMID:24288684
An assessment of optical and biogeochemical multi-decadal trends in the Sargasso Sea
NASA Astrophysics Data System (ADS)
Allen, J. G.; Siegel, D.; Nelson, N. B.
2016-02-01
Observations of optical and biogeochemical data, made as part of the Bermuda Bio-Optics Project (BBOP) at the Bermuda Atlantic Time-series Study (BATS) site in the Sargasso Sea, allow for the examination of temporal trends in vertical light attenuation and their potential controls. Trends in both the magnitude and spectral slope of the diffuse attenuation coefficient should reflect changes in chlorophyll and chromophoric dissolved organic matter (CDOM) concentrations in the Sargasso Sea. The length and methodological consistency of this time series provides an excellent opportunity to extend analyses of seasonal cycles of apparent optical properties to interannual and multi-year time scales. Here, we characterize changes in the size and shape of diffuse attenuation coefficient spectra and compare them to temperature, chlorophyll a concentration, and to discrete measurements of phytoplankton and CDOM absorption. The time series analyses reveal up to a 1.2% annual increase of the magnitude of the diffuse attenuation coefficient over the upper 70 m of the water column while showing no significant change in the spectral slope of diffuse attenuation over the course of the study. These observations indicate that increases in phytoplankton pigment concentration rather than changes in CDOM are the primary driver for the attenuation trends on multi-year timescales for this region.
What does the structure of its visibility graph tell us about the nature of the time series?
NASA Astrophysics Data System (ADS)
Franke, Jasper G.; Donner, Reik V.
2017-04-01
Visibility graphs are a recently introduced method to construct complex network representations based upon univariate time series in order to study their dynamical characteristics [1]. In the last years, this approach has been successfully applied to studying a considerable variety of geoscientific research questions and data sets, including non-trivial temporal patterns in complex earthquake catalogs [2] or time-reversibility in climate time series [3]. It has been shown that several characteristic features of the thus constructed networks differ between stochastic and deterministic (possibly chaotic) processes, which is, however, relatively hard to exploit in the case of real-world applications. In this study, we propose studying two new measures related with the network complexity of visibility graphs constructed from time series, one being a special type of network entropy [4] and the other a recently introduced measure of the heterogeneity of the network's degree distribution [5]. For paradigmatic model systems exhibiting bifurcation sequences between regular and chaotic dynamics, both properties clearly trace the transitions between both types of regimes and exhibit marked quantitative differences for regular and chaotic dynamics. Moreover, for dynamical systems with a small amount of additive noise, the considered properties demonstrate gradual changes prior to the bifurcation point. This finding appears closely related to the subsequent loss of stability of the current state known to lead to a critical slowing down as the transition point is approaches. In this spirit, both considered visibility graph characteristics provide alternative tracers of dynamical early warning signals consistent with classical indicators. Our results demonstrate that measures of visibility graph complexity (i) provide a potentially useful means to tracing changes in the dynamical patterns encoded in a univariate time series that originate from increasing autocorrelation and (ii) allow to systematically distinguish regular from deterministic-chaotic dynamics. We demonstrate the application of our method for different model systems as well as selected paleoclimate time series from the North Atlantic region. Notably, visibility graph based methods are particularly suited for studying the latter type of geoscientific data, since they do not impose intrinsic restrictions or assumptions on the nature of the time series under investigation in terms of noise process, linearity and sampling homogeneity. [1] Lacasa, Lucas, et al. "From time series to complex networks: The visibility graph." Proceedings of the National Academy of Sciences 105.13 (2008): 4972-4975. [2] Telesca, Luciano, and Michele Lovallo. "Analysis of seismic sequences by using the method of visibility graph." EPL (Europhysics Letters) 97.5 (2012): 50002. [3] Donges, Jonathan F., Reik V. Donner, and Jürgen Kurths. "Testing time series irreversibility using complex network methods." EPL (Europhysics Letters) 102.1 (2013): 10004. [4] Small, Michael. "Complex networks from time series: capturing dynamics." 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013), Beijing (2013): 2509-2512. [5] Jacob, Rinku, K.P. Harikrishnan, Ranjeev Misra, and G. Ambika. "Measure for degree heterogeneity in complex networks and its application to recurrence network analysis." arXiv preprint 1605.06607 (2016).
NASA Astrophysics Data System (ADS)
Di Piazza, A.; Cordano, E.; Eccel, E.
2012-04-01
The issue of climate change detection is considered a major challenge. In particular, high temporal resolution climate change scenarios are required in the evaluation of the effects of climate change on agricultural management (crop suitability, yields, risk assessment, etc.) energy production and water management. In this work, a "Weather Generator" technique was used for downscaling climate change scenarios for temperature. An R package (RMAWGEN, Cordano and Eccel, 2011 - available on http://cran.r-project.org) was developed aiming to generate synthetic daily weather conditions by using the theory of vectorial auto-regressive models (VAR). The VAR model was chosen for its ability in maintaining the temporal and spatial correlations among variables. In particular, observed time series of daily maximum and minimum temperature are transformed into "new" normally-distributed variable time series which are used to calibrate the parameters of a VAR model by using ordinary least square methods. Therefore the implemented algorithm, applied to monthly mean climatic values downscaled by Global Climate Model predictions, can generate several stochastic daily scenarios where the statistical consistency among series is saved. Further details are present in RMAWGEN documentation. An application is presented here by using a dataset with daily temperature time series recorded in 41 different sites of Trentino region for the period 1958-2010. Temperature time series were pre-processed to fill missing values (by a site-specific calibrated Inverse Distance Weighting algorithm, corrected with elevation) and to remove inhomogeneities. Several climatic indices were taken into account, useful for several impact assessment applications, and their time trends within the time series were analyzed. The indices go from the more classical ones, as annual mean temperatures, seasonal mean temperatures and their anomalies (from the reference period 1961-1990) to the climate change indices selected from the list recommended by the World Meteorological Organization Commission for Climatology (WMO-CCL) and the Research Programme on Climate Variability and Predictability (CLIVAR) project's Expert Team on Climate Change Detection, Monitoring and Indices (ETCCDMI). Each index was applied to both observed (and processed) data and to synthetic time series produced by the Weather Generator, over the thirty year reference period 1981-2010, in order to validate the procedure. Climate projections were statistically downscaled for a selection of sites for the two 30-year periods 2021-2050 and 2071-2099 of the European project "Ensembles" multi-model output (scenario A1B). The use of several climatic indices strengthens the trend analysis of both the generated synthetic series and future climate projections.
NASA Astrophysics Data System (ADS)
Katselis, George; Koukou, Katerina; Dimitriou, Evagelos; Koutsikopoulos, Constantin
2007-07-01
In the present study we analysed the daily seaward migratory behaviour of four dominant euryhaline fish species (Mugilidae: Liza saliens, Liza aurata, Mugil cephalus and Sparidae: Sparus aurata) in the Messolonghi Etoliko lagoon system (Western Greek coast) based on the daily landings' time series of barrier traps and assessed the relationship between their migratory behaviour and various climatic variables (air temperature and atmospheric pressure) and the lunar cycle. A 2-year time series of daily fish landings (1993 and 1994), a long time series of daily air temperature and daily temperature range (1991 1998) as well as a 4-year time series of the daily atmospheric pressure (1994 1997) and daily pressure range were used. Harmonic models (HM) consisting of annual and lunar cycle harmonic components explained most (R2 > 0.80) of the mean daily species landings and temperature variations, while a rather low part of the variation (0.18 < R2 < 0.27) was explained for pressure, daily pressure range and daily temperature range. In all the time series sets the amplitude of the annual component was highest. The model values of all species revealed two important migration periods (summer and winter) corresponding to the spawning and refuge migrations. The lunar cycle effect on species' daily migration rates and the short-term fluctuation of daily migration rates were rather low. However, the short-term fluctuation of some species' daily migration rates during winter was greater than during summer. In all species, the main migration was the spawning migration. The model lunar components of the species landings showed a monthly oscillation synchronous to the full moon (S. aurata and M. cephalus) or a semi-monthly oscillation synchronous to the new and full moon (L. aurata and L. saliens). Bispectral analysis of the model values and the model residuals' time series revealed that the species daily migration were correlated (coherencies > 0.6) to the daily fluctuations of the climatic variables at seasonal, mid and short-term scales.
Waning of "conditioned pain modulation": a novel expression of subtle pronociception in migraine.
Nahman-Averbuch, Hadas; Granovsky, Yelena; Coghill, Robert C; Yarnitsky, David; Sprecher, Elliot; Weissman-Fogel, Irit
2013-01-01
To assess the decay of the conditioned pain modulation (CPM) response along repeated applications as a possible expression of subtle pronociception in migraine. One of the most explored mechanisms underlying the pain modulation system is "diffuse noxious inhibitory controls," which is measured psychophysically in the lab by the CPM paradigm. There are contradicting reports on CPM response in migraine, questioning whether migraineurs express pronociceptive pain modulation. Migraineurs (n = 26) and healthy controls (n = 35), all females, underwent 3 stimulation series, consisting of repeated (1) "test-stimulus" (Ts) alone that was given first followed by (2) parallel CPM application (CPM-parallel), and (3) sequential CPM application (CPM-sequential), in which the Ts is delivered during or following the conditioning-stimulus, respectively. In all series, the Ts repeated 4 times (0-3). In the CPM series, repetition "0" consisted of the Ts-alone that was followed by 3 repetitions of the Ts with a conditioning-stimulus application. Although there was no difference between migraineurs and controls for the first CPM response in each series, we found waning of CPM-parallel efficiency along the series for migraineurs (P = .005 for third vs first CPM), but not for controls. Further, greater CPM waning in the CPM-sequential series was correlated with less reported extent of pain reduction by episodic medication (r = 0.493, P = .028). Migraineurs have subtle deficits in endogenous pain modulation which requires a more challenging test protocol than the commonly used single CPM. Waning of CPM response seems to reveal this pronociceptive state. The clinical relevance of the CPM waning effect is highlighted by its association with clinical parameters of migraine. © 2013 American Headache Society.
Single flux quantum voltage amplifiers
NASA Astrophysics Data System (ADS)
Golomidov, Vladimir; Kaplunenko, Vsevolod; Khabipov, Marat; Koshelets, Valery; Kaplunenko, Olga
The novel elements of the Rapid Single Flux Quantum (RSFQ) logic family — a Quasi Digital Voltage Parallel and Series Amplifiers (QDVA) have been computer simulated, designed and experimentally investigated. The Parallel QDVA consists of six stages and provides multiplication of the input voltage with factor five. The output resistance of the QDVA is five times larger than the input so this amplifier seems to be a good matching stage between RSFQL and usual semiconductor electronics. The series QDVA provides a gain factor four and involves two doublers connected by transmission line. The proposed parallel QDVA can be integrated on the same chip with a SQUID sensor.
Indriati, Etty; Swisher, Carl C; Lepre, Christopher; Quinn, Rhonda L; Suriyanto, Rusyad A; Hascaryo, Agus T; Grün, Rainer; Feibel, Craig S; Pobiner, Briana L; Aubert, Maxime; Lees, Wendy; Antón, Susan C
2011-01-01
Homo erectus was the first human lineage to disperse widely throughout the Old World, the only hominin in Asia through much of the Pleistocene, and was likely ancestral to H. sapiens. The demise of this taxon remains obscure because of uncertainties regarding the geological age of its youngest populations. In 1996, some of us co-published electron spin resonance (ESR) and uranium series (U-series) results indicating an age as young as 35-50 ka for the late H. erectus sites of Ngandong and Sambungmacan and the faunal site of Jigar (Indonesia). If correct, these ages favor an African origin for recent humans who would overlap with H. erectus in time and space. Here, we report (40)Ar/(39)Ar incremental heating analyses and new ESR/U-series age estimates from the "20 m terrace" at Ngandong and Jigar. Both data sets are internally consistent and provide no evidence for reworking, yet they are inconsistent with one another. The (40)Ar/(39)Ar analyses give an average age of 546±12 ka (sd±5 se) for both sites, the first reliable radiometric indications of a middle Pleistocene component for the terrace. Given the technical accuracy and consistency of the analyses, the argon ages represent either the actual age or the maximum age for the terrace and are significantly older than previous estimates. Most of the ESR/U-series results are older as well, but the oldest that meets all modeling criteria is 143 ka+20/-17. Most samples indicated leaching of uranium and likely represent either the actual or the minimum age of the terrace. Given known sources of error, the U-series results could be consistent with a middle Pleistocene age. However, the ESR and (40)Ar/(39)Ar ages preclude one another. Regardless, the age of the sites and hominins is at least bracketed between these estimates and is older than currently accepted.
The Age of the 20 Meter Solo River Terrace, Java, Indonesia and the Survival of Homo erectus in Asia
Indriati, Etty; Swisher, Carl C.; Lepre, Christopher; Quinn, Rhonda L.; Suriyanto, Rusyad A.; Hascaryo, Agus T.; Grün, Rainer; Feibel, Craig S.; Pobiner, Briana L.; Aubert, Maxime; Lees, Wendy; Antón, Susan C.
2011-01-01
Homo erectus was the first human lineage to disperse widely throughout the Old World, the only hominin in Asia through much of the Pleistocene, and was likely ancestral to H. sapiens. The demise of this taxon remains obscure because of uncertainties regarding the geological age of its youngest populations. In 1996, some of us co-published electron spin resonance (ESR) and uranium series (U-series) results indicating an age as young as 35–50 ka for the late H. erectus sites of Ngandong and Sambungmacan and the faunal site of Jigar (Indonesia). If correct, these ages favor an African origin for recent humans who would overlap with H. erectus in time and space. Here, we report 40Ar/39Ar incremental heating analyses and new ESR/U-series age estimates from the “20 m terrace" at Ngandong and Jigar. Both data sets are internally consistent and provide no evidence for reworking, yet they are inconsistent with one another. The 40Ar/39Ar analyses give an average age of 546±12 ka (sd±5 se) for both sites, the first reliable radiometric indications of a middle Pleistocene component for the terrace. Given the technical accuracy and consistency of the analyses, the argon ages represent either the actual age or the maximum age for the terrace and are significantly older than previous estimates. Most of the ESR/U-series results are older as well, but the oldest that meets all modeling criteria is 143 ka+20/−17. Most samples indicated leaching of uranium and likely represent either the actual or the minimum age of the terrace. Given known sources of error, the U-series results could be consistent with a middle Pleistocene age. However, the ESR and 40Ar/39Ar ages preclude one another. Regardless, the age of the sites and hominins is at least bracketed between these estimates and is older than currently accepted. PMID:21738710
Consistent response of vegetation dynamics to recent climate change in tropical mountain regions.
Krishnaswamy, Jagdish; John, Robert; Joseph, Shijo
2014-01-01
Global climate change has emerged as a major driver of ecosystem change. Here, we present evidence for globally consistent responses in vegetation dynamics to recent climate change in the world's mountain ecosystems located in the pan-tropical belt (30°N-30°S). We analyzed decadal-scale trends and seasonal cycles of vegetation greenness using monthly time series of satellite greenness (Normalized Difference Vegetation Index) and climate data for the period 1982-2006 for 47 mountain protected areas in five biodiversity hotspots. The time series of annual maximum NDVI for each of five continental regions shows mild greening trends followed by reversal to stronger browning trends around the mid-1990s. During the same period we found increasing trends in temperature but only marginal change in precipitation. The amplitude of the annual greenness cycle increased with time, and was strongly associated with the observed increase in temperature amplitude. We applied dynamic models with time-dependent regression parameters to study the time evolution of NDVI-climate relationships. We found that the relationship between vegetation greenness and temperature weakened over time or was negative. Such loss of positive temperature sensitivity has been documented in other regions as a response to temperature-induced moisture stress. We also used dynamic models to extract the trends in vegetation greenness that remain after accounting for the effects of temperature and precipitation. We found residual browning and greening trends in all regions, which indicate that factors other than temperature and precipitation also influence vegetation dynamics. Browning rates became progressively weaker with increase in elevation as indicated by quantile regression models. Tropical mountain vegetation is considered sensitive to climatic changes, so these consistent vegetation responses across widespread regions indicate persistent global-scale effects of climate warming and associated moisture stresses. © 2013 John Wiley & Sons Ltd.
2013-01-01
Background Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Methods Monthly mean raw mortality (at hospital discharge) time series, 1995–2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) “in-control” status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. Results The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. Conclusions The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues. PMID:23705957
The EUMETSAT Polar System - Second Generation (EPS-SG) micro-wave imaging (MWI) mission
NASA Astrophysics Data System (ADS)
Bojkov, B. R.; Accadia, C.; Klaes, D.; Canestri, A.; Cohen, M.
2017-12-01
The EUMETSAT Polar System (EPS) will be followed by a second generation system called EPS-SG. This new family of missions will contribute to the Joint Polar System being jointly set up with NOAA in the timeframe 2020-2040. These satellites will fly, like Metop (EPS), in a sun synchronous, low earth orbit at 830 km altitude and 09:30 local time descending node, providing observations over the full globe with revisit times of 12 hours. EPS-SG consists of two different satellites configurations, the EPS-SGa series dedicated to IR and MW sounding, and the EPS-SGb series dedicated to microwave imaging and scatterometry. The EPS-SG family will consist of three successive launches of each satellite-type. The Microwave Imager (MWI) will be hosted on Metop-SGb series of satellites, with the primary objective of supporting Numerical Weather Prediction (NWP) at regional and global scales. Other applications will be observation of surface parameters such as sea ice concentration and hydrology applications. The 18 MWI instrument frequencies range from 18.7 GHz to 183 GHz. All MWI channels up to 89 GHz will measure V- and H polarizations. The MWI was also designed to provide continuity of measurements for select heritage microwave imager channels (e.g. SSM/I, AMSR-E). The additional sounding channels such as the 50-55 and 118 GHz bands will provide additional cloud and precipitation information over sea and land. This combination of channels was successfully tested on the NPOESS Aircraft Sounder Testbed - Microwave Sounder (NAST-M) airborne radiometer, and it is the first time that will be implemented in a conical scanning configuration in a single instrument. An overview of the EPS-SG programme and the MWI instrument will be presented.
Comparison of Passive Microwave-Derived Early Melt Onset Records on Arctic Sea Ice
NASA Technical Reports Server (NTRS)
Bliss, Angela C.; Miller, Jeffrey A.; Meier, Walter N.
2017-01-01
Two long records of melt onset (MO) on Arctic sea ice from passive microwave brightness temperatures (Tbs) obtained by a series of satellite-borne instruments are compared. The Passive Microwave (PMW) method and Advanced Horizontal Range Algorithm (AHRA) detect the increase in emissivity that occurs when liquid water develops around snow grains at the onset of early melting on sea ice. The timing of MO on Arctic sea ice influences the amount of solar radiation absorbed by the ice-ocean system throughout the melt season by reducing surface albedos in the early spring. This work presents a thorough comparison of these two methods for the time series of MO dates from 1979through 2012. The methods are first compared using the published data as a baseline comparison of the publically available data products. A second comparison is performed on adjusted MO dates we produced to remove known differences in inter-sensor calibration of Tbs and masking techniques used to develop the original MO date products. These adjustments result in a more consistent set of input Tbs for the algorithms. Tests of significance indicate that the trends in the time series of annual mean MO dates for the PMW and AHRA are statistically different for the majority of the Arctic Ocean including the Laptev, E. Siberian, Chukchi, Beaufort, and central Arctic regions with mean differences as large as 38.3 days in the Barents Sea. Trend agreement improves for our more consistent MO dates for nearly all regions. Mean differences remain large, primarily due to differing sensitivity of in-algorithm thresholds and larger uncertainties in thin-ice regions.
State Education Indicators with a Focus on Title I: 2003-04
ERIC Educational Resources Information Center
Williams, Andra; Blank, Rolf K.; Toye, Carla; Petermann, Adam
2007-01-01
"State Education Indicators with a Focus on Title I: 2003-04" is the ninth in a series of reports designed to provide: (1) consistent, reliable indicators to allow analysis of trends for each state over time; (2) high quality, comparable state data; and (3) indicator formats designed for use by a diverse audience. Since its inception,…
State Education Indicators with a Focus on Title I: 2002-03
ERIC Educational Resources Information Center
Williams, Andra; Blank, Rolf K.; Toye, Carla; Petermann, Adam
2007-01-01
This paper is the eighth in a series of reports designed to provide (1) consistent, reliable indicators to allow analysis of trends for each state over time, (2) high data quality for comparability from state to state, and (3) accessible indicator formats aimed toward facilitating use by a variety of audiences. Since its inception, the report has…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-21
... assessment of the Gulf of Mexico red snapper fishery will consist of a series of workshops and supplemental... Workshop Webinars 7 and 8 are as follows: Panelists will continue to review the progress of modeling... (see ADDRESSES) at least 10 business days prior to the meeting. Note: The times and sequence specified...
16 CFR 1615.4 - Test procedure.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Specimen holder. The specimen holder is designed to permit suspension of the specimen in a fixed vertical... series of loads for char length determinations. Suitable metal hooks consist of No. 19 gauge steel wire... least 8 h at 21±1.1 °C (70±2 °F) and 65±2 pct relative humidity. Shorter conditioning times may be used...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-29
... of PHLX FOREX Options may be added consistent with the timing described above for new series of index... prior to the expiration date. Under proposed Rule 1006C, the closing settlement value for PHLX FOREX Options and for the FLEX PHLX FOREX Options on the currencies listed in the rule shall be the spot market...
Predicting future forestland area: a comparison of econometric approaches.
SoEun Ahn; Andrew J. Plantinga; Ralph J. Alig
2000-01-01
Predictions of future forestland area are an important component of forest policy analyses. In this article, we test the ability of econometric land use models to accurately forecast forest area. We construct a panel data set for Alabama consisting of county and time-series observation for the period 1964 to 1992. We estimate models using restricted data sets-namely,...
Music for Elementary Teachers; Self-Help Guide (MUS 370). Adams State College.
ERIC Educational Resources Information Center
Stokes, Cloyce
This self-help guide for the music teacher is one of a series of eight Teacher Education Modules developed by Adams State College Teacher Corps Program. The guide itself consists of 11 modules, the first five of which focus on the mathematical and scientific aspects of music--pitch, tempo, furation, time, and key. These five modules are…
ERIC Educational Resources Information Center
Bassanini, Andrea; Duval, Romain
2006-01-01
This paper explores the impact of policies and institutions on employment and unemployment of OECD countries in the past decades. Reduced-form unemployment equations, consistent with standard wage setting/price-setting models, are estimated using cross-country/time-series data from 21 OECD countries over the period 1982-2003. In the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-27
... Perryman (2002) used similar methods to Wade (2002) to update the status assessment of ENP gray whales by...). Subsequently, Laake et al. (2009) reanalyzed all previous abundance data using methods consistent with Wade... time series of abundance estimates. Punt and Wade (2010) used methods similar to those described by...
Marine Boundary Layer Cloud Properties From AMF Point Reyes Satellite Observations
NASA Technical Reports Server (NTRS)
Jensen, Michael; Vogelmann, Andrew M.; Luke, Edward; Minnis, Patrick; Miller, Mark A.; Khaiyer, Mandana; Nguyen, Louis; Palikonda, Rabindra
2007-01-01
Cloud Diameter, C(sub D), offers a simple measure of Marine Boundary Layer (MBL) cloud organization. The diurnal cycle of cloud-physical properties and C(sub D) at Pt Reyes are consistent with previous work. The time series of C(sub D) can be used to identify distinct mesoscale organization regimes within the Pt. Reyes observation period.
materials determine the range of applicability of each method. A useful microencapsulation method, based on coagulation by inertial force was developed...The generation apparatus, consisting of two aerosol generators in series, was utilized to produce many kinds of microcapsules . A fluid energy mill...was found useful for the production of some microcapsules . The permeability of microcapsule films and the effect of exposure time and humidity were
Fully Bayesian Estimation of Data from Single Case Designs
ERIC Educational Resources Information Center
Rindskopf, David
2013-01-01
Single case designs (SCDs) generally consist of a small number of short time series in two or more phases. The analysis of SCDs statistically fits in the framework of a multilevel model, or hierarchical model. The usual analysis does not take into account the uncertainty in the estimation of the random effects. This not only has an effect on the…
A comparative analysis of nurse and physician characters in the entertainment media.
Kalisch, P A; Kalisch, B J
1986-03-01
The results of a large body of research have yielded findings supportive of the view that the mass media have a decisive effect on the formation of public attitudes and behaviours. This study reports the results of a content analysis of 670 nurse and 466 physician characters portrayed in novels, motion pictures and prime-time television series, published or produced from 1920 to 1980. When compared with media physicians, media nurses were consistently found to be less central to the plot, less intelligent, rational, and individualistic, less likely to value scholarliness and achievement and exercise clinical judgement. Moreover in television series nurse characters were depicted as valuing service to others and being helpful to patients less, and as being lower in nurturance and empathy than physician characters. An analysis of these data over time points to a steady and unmistakable decline in the mass media entertainment image of nurses while physician characters have remained consistently high or shown improvement. The implications of this image gap are discussed along with the need for image reshaping efforts which might direct public demand for more collegial and productive 'real world' nurse-physician roles and interprofessional relationships.
Spectroscopic study of shock-induced decomposition in ammonium perchlorate single crystals.
Gruzdkov, Y A; Winey, J M; Gupta, Y M
2008-05-01
Time-resolved Raman scattering measurements were performed on ammonium perchlorate (AP) single crystals under stepwise shock loading. For particular temperature and pressure conditions, the intensity of the Raman spectra in shocked AP decayed exponentially with time. This decay is attributed to shock-induced chemical decomposition in AP. A series of shock experiments, reaching peak stresses from 10-18 GPa, demonstrated that higher stresses inhibit decomposition while higher temperatures promote it. No orientation dependence was found when AP crystals were shocked normal to the (210) and (001) crystallographic planes. VISAR (velocity interferometer system for any reflector) particle velocity measurements and time-resolved optical extinction measurements carried out to verify these observations are consistent with the Raman data. The combined kinetic and spectroscopic results are consistent with a proton-transfer reaction as the first decomposition step in shocked AP.
Diffusive and subdiffusive dynamics of indoor microclimate: a time series modeling.
Maciejewska, Monika; Szczurek, Andrzej; Sikora, Grzegorz; Wyłomańska, Agnieszka
2012-09-01
The indoor microclimate is an issue in modern society, where people spend about 90% of their time indoors. Temperature and relative humidity are commonly used for its evaluation. In this context, the two parameters are usually considered as behaving in the same manner, just inversely correlated. This opinion comes from observation of the deterministic components of temperature and humidity time series. We focus on the dynamics and the dependency structure of the time series of these parameters, without deterministic components. Here we apply the mean square displacement, the autoregressive integrated moving average (ARIMA), and the methodology for studying anomalous diffusion. The analyzed data originated from five monitoring locations inside a modern office building, covering a period of nearly one week. It was found that the temperature data exhibited a transition between diffusive and subdiffusive behavior, when the building occupancy pattern changed from the weekday to the weekend pattern. At the same time the relative humidity consistently showed diffusive character. Also the structures of the dependencies of the temperature and humidity data sets were different, as shown by the different structures of the ARIMA models which were found appropriate. In the space domain, the dynamics and dependency structure of the particular parameter were preserved. This work proposes an approach to describe the very complex conditions of indoor air and it contributes to the improvement of the representative character of microclimate monitoring.
Results from the JPL IGS Analysis Center IGS14 Reprocessing Campaign
NASA Astrophysics Data System (ADS)
Ries, P. A.; Amiri, N.; Heflin, M. B.; Sakumura, C.; Sibois, A. E.; Sibthorpe, A.; David, M. W.
2017-12-01
The JPL IGS analysis center has begun a campaign to reprocess GPS orbits and clocks in the IGS14 reference frame. Though the new frame is only a few millimeters offset from the previous IGb08 frame, a reprocessing is required for consistent use of the new frame due to a change in the satellite phase center offsets between the frames. We will present results on the reprocessing campaign from 2002 to present in order to evaluate any effects caused by the new frame. We also create long-term time-series and periodograms of translation, rotation, and scale parameters to see if there is any divergence between the frames. We will also process long-term PPP time series and derived velocities for a well-distributed set of stations in each frame to compare with the published frame offsets.
Estimating terrestrial snow depth with the Topex-Poseidon altimeter and radiometer
Papa, F.; Legresy, B.; Mognard, N.M.; Josberger, E.G.; Remy, F.
2002-01-01
Active and passive microwave measurements obtained by the dual-frequency Topex-Poseidon radar altimeter from the Northern Great Plains of the United States are used to develop a snow pack radar backscatter model. The model results are compared with daily time series of surface snow observations made by the U.S. National Weather Service. The model results show that Ku-band provides more accurate snow depth determinations than does C-band. Comparing the snow depth determinations derived from the Topex-Poseidon nadir-looking passive microwave radiometers with the oblique-looking Satellite Sensor Microwave Imager (SSM/I) passive microwave observations and surface observations shows that both instruments accurately portray the temporal characteristics of the snow depth time series. While both retrievals consistently underestimate the actual snow depths, the Topex-Poseidon results are more accurate.
Satellite Emission Range Inferred Earth Survey (SERIES) project
NASA Technical Reports Server (NTRS)
Buennagel, L. A.; Macdoran, P. F.; Neilan, R. E.; Spitzmesser, D. J.; Young, L. E.
1984-01-01
The Global Positioning System (GPS) was developed by the Department of Defense primarily for navigation use by the United States Armed Forces. The system will consist of a constellation of 18 operational Navigation Satellite Timing and Ranging (NAVSTAR) satellites by the late 1980's. During the last four years, the Satellite Emission Range Inferred Earth Surveying (SERIES) team at the Jet Propulsion Laboratory (JPL) has developed a novel receiver which is the heart of the SERIES geodetic system designed to use signals broadcast from the GPS. This receiver does not require knowledge of the exact code sequence being transmitted. In addition, when two SERIES receivers are used differentially to determine a baseline, few cm accuracies can be obtained. The initial engineering test phase has been completed for the SERIES Project. Baseline lengths, ranging from 150 meters to 171 kilometers, have been measured with 0.3 cm to 7 cm accuracies. This technology, which is sponsored by the NASA Geodynamics Program, has been developed at JPL to meet the challenge for high precision, cost-effective geodesy, and to complement the mobile Very Long Baseline Interferometry (VLBI) system for Earth surveying.
Describing temporal variability of the mean Estonian precipitation series in climate time scale
NASA Astrophysics Data System (ADS)
Post, P.; Kärner, O.
2009-04-01
Applicability of the random walk type models to represent the temporal variability of various atmospheric temperature series has been successfully demonstrated recently (e.g. Kärner, 2002). Main problem in the temperature modeling is connected to the scale break in the generally self similar air temperature anomaly series (Kärner, 2005). The break separates short-range strong non-stationarity from nearly stationary longer range variability region. This is an indication of the fact that several geophysical time series show a short-range non-stationary behaviour and a stationary behaviour in longer range (Davis et al., 1996). In order to model series like that the choice of time step appears to be crucial. To characterize the long-range variability we can neglect the short-range non-stationary fluctuations, provided that we are able to model properly the long-range tendencies. The structure function (Monin and Yaglom, 1975) was used to determine an approximate segregation line between the short and the long scale in terms of modeling. The longer scale can be called climate one, because such models are applicable in scales over some decades. In order to get rid of the short-range fluctuations in daily series the variability can be examined using sufficiently long time step. In the present paper, we show that the same philosophy is useful to find a model to represent a climate-scale temporal variability of the Estonian daily mean precipitation amount series over 45 years (1961-2005). Temporal variability of the obtained daily time series is examined by means of an autoregressive and integrated moving average (ARIMA) family model of the type (0,1,1). This model is applicable for daily precipitation simulating if to select an appropriate time step that enables us to neglet the short-range non-stationary fluctuations. A considerably longer time step than one day (30 days) is used in the current paper to model the precipitation time series variability. Each ARIMA (0,1,1) model can be interpreted to be consisting of random walk in a noisy environment (Box and Jenkins, 1976). The fitted model appears to be weakly non-stationary, that gives us the possibility to use stationary approximation if only the noise component from that sum of white noise and random walk is exploited. We get a convenient routine to generate a stationary precipitation climatology with a reasonable accuracy, since the noise component variance is much larger than the dispersion of the random walk generator. This interpretation emphasizes dominating role of a random component in the precipitation series. The result is understandable due to a small territory of Estonia that is situated in the mid-latitude cyclone track. References Box, J.E.P. and G. Jenkins 1976: Time Series Analysis, Forecasting and Control (revised edn.), Holden Day San Francisco, CA, 575 pp. Davis, A., Marshak, A., Wiscombe, W. and R. Cahalan 1996: Multifractal characterizations of intermittency in nonstationary geophysical signals and fields.in G. Trevino et al. (eds) Current Topics in Nonsstationarity Analysis. World-Scientific, Singapore, 97-158. Kärner, O. 2002: On nonstationarity and antipersistency in global temperature series. J. Geophys. Res. D107; doi:10.1029/2001JD002024. Kärner, O. 2005: Some examples on negative feedback in the Earth climate system. Centr. European J. Phys. 3; 190-208. Monin, A.S. and A.M. Yaglom 1975: Statistical Fluid Mechanics, Vol 2. Mechanics of Turbulence , MIT Press Boston Mass, 886 pp.
Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.; ...
2016-10-20
Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.
Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less
Sippel, Sebastian; Mahecha, Miguel D.; Hauhs, Michael; Bodesheim, Paul; Kaminski, Thomas; Gans, Fabian; Rosso, Osvaldo A.
2016-01-01
Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observed and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. We demonstrate here that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics. PMID:27764187
Does the Rain fall in our heads?
NASA Astrophysics Data System (ADS)
Costa, M. E. G.; Rodrigues, M. A. S.
2012-04-01
In our school the activities linked with sciences are developed in a partnership with other school subjects. Interdisciplinary projects are always valued from beginning to end of a project. It is common for teachers of different areas to work together in a Science project. Research of English written articles is very important not only for the development of our students' scientific literacy but also as a way of widening knowledge and a view on different perspectives of life instead of being limited to research of any articles in Portuguese language. In this work, we are going to study the rainfall trends in our council (Góis, Portugal). The use of the analyses of long-term time series of rainfall becomes imperative to evaluate variability and tendency of the climate in secular time series. These, in turn, result in a better understanding of the regional climate, allowing a prognosis of the future climate which is of extreme importance in managing the natural and hydro resources and for planning human activities through scenarios and their impact. This work consists of analysis of long-term observed rainfall series for the council of Góis.
New Insights into Signed Path Coefficient Granger Causality Analysis.
Zhang, Jian; Li, Chong; Jiang, Tianzi
2016-01-01
Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of "signed path coefficient Granger causality," a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an "excitatory" or "inhibitory" influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation.
A 305 year monthly rainfall series for the Island of Ireland (1711-2016)
NASA Astrophysics Data System (ADS)
Murphy, Conor; Burt, Tim P.; Broderick, Ciaran; Duffy, Catriona; Macdonald, Neil; Matthews, Tom; McCarthy, Mark P.; Mullan, Donal; Noone, Simon; Ryan, Ciara; Thorne, Peter; Walsh, Seamus; Wilby, Robert L.
2017-04-01
This paper derives a continuous 305-year monthly rainfall series for the Island of Ireland (IoI) for the period 1711-2016. Two key data sources are employed: i) a previously unpublished UK Met Office Note which compiled annual rainfall anomalies and corresponding monthly per mille amounts from weather diaries and early observational records for the period 1711-1977; and ii) a long-term, homogenised monthly IoI rainfall series for the period 1850-2016. Using estimates of long-term average precipitation sampled from the quality assured series, the full record is reconstituted and insights drawn regarding notable periods and the range of climate variability and change experienced. Consistency with other long records for the region is examined, including: the England and Wales Precipitation series (EWP; 1766-2016); the early EWP Glasspoole series (1716-1765) and the Central England Temperature series (CET; 1711-2016). Strong correspondence between all records is noted from 1780 onwards. While disparities are evident between the early EWP and Ireland series, the latter shows strong decadal consistency with CET throughout the record. In addition, independent, early observations from Cork and Dublin, along with available documentary sources, corroborate the derived series and add confidence to our reconstruction. The new IoI rainfall record reveals that the wettest decades occurred in the early 18th Century, despite the fact that IoI has experienced a long-term winter wetting trend consistent with climate model projections. These exceptionally wet winters of the 1720s and 1730s were concurrent with almost unprecedented warmth in the CET, glacial advance throughout Scandinavia, and glacial retreat in West Greenland, consistent with a wintertime NAO-type forcing. Our study therefore demonstrates the value of long-term observational records for providing insight to the natural climate variability of the North Atlantic region.
NASA Astrophysics Data System (ADS)
Boudhina, Nissaf; Zitouna-Chebbi, Rim; Mekki, Insaf; Jacob, Frédéric; Ben Mechlia, Nétij; Masmoudi, Moncef; Prévot, Laurent
2018-06-01
Estimating evapotranspiration in hilly watersheds is paramount for managing water resources, especially in semiarid/subhumid regions. The eddy covariance (EC) technique allows continuous measurements of latent heat flux (LE). However, time series of EC measurements often experience large portions of missing data because of instrumental malfunctions or quality filtering. Existing gap-filling methods are questionable over hilly crop fields because of changes in airflow inclination and subsequent aerodynamic properties. We evaluated the performances of different gap-filling methods before and after tailoring to conditions of hilly crop fields. The tailoring consisted of splitting the LE time series beforehand on the basis of upslope and downslope winds. The experiment was setup within an agricultural hilly watershed in northeastern Tunisia. EC measurements were collected throughout the growth cycle of three wheat crops, two of them located in adjacent fields on opposite hillslopes, and the third one located in a flat field. We considered four gap-filling methods: the REddyProc method, the linear regression between LE and net radiation (Rn), the multi-linear regression of LE against the other energy fluxes, and the use of evaporative fraction (EF). Regardless of the method, the splitting of the LE time series did not impact the gap-filling rate, and it might improve the accuracies on LE retrievals in some cases. Regardless of the method, the obtained accuracies on LE estimates after gap filling were close to instrumental accuracies, and they were comparable to those reported in previous studies over flat and mountainous terrains. Overall, REddyProc was the most appropriate method, for both gap-filling rate and retrieval accuracy. Thus, it seems possible to conduct gap filling for LE time series collected over hilly crop fields, provided the LE time series are split beforehand on the basis of upslope-downslope winds. Future works should address consecutive vegetation growth cycles for a larger panel of conditions in terms of climate, vegetation, and water status.
Analyses of GIMMS NDVI Time Series in Kogi State, Nigeria
NASA Astrophysics Data System (ADS)
Palka, Jessica; Wessollek, Christine; Karrasch, Pierre
2017-10-01
The value of remote sensing data is particularly evident where an areal monitoring is needed to provide information on the earth's surface development. The use of temporal high resolution time series data allows for detecting short-term changes. In Kogi State in Nigeria different vegetation types can be found. As the major population in this region is living in rural communities with crop farming the existing vegetation is slowly being altered. The expansion of agricultural land causes loss of natural vegetation, especially in the regions close to the rivers which are suitable for crop production. With regard to these facts, two questions can be dealt with covering different aspects of the development of vegetation in the Kogi state, the determination and evaluation of the general development of the vegetation in the study area (trend estimation) and analyses on a short-term behavior of vegetation conditions, which can provide information about seasonal effects in vegetation development. For this purpose, the GIMMS-NDVI data set, provided by the NOAA, provides information on the normalized difference vegetation index (NDVI) in a geometric resolution of approx. 8 km. The temporal resolution of 15 days allows the already described analyses. For the presented analysis data for the period 1981-2012 (31 years) were used. The implemented workflow mainly applies methods of time series analysis. The results show that in addition to the classical seasonal development, artefacts of different vegetation periods (several NDVI maxima) can be found in the data. The trend component of the time series shows a consistently positive development in the entire study area considering the full investigation period of 31 years. However, the results also show that this development has not been continuous and a simple linear modeling of the NDVI increase is only possible to a limited extent. For this reason, the trend modeling was extended by procedures for detecting structural breaks in the time series.
Identification of Boolean Network Models From Time Series Data Incorporating Prior Knowledge.
Leifeld, Thomas; Zhang, Zhihua; Zhang, Ping
2018-01-01
Motivation: Mathematical models take an important place in science and engineering. A model can help scientists to explain dynamic behavior of a system and to understand the functionality of system components. Since length of a time series and number of replicates is limited by the cost of experiments, Boolean networks as a structurally simple and parameter-free logical model for gene regulatory networks have attracted interests of many scientists. In order to fit into the biological contexts and to lower the data requirements, biological prior knowledge is taken into consideration during the inference procedure. In the literature, the existing identification approaches can only deal with a subset of possible types of prior knowledge. Results: We propose a new approach to identify Boolean networks from time series data incorporating prior knowledge, such as partial network structure, canalizing property, positive and negative unateness. Using vector form of Boolean variables and applying a generalized matrix multiplication called the semi-tensor product (STP), each Boolean function can be equivalently converted into a matrix expression. Based on this, the identification problem is reformulated as an integer linear programming problem to reveal the system matrix of Boolean model in a computationally efficient way, whose dynamics are consistent with the important dynamics captured in the data. By using prior knowledge the number of candidate functions can be reduced during the inference. Hence, identification incorporating prior knowledge is especially suitable for the case of small size time series data and data without sufficient stimuli. The proposed approach is illustrated with the help of a biological model of the network of oxidative stress response. Conclusions: The combination of efficient reformulation of the identification problem with the possibility to incorporate various types of prior knowledge enables the application of computational model inference to systems with limited amount of time series data. The general applicability of this methodological approach makes it suitable for a variety of biological systems and of general interest for biological and medical research.
Consistency properties of chaotic systems driven by time-delayed feedback
NASA Astrophysics Data System (ADS)
Jüngling, T.; Soriano, M. C.; Oliver, N.; Porte, X.; Fischer, I.
2018-04-01
Consistency refers to the property of an externally driven dynamical system to respond in similar ways to similar inputs. In a delay system, the delayed feedback can be considered as an external drive to the undelayed subsystem. We analyze the degree of consistency in a generic chaotic system with delayed feedback by means of the auxiliary system approach. In this scheme an identical copy of the nonlinear node is driven by exactly the same signal as the original, allowing us to verify complete consistency via complete synchronization. In the past, the phenomenon of synchronization in delay-coupled chaotic systems has been widely studied using correlation functions. Here, we analytically derive relationships between characteristic signatures of the correlation functions in such systems and unequivocally relate them to the degree of consistency. The analytical framework is illustrated and supported by numerical calculations of the logistic map with delayed feedback for different replica configurations. We further apply the formalism to time series from an experiment based on a semiconductor laser with a double fiber-optical feedback loop. The experiment constitutes a high-quality replica scheme for studying consistency of the delay-driven laser and confirms the general theoretical results.
Detection of bifurcations in noisy coupled systems from multiple time series
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williamson, Mark S., E-mail: m.s.williamson@exeter.ac.uk; Lenton, Timothy M.
We generalize a method of detecting an approaching bifurcation in a time series of a noisy system from the special case of one dynamical variable to multiple dynamical variables. For a system described by a stochastic differential equation consisting of an autonomous deterministic part with one dynamical variable and an additive white noise term, small perturbations away from the system's fixed point will decay slower the closer the system is to a bifurcation. This phenomenon is known as critical slowing down and all such systems exhibit this decay-type behaviour. However, when the deterministic part has multiple coupled dynamical variables, themore » possible dynamics can be much richer, exhibiting oscillatory and chaotic behaviour. In our generalization to the multi-variable case, we find additional indicators to decay rate, such as frequency of oscillation. In the case of approaching a homoclinic bifurcation, there is no change in decay rate but there is a decrease in frequency of oscillations. The expanded method therefore adds extra tools to help detect and classify approaching bifurcations given multiple time series, where the underlying dynamics are not fully known. Our generalisation also allows bifurcation detection to be applied spatially if one treats each spatial location as a new dynamical variable. One may then determine the unstable spatial mode(s). This is also something that has not been possible with the single variable method. The method is applicable to any set of time series regardless of its origin, but may be particularly useful when anticipating abrupt changes in the multi-dimensional climate system.« less
Bayesian inference of selection in a heterogeneous environment from genetic time-series data.
Gompert, Zachariah
2016-01-01
Evolutionary geneticists have sought to characterize the causes and molecular targets of selection in natural populations for many years. Although this research programme has been somewhat successful, most statistical methods employed were designed to detect consistent, weak to moderate selection. In contrast, phenotypic studies in nature show that selection varies in time and that individual bouts of selection can be strong. Measurements of the genomic consequences of such fluctuating selection could help test and refine hypotheses concerning the causes of ecological specialization and the maintenance of genetic variation in populations. Herein, I proposed a Bayesian nonhomogeneous hidden Markov model to estimate effective population sizes and quantify variable selection in heterogeneous environments from genetic time-series data. The model is described and then evaluated using a series of simulated data, including cases where selection occurs on a trait with a simple or polygenic molecular basis. The proposed method accurately distinguished neutral loci from non-neutral loci under strong selection, but not from those under weak selection. Selection coefficients were accurately estimated when selection was constant or when the fitness values of genotypes varied linearly with the environment, but these estimates were less accurate when fitness was polygenic or the relationship between the environment and the fitness of genotypes was nonlinear. Past studies of temporal evolutionary dynamics in laboratory populations have been remarkably successful. The proposed method makes similar analyses of genetic time-series data from natural populations more feasible and thereby could help answer fundamental questions about the causes and consequences of evolution in the wild. © 2015 John Wiley & Sons Ltd.
Detection of bifurcations in noisy coupled systems from multiple time series
NASA Astrophysics Data System (ADS)
Williamson, Mark S.; Lenton, Timothy M.
2015-03-01
We generalize a method of detecting an approaching bifurcation in a time series of a noisy system from the special case of one dynamical variable to multiple dynamical variables. For a system described by a stochastic differential equation consisting of an autonomous deterministic part with one dynamical variable and an additive white noise term, small perturbations away from the system's fixed point will decay slower the closer the system is to a bifurcation. This phenomenon is known as critical slowing down and all such systems exhibit this decay-type behaviour. However, when the deterministic part has multiple coupled dynamical variables, the possible dynamics can be much richer, exhibiting oscillatory and chaotic behaviour. In our generalization to the multi-variable case, we find additional indicators to decay rate, such as frequency of oscillation. In the case of approaching a homoclinic bifurcation, there is no change in decay rate but there is a decrease in frequency of oscillations. The expanded method therefore adds extra tools to help detect and classify approaching bifurcations given multiple time series, where the underlying dynamics are not fully known. Our generalisation also allows bifurcation detection to be applied spatially if one treats each spatial location as a new dynamical variable. One may then determine the unstable spatial mode(s). This is also something that has not been possible with the single variable method. The method is applicable to any set of time series regardless of its origin, but may be particularly useful when anticipating abrupt changes in the multi-dimensional climate system.
GNSS seismometer: Seismic phase recognition of real-time high-rate GNSS deformation waves
NASA Astrophysics Data System (ADS)
Nie, Zhaosheng; Zhang, Rui; Liu, Gang; Jia, Zhige; Wang, Dijin; Zhou, Yu; Lin, Mu
2016-12-01
High-rate global navigation satellite systems (GNSS) can potentially be used as seismometers to capture short-period instantaneous dynamic deformation waves from earthquakes. However, the performance and seismic phase recognition of the GNSS seismometer in the real-time mode, which plays an important role in GNSS seismology, are still uncertain. By comparing the results of accuracy and precision of the real-time solution using a shake table test, we found real-time solutions to be consistent with post-processing solutions and independent of sampling rate. In addition, we analyzed the time series of real-time solutions for shake table tests and recent large earthquakes. The results demonstrated that high-rate GNSS have the ability to retrieve most types of seismic waves, including P-, S-, Love, and Rayleigh waves. The main factor limiting its performance in recording seismic phases is the widely used 1-Hz sampling rate. The noise floor also makes recognition of some weak seismic phases difficult. We concluded that the propagation velocities and path of seismic waves, macro characteristics of the high-rate GNSS array, spatial traces of seismic phases, and incorporation of seismographs are all useful in helping to retrieve seismic phases from the high-rate GNSS time series.
Deriving phenological metrics from NDVI through an open source tool developed in QGIS
NASA Astrophysics Data System (ADS)
Duarte, Lia; Teodoro, A. C.; Gonçalves, Hernãni
2014-10-01
Vegetation indices have been commonly used over the past 30 years for studying vegetation characteristics using images collected by remote sensing satellites. One of the most commonly used is the Normalized Difference Vegetation Index (NDVI). The various stages that green vegetation undergoes during a complete growing season can be summarized through time-series analysis of NDVI data. The analysis of such time-series allow for extracting key phenological variables or metrics of a particular season. These characteristics may not necessarily correspond directly to conventional, ground-based phenological events, but do provide indications of ecosystem dynamics. A complete list of the phenological metrics that can be extracted from smoothed, time-series NDVI data is available in the USGS online resources (http://phenology.cr.usgs.gov/methods_deriving.php).This work aims to develop an open source application to automatically extract these phenological metrics from a set of satellite input data. The main advantage of QGIS for this specific application relies on the easiness and quickness in developing new plug-ins, using Python language, based on the experience of the research group in other related works. QGIS has its own application programming interface (API) with functionalities and programs to develop new features. The toolbar developed for this application was implemented using the plug-in NDVIToolbar.py. The user introduces the raster files as input and obtains a plot and a report with the metrics. The report includes the following eight metrics: SOST (Start Of Season - Time) corresponding to the day of the year identified as having a consistent upward trend in the NDVI time series; SOSN (Start Of Season - NDVI) corresponding to the NDVI value associated with SOST; EOST (End of Season - Time) which corresponds to the day of year identified at the end of a consistent downward trend in the NDVI time series; EOSN (End of Season - NDVI) corresponding to the NDVI value associated with EOST; MAXN (Maximum NDVI) which corresponds to the maximum NDVI value; MAXT (Time of Maximum) which is the day associated with MAXN; DUR (Duration) defined as the number of days between SOST and EOST; and AMP (Amplitude) which is the difference between MAXN and SOSN. This application provides all these metrics in a single step. Initially, the data points are interpolated using a moving average graphic with five and three points. The eight metrics previously described are then obtained from the spline using numpy functions. In the present work, the developed toolbar was applied to MODerate resolution Imaging Spectroradiometer (MODIS) data covering a particular region of Portugal, which can be generally applied to other satellite data and study area. The code is open and can be modified according to the user requirements. Other advantage in publishing the plug-ins and the application code is the possibility of other users to improve this application.
Zhao, Yu Xi; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi
2018-04-01
Hydrological process evaluation is temporal dependent. Hydrological time series including dependence components do not meet the data consistency assumption for hydrological computation. Both of those factors cause great difficulty for water researches. Given the existence of hydrological dependence variability, we proposed a correlationcoefficient-based method for significance evaluation of hydrological dependence based on auto-regression model. By calculating the correlation coefficient between the original series and its dependence component and selecting reasonable thresholds of correlation coefficient, this method divided significance degree of dependence into no variability, weak variability, mid variability, strong variability, and drastic variability. By deducing the relationship between correlation coefficient and auto-correlation coefficient in each order of series, we found that the correlation coefficient was mainly determined by the magnitude of auto-correlation coefficient from the 1 order to p order, which clarified the theoretical basis of this method. With the first-order and second-order auto-regression models as examples, the reasonability of the deduced formula was verified through Monte-Carlo experiments to classify the relationship between correlation coefficient and auto-correlation coefficient. This method was used to analyze three observed hydrological time series. The results indicated the coexistence of stochastic and dependence characteristics in hydrological process.
Structural tailoring of counter rotation propfans
NASA Technical Reports Server (NTRS)
Brown, Kenneth W.; Hopkins, D. A.
1989-01-01
The STAT program was designed for the optimization of single rotation, tractor propfan designs. New propfan designs, however, generally consist of two counter rotating propfan rotors. STAT is constructed to contain two levels of analysis. An interior loop, consisting of accurate, efficient approximate analyses, is used to perform the primary propfan optimization. Once an optimum design has been obtained, a series of refined analyses are conducted. These analyses, while too computer time expensive for the optimization loop, are of sufficient accuracy to validate the optimized design. Should the design prove to be unacceptable, provisions are made for recalibration of the approximate analyses, for subsequent reoptimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arendt, Carli A.; Aciego, Sarah M.; Sims, Kenneth W. W.
The residence time of subglacial meltwater impacts aquifer recharge, nutrient production, and chemical signals that reflect underlying bedrock/substrate, but is inaccessible to direct observation. We report the seasonal evolution of subglacial meltwater chemistry from the 2011 melt season at the terminus of the Athabasca Glacier, Canada. We also measured major and trace analytes and U-series isotopes for twenty-nine bulk meltwater samples collected over the duration of the melt season. This dataset, which is the longest time-series record of ( 234U/ 238U) isotopes in a glacial meltwater system, provides insight into the hydrologic evolution of the subglacial system during active melting.more » Meltwater samples, measured from the outflow, were analyzed for ( 238U), ( 222Rn) and ( 234U/ 238U)activity, conductivity, alkalinity, pH and major cations. Subglacial meltwater varied in [238U] and (222Rn) from 23 to 832 ppt and 9 to 171 pCi/L, respectively. Activity ratios of ( 234U/ 238U) ranged from 1.003 to 1.040, with the highest ( 238U), ( 222Rn) and ( 234U/ 238U)activity values occurring in early May when delayed-flow basal meltwater composed a significant portion of the bulk melt. Furthemore, from the chemical evolution of the meltwater, we posit that the relative subglacial water residence times decrease over the course of the melt season. This decrease in qualitative residence time during active melt is consistent with prior field studies and model-predicted channel switching from a delayed, distributed network to a fast, channelized network flow. As such, our study provides support for linking U-series isotopes to storage lengths of meltwater beneath glacial systems as subglacial hydrologic networks evolve with increased melting and channel network efficiency.« less
Gleason, Jessie A; Fagliano, Jerald A
2015-10-01
Asthma is one of the most common chronic diseases affecting children. This study assesses the associations of ozone and fine particulate matter (PM2.5) with pediatric emergency department visits in the urban environment of Newark, NJ. Two study designs were utilized and evaluated for usability. We obtained daily emergency department visits among children aged 3-17 years with a primary diagnosis of asthma during April to September for 2004-2007. Both a time-stratified case-crossover study design with bi-directional control sampling and a time-series study design were utilized. Lagged effects (1-d through 5-d lag, 3-d average, and 5-d average) of ozone and PM2.5 were explored and a dose-response analysis comparing the bottom 5th percentile of 3-d average lag ozone with each 5 percentile increase was performed. Associations of interquartile range increase in same-day ozone were similar between the time-series and case-crossover study designs (RR = 1.08, 95% CI 1.04-1.12) and (OR = 1.10, 95% CI 1.06-1.14), respectively. Similar associations were seen for 1-day lag and 3-day average lag ozone levels. PM2.5 was not associated with the outcome in either study design. Dose-response assessment indicated a statistically significant and increasing association around 50-55 ppb consistent for both study designs. Ozone was statistically positively associated with pediatric asthma ED visits in Newark, NJ. Our results were generally comparable across the time-series and case-crossover study designs, indicating both are useful to assess local air pollution impacts.
Structural Time Series Model for El Niño Prediction
NASA Astrophysics Data System (ADS)
Petrova, Desislava; Koopman, Siem Jan; Ballester, Joan; Rodo, Xavier
2015-04-01
ENSO is a dominant feature of climate variability on inter-annual time scales destabilizing weather patterns throughout the globe, and having far-reaching socio-economic consequences. It does not only lead to extensive rainfall and flooding in some regions of the world, and anomalous droughts in others, thus ruining local agriculture, but also substantially affects the marine ecosystems and the sustained exploitation of marine resources in particular coastal zones, especially the Pacific South American coast. As a result, forecasting of ENSO and especially of the warm phase of the oscillation (El Niño/EN) has long been a subject of intense research and improvement. Thus, the present study explores a novel method for the prediction of the Niño 3.4 index. In the state-of-the-art the advantageous statistical modeling approach of Structural Time Series Analysis has not been applied. Therefore, we have developed such a model using a State Space approach for the unobserved components of the time series. Its distinguishing feature is that observations consist of various components - level, seasonality, cycle, disturbance, and regression variables incorporated as explanatory covariates. These components are aimed at capturing the various modes of variability of the N3.4 time series. They are modeled separately, then combined in a single model for analysis and forecasting. Customary statistical ENSO prediction models essentially use SST, SLP and wind stress in the equatorial Pacific. We introduce new regression variables - subsurface ocean temperature in the western equatorial Pacific, motivated by recent (Ramesh and Murtugudde, 2012) and classical research (Jin, 1997), (Wyrtki, 1985), showing that subsurface processes and heat accumulation there are fundamental for initiation of an El Niño event; and a southern Pacific temperature-difference tracer, the Rossbell dipole, leading EN by about nine months (Ballester, 2011).
Arendt, Carli A.; Aciego, Sarah M.; Sims, Kenneth W. W.; ...
2017-07-31
The residence time of subglacial meltwater impacts aquifer recharge, nutrient production, and chemical signals that reflect underlying bedrock/substrate, but is inaccessible to direct observation. We report the seasonal evolution of subglacial meltwater chemistry from the 2011 melt season at the terminus of the Athabasca Glacier, Canada. We also measured major and trace analytes and U-series isotopes for twenty-nine bulk meltwater samples collected over the duration of the melt season. This dataset, which is the longest time-series record of ( 234U/ 238U) isotopes in a glacial meltwater system, provides insight into the hydrologic evolution of the subglacial system during active melting.more » Meltwater samples, measured from the outflow, were analyzed for ( 238U), ( 222Rn) and ( 234U/ 238U)activity, conductivity, alkalinity, pH and major cations. Subglacial meltwater varied in [238U] and (222Rn) from 23 to 832 ppt and 9 to 171 pCi/L, respectively. Activity ratios of ( 234U/ 238U) ranged from 1.003 to 1.040, with the highest ( 238U), ( 222Rn) and ( 234U/ 238U)activity values occurring in early May when delayed-flow basal meltwater composed a significant portion of the bulk melt. Furthemore, from the chemical evolution of the meltwater, we posit that the relative subglacial water residence times decrease over the course of the melt season. This decrease in qualitative residence time during active melt is consistent with prior field studies and model-predicted channel switching from a delayed, distributed network to a fast, channelized network flow. As such, our study provides support for linking U-series isotopes to storage lengths of meltwater beneath glacial systems as subglacial hydrologic networks evolve with increased melting and channel network efficiency.« less
NASA Astrophysics Data System (ADS)
Zheng, Jinde; Pan, Haiyang; Cheng, Junsheng
2017-02-01
To timely detect the incipient failure of rolling bearing and find out the accurate fault location, a novel rolling bearing fault diagnosis method is proposed based on the composite multiscale fuzzy entropy (CMFE) and ensemble support vector machines (ESVMs). Fuzzy entropy (FuzzyEn), as an improvement of sample entropy (SampEn), is a new nonlinear method for measuring the complexity of time series. Since FuzzyEn (or SampEn) in single scale can not reflect the complexity effectively, multiscale fuzzy entropy (MFE) is developed by defining the FuzzyEns of coarse-grained time series, which represents the system dynamics in different scales. However, the MFE values will be affected by the data length, especially when the data are not long enough. By combining information of multiple coarse-grained time series in the same scale, the CMFE algorithm is proposed in this paper to enhance MFE, as well as FuzzyEn. Compared with MFE, with the increasing of scale factor, CMFE obtains much more stable and consistent values for a short-term time series. In this paper CMFE is employed to measure the complexity of vibration signals of rolling bearings and is applied to extract the nonlinear features hidden in the vibration signals. Also the physically meanings of CMFE being suitable for rolling bearing fault diagnosis are explored. Based on these, to fulfill an automatic fault diagnosis, the ensemble SVMs based multi-classifier is constructed for the intelligent classification of fault features. Finally, the proposed fault diagnosis method of rolling bearing is applied to experimental data analysis and the results indicate that the proposed method could effectively distinguish different fault categories and severities of rolling bearings.
NASA Astrophysics Data System (ADS)
Theodorsen, Audun; Garcia, Odd Erik; Kube, Ralph; Labombard, Brian; Terry, Jim
2017-10-01
In the far scrape-off layer (SOL), radial motion of filamentary structures leads to excess transport of particles and heat. Amplitudes and arrival times of these filaments have previously been studied by conditional averaging in single-point measurements from Langmuir Probes and Gas Puff Imaging (GPI). Conditional averaging can be problematic: the cutoff for large amplitudes is mostly chosen by convention; the conditional windows used may influence the arrival time distribution; and the amplitudes cannot be separated from a background. Previous work has shown that SOL fluctuations are well described by a stochastic model consisting of a super-position of pulses with fixed shape and randomly distributed amplitudes and arrival times. The model can be formulated as a pulse shape convolved with a train of delta pulses. By choosing a pulse shape consistent with the power spectrum of the fluctuation time series, Richardson-Lucy deconvolution can be used to recover the underlying amplitudes and arrival times of the delta pulses. We apply this technique to both L and H-mode GPI data from the Alcator C-Mod tokamak. The pulse arrival times are shown to be uncorrelated and uniformly distributed, consistent with a Poisson process, and the amplitude distribution has an exponential tail.
NASA Astrophysics Data System (ADS)
Baret, F.; Weiss, M.; Lacaze, R.; Camacho, F.; Smets, B.; Pacholczyk, P.; Makhmara, H.
2010-12-01
LAI and fAPAR are recognized as Essential Climate Variables providing key information for the understanding and modeling of canopy functioning. Global remote sensing observations at medium resolution are routinely acquired since the 80’s mainly with AVHRR, SEAWIFS, VEGETATION, MODIS and MERIS sensors. Several operational products have been derived and provide global maps of LAI and fAPAR at daily to monthly time steps. Inter-comparison between MODIS, CYCLOPES, GLOBCARBON and JRC-FAPAR products showed generally consistent seasonality, while large differences in magnitude and smoothness may be observed. One of the objectives of the GEOLAND2 European project is to develop such core products to be used in a range of application services including the carbon monitoring. Rather than generating an additional product from scratch, the version 1 of GEOLAND2 products was capitalizing on the existing products by combining them to retain their pros and limit their cons. For these reasons, MODIS and CYCLOPES products were selected since they both include LAI and fAPAR while having relatively close temporal sampling intervals (8 to 10 days). GLOBCARBON products were not used here because of the too long monthly time step inducing large uncertainties in the seasonality description. JRC-FAPAR was not selected as well to preserve better consistency between LAI and fAPAR products. MODIS and CYCLOPES products were then linearly combined to take advantage of the good performances of CYCLOPES products for low to medium values of LAI and fAPAR while benefiting from the better MODIS performances for the highest LAI values. A training database representative of the global variability of vegetation type and conditions was thus built. A back-propagation neural network was then calibrated to estimate the new LAI and fAPAR products from VEGETATION preprocessed observations. Similarly, the vegetation cover fraction (fCover) was also derived by scaling the original CYCLOPES fCover products. Validation results achieved following the principles proposed by CEOS-LPV show that the new product called GEOV1 behaves as expected with good performances over the whole range of LAI and fAPAR in a temporally smooth and spatially consistent manner. These products will be processed and delivered by VITO in near real time at 1 km spatial resolution and 10 days frequency using a pre-operational production quality tracking system. The entire VEGETATION archive, from 1999 will be processed to provide a consistent time series over both VEGETATION sensors at the same spatial and temporal sampling. A climatology of products computed over the VEGETATION period will be also delivered at the same spatial and temporal sampling, showing average values, between year variability and possible trends over the decade. Finally, the VEGETATION derived time series starting back to 1999 will be completed with consistent products at 4 km spatial resolution derived from the NOAA/AVHRR series to cover the 1981-2010 period.
Zago, Myrka; Bosco, Gianfranco; Maffei, Vincenzo; Iosa, Marco; Ivanenko, Yuri P; Lacquaniti, Francesco
2004-04-01
Prevailing views on how we time the interception of a moving object assume that the visual inputs are informationally sufficient to estimate the time-to-contact from the object's kinematics. Here we present evidence in favor of a different view: the brain makes the best estimate about target motion based on measured kinematics and an a priori guess about the causes of motion. According to this theory, a predictive model is used to extrapolate time-to-contact from expected dynamics (kinetics). We projected a virtual target moving vertically downward on a wide screen with different randomized laws of motion. In the first series of experiments, subjects were asked to intercept this target by punching a real ball that fell hidden behind the screen and arrived in synchrony with the visual target. Subjects systematically timed their motor responses consistent with the assumption of gravity effects on an object's mass, even when the visual target did not accelerate. With training, the gravity model was not switched off but adapted to nonaccelerating targets by shifting the time of motor activation. In the second series of experiments, there was no real ball falling behind the screen. Instead the subjects were required to intercept the visual target by clicking a mousebutton. In this case, subjects timed their responses consistent with the assumption of uniform motion in the absence of forces, even when the target actually accelerated. Overall, the results are in accord with the theory that motor responses evoked by visual kinematics are modulated by a prior of the target dynamics. The prior appears surprisingly resistant to modifications based on performance errors.
Validation of the CHIRPS Satellite Rainfall Estimates over Eastern of Africa
NASA Astrophysics Data System (ADS)
Dinku, T.; Funk, C. C.; Tadesse, T.; Ceccato, P.
2017-12-01
Long and temporally consistent rainfall time series are essential in climate analyses and applications. Rainfall data from station observations are inadequate over many parts of the world due to sparse or non-existent observation networks, or limited reporting of gauge observations. As a result, satellite rainfall estimates have been used as an alternative or as a supplement to station observations. However, many satellite-based rainfall products with long time series suffer from coarse spatial and temporal resolutions and inhomogeneities caused by variations in satellite inputs. There are some satellite rainfall products with reasonably consistent time series, but they are often limited to specific geographic areas. The Climate Hazards Group Infrared Precipitation (CHIRP) and CHIRP combined with station observations (CHIRPS) are recently produced satellite-based rainfall products with relatively high spatial and temporal resolutions and quasi-global coverage. In this study, CHIRP and CHIRPS were evaluated over East Africa at daily, dekadal (10-day) and monthly time scales. The evaluation was done by comparing the satellite products with rain gauge data from about 1200 stations. The is unprecedented number of validation stations for this region covering. The results provide a unique region-wide understanding of how satellite products perform over different climatic/geographic (low lands, mountainous regions, and coastal) regions. The CHIRP and CHIRPS products were also compared with two similar satellite rainfall products: the African Rainfall Climatology version 2 (ARC2) and the latest release of the Tropical Applications of Meteorology using Satellite data (TAMSAT). The results show that both CHIRP and CHIRPS products are significantly better than ARC2 with higher skill and low or no bias. These products were also found to be slightly better than the latest version of the TAMSAT product. A comparison was also done between the latest release of the TAMSAT product (TAMSAT3) and the earlier version(TAMSAT2), which has shown that the latest version is a substantial improvement over the previous one, particularly with regards to the bias statistics.
Estimating urban vegetation fraction across 25 cities in pan-Pacific using Landsat time series data
NASA Astrophysics Data System (ADS)
Lu, Yuhao; Coops, Nicholas C.; Hermosilla, Txomin
2017-04-01
Urbanization globally is consistently reshaping the natural landscape to accommodate the growing human population. Urban vegetation plays a key role in moderating environmental impacts caused by urbanization and is critically important for local economic, social and cultural development. The differing patterns of human population growth, varying urban structures and development stages, results in highly varied spatial and temporal vegetation patterns particularly in the pan-Pacific region which has some of the fastest urbanization rates globally. Yet spatially-explicit temporal information on the amount and change of urban vegetation is rarely documented particularly in less developed nations. Remote sensing offers an exceptional data source and a unique perspective to map urban vegetation and change due to its consistency and ubiquitous nature. In this research, we assess the vegetation fractions of 25 cities across 12 pan-Pacific countries using annual gap-free Landsat surface reflectance products acquired from 1984 to 2012, using sub-pixel, spectral unmixing approaches. Vegetation change trends were then analyzed using Mann-Kendall statistics and Theil-Sen slope estimators. Unmixing results successfully mapped urban vegetation for pixels located in urban parks, forested mountainous regions, as well as agricultural land (correlation coefficient ranging from 0.66 to 0.77). The greatest vegetation loss from 1984 to 2012 was found in Shanghai, Tianjin, and Dalian in China. In contrast, cities including Vancouver (Canada) and Seattle (USA) showed stable vegetation trends through time. Using temporal trend analysis, our results suggest that it is possible to reduce noise and outliers caused by phenological changes particularly in cropland using dense new Landsat time series approaches. We conclude that simple yet effective approaches of unmixing Landsat time series data for assessing spatial and temporal changes of urban vegetation at regional scales can provide critical information for urban planners and anthropogenic studies globally.
Fractal Dimension Analysis of Transient Visual Evoked Potentials: Optimisation and Applications.
Boon, Mei Ying; Henry, Bruce Ian; Chu, Byoung Sun; Basahi, Nour; Suttle, Catherine May; Luu, Chi; Leung, Harry; Hing, Stephen
2016-01-01
The visual evoked potential (VEP) provides a time series signal response to an external visual stimulus at the location of the visual cortex. The major VEP signal components, peak latency and amplitude, may be affected by disease processes. Additionally, the VEP contains fine detailed and non-periodic structure, of presently unclear relevance to normal function, which may be quantified using the fractal dimension. The purpose of this study is to provide a systematic investigation of the key parameters in the measurement of the fractal dimension of VEPs, to develop an optimal analysis protocol for application. VEP time series were mathematically transformed using delay time, τ, and embedding dimension, m, parameters. The fractal dimension of the transformed data was obtained from a scaling analysis based on straight line fits to the numbers of pairs of points with separation less than r versus log(r) in the transformed space. Optimal τ, m, and scaling analysis were obtained by comparing the consistency of results using different sampling frequencies. The optimised method was then piloted on samples of normal and abnormal VEPs. Consistent fractal dimension estimates were obtained using τ = 4 ms, designating the fractal dimension = D2 of the time series based on embedding dimension m = 7 (for 3606 Hz and 5000 Hz), m = 6 (for 1803 Hz) and m = 5 (for 1000Hz), and estimating D2 for each embedding dimension as the steepest slope of the linear scaling region in the plot of log(C(r)) vs log(r) provided the scaling region occurred within the middle third of the plot. Piloting revealed that fractal dimensions were higher from the sampled abnormal than normal achromatic VEPs in adults (p = 0.02). Variances of fractal dimension were higher from the abnormal than normal chromatic VEPs in children (p = 0.01). A useful analysis protocol to assess the fractal dimension of transformed VEPs has been developed.
Control of molt in birds: association with prolactin and gonadal regression in starlings.
Dawson, Alistair
2006-07-01
Despite the importance of molt to birds, very little is known about its environmental or physiological control. In starlings Sturnus vulgaris, and other species, under both natural conditions and experimental regimes, gonadal regression coincides with peak prolactin secretion. The prebasic molt starts at the same time. The aim of this series of experiments was to keep starlings on photo-schedules that would challenge the normally close relationship between gonadal regression and molt, to determine how closely the start of molt is associated with gonadal regression and/or associated changes in prolactin concentrations. In one series of experiments, photosensitive starlings were moved from a short photoperiod, 8 h light per day (8L), to 13 or 18L, and from 13 to 18L or 13 to 8L during testicular maturation. Later, photorefractory birds under 13L that had finished molting were moved to 18L. In another series of experiments, photorefractory starlings were moved from 18 to 8L for 7 weeks, 4 weeks, 2 weeks, 1 week, 3 days, 1 day, or 0 days, before being returned to 18L. There was no consistent relationship between photoperiod, or the increase in photoperiod, and the timing of the start of molt. Nor was there a consistent relationship with gonadal regression and the start of molt-molt could be triggered in the absence of a gonadal cycle. However, there was always an association between the start of molt and prolactin. In all cases where molt was induced, there had been an earlier increase in prolactin. However, the timing of molt was related to the time of peak prolactin, not the magnitude of that peak. This relationship between peak prolactin and the start of molt could explain the normally close relationship between the end of breeding activity and the start of molt.
Battery Charge Equalizer with Transformer Array
NASA Technical Reports Server (NTRS)
Davies, Francis
2013-01-01
High-power batteries generally consist of a series connection of many cells or cell banks. In order to maintain high performance over battery life, it is desirable to keep the state of charge of all the cell banks equal. A method provides individual charging for battery cells in a large, high-voltage battery array with a minimum number of transformers while maintaining reasonable efficiency. This is designed to augment a simple highcurrent charger that supplies the main charge energy. The innovation will form part of a larger battery charge system. It consists of a transformer array connected to the battery array through rectification and filtering circuits. The transformer array is connected to a drive circuit and a timing and control circuit that allow individual battery cells or cell banks to be charged. The timing circuit and control circuit connect to a charge controller that uses battery instrumentation to determine which battery bank to charge. It is important to note that the innovation can charge an individual cell bank at the same time that the main battery charger is charging the high-voltage battery. The fact that the battery cell banks are at a non-zero voltage, and that they are all at similar voltages, can be used to allow charging of individual cell banks. A set of transformers can be connected with secondary windings in series to make weighted sums of the voltages on the primaries.
Temporal changes and sexual differences in spatial distribution of Burbot in Lake Erie
Stapanian, Martin A.; Witzel, Larry D.; Cook, Andy
2013-01-01
We used GIS mapping techniques to examine capture data for Burbot Lota lota from annual gill-net surveys in Canadian waters of Lake Erie during late August and September 1994–2011. Adult males were captured over a larger area (3–17% for ≥20% maximum yearly catch [MYC]) than adult females. More males than females were caught in the gill nets in 14 of the 15 study years. Collectively, these results support a hypothesis of greater activity by adult males during summer, when Burbot are actively feeding. The area of capture contracted by more than 60% (for ≥20% MYC) for both sexes during the time period, which is consistent with the documented decrease of the Burbot population in the lake. The sex ratio (females: males) varied over the time series but declined steadily from 0.97 in 2001 to 0.59 in 2011. The overlap in the capture areas of adult males and females was scale dependent. The depth distribution at which adult Burbot were caught did not change over the time series, and there was no difference in the median depths (about 30 m) at which adult male and female Burbot were caught. The last results are consistent with the Burbot's reliance on coldwater habitats. Additional research is recommended, including telemetry to describe daily and seasonal movements and assessment of gender bias in active and passive capture gear.
IDSP- INTERACTIVE DIGITAL SIGNAL PROCESSOR
NASA Technical Reports Server (NTRS)
Mish, W. H.
1994-01-01
The Interactive Digital Signal Processor, IDSP, consists of a set of time series analysis "operators" based on the various algorithms commonly used for digital signal analysis work. The processing of a digital time series to extract information is usually achieved by the application of a number of fairly standard operations. However, it is often desirable to "experiment" with various operations and combinations of operations to explore their effect on the results. IDSP is designed to provide an interactive and easy-to-use system for this type of digital time series analysis. The IDSP operators can be applied in any sensible order (even recursively), and can be applied to single time series or to simultaneous time series. IDSP is being used extensively to process data obtained from scientific instruments onboard spacecraft. It is also an excellent teaching tool for demonstrating the application of time series operators to artificially-generated signals. IDSP currently includes over 43 standard operators. Processing operators provide for Fourier transformation operations, design and application of digital filters, and Eigenvalue analysis. Additional support operators provide for data editing, display of information, graphical output, and batch operation. User-developed operators can be easily interfaced with the system to provide for expansion and experimentation. Each operator application generates one or more output files from an input file. The processing of a file can involve many operators in a complex application. IDSP maintains historical information as an integral part of each file so that the user can display the operator history of the file at any time during an interactive analysis. IDSP is written in VAX FORTRAN 77 for interactive or batch execution and has been implemented on a DEC VAX-11/780 operating under VMS. The IDSP system generates graphics output for a variety of graphics systems. The program requires the use of Versaplot and Template plotting routines and IMSL Math/Library routines. These software packages are not included in IDSP. The virtual memory requirement for the program is approximately 2.36 MB. The IDSP system was developed in 1982 and was last updated in 1986. Versaplot is a registered trademark of Versatec Inc. Template is a registered trademark of Template Graphics Software Inc. IMSL Math/Library is a registered trademark of IMSL Inc.
W. Cohen; H. Andersen; S. Healey; G. Moisen; T. Schroeder; C. Woodall; G. Domke; Z. Yang; S. Stehman; R. Kennedy; C. Woodcock; Z. Zhu; J. Vogelmann; D. Steinwand; C. Huang
2014-01-01
The authors are developing a REDD+ MRV system that tests different biomass estimation frameworks and components. Design-based inference from a costly fi eld plot network was compared to sampling with LiDAR strips and a smaller set of plots in combination with Landsat for disturbance monitoring. Biomass estimation uncertainties associated with these different data sets...
Assessment and maintenance of a 15 year old stress-laminated timber bridge
T. Russell Gentry; Karl N. Brohammer; John Wells; James P. Wacker
2006-01-01
A timber bridge consisting of three 6.7 meter spans with a stress laminated deck was constructed in 1991 in the Spirit Creek State Forest near August, Georgia, USA. The stress laminated bridge uses a series of post-tensioning bars to hold the laminations together. The bridge remained in service until 2001 with no maintenance, at which time the bridge was inspected,...
The Use of Ion Implantation for Materials Processing.
1980-10-06
consists of a series of sections, each section being an annular insulator (glass) and a shaped metal electrode (polished aluminum ) cemented together. A...depending on the ion species, semiconductor material, attached materials (such as aluminum leads), implantation energy, and dose; but some devices are...concentration of subsurface carbon. Appearing directly beneath the oxide layer, the C concentration first reaches a maximum of about five times the bulk
ERIC Educational Resources Information Center
Butner, Jonathan; Story, T. Nathan; Berg, Cynthia A.; Wiebe, Deborah J.
2011-01-01
Temporal patterning in blood glucose (BG) consistent with fractals--how BG follows a repetitive pattern through resolutions of time--was used to examine 2 different samples of adolescents with Type 1 diabetes (10-14 years). Sample 1 contained 10 adolescents with longtime series for accurate estimations of long-term dependencies associated with…
Spatial Soliton Interactions for Photonic Switching. Part I
2000-03-07
technique , a fully vectorial, first-order nonlinear wave equation that consistently includes terms two -orders beyond the slowly-varying amplitude , slowly...by using two tunable mode-locked Er-doped fiber lasers ," in Conference on Optical Fiber Communications, OSA Technical Digest Series, vol. 4, 1994...instead, based on optical logic gates. In addition, optical logic could be used for contention resolution, real-time encryption /decryption, and other
Eocene volcanism and the origin of horizon A
Gibson, T.G.; Towe, K.M.
1971-01-01
A series of closely time-equivalent deposits that correlate with seismic reflector horizon A exists along the coast of eastern North America. These sediments of Late-Early to Early-Middle Eocene age contain an authigenic mineral suite indicative of the alteration of volcanic glass. A volcanic origin for these siliceous deposits onshore is consistent with a volcanic origin for the cherts of horizon A offshore.
Impact of Sensor Degradation on the MODIS NDVI Time Series
NASA Technical Reports Server (NTRS)
Wang, Dongdong; Morton, Douglas Christopher; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert
2012-01-01
Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, the impact of sensor degradation on trend detection was evaluated using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470 nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004 yr-1 decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends in MODIS NDVI over North America were consistentwith simulated results,with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in detection of NDVI trends.
Murray, Jessica R.; Svarc, Jerry L.
2017-01-01
The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
Deriving crop calendar using NDVI time-series
NASA Astrophysics Data System (ADS)
Patel, J. H.; Oza, M. P.
2014-11-01
Agricultural intensification is defined in terms as cropping intensity, which is the numbers of crops (single, double and triple) per year in a unit cropland area. Information about crop calendar (i.e. number of crops in a parcel of land and their planting & harvesting dates and date of peak vegetative stage) is essential for proper management of agriculture. Remote sensing sensors provide a regular, consistent and reliable measurement of vegetation response at various growth stages of crop. Therefore it is ideally suited for monitoring purpose. The spectral response of vegetation, as measured by the Normalized Difference Vegetation Index (NDVI) and its profiles, can provide a new dimension for describing vegetation growth cycle. The analysis based on values of NDVI at regular time interval provides useful information about various crop growth stages and performance of crop in a season. However, the NDVI data series has considerable amount of local fluctuation in time domain and needs to be smoothed so that dominant seasonal behavior is enhanced. Based on temporal analysis of smoothed NDVI series, it is possible to extract number of crop cycles per year and their crop calendar. In the present study, a methodology is developed to extract key elements of crop growth cycle (i.e. number of crops per year and their planting - peak - harvesting dates). This is illustrated by analysing MODIS-NDVI data series of one agricultural year (from June 2012 to May 2013) over Gujarat. Such an analysis is very useful for analysing dynamics of kharif and rabi crops.
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
NASA Astrophysics Data System (ADS)
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingsong; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Román, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.
2017-07-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingson; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.
2017-01-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
Detection of deformation time-series in Miyake-jima using PALSAR/InSAR
NASA Astrophysics Data System (ADS)
Ozawa, T.; Ueda, H.
2010-12-01
Volcano deformation is often complicated temporally and spatially. Then deformation mapping by InSAR is useful to understand it in detail. However, InSAR is affected by the atmospheric, the ionospheric and other noises, and then we sometimes miss an important temporal change of deformation with a few cm. So we want to develop InSAR time-series analysis which detects volcano deformation precisely. Generally, the area of 10×10km which covers general volcano size is included in several SAR scenes obtained from different orbits or observation modes. First, interferograms are generated for each orbit path. In InSAR processing, the atmospheric noise reduction using the simulation from numerical weather model is used. Long wavelength noise due to orbit error and the ionospheric disturbance is corrected by adjusting to GPS deformation time-series, assuming it to be a plane. Next, we estimate deformation time-series from obtained interferograms. Radar incidence directions for each orbit path are different, but those for observation modes with 34.3° and 41.5° offnadir angles are almost included in one plane. Then slant-range change for all orbit paths can be described by the horizontal and the vertical components of its co-plane. Inversely, we estimate them for all epochs with the constraint that temporal change of deformation is smooth. Simultaneously, we estimate DEM error. As one of case studies, we present an application in Miyake-jima. Miyake-jima is a volcanic island located to 200km south of Tokyo, and a large amount of volcanic gas has been ejecting since the 2000 eruption. Crustal deformation associated with such volcanic activity has been observed by continuous GPS observations. However, its distribution is complicated, and therefore we applied this method to detect precise deformation time-series. In the most of GPS sites, obtained time-series were good agreement with GPS time-series, and the root-mean-square of residuals was less than 1cm. However, the temporal step of deformation was estimated in 2008, and it is not consistent with GPS time-series. We think that the effect of an orbit maneuver in 2008 has appeared. An improvement for such noise is one of next subjects. In the obtained deformation map, contraction around the caldera and uplift along the north-west-south coast were found. It is obvious that this deformation pattern cannot be explained by simple one inflation or deflation source, and its interpretation is also one of next subjects. In the caldera bottom, subsidence with 14cm/yr was found. Though its subsidence speed was constant until 2008, it decelerated to 20cm/yr from 2009. Furthermore subsidence speed in 2010 was 3cm/yr. Around the same time, low-frequency earthquakes increased just under the caldera. Then we speculate that deceleration of subsidence may directly relate with the volcanic activity. Although the result shows volcano deformation in detail, some mis-estimations were obtained. We believe that this InSAR time-series analysis is useful, but more improvements are necessary.
NASA Technical Reports Server (NTRS)
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Quingsong; Kim, Jihyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.;
2017-01-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warmingcooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500-meter Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF (Bidirectional Reflectance Distribution Function) / NBAR (Nadir BRDF-Adjusted Reflectance) / albedo products and 30-meter Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDFAlbedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30-meter Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30-meter albedos for the intervening daily time steps in this study. These enhanced daily 30-meter spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of 0.006. These synthetic time series provide much greater spatial detail than the 500 meter gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 kilometers by 14 kilometers) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF-Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30-meter resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
Hydrodynamic measurements in Suisun Bay, California, 1992-93
Gartner, Jeffrey W.; Burau, Jon R.
1999-01-01
Sea level, velocity, temperature, and salinity (conductivity and temperature) data collected in Suisun Bay, California, from December 11, 1992, through May 31, 1993, by the U.S. Geological Survey are documented in this report. Sea-level data were collected at four locations and temperature and salinity data were collected at seven locations. Velocity data were collected at three locations using acoustic Doppler current profilers and at four other locations using point velocity meters. Sea-level and velocity data are presented in three forms (1) harmonic analysis results, (2) time-series plots (sea level, current speed, and current direction versus time), and (3) time-series plots of the low-pass filtered data. Temperature and salinity data are presented as plots of raw and low-pass filtered time series. The velocity and salinity data collected during this study document a period when the residual current patterns and salt field were significantly altered by large Delta outflow (three peaks in excess of 2,000 cubic meters per second). Residual current profiles were consistently seaward with magnitudes that fluctuated primarily in concert with Delta outflow and secondarily with the spring-neap tide cycle. The freshwater inputs advected salinity seaward of Suisun Bay for most of this study. Except for a 10-day period at the beginning of the study, dynamically significant salinities (>2) were seaward of Suisun Bay, which resulted in little or no gravitational circulation transport.
Land use intensity trajectories on Amazonian pastures derived from Landsat time series
NASA Astrophysics Data System (ADS)
Rufin, Philippe; Müller, Hannes; Pflugmacher, Dirk; Hostert, Patrick
2015-09-01
Monitoring changes in land use intensity of grazing systems in the Amazon is an important prerequisite to study the complex political and socio-economic forces driving Amazonian deforestation. Remote sensing offers the potential to map pasture vegetation over large areas, but mapping pasture conditions consistently through time is not a trivial task because of seasonal changes associated with phenology and data gaps from clouds and cloud shadows. In this study, we tested spectral-temporal metrics derived from intra-annual Landsat time series to distinguish between grass-dominated and woody pastures. The abundance of woody vegetation on pastures is an indicator for management intensity, since the duration and intensity of land use steer secondary succession rates, apart from climate and soil conditions. We used the developed Landsat-based metrics to analyze pasture intensity trajectories between 1985 and 2012 in Novo Progresso, Brazil, finding that woody vegetation cover generally decreased after four to ten years of grazing activity. Pastures established in the 80s and early 90s showed a higher fraction of woody vegetation during their initial land use history than pastures established in the early 2000s. Historic intensity trajectories suggested a trend towards more intensive land use in the last decade, which aligns well with regional environmental policies and market dynamics. This study demonstrates the potential of dense Landsat time series to monitor land-use intensification on Amazonian pastures.
Wan, Huafang; Cui, Yixin; Ding, Yijuan; Mei, Jiaqin; Dong, Hongli; Zhang, Wenxin; Wu, Shiqi; Liang, Ying; Zhang, Chunyu; Li, Jiana; Xiong, Qing; Qian, Wei
2016-01-01
Understanding the regulation of lipid metabolism is vital for genetic engineering of canola ( Brassica napus L.) to increase oil yield or modify oil composition. We conducted time-series analyses of transcriptomes and proteomes to uncover the molecular networks associated with oil accumulation and dynamic changes in these networks in canola. The expression levels of genes and proteins were measured at 2, 4, 6, and 8 weeks after pollination (WAP). Our results show that the biosynthesis of fatty acids is a dominant cellular process from 2 to 6 WAP, while the degradation mainly happens after 6 WAP. We found that genes in almost every node of fatty acid synthesis pathway were significantly up-regulated during oil accumulation. Moreover, significant expression changes of two genes, acetyl-CoA carboxylase and acyl-ACP desaturase, were detected on both transcriptomic and proteomic levels. We confirmed the temporal expression patterns revealed by the transcriptomic analyses using quantitative real-time PCR experiments. The gene set association analysis show that the biosynthesis of fatty acids and unsaturated fatty acids are the most significant biological processes from 2-4 WAP and 4-6 WAP, respectively, which is consistent with the results of time-series analyses. These results not only provide insight into the mechanisms underlying lipid metabolism, but also reveal novel candidate genes that are worth further investigation for their values in the genetic engineering of canola.
Initial Validation of NDVI time seriesfrom AVHRR, VEGETATION, and MODIS
NASA Technical Reports Server (NTRS)
Morisette, Jeffrey T.; Pinzon, Jorge E.; Brown, Molly E.; Tucker, Jim; Justice, Christopher O.
2004-01-01
The paper will address Theme 7: Multi-sensor opportunities for VEGETATION. We present analysis of a long-term vegetation record derived from three moderate resolution sensors: AVHRR, VEGETATION, and MODIS. While empirically based manipulation can ensure agreement between the three data sets, there is a need to validate the series. This paper uses atmospherically corrected ETM+ data available over the EOS Land Validation Core Sites as an independent data set with which to compare the time series. We use ETM+ data from 15 globally distributed sites, 7 of which contain repeat coverage in time. These high-resolution data are compared to the values of each sensor by spatially aggregating the ETM+ to each specific sensors' spatial coverage. The aggregated ETM+ value provides a point estimate for a specific site on a specific date. The standard deviation of that point estimate is used to construct a confidence interval for that point estimate. The values from each moderate resolution sensor are then evaluated with respect to that confident interval. Result show that AVHRR, VEGETATION, and MODIS data can be combined to assess temporal uncertainties and address data continuity issues and that the atmospherically corrected ETM+ data provide an independent source with which to compare that record. The final product is a consistent time series climate record that links historical observations to current and future measurements.
Ip, Ryan H L; Li, W K; Leung, Kenneth M Y
2013-09-15
Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.
Limits to detection of generalized synchronization in delay-coupled chaotic oscillators.
Kato, Hideyuki; Soriano, Miguel C; Pereda, Ernesto; Fischer, Ingo; Mirasso, Claudio R
2013-12-01
We study how reliably generalized synchronization can be detected and characterized from time-series analysis. To that end, we analyze synchronization in a generalized sense of delay-coupled chaotic oscillators in unidirectional ring configurations. The generalized synchronization condition can be verified via the auxiliary system approach; however, in practice, this might not always be possible. Therefore, in this study, widely used indicators to directly quantify generalized and phase synchronization from noise-free time series of two oscillators are employed complementarily to the auxiliary system approach. In our analysis, none of the indices provide the consistent results of the auxiliary system approach. Our findings indicate that it is a major challenge to directly detect synchronization in a generalized sense between two oscillators that are connected via a chain of other oscillators, even if the oscillators are identical. This has major consequences for the interpretation of the dynamics of coupled systems and applications thereof.
NASA Technical Reports Server (NTRS)
Penland, Cecile; Ghil, Michael; Weickmann, Klaus M.
1991-01-01
The spectral resolution and statistical significance of a harmonic analysis obtained by low-order MEM can be improved by subjecting the data to an adaptive filter. This adaptive filter consists of projecting the data onto the leading temporal empirical orthogonal functions obtained from singular spectrum analysis (SSA). The combined SSA-MEM method is applied both to a synthetic time series and a time series of AAM data. The procedure is very effective when the background noise is white and less so when the background noise is red. The latter case obtains in the AAM data. Nevertheless, reliable evidence for intraseasonal and interannual oscillations in AAM is detected. The interannual periods include a quasi-biennial one and an LF one, of 5 years, both related to the El Nino/Southern Oscillation. In the intraseasonal band, separate oscillations of about 48.5 and 51 days are ascertained.
An astronomer's guide to period searching
NASA Astrophysics Data System (ADS)
Schwarzenberg-Czerny, A.
2003-03-01
We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.
Zhang, Ridong; Tao, Jili; Lu, Renquan; Jin, Qibing
2018-02-01
Modeling of distributed parameter systems is difficult because of their nonlinearity and infinite-dimensional characteristics. Based on principal component analysis (PCA), a hybrid modeling strategy that consists of a decoupled linear autoregressive exogenous (ARX) model and a nonlinear radial basis function (RBF) neural network model are proposed. The spatial-temporal output is first divided into a few dominant spatial basis functions and finite-dimensional temporal series by PCA. Then, a decoupled ARX model is designed to model the linear dynamics of the dominant modes of the time series. The nonlinear residual part is subsequently parameterized by RBFs, where genetic algorithm is utilized to optimize their hidden layer structure and the parameters. Finally, the nonlinear spatial-temporal dynamic system is obtained after the time/space reconstruction. Simulation results of a catalytic rod and a heat conduction equation demonstrate the effectiveness of the proposed strategy compared to several other methods.
[Time series studies of air pollution by fires and the effects on human health].
do Carmo, Cleber Nascimento; Hacon, Sandra de Souza
2013-11-01
Burnoffs (intentional fires for agricultural purposes) and forest fires of large proportions have been observed in various regions of the planet. Exposure to high levels of air pollutants emitted by fires can be responsible for various harmful effects on human health. In this article, the literature on estimating acute effects of air pollution on human health by fires in the regions with the highest number of fires on the planet, using a time series approach is summarized. An attempt was made to identify gaps in knowledge. The study consisted of a narrative review, in which the characteristics of the selected studies were grouped by regions of the planet with a higher incidence of burnoffs: Amazon, America, Australia and Asia. The results revealed a large number of studies in Australia, few studies in the Amazon and great heterogeneity in the results on the significant effects on human health.
The Wonders of Physics Outreach Program
NASA Astrophysics Data System (ADS)
Sprott, J. C.; Mirus, K. A.; Newman, D. E.; Watts, C.; Feeley, R. E.; Fernandez, E.; Fontana, P. W.; Krajewski, T.; Lovell, T. W.; Oliva, S.; Stoneking, M. R.; Thomas, M. A.; Jaimison, W.; Maas, K.; Milbrandt, R.; Mullman, K.; Narf, S.; Nesnidal, R.; Nonn, P.
1996-11-01
One important step toward public education about fusion energy is to first elevate the public's appreciation of science in general. Toward this end, the Wonders of Physics program was started at the University of Wisconsin-Madison in 1984 as a public lecture and demonstration series in an attempt to stem a growing tide of science illiteracy and to bolster the public's perception of the scientific enterprise. Since that time, it has grown into a public outreach endeavor which consists of a traveling demonstration show, educational pamphlets, videos, software, a website (http://sprott.physics.wisc.edu/wop.htm), and the annual public lecture demonstration series including tours highlighting the Madison Symmetric Torus and departmental facilities. The presentation has been made about 400 times to a total audience in excess of 50,000. Sample educational materials and Lecture Kits will be available at the poster session. Currently at Oak Ridge National Laboratories. Currently at Max Planck Institut fuer Plasmaphysik. *Currently at Johnson Controls.
Homeostasis and Gauss statistics: barriers to understanding natural variability.
West, Bruce J
2010-06-01
In this paper, the concept of knowledge is argued to be the top of a three-tiered system of science. The first tier is that of measurement and data, followed by information consisting of the patterns within the data, and ending with theory that interprets the patterns and yields knowledge. Thus, when a scientific theory ceases to be consistent with the database the knowledge based on that theory must be re-examined and potentially modified. Consequently, all knowledge, like glory, is transient. Herein we focus on the non-normal statistics of physiologic time series and conclude that the empirical inverse power-law statistics and long-time correlations are inconsistent with the theoretical notion of homeostasis. We suggest replacing the notion of homeostasis with that of Fractal Physiology.
Detection of generalized synchronization using echo state networks
NASA Astrophysics Data System (ADS)
Ibáñez-Soria, D.; Garcia-Ojalvo, J.; Soria-Frisch, A.; Ruffini, G.
2018-03-01
Generalized synchronization between coupled dynamical systems is a phenomenon of relevance in applications that range from secure communications to physiological modelling. Here, we test the capabilities of reservoir computing and, in particular, echo state networks for the detection of generalized synchronization. A nonlinear dynamical system consisting of two coupled Rössler chaotic attractors is used to generate temporal series consisting of time-locked generalized synchronized sequences interleaved with unsynchronized ones. Correctly tuned, echo state networks are able to efficiently discriminate between unsynchronized and synchronized sequences even in the presence of relatively high levels of noise. Compared to other state-of-the-art techniques of synchronization detection, the online capabilities of the proposed Echo State Network based methodology make it a promising choice for real-time applications aiming to monitor dynamical synchronization changes in continuous signals.
Distinguishing signatures of determinism and stochasticity in spiking complex systems
Aragoneses, Andrés; Rubido, Nicolás; Tiana-Alsina, Jordi; Torrent, M. C.; Masoller, Cristina
2013-01-01
We describe a method to infer signatures of determinism and stochasticity in the sequence of apparently random intensity dropouts emitted by a semiconductor laser with optical feedback. The method uses ordinal time-series analysis to classify experimental data of inter-dropout-intervals (IDIs) in two categories that display statistically significant different features. Despite the apparent randomness of the dropout events, one IDI category is consistent with waiting times in a resting state until noise triggers a dropout, and the other is consistent with dropouts occurring during the return to the resting state, which have a clear deterministic component. The method we describe can be a powerful tool for inferring signatures of determinism in the dynamics of complex systems in noisy environments, at an event-level description of their dynamics.
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il
A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
NASA Astrophysics Data System (ADS)
García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.
2018-07-01
In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.
Ahn, Yeong Hee; Lee, Yeon Jung; Kim, Sung Ho
2015-01-01
This study describes an MS-based analysis method for monitoring changes in polymer composition during the polyaddition polymerization reaction of toluene diisocyanate (TDI) and ethylene glycol (EG). The polymerization was monitored as a function of reaction time using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS). The resulting series of polymer adducts terminated with various end-functional groups were precisely identified and the relative compositions of those series were estimated. A new MALDI MS data interpretation method was developed, consisting of a peak-resolving algorithm for overlapping peaks in MALDI MS spectra, a retrosynthetic analysis for the generation of reduced unit mass peaks, and a Gaussian fit-based selection of the most prominent polymer series among the reconstructed unit mass peaks. This method of data interpretation avoids errors originating from side reactions due to the presence of trace water in the reaction mixture or MALDI analysis. Quantitative changes in the relative compositions of the resulting polymer products were monitored as a function of reaction time. These results demonstrate that the mass data interpretation method described herein can be a powerful tool for estimating quantitative changes in the compositions of polymer products arising during a polymerization reaction.
Improving cluster-based missing value estimation of DNA microarray data.
Brás, Lígia P; Menezes, José C
2007-06-01
We present a modification of the weighted K-nearest neighbours imputation method (KNNimpute) for missing values (MVs) estimation in microarray data based on the reuse of estimated data. The method was called iterative KNN imputation (IKNNimpute) as the estimation is performed iteratively using the recently estimated values. The estimation efficiency of IKNNimpute was assessed under different conditions (data type, fraction and structure of missing data) by the normalized root mean squared error (NRMSE) and the correlation coefficients between estimated and true values, and compared with that of other cluster-based estimation methods (KNNimpute and sequential KNN). We further investigated the influence of imputation on the detection of differentially expressed genes using SAM by examining the differentially expressed genes that are lost after MV estimation. The performance measures give consistent results, indicating that the iterative procedure of IKNNimpute can enhance the prediction ability of cluster-based methods in the presence of high missing rates, in non-time series experiments and in data sets comprising both time series and non-time series data, because the information of the genes having MVs is used more efficiently and the iterative procedure allows refining the MV estimates. More importantly, IKNN has a smaller detrimental effect on the detection of differentially expressed genes.
NASA Astrophysics Data System (ADS)
Max-Moerbeck, W.; Richards, J. L.; Hovatta, T.; Pavlidou, V.; Pearson, T. J.; Readhead, A. C. S.
2014-11-01
We present a practical implementation of a Monte Carlo method to estimate the significance of cross-correlations in unevenly sampled time series of data, whose statistical properties are modelled with a simple power-law power spectral density. This implementation builds on published methods; we introduce a number of improvements in the normalization of the cross-correlation function estimate and a bootstrap method for estimating the significance of the cross-correlations. A closely related matter is the estimation of a model for the light curves, which is critical for the significance estimates. We present a graphical and quantitative demonstration that uses simulations to show how common it is to get high cross-correlations for unrelated light curves with steep power spectral densities. This demonstration highlights the dangers of interpreting them as signs of a physical connection. We show that by using interpolation and the Hanning sampling window function we are able to reduce the effects of red-noise leakage and to recover steep simple power-law power spectral densities. We also introduce the use of a Neyman construction for the estimation of the errors in the power-law index of the power spectral density. This method provides a consistent way to estimate the significance of cross-correlations in unevenly sampled time series of data.
Toward a comprehensive landscape vegetation monitoring framework
NASA Astrophysics Data System (ADS)
Kennedy, Robert; Hughes, Joseph; Neeti, Neeti; Larrue, Tara; Gregory, Matthew; Roberts, Heather; Ohmann, Janet; Kane, Van; Kane, Jonathan; Hooper, Sam; Nelson, Peder; Cohen, Warren; Yang, Zhiqiang
2016-04-01
Blossoming Earth observation resources provide great opportunity to better understand land vegetation dynamics, but also require new techniques and frameworks to exploit their potential. Here, I describe several parallel projects that leverage time-series Landsat imagery to describe vegetation dynamics at regional and continental scales. At the core of these projects are the LandTrendr algorithms, which distill time-series earth observation data into periods of consistent long or short-duration dynamics. In one approach, we built an integrated, empirical framework to blend these algorithmically-processed time-series data with field data and lidar data to ascribe yearly change in forest biomass across the US states of Washington, Oregon, and California. In a separate project, we expanded from forest-only monitoring to full landscape land cover monitoring over the same regional scale, including both categorical class labels and continuous-field estimates. In these and other projects, we apply machine-learning approaches to ascribe all changes in vegetation to driving processes such as harvest, fire, urbanization, etc., allowing full description of both disturbance and recovery processes and drivers. Finally, we are moving toward extension of these same techniques to continental and eventually global scales using Google Earth Engine. Taken together, these approaches provide one framework for describing and understanding processes of change in vegetation communities at broad scales.
Miró, Conrado; Baeza, Antonio; Madruga, María J; Periañez, Raul
2012-11-01
The objective of this work consisted of analysing the spatial and temporal evolution of two radionuclide concentrations in the Tagus River. Time-series analysis techniques and numerical modelling have been used in this study. (137)Cs and (90)Sr concentrations have been measured from 1994 to 1999 at several sampling points in Spain and Portugal. These radionuclides have been introduced into the river by the liquid releases from several nuclear power plants in Spain, as well as from global fallout. Time-series analysis techniques have allowed the determination of radionuclide transit times along the river, and have also pointed out the existence of temporal cycles of radionuclide concentrations at some sampling points, which are attributed to water management in the reservoirs placed along the Tagus River. A stochastic dispersion model, in which transport with water, radioactive decay and water-sediment interactions are solved through Monte Carlo methods, has been developed. Model results are, in general, in reasonable agreement with measurements. The model has finally been applied to the calculation of mean ages of radioactive content in water and sediments in each reservoir. This kind of model can be a very useful tool to support the decision-making process after an eventual emergency situation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Waldner, François; Hansen, Matthew C; Potapov, Peter V; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre
2017-01-01
The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring.
Hansen, Matthew C.; Potapov, Peter V.; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre
2017-01-01
The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring. PMID:28817618
Modeling turbidity and flow at daily steps in karst using ARIMA/ARFIMA-GARCH error models
NASA Astrophysics Data System (ADS)
Massei, N.
2013-12-01
Hydrological and physico-chemical variations recorded at karst springs usually reflect highly non-linear processes and the corresponding time series are then very often also highly non-linear. Among others, turbidity, as an important parameter regarding water quality and management, is a very complex response of karst systems to rain events, involving direct transfer of particles from point-source recharge as well as resuspension of particles previously deposited and stored within the system. For those reasons, turbidity modeling has not been well taken in karst hydrological models so far. Most of the time, the modeling approaches would involve stochastic linear models such ARIMA-type models and their derivatives (ARMA, ARMAX, ARIMAX, ARFIMA...). Yet, linear models usually fail to represent well the whole (stochastic) process variability, and their residuals still contain useful information that can be used to either understand the whole variability or to enhance short-term predictability and forecasting. Model residuals are actually not i.i.d., which can be identified by the fact that squared residuals still present clear and significant serial correlation. Indeed, high (low) amplitudes are followed in time by high (low) amplitudes, which can be seen on residuals time series as periods of time during which amplitudes are higher (lower) then the mean amplitude. This is known as the ARCH effet (AutoRegressive Conditional Heteroskedasticity), and the corresponding non-linear process affecting residuals of a linear model can be modeled using ARCH or generalized ARCH (GARCH) non-linear modeling, which approaches are very well known in econometrics. Here we investigated the capability of ARIMA-GARCH error models to represent a ~20-yr daily turbidity time series recorded at a karst spring used for water supply of the city of Le Havre (Upper Normandy, France). ARIMA and ARFIMA models were used to represent the mean behavior of the time series and the residuals clearly appeared to present a pronounced ARCH effect, as confirmed by Ljung-Box and McLeod-Li tests. We then identified and fitted GARCH models to the residuals of ARIMA and ARFIMA models in order to model the conditional variance and volatility of the turbidity time series. The results eventually showed that serial correlation was succesfully removed in the last standardized residuals of the GARCH model, and hence that the ARIMA-GARCH error model appeared consistent for modeling such time series. The approach finally improved short-term (e.g a few steps-ahead) turbidity forecasting.
Empirical Investigation of Critical Transitions in Paleoclimate
NASA Astrophysics Data System (ADS)
Loskutov, E. M.; Mukhin, D.; Gavrilov, A.; Feigin, A.
2016-12-01
In this work we apply a new empirical method for the analysis of complex spatially distributed systems to the analysis of paleoclimate data. The method consists of two general parts: (i) revealing the optimal phase-space variables and (ii) construction the empirical prognostic model by observed time series. The method of phase space variables construction based on the data decomposition into nonlinear dynamical modes which was successfully applied to global SST field and allowed clearly separate time scales and reveal climate shift in the observed data interval [1]. The second part, the Bayesian approach to optimal evolution operator reconstruction by time series is based on representation of evolution operator in the form of nonlinear stochastic function represented by artificial neural networks [2,3]. In this work we are focused on the investigation of critical transitions - the abrupt changes in climate dynamics - in match longer time scale process. It is well known that there were number of critical transitions on different time scales in the past. In this work, we demonstrate the first results of applying our empirical methods to analysis of paleoclimate variability. In particular, we discuss the possibility of detecting, identifying and prediction such critical transitions by means of nonlinear empirical modeling using the paleoclimate record time series. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep155102. Ya. I. Molkov, D. N. Mukhin, E. M. Loskutov, A.M. Feigin, (2012) : Random dynamical models from time series. Phys. Rev. E, Vol. 85, n.3.3. Mukhin, D., Kondrashov, D., Loskutov, E., Gavrilov, A., Feigin, A., & Ghil, M. (2015). Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models. Journal of Climate, 28(5), 1962-1976. http://doi.org/10.1175/JCLI-D-14-00240.1
Superconducting fault current limiter for railway transport
NASA Astrophysics Data System (ADS)
Fisher, L. M.; Alferov, D. F.; Akhmetgareev, M. R.; Budovskii, A. I.; Evsin, D. V.; Voloshin, I. F.; Kalinov, A. V.
2015-12-01
A resistive switching superconducting fault current limiter (SFCL) for DC networks with voltage of 3.5 kV and nominal current of 2 kA is developed. The SFCL consists of two series-connected units: block of superconducting modules and high-speed vacuum breaker with total disconnection time not more than 8 ms. The results of laboratory tests of superconducting SFCL modules in current limiting mode are presented. The recovery time of superconductivity is experimentally determined. The possibility of application of SFCL on traction substations of Russian Railways is considered.
IR temperatures of Mauna Loa caldera obtained with multispectral thermal imager
NASA Astrophysics Data System (ADS)
Pendergast, Malcolm M.; O'Steen, Byron L.; Kurzeja, Robert J.
2002-01-01
A survey of surface temperatures of the Mauna Loa caldera during 7/14/00 and 7/15/00 was made by SRTC in conjunction with a MTI satellite image collection. The general variation of surface temperature appears quite predictable responding to solar heating. The analysis of detailed times series of temperature indicates systematic variations in temperature of 5 C corresponding to time scales of 3-5 minutes and space scales of 10-20 m. The average temperature patterns are consistent with those predicted by the Regional Atmospheric Modeling System (RAMS).
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Persistent homology of time-dependent functional networks constructed from coupled time series
NASA Astrophysics Data System (ADS)
Stolz, Bernadette J.; Harrington, Heather A.; Porter, Mason A.
2017-04-01
We use topological data analysis to study "functional networks" that we construct from time-series data from both experimental and synthetic sources. We use persistent homology with a weight rank clique filtration to gain insights into these functional networks, and we use persistence landscapes to interpret our results. Our first example uses time-series output from networks of coupled Kuramoto oscillators. Our second example consists of biological data in the form of functional magnetic resonance imaging data that were acquired from human subjects during a simple motor-learning task in which subjects were monitored for three days during a five-day period. With these examples, we demonstrate that (1) using persistent homology to study functional networks provides fascinating insights into their properties and (2) the position of the features in a filtration can sometimes play a more vital role than persistence in the interpretation of topological features, even though conventionally the latter is used to distinguish between signal and noise. We find that persistent homology can detect differences in synchronization patterns in our data sets over time, giving insight both on changes in community structure in the networks and on increased synchronization between brain regions that form loops in a functional network during motor learning. For the motor-learning data, persistence landscapes also reveal that on average the majority of changes in the network loops take place on the second of the three days of the learning process.
Cross-recurrence quantification analysis of categorical and continuous time series: an R package
Coco, Moreno I.; Dale, Rick
2014-01-01
This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between “correlation” and “co-visitation” when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results’ consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels. PMID:25018736
NASA Astrophysics Data System (ADS)
Ai, Jinquan; Gao, Wei; Gao, Zhiqiang; Shi, Runhe; Zhang, Chao
2017-04-01
Spartina alterniflora is an aggressive invasive plant species that replaces native species, changes the structure and function of the ecosystem across coastal wetlands in China, and is thus a major conservation concern. Mapping the spread of its invasion is a necessary first step for the implementation of effective ecological management strategies. The performance of a phenology-based approach for S. alterniflora mapping is explored in the coastal wetland of the Yangtze Estuary using a time series of GaoFen satellite no. 1 wide field of view camera (GF-1 WFV) imagery. First, a time series of the normalized difference vegetation index (NDVI) was constructed to evaluate the phenology of S. alterniflora. Two phenological stages (the senescence stage from November to mid-December and the green-up stage from late April to May) were determined as important for S. alterniflora detection in the study area based on NDVI temporal profiles, spectral reflectance curves of S. alterniflora and its coexistent species, and field surveys. Three phenology feature sets representing three major phenology-based detection strategies were then compared to map S. alterniflora: (1) the single-date imagery acquired within the optimal phenological window, (2) the multitemporal imagery, including four images from the two important phenological windows, and (3) the monthly NDVI time series imagery. Support vector machines and maximum likelihood classifiers were applied on each phenology feature set at different training sample sizes. For all phenology feature sets, the overall results were produced consistently with high mapping accuracies under sufficient training samples sizes, although significantly improved classification accuracies (10%) were obtained when the monthly NDVI time series imagery was employed. The optimal single-date imagery had the lowest accuracies of all detection strategies. The multitemporal analysis demonstrated little reduction in the overall accuracy compared with the use of monthly NDVI time series imagery. These results show the importance of considering the phenological stage for image selection for mapping S. alterniflora using GF-1 WFV imagery. Furthermore, in light of the better tradeoff between the number of images and classification accuracy when using multitemporal GF-1 WFV imagery, we suggest using multitemporal imagery acquired at appropriate phenological windows for S. alterniflora mapping at regional scales.
NASA Astrophysics Data System (ADS)
Broich, M.; Tulbure, M. G.; Wijaya, A.; Weisse, M.; Stolle, F.
2017-12-01
Deforestation and forest degradation form the 2nd largest source of anthropogenic CO2 emissions. While deforestation is being globally mapped with satellite image time series, degradation remains insufficiently quantified. Previous studies quantified degradation for small scale, local sites. A method suitable for accurate mapping across large areas has not yet been developed due to the variability of the low magnitude and short-lived degradation signal and the absence of data with suitable resolution properties. Here we use a combination of newly available streams of free optical and radar image time series acquired by NASA and ESA, and HPC-based data science algorithms to innovatively quantify degradation consistently across Southeast Asia (SEA). We used Sentinel1 c-band radar data and NASA's new Harmonized Landsat8 (L8) Sentinel2 (S2) product (HLS) for cloud free optical images. Our results show that dense time series of cloud penetrating Sentinel 1 c-band radar can provide degradation alarm flags, while the HLS product of cloud-free optical images can unambiguously confirm degradation alarms. The detectability of degradation differed across SEA. In the seasonal forest of continental SEA the reliability of our radar-based alarm flags increased as the variability in landscape moisture decreases in the dry season. We reliably confirmed alarms with optical image time series during the late dry season, where degradation in open canopy forests becomes detectable once the undergrowth vegetation has died down. Conversely, in insular SEA landscape moisture is low, the radar time series generated degradation alarms flags with moderate to high reliability throughout the year, further confirmed with the HLS product. Based on the HLS product we can now confirm degradation within < 6 months on average as opposed to 1 year when using either L8 or S2 alone. In contrast to continental SEA, across insular SEA our degradation maps are not suitable to provide annual maps of total degradation area, but can pinpoint degradation areas on a rolling basin throughout the year. In both continental SEA and insular SEA there the combination of optical and radar time series provides better results than either one on its own. Our results provide significant information with application for carbon trading policy and land management.
Detection of a sudden change of the field time series based on the Lorenz system.
Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.
Li, Shi; Batterman, Stuart; Wasilevich, Elizabeth; Wahl, Robert; Wirth, Julie; Su, Feng-Chiao; Mukherjee, Bhramar
2011-11-01
Asthma morbidity has been associated with ambient air pollutants in time-series and case-crossover studies. In such study designs, threshold effects of air pollutants on asthma outcomes have been relatively unexplored, which are of potential interest for exploring concentration-response relationships. This study analyzes daily data on the asthma morbidity experienced by the pediatric Medicaid population (ages 2-18 years) of Detroit, Michigan and concentrations of pollutants fine particles (PM2.5), CO, NO2 and SO2 for the 2004-2006 period, using both time-series and case-crossover designs. We use a simple, testable and readily implementable profile likelihood-based approach to estimate threshold parameters in both designs. Evidence of significant increases in daily acute asthma events was found for SO2 and PM2.5, and a significant threshold effect was estimated for PM2.5 at 13 and 11 μg m(-3) using generalized additive models and conditional logistic regression models, respectively. Stronger effect sizes above the threshold were typically noted compared to standard linear relationship, e.g., in the time series analysis, an interquartile range increase (9.2 μg m(-3)) in PM2.5 (5-day-moving average) had a risk ratio of 1.030 (95% CI: 1.001, 1.061) in the generalized additive models, and 1.066 (95% CI: 1.031, 1.102) in the threshold generalized additive models. The corresponding estimates for the case-crossover design were 1.039 (95% CI: 1.013, 1.066) in the conditional logistic regression, and 1.054 (95% CI: 1.023, 1.086) in the threshold conditional logistic regression. This study indicates that the associations of SO2 and PM2.5 concentrations with asthma emergency department visits and hospitalizations, as well as the estimated PM2.5 threshold were fairly consistent across time-series and case-crossover analyses, and suggests that effect estimates based on linear models (without thresholds) may underestimate the true risk. Copyright © 2011 Elsevier Inc. All rights reserved.
Volatility of linear and nonlinear time series
NASA Astrophysics Data System (ADS)
Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo
2005-07-01
Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.
When Does Haste Make Waste? Speed-Accuracy Tradeoff, Skill Level, and the Tools of the Trade
ERIC Educational Resources Information Center
Beilock, Sian L.; Bertenthal, Bennett I.; Hoerger, Michael; Carr, Thomas H.
2008-01-01
Novice and skilled golfers took a series of golf putts with a standard putter (Exp. 1) or a distorted "funny putter" (consisting of an s-shaped and arbitrarily weighted putter shaft; Exp. 2) under instructions to either (a) take as much time as needed to be accurate or to (b) putt as fast as possible while still being accurate. Planning and…
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
1998-01-01
Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…
Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data
2014-12-01
Office of Research 113 Bowne Hall Syracuse, NY 13244 -1200 ABSTRACT HYPOTHESIS TESTING USING SPATIALLY DEPENDENT HEAVY-TAILED MULTISENSOR DATA Report...consistent with the null hypothesis of linearity and can be used to estimate the distribution of a test statistic that can discrimi- nate between the null... Test for nonlinearity. Histogram is generated using the surrogate data. The statistic of the original time series is represented by the solid line
Area Handbook Series: Chad, A Country Study
1988-12-01
northern and central interests. As dis- affection in these regions increased, in the late 1960s dissident groups formed an antigovernment coalition, the...decades-might finally ensue. December 13, 1988 * * seea eet After the research for this book was completed, several events occurred that greatly affected ...Chad’s geographic position along major trans-Saharan trade routes has also affected its historical development. In early times, trade consisted of
Reynolds, Caitlin E.; Poore, Richard Z.
2013-01-01
The U.S. Geological Survey anchored a sediment trap in the northern Gulf of Mexico to collect seasonal time-series data on the flux and assemblage composition of live planktic foraminifers. This report provides an update of the previous time-series data to include results from 2011. Ten species, or varieties, constituted ~92 percent of the 2011 assemblage: Globigerinoides ruber (pink and white varieties), Globigerinoides sacculifer, Globigerina calida, Globigerinella aequilateralis, Globorotalia menardii group [The Gt. menardii group includes Gt. menardii, Gt. tumida, and Gt. ungulata], Orbulina universa, Globorotalia truncatulinoides, Pulleniatina spp., and Neogloboquadrina dutertrei. The mean daily flux was 205 tests per square meter per day (m-2 day-1), with maximum fluxes of >600 tests m-2 day-1 during mid-February and mid-September and minimum fluxes of -2 day-1 during mid-March, the beginning of May, and November. Globorotalia truncatulinoides showed a clear preference for the winter, consistent with data from 2008 to 2010. Globigerinoides ruber (white) flux data for 2011 (average 30 tests m-2 day-1) were consistent with data from 2010 (average 29 m-2 day-1) and showed a steady threefold increase since 2009 (average 11 tests m-2 day-1) and a tenfold increase from the 2008 flux (3 tests m-2 day-1).
Forecasting daily patient volumes in the emergency department.
Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L
2008-02-01
Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.
Helzer, Erik G; Fleeson, William; Furr, R Michael; Meindl, Peter; Barranti, Maxwell
2017-08-01
Although individual differences in the application of moral principles, such as utilitarianism, have been documented, so too have powerful context effects-effects that raise doubts about the durability of people's moral principles. In this article, we examine the robustness of individual differences in moral judgment by examining them across time and across different decision contexts. In Study 1, consistency in utilitarian judgment of 122 adult participants was examined over two different survey sessions. In Studies 2A and 2B, large samples (Ns = 130 and 327, respectively) of adult participants made a series of 32 moral judgments across eight different contexts that are known to affect utilitarian endorsement. Contrary to some contemporary theorizing, our results reveal a strong degree of consistency in moral judgment. Across time and experimental manipulations of context, individuals maintained their relative standing on utilitarianism, and aggregated moral decisions reached levels of near-perfect consistency. Results support the view that on at least one dimension (utilitarianism), people's moral judgments are robustly consistent, with context effects tailoring the application of principles to the particulars of any given moral judgment. © 2016 Wiley Periodicals, Inc.
Duality between Time Series and Networks
Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.
2011-01-01
Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093
AnClim and ProClimDB software for data quality control and homogenization of time series
NASA Astrophysics Data System (ADS)
Stepanek, Petr
2015-04-01
During the last decade, a software package consisting of AnClim, ProClimDB and LoadData for processing (mainly climatological) data has been created. This software offers a complex solution for processing of climatological time series, starting from loading the data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme value evaluations and RCM outputs verification and correction (ProClimDB and AnClim software). The detection of inhomogeneities is carried out on a monthly scale through the application of AnClim, or newly by R functions called from ProClimDB, while quality control, the preparation of reference series and the correction of found breaks is carried out by the ProClimDB software. The software combines many statistical tests, types of reference series and time scales (monthly, seasonal and annual, daily and sub-daily ones). These can be used to create an "ensemble" of solutions, which may be more reliable than any single method. AnClim software is suitable for educational purposes: e.g. for students getting acquainted with methods used in climatology. Built-in graphical tools and comparison of various statistical tests help in better understanding of a given method. ProClimDB is, on the contrary, tool aimed for processing of large climatological datasets. Recently, functions from R may be used within the software making it more efficient in data processing and capable of easy inclusion of new methods (when available under R). An example of usage is easy comparison of methods for correction of inhomogeneities in daily data (HOM of Paul Della-Marta, SPLIDHOM method of Olivier Mestre, DAP - own method, QM of Xiaolan Wang and others). The software is available together with further information on www.climahom.eu . Acknowledgement: this work was partially funded by the project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248.
Yang, Yi; Maxwell, Andrew; Zhang, Xiaowei; Wang, Nan; Perkins, Edward J; Zhang, Chaoyang; Gong, Ping
2013-01-01
Pathway alterations reflected as changes in gene expression regulation and gene interaction can result from cellular exposure to toxicants. Such information is often used to elucidate toxicological modes of action. From a risk assessment perspective, alterations in biological pathways are a rich resource for setting toxicant thresholds, which may be more sensitive and mechanism-informed than traditional toxicity endpoints. Here we developed a novel differential networks (DNs) approach to connect pathway perturbation with toxicity threshold setting. Our DNs approach consists of 6 steps: time-series gene expression data collection, identification of altered genes, gene interaction network reconstruction, differential edge inference, mapping of genes with differential edges to pathways, and establishment of causal relationships between chemical concentration and perturbed pathways. A one-sample Gaussian process model and a linear regression model were used to identify genes that exhibited significant profile changes across an entire time course and between treatments, respectively. Interaction networks of differentially expressed (DE) genes were reconstructed for different treatments using a state space model and then compared to infer differential edges/interactions. DE genes possessing differential edges were mapped to biological pathways in databases such as KEGG pathways. Using the DNs approach, we analyzed a time-series Escherichia coli live cell gene expression dataset consisting of 4 treatments (control, 10, 100, 1000 mg/L naphthenic acids, NAs) and 18 time points. Through comparison of reconstructed networks and construction of differential networks, 80 genes were identified as DE genes with a significant number of differential edges, and 22 KEGG pathways were altered in a concentration-dependent manner. Some of these pathways were perturbed to a degree as high as 70% even at the lowest exposure concentration, implying a high sensitivity of our DNs approach. Findings from this proof-of-concept study suggest that our approach has a great potential in providing a novel and sensitive tool for threshold setting in chemical risk assessment. In future work, we plan to analyze more time-series datasets with a full spectrum of concentrations and sufficient replications per treatment. The pathway alteration-derived thresholds will also be compared with those derived from apical endpoints such as cell growth rate.
Analysis of Multipsectral Time Series for supporting Forest Management Plans
NASA Astrophysics Data System (ADS)
Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.
2010-05-01
Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.
NASA Technical Reports Server (NTRS)
Campbell, Petya K. Entcheva; Middleton, Elizabeth M.; Thome, Kurt J.; Kokaly, Raymond F.; Huemmrich, Karl Fred; Lagomasino, David; Novick, Kimberly A.; Brunsell, Nathaniel A.
2013-01-01
This study evaluated Earth Observing 1 (EO-1) Hyperion reflectance time series at established calibration sites to assess the instrument stability and suitability for monitoring vegetation functional parameters. Our analysis using three pseudo-invariant calibration sites in North America indicated that the reflectance time series are devoid of apparent spectral trends and their stability consistently is within 2.5-5 percent throughout most of the spectral range spanning the 12-plus year data record. Using three vegetated sites instrumented with eddy covariance towers, the Hyperion reflectance time series were evaluated for their ability to determine important variables of ecosystem function. A number of narrowband and derivative vegetation indices (VI) closely described the seasonal profiles in vegetation function and ecosystem carbon exchange (e.g., net and gross ecosystem productivity) in three very different ecosystems, including a hardwood forest and tallgrass prairie in North America, and a Miombo woodland in Africa. Our results demonstrate the potential for scaling the carbon flux tower measurements to local and regional landscape levels. The VIs with stronger relationships to the CO2 parameters were derived using continuous reflectance spectra and included wavelengths associated with chlorophyll content and/or chlorophyll fluorescence. Since these indices cannot be calculated from broadband multispectral instrument data, the opportunity to exploit these spectrometer-based VIs in the future will depend on the launch of satellites such as EnMAP and HyspIRI. This study highlights the practical utility of space-borne spectrometers for characterization of the spectral stability and uniformity of the calibration sites in support of sensor cross-comparisons, and demonstrates the potential of narrowband VIs to track and spatially extend ecosystem functional status as well as carbon processes measured at flux towers.
Torheim, Turid; Groendahl, Aurora R; Andersen, Erlend K F; Lyng, Heidi; Malinen, Eirik; Kvaal, Knut; Futsaether, Cecilia M
2016-11-01
Solid tumors are known to be spatially heterogeneous. Detection of treatment-resistant tumor regions can improve clinical outcome, by enabling implementation of strategies targeting such regions. In this study, K-means clustering was used to group voxels in dynamic contrast enhanced magnetic resonance images (DCE-MRI) of cervical cancers. The aim was to identify clusters reflecting treatment resistance that could be used for targeted radiotherapy with a dose-painting approach. Eighty-one patients with locally advanced cervical cancer underwent DCE-MRI prior to chemoradiotherapy. The resulting image time series were fitted to two pharmacokinetic models, the Tofts model (yielding parameters K trans and ν e ) and the Brix model (A Brix , k ep and k el ). K-means clustering was used to group similar voxels based on either the pharmacokinetic parameter maps or the relative signal increase (RSI) time series. The associations between voxel clusters and treatment outcome (measured as locoregional control) were evaluated using the volume fraction or the spatial distribution of each cluster. One voxel cluster based on the RSI time series was significantly related to locoregional control (adjusted p-value 0.048). This cluster consisted of low-enhancing voxels. We found that tumors with poor prognosis had this RSI-based cluster gathered into few patches, making this cluster a potential candidate for targeted radiotherapy. None of the voxels clusters based on Tofts or Brix parameter maps were significantly related to treatment outcome. We identified one group of tumor voxels significantly associated with locoregional relapse that could potentially be used for dose painting. This tumor voxel cluster was identified using the raw MRI time series rather than the pharmacokinetic maps.
Regular Cycles of Forward and Backward Signal Propagation in Prefrontal Cortex and in Consciousness
Werbos, Paul J.; Davis, Joshua J. J.
2016-01-01
This paper addresses two fundamental questions: (1) Is it possible to develop mathematical neural network models which can explain and replicate the way in which higher-order capabilities like intelligence, consciousness, optimization, and prediction emerge from the process of learning (Werbos, 1994, 2016a; National Science Foundation, 2008)? and (2) How can we use and test such models in a practical way, to track, to analyze and to model high-frequency (≥ 500 hz) many-channel data from recording the brain, just as econometrics sometimes uses models grounded in the theory of efficient markets to track real-world time-series data (Werbos, 1990)? This paper first reviews some of the prior work addressing question (1), and then reports new work performed in MATLAB analyzing spike-sorted and burst-sorted data on the prefrontal cortex from the Buzsaki lab (Fujisawa et al., 2008, 2015) which is consistent with a regular clock cycle of about 153.4 ms and with regular alternation between a forward pass of network calculations and a backwards pass, as in the general form of the backpropagation algorithm which one of us first developed in the period 1968–1974 (Werbos, 1994, 2006; Anderson and Rosenfeld, 1998). In business and finance, it is well known that adjustments for cycles of the year are essential to accurate prediction of time-series data (Box and Jenkins, 1970); in a similar way, methods for identifying and using regular clock cycles offer large new opportunities in neural time-series analysis. This paper demonstrates a few initial footprints on the large “continent” of this type of neural time-series analysis, and discusses a few of the many further possibilities opened up by this new approach to “decoding” the neural code (Heller et al., 1995). PMID:27965547
Regular Cycles of Forward and Backward Signal Propagation in Prefrontal Cortex and in Consciousness.
Werbos, Paul J; Davis, Joshua J J
2016-01-01
This paper addresses two fundamental questions: (1) Is it possible to develop mathematical neural network models which can explain and replicate the way in which higher-order capabilities like intelligence, consciousness, optimization, and prediction emerge from the process of learning (Werbos, 1994, 2016a; National Science Foundation, 2008)? and (2) How can we use and test such models in a practical way, to track, to analyze and to model high-frequency (≥ 500 hz) many-channel data from recording the brain, just as econometrics sometimes uses models grounded in the theory of efficient markets to track real-world time-series data (Werbos, 1990)? This paper first reviews some of the prior work addressing question (1), and then reports new work performed in MATLAB analyzing spike-sorted and burst-sorted data on the prefrontal cortex from the Buzsaki lab (Fujisawa et al., 2008, 2015) which is consistent with a regular clock cycle of about 153.4 ms and with regular alternation between a forward pass of network calculations and a backwards pass, as in the general form of the backpropagation algorithm which one of us first developed in the period 1968-1974 (Werbos, 1994, 2006; Anderson and Rosenfeld, 1998). In business and finance, it is well known that adjustments for cycles of the year are essential to accurate prediction of time-series data (Box and Jenkins, 1970); in a similar way, methods for identifying and using regular clock cycles offer large new opportunities in neural time-series analysis. This paper demonstrates a few initial footprints on the large "continent" of this type of neural time-series analysis, and discusses a few of the many further possibilities opened up by this new approach to "decoding" the neural code (Heller et al., 1995).
NASA Astrophysics Data System (ADS)
Baldysz, Zofia; Nykiel, Grzegorz; Araszkiewicz, Andrzej; Figurski, Mariusz; Szafranek, Karolina
2016-09-01
The main purpose of this research was to acquire information about consistency of ZTD (zenith total delay) linear trends and seasonal components between two consecutive GPS reprocessing campaigns. The analysis concerned two sets of the ZTD time series which were estimated during EUREF (Reference Frame Sub-Commission for Europe) EPN (Permanent Network) reprocessing campaigns according to 2008 and 2015 MUT AC (Military University of Technology Analysis Centre) scenarios. Firstly, Lomb-Scargle periodograms were generated for 57 EPN stations to obtain a characterisation of oscillations occurring in the ZTD time series. Then, the values of seasonal components and linear trends were estimated using the LSE (least squares estimation) approach. The Mann-Kendall trend test was also carried out to verify the presence of linear long-term ZTD changes. Finally, differences in seasonal signals and linear trends between these two data sets were investigated. All these analyses were conducted for the ZTD time series of two lengths: a shortened 16-year series and a full 18-year one. In the case of spectral analysis, amplitudes of the annual and semi-annual periods were almost exactly the same for both reprocessing campaigns. Exceptions were found for only a few stations and they did not exceed 1 mm. The estimated trends were also similar. However, for the reprocessing performed in 2008, the trends values were usually higher. In general, shortening of the analysed time period by 2 years resulted in a decrease of the linear trends values of about 0.07 mm yr-1. This was confirmed by analyses based on two data sets.
Use of a Principal Components Analysis for the Generation of Daily Time Series.
NASA Astrophysics Data System (ADS)
Dreveton, Christine; Guillou, Yann
2004-07-01
A new approach for generating daily time series is considered in response to the weather-derivatives market. This approach consists of performing a principal components analysis to create independent variables, the values of which are then generated separately with a random process. Weather derivatives are financial or insurance products that give companies the opportunity to cover themselves against adverse climate conditions. The aim of a generator is to provide a wider range of feasible situations to be used in an assessment of risk. Generation of a temperature time series is required by insurers or bankers for pricing weather options. The provision of conditional probabilities and a good representation of the interannual variance are the main challenges of a generator when used for weather derivatives. The generator was developed according to this new approach using a principal components analysis and was applied to the daily average temperature time series of the Paris-Montsouris station in France. The observed dataset was homogenized and the trend was removed to represent correctly the present climate. The results obtained with the generator show that it represents correctly the interannual variance of the observed climate; this is the main result of the work, because one of the main discrepancies of other generators is their inability to represent accurately the observed interannual climate variance—this discrepancy is not acceptable for an application to weather derivatives. The generator was also tested to calculate conditional probabilities: for example, the knowledge of the aggregated value of heating degree-days in the middle of the heating season allows one to estimate the probability if reaching a threshold at the end of the heating season. This represents the main application of a climate generator for use with weather derivatives.
Campbell, P.K.E.; Middleton, E.M.; Thome, K.J.; Kokaly, Raymond F.; Huemmrich, K.F.; Novick, K.A.; Brunsell, N.A.
2013-01-01
This study evaluated Earth Observing 1 (EO-1) Hyperion reflectance time series at established calibration sites to assess the instrument stability and suitability for monitoring vegetation functional parameters. Our analysis using three pseudo-invariant calibration sites in North America indicated that the reflectance time series are devoid of apparent spectral trends and their stability consistently is within 2.5-5 percent throughout most of the spectral range spanning the 12+ year data record. Using three vegetated sites instrumented with eddy covariance towers, the Hyperion reflectance time series were evaluated for their ability to determine important variables of ecosystem function. A number of narrowband and derivative vegetation indices (VI) closely described the seasonal profiles in vegetation function and ecosystem carbon exchange (e.g., net and gross ecosystem productivity) in three very different ecosystems, including a hardwood forest and tallgrass prairie in North America, and a Miombo woodland in Africa. Our results demonstrate the potential for scaling the carbon flux tower measurements to local and regional landscape levels. The VIs with stronger relationships to the CO2 parameters were derived using continuous reflectance spectra and included wavelengths associated with chlorophyll content and/or chlorophyll fluorescence. Since these indices cannot be calculated from broadband multispectral instrument data, the opportunity to exploit these spectrometer-based VIs in the future will depend on the launch of satellites such as EnMAP and HyspIRI. This study highlights the practical utility of space-borne spectrometers for characterization of the spectral stability and uniformity of the calibration sites in support of sensor cross-comparisons, and demonstrates the potential of narrowband VIs to track and spatially extend ecosystem functional status as well as carbon processes measured at flux towers.
Suárez Rodríguez, David; del Valle Soto, Miguel
2017-01-01
Background The aim of this study is to find the differences between two specific interval exercises. We begin with the hypothesis that the use of microintervals of work and rest allow for greater intensity of play and a reduction in fatigue. Methods Thirteen competition-level male tennis players took part in two interval training exercises comprising nine 2 min series, which consisted of hitting the ball with cross-court forehand and backhand shots, behind the service box. One was a high-intensity interval training (HIIT), made up of periods of continuous work lasting 2 min, and the other was intermittent interval training (IIT), this time with intermittent 2 min intervals, alternating periods of work with rest periods. Average heart rate (HR) and lactate levels were registered in order to observe the physiological intensity of the two exercises, along with the Borg Scale results for perceived exertion and the number of shots and errors in order to determine the intensity achieved and the degree of fatigue throughout the exercise. Results There were no significant differences in the average heart rate, lactate or the Borg Scale. Significant differences were registered, on the other hand, with a greater number of shots in the first two HIIT series (series 1 p>0.009; series 2 p>0.056), but not in the third. The number of errors was significantly lower in all the IIT series (series 1 p<0.035; series 2 p<0.010; series 3 p<0.001). Conclusion Our study suggests that high-intensity intermittent training allows for greater intensity of play in relation to the real time spent on the exercise, reduced fatigue levels and the maintaining of greater precision in specific tennis-related exercises. PMID:29021912
Infrared photometry of the black hole candidate Sagittarius A*
NASA Technical Reports Server (NTRS)
Close, Laird M.; Mccarthy, Donald W. JR.; Melia, Fulvio
1995-01-01
An infrared source has been imaged within 0.2 +/- 0.3 arcseconds of the unique Galactic center radio source Sgr A* High angular resolution (averaged value of the Full Width at Half Maximum (FWHM) approximately 0.55 arcseconds) was achieved by rapid (approximately 50 Hz) real-time images motion compensation. The source's near-infrared magnitudes (K = 12.1 +/- 0.3, H = 13.7 +/- 0.3, and J = 16.6 +/- 0.4) are consistent with a hot object reddened by the local extinction A(sub v) approximately 27). At the 3 sigma level of confidence, a time series of 80 images limits the source variability to less than 50% on timescales from 3 to 30 minutes. The photometry is consistent with the emission from a simple accretion disk model for a approximately 1 x 10(exp 6) solar mass black hole. However, the fluxes are also consistent with a hot luminous (L approximately 10(exp 3.5) to 10(exp 4-6) solar luminosity) central cluster star positionally coincident with Sgr A*.
Weiss, Jonathan D.
1995-01-01
A shock velocity and damage location sensor providing a means of measuring shock speed and damage location. The sensor consists of a long series of time-of-arrival "points" constructed with fiber optics. The fiber optic sensor apparatus measures shock velocity as the fiber sensor is progressively crushed as a shock wave proceeds in a direction along the fiber. The light received by a receiving means changes as time-of-arrival points are destroyed as the sensor is disturbed by the shock. The sensor may comprise a transmitting fiber bent into a series of loops and fused to a receiving fiber at various places, time-of-arrival points, along the receiving fibers length. At the "points" of contact, where a portion of the light leaves the transmitting fiber and enters the receiving fiber, the loops would be required to allow the light to travel backwards through the receiving fiber toward a receiving means. The sensor may also comprise a single optical fiber wherein the time-of-arrival points are comprised of reflection planes distributed along the fibers length. In this configuration, as the shock front proceeds along the fiber it destroys one reflector after another. The output received by a receiving means from this sensor may be a series of downward steps produced as the shock wave destroys one time-of-arrival point after another, or a nonsequential pattern of steps in the event time-of-arrival points are destroyed at any point along the sensor.
Weiss, J.D.
1995-08-29
A shock velocity and damage location sensor providing a means of measuring shock speed and damage location is disclosed. The sensor consists of a long series of time-of-arrival ``points`` constructed with fiber optics. The fiber optic sensor apparatus measures shock velocity as the fiber sensor is progressively crushed as a shock wave proceeds in a direction along the fiber. The light received by a receiving means changes as time-of-arrival points are destroyed as the sensor is disturbed by the shock. The sensor may comprise a transmitting fiber bent into a series of loops and fused to a receiving fiber at various places, time-of-arrival points, along the receiving fibers length. At the ``points`` of contact, where a portion of the light leaves the transmitting fiber and enters the receiving fiber, the loops would be required to allow the light to travel backwards through the receiving fiber toward a receiving means. The sensor may also comprise a single optical fiber wherein the time-of-arrival points are comprised of reflection planes distributed along the fibers length. In this configuration, as the shock front proceeds along the fiber it destroys one reflector after another. The output received by a receiving means from this sensor may be a series of downward steps produced as the shock wave destroys one time-of-arrival point after another, or a nonsequential pattern of steps in the event time-of-arrival points are destroyed at any point along the sensor. 6 figs.
NASA Astrophysics Data System (ADS)
Doubre, C.; Deprez, A.; Masson, F.; Socquet, A.; Ulrich, P.; Ibrahim Ahmed, S.; de Chabalier, J. B.; Ahmadine Omar, A.; Vigny, C.; Ruegg, J. C.
2014-12-01
We present the results of the last GPS campaign conducted over the Djiboutian part of Eastern Afar. A large and dense geodetic network has been measured regularly since the 90's, and allows an accurate determination of the velocity field associated with the western tip of the Arabia-Somalia divergent plate boundary. Within the Tadjoura Gulf, the Aden ridge consists of a series of 3 en échelon, submerged spreading segments, except for the Asal segment, which is partly above water. The repetition of 6 to 7 measurements together with 6 permanent continuous GNSS stations allow an opportunity to study the spatial distribution of the active extension in relation to these 3 segments, but also to study time variations of the displacements, which are greatly expected to be transitory because of the occurrence of dyking events, small to intermediate seismic events, and volcanic activity. The divergent motion of the two margins of the Gulf occurs at ~15 mm/yr, which is consistent with the long-term estimates of the Arabia-Somalia motion. Across the Asal segment, this value confirms that the effect of the dyking event in 1978 has ended. The velocity gradients show that the deformation is distributed from the southern to the northern rift shoulder. As revealed by the InSAR data however, the along-axis variations of the deformation pattern, i.e. clear superficial active faults in the SE part of the rift and deep opening in the NW part, suggests the remaining influence of the previous dyke intrusions within the segment inner floor. The time series show that the velocity field was more heterogeneous before 2003, when the micro-seismic activity was significant, particularly around the volcanic center. The striking feature of the time evolution of the velocity field consists in the transition from an extension mainly localized across the Asal segment before 2003 to an extension more distributed, implying the influence of the southern Quaternary structures forming the Gaggade and Hanle Basins. This results in a decrease of the opening velocity across the Asal segment. This crucial change suggests that the activity of the volcanic/geothermal centre in the segment is a determining factor in the spatial organization of the deformation, by affecting the activity of the normal faults and thereby favoring the concentration of the extensive deformation.
Simulating extreme low-discharge events for the Rhine using a stochastic model
NASA Astrophysics Data System (ADS)
Macian-Sorribes, Hector; Mens, Marjolein; Schasfoort, Femke; Diermanse, Ferdinand; Pulido-Velazquez, Manuel
2017-04-01
The specific features of hydrological droughts make them more difficult to be analysed than other water-related phenomena: longer time scales (months to several years) so less historical events are available, and the drought severity and associate damage depends on a combination of variables with no clear prevalence (e.g., total water deficit, maximum deficit and duration). As part of drought risk analysis, which aims to provide insight into the variability of hydrological conditions and associated socio-economic impacts, long synthetic time series should therefore be developed. In this contribution, we increase the length of the available inflow time series using stochastic autoregressive modelling. This enhancement could improve the characterization of the extreme range and can define extreme droughts with similar periods of return but different patterns that can lead to distinctly different damages. The methodology consists of: 1) fitting an autoregressive model (AR, ARMA…) to the available records; 2) generating extended time series (thousands of years); 3) performing a frequency analysis with different characteristic variables (total, deficit, maximum deficit and so on); and 4) selecting extreme drought events associated with different characteristic variables and return periods. The methodology was applied to the Rhine river discharge at location Lobith, where the Rhine enters The Netherlands. A monthly ARMA(1,1) autoregressive model with seasonally varying parameters was fitted and successfully validated to the historical records available since year 1901. The maximum monthly deficit with respect to a threshold value of 1800 m3/s and the average discharge for a given time span in m3/s were chosen as indicators to identify drought periods. A synthetic series of 10,000 years of discharges was generated using the validated ARMA model. Two time spans were considered in the analysis: the whole calendar year and the half-year period between April and September (the summer half year, where water demands are highest). Frequency analysis was performed for both indicators and time spans for the generated time series and the historical records. The comparison between observed and generated series showed that the ARMA model provides a good reproduction of the maximum deficits and total discharges, especially for the summer half-year period. The resulting synthetic series are therefore considered credible. These synthetic series, with its wealth of information, can then be used as inputs for the damage assessment models, together with information on precipitation deficits, in order to estimate the risk that lower inflows can have on the urban, the agricultural, the shipping sector and so on. This will help in associating economic losses and periods of return, as well as for estimating how droughts with similar periods of return but different patterns can lead to different damages. ACKNOWLEDGEMENT This study has been supported by the European Union's Horizon 2020 research and innovation programme under the IMPREX project (grant agreement no: 641.811), and by the Climate-KIC Pioneers into Practice Program supported by the European Union's EIT.
Scale-free avalanche dynamics in the stock market
NASA Astrophysics Data System (ADS)
Bartolozzi, M.; Leinweber, D. B.; Thomas, A. W.
2006-10-01
Self-organized criticality (SOC) has been claimed to play an important role in many natural and social systems. In the present work we empirically investigate the relevance of this theory to stock-market dynamics. Avalanches in stock-market indices are identified using a multi-scale wavelet-filtering analysis designed to remove Gaussian noise from the index. Here, new methods are developed to identify the optimal filtering parameters which maximize the noise removal. The filtered time series is reconstructed and compared with the original time series. A statistical analysis of both high-frequency Nasdaq E-mini Futures and daily Dow Jones data is performed. The results of this new analysis confirm earlier results revealing a robust power-law behaviour in the probability distribution function of the sizes, duration and laminar times between avalanches. This power-law behaviour holds the potential to be established as a stylized fact of stock market indices in general. While the memory process, implied by the power-law distribution of the laminar times, is not consistent with classical models for SOC, we note that a power-law distribution of the laminar times cannot be used to rule out self-organized critical behaviour.
NASA Astrophysics Data System (ADS)
Barnhart, B. L.; Eichinger, W. E.; Prueger, J. H.
2010-12-01
Hilbert-Huang transform (HHT) is a relatively new data analysis tool which is used to analyze nonstationary and nonlinear time series data. It consists of an algorithm, called empirical mode decomposition (EMD), which extracts the cyclic components embedded within time series data, as well as Hilbert spectral analysis (HSA) which displays the time and frequency dependent energy contributions from each component in the form of a spectrogram. The method can be considered a generalized form of Fourier analysis which can describe the intrinsic cycles of data with basis functions whose amplitudes and phases may vary with time. The HHT will be introduced and compared to current spectral analysis tools such as Fourier analysis, short-time Fourier analysis, wavelet analysis and Wigner-Ville distributions. A number of applications are also presented which demonstrate the strengths and limitations of the tool, including analyzing sunspot number variability and total solar irradiance proxies as well as global averaged temperature and carbon dioxide concentration. Also, near-surface atmospheric quantities such as temperature and wind velocity are analyzed to demonstrate the nonstationarity of the atmosphere.
Detection of a sudden change of the field time series based on the Lorenz system
Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832
Hyvärinen, A
1985-01-01
The main purpose of the present study was to describe the statistical behaviour of daily analytical errors in the dimensions of place and time, providing a statistical basis for realistic estimates of the analytical error, and hence allowing the importance of the error and the relative contributions of its different sources to be re-evaluated. The observation material consists of creatinine and glucose results for control sera measured in daily routine quality control in five laboratories for a period of one year. The observation data were processed and computed by means of an automated data processing system. Graphic representations of time series of daily observations, as well as their means and dispersion limits when grouped over various time intervals, were investigated. For partition of the total variation several two-way analyses of variance were done with laboratory and various time classifications as factors. Pooled sets of observations were tested for normality of distribution and for consistency of variances, and the distribution characteristics of error variation in different categories of place and time were compared. Errors were found from the time series to vary typically between days. Due to irregular fluctuations in general and particular seasonal effects in creatinine, stable estimates of means or of dispersions for errors in individual laboratories could not be easily obtained over short periods of time but only from data sets pooled over long intervals (preferably at least one year). Pooled estimates of proportions of intralaboratory variation were relatively low (less than 33%) when the variation was pooled within days. However, when the variation was pooled over longer intervals this proportion increased considerably, even to a maximum of 89-98% (95-98% in each method category) when an outlying laboratory in glucose was omitted, with a concomitant decrease in the interaction component (representing laboratory-dependent variation with time). This indicates that a substantial part of the variation comes from intralaboratory variation with time rather than from constant interlaboratory differences. Normality and consistency of statistical distributions were best achieved in the long-term intralaboratory sets of the data, under which conditions the statistical estimates of error variability were also most characteristic of the individual laboratories rather than necessarily being similar to one another. Mixing of data from different laboratories may give heterogeneous and nonparametric distributions and hence is not advisable.(ABSTRACT TRUNCATED AT 400 WORDS)
Phase-synchronization, energy cascade, and intermittency in solar-wind turbulence.
Perri, S; Carbone, V; Vecchio, A; Bruno, R; Korth, H; Zurbuchen, T H; Sorriso-Valvo, L
2012-12-14
The energy cascade in solar wind magnetic turbulence is investigated using MESSENGER data in the inner heliosphere. The decomposition of magnetic field time series in intrinsic functions, each characterized by a typical time scale, reveals phase reorganization. This allows for the identification of structures of all sizes generated by the nonlinear turbulent cascade, covering both the inertial and the dispersive ranges of the turbulent magnetic power spectrum. We find that the correlation (or anticorrelation) of phases occurs between pairs of neighboring time scales, whenever localized peaks of magnetic energy are present at both scales, consistent with the local character of the energy transfer process.
Detecting multiple moving objects in crowded environments with coherent motion regions
Cheriyadat, Anil M.; Radke, Richard J.
2013-06-11
Coherent motion regions extend in time as well as space, enforcing consistency in detected objects over long time periods and making the algorithm robust to noisy or short point tracks. As a result of enforcing the constraint that selected coherent motion regions contain disjoint sets of tracks defined in a three-dimensional space including a time dimension. An algorithm operates directly on raw, unconditioned low-level feature point tracks, and minimizes a global measure of the coherent motion regions. At least one discrete moving object is identified in a time series of video images based on the trajectory similarity factors, which is a measure of a maximum distance between a pair of feature point tracks.
3-component time-dependent crustal deformation in Southern California from Sentinel-1 and GPS
NASA Astrophysics Data System (ADS)
Tymofyeyeva, E.; Fialko, Y. A.
2017-12-01
We combine data from the Sentinel-1 InSAR mission collected between 2014-2017 with continuous GPS measurements to calculate the three components of the interseismic surface velocity field in Southern California at the resolution of InSAR data ( 100 m). We use overlapping InSAR tracks with two different look geometries (descending tracks 71, 173, and 144, and ascending tracks 64 and 166) to obtain the 3 orthogonal components of surface motion. Because of the under-determined nature of the problem, we use the local azimuth of the horizontal velocity vector as an additional constraint. The spatially variable azimuths of the horizontal velocity are obtained by interpolating data from the continuous GPS network. We estimate both secular velocities and displacement time series. The latter are obtained by combining InSAR time series from different lines of sight with time-dependent azimuths computed using continuous GPS time series at every InSAR epoch. We use the CANDIS method [Tymofyeyeva and Fialko, 2015], a technique based on iterative common point stacking, to correct the InSAR data for tropospheric and ionospheric artifacts when calculating secular velocities and time series, and to isolate low-amplitude deformation signals in our study region. The obtained horizontal (East and North) components of secular velocity exhibit long-wavelength patterns consistent with strain accumulation on major faults of the Pacific-North America plate boundary. The vertical component of velocity reveals a number of localized uplift and subsidence anomalies, most likely related to hydrologic effects and anthropogenic activity. In particular, in the Los Angeles basin we observe localized uplift of about 10-15mm/yr near Anaheim, Long Beach, and Redondo Beach, as well as areas of rapid subsidence near Irvine and Santa Monica, which are likely caused by the injection of water in the oil fields, and the pumping and recharge cycles of the aquifers in the basin.
Wind, Wave, and Tidal Energy Without Power Conditioning
NASA Technical Reports Server (NTRS)
Jones, Jack A.
2013-01-01
Most present wind, wave, and tidal energy systems require expensive power conditioning systems that reduce overall efficiency. This new design eliminates power conditioning all, or nearly all, of the time. Wind, wave, and tidal energy systems can transmit their energy to pumps that send high-pressure fluid to a central power production area. The central power production area can consist of a series of hydraulic generators. The hydraulic generators can be variable displacement generators such that the RPM, and thus the voltage, remains constant, eliminating the need for further power conditioning. A series of wind blades is attached to a series of radial piston pumps, which pump fluid to a series of axial piston motors attached to generators. As the wind is reduced, the amount of energy is reduced, and the number of active hydraulic generators can be reduced to maintain a nearly constant RPM. If the axial piston motors have variable displacement, an exact RPM can be maintained for all, or nearly all, wind speeds. Analyses have been performed that show over 20% performance improvements with this technique over conventional wind turbines
New Insights into Signed Path Coefficient Granger Causality Analysis
Zhang, Jian; Li, Chong; Jiang, Tianzi
2016-01-01
Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of “signed path coefficient Granger causality,” a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an “excitatory” or “inhibitory” influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation. PMID:27833547
Helder, Onno K; Brug, Johannes; van Goudoever, Johannes B; Looman, Caspar W N; Reiss, Irwin K M; Kornelisse, René F
2014-07-01
Sustained high compliance with hand hygiene (HH) is needed to reduce nosocomial bloodstream infections (NBSIs). However, over time, a wash out effect often occurs. We studied the long-term effect of sequential HH-promoting interventions. An observational study with an interrupted time series analysis of the occurrence of NBSI was performed in very low-birth weight (VLBW) infants. Interventions consisted of an education program, gain-framed screen saver messages, and an infection prevention week with an introduction on consistent glove use. A total of 1,964 VLBW infants admitted between January 1, 2002, and December 31, 2011, were studied. The proportion of infants with ≥1 NBSI decreased from 47.6%-21.2% (P < .01); the number of NBSIs per 1,000 patient days decreased from 16.8-8.9 (P < .01). Preintervention, the number of NBSIs per 1,000 patient days significantly increased by 0.74 per quartile (95% confidence interval [CI], 0.27-1.22). The first intervention was followed by a significantly declining trend in NBSIs of -1.27 per quartile (95% CI, -2.04 to -0.49). The next interventions were followed by a neutral trend change. The relative contributions of coagulase-negative staphylococci and Staphylococcus aureus as causative pathogens decreased significantly over time. Sequential HH promotion seems to contribute to a sustained low NBSI rate. Copyright © 2014 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.
Multiple Indicator Stationary Time Series Models.
ERIC Educational Resources Information Center
Sivo, Stephen A.
2001-01-01
Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…
Perl Modules for Constructing Iterators
NASA Technical Reports Server (NTRS)
Tilmes, Curt
2009-01-01
The Iterator Perl Module provides a general-purpose framework for constructing iterator objects within Perl, and a standard API for interacting with those objects. Iterators are an object-oriented design pattern where a description of a series of values is used in a constructor. Subsequent queries can request values in that series. These Perl modules build on the standard Iterator framework and provide iterators for some other types of values. Iterator::DateTime constructs iterators from DateTime objects or Date::Parse descriptions and ICal/RFC 2445 style re-currence descriptions. It supports a variety of input parameters, including a start to the sequence, an end to the sequence, an Ical/RFC 2445 recurrence describing the frequency of the values in the series, and a format description that can refine the presentation manner of the DateTime. Iterator::String constructs iterators from string representations. This module is useful in contexts where the API consists of supplying a string and getting back an iterator where the specific iteration desired is opaque to the caller. It is of particular value to the Iterator::Hash module which provides nested iterations. Iterator::Hash constructs iterators from Perl hashes that can include multiple iterators. The constructed iterators will return all the permutations of the iterations of the hash by nested iteration of embedded iterators. A hash simply includes a set of keys mapped to values. It is a very common data structure used throughout Perl programming. The Iterator:: Hash module allows a hash to include strings defining iterators (parsed and dispatched with Iterator::String) that are used to construct an overall series of hash values.
Model for the respiratory modulation of the heart beat-to-beat time interval series
NASA Astrophysics Data System (ADS)
Capurro, Alberto; Diambra, Luis; Malta, C. P.
2005-09-01
In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.
[Winter wheat area estimation with MODIS-NDVI time series based on parcel].
Li, Le; Zhang, Jin-shui; Zhu, Wen-quan; Hu, Tan-gao; Hou, Dong
2011-05-01
Several attributes of MODIS (moderate resolution imaging spectrometer) data, especially the short temporal intervals and the global coverage, provide an extremely efficient way to map cropland and monitor its seasonal change. However, the reliability of their measurement results is challenged because of the limited spatial resolution. The parcel data has clear geo-location and obvious boundary information of cropland. Also, the spectral differences and the complexity of mixed pixels are weak in parcels. All of these make that area estimation based on parcels presents more advantage than on pixels. In the present study, winter wheat area estimation based on MODIS-NDVI time series has been performed with the support of cultivated land parcel in Tongzhou, Beijing. In order to extract the regional winter wheat acreage, multiple regression methods were used to simulate the stable regression relationship between MODIS-NDVI time series data and TM samples in parcels. Through this way, the consistency of the extraction results from MODIS and TM can stably reach up to 96% when the amount of samples accounts for 15% of the whole area. The results shows that the use of parcel data can effectively improve the error in recognition results in MODIS-NDVI based multi-series data caused by the low spatial resolution. Therefore, with combination of moderate and low resolution data, the winter wheat area estimation became available in large-scale region which lacks completed medium resolution images or has images covered with clouds. Meanwhile, it carried out the preliminary experiments for other crop area estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef; Fast P.; Kraus, M.
2006-01-01
Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that thesemore » data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.« less
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
SIMBIOS Project 1998 Annual Report
NASA Technical Reports Server (NTRS)
McClain, Charles R.; Fargion, Giulietta, S.
1999-01-01
The purpose of this series of technical reports is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Ocean Studies (SIMBIOS) Project activities, NASA Research Announcement (NRA) research status, satellite data processing, data product validation and field calibration. This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant to substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issues by an operational project.
Model for the heart beat-to-beat time series during meditation
NASA Astrophysics Data System (ADS)
Capurro, A.; Diambra, L.; Malta, C. P.
2003-09-01
We present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a pacemaker, that simulates the membrane potential of the sinoatrial node, modulated by a periodic input signal plus correlated noise that simulates the respiratory input. The model was used to assess the waveshape of the respiratory signals needed to reproduce in the phase space the trajectory of experimental heart beat-to-beat interval data. The data sets were recorded during meditation practices of the Chi and Kundalini Yoga techniques. Our study indicates that in the first case the respiratory signal has the shape of a smoothed square wave, and in the second case it has the shape of a smoothed triangular wave.
A novel weight determination method for time series data aggregation
NASA Astrophysics Data System (ADS)
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
IGS14/igs14.atx: a new Framework for the IGS Products
NASA Astrophysics Data System (ADS)
Rebischung, P.; Schmid, R.
2016-12-01
The International GNSS Service (IGS) is about to switch to a new reference frame (IGS14), based on the latest release of the International Terrestrial Reference Frame (ITRF2014), as the basis for its products. An updated set of satellite and ground antenna calibrations (igs14.atx) will become effective at the same time. IGS14 and igs14.atx will then replace the previous IGS08/igs08.atx framework in use since GPS week 1632 (17 April 2011) and in the second IGS reprocessing campaign (repro2). Despite the negligible scale difference between ITRF2008 and ITRF2014 (0.02 ppb), the radial components of all GPS and GLONASS satellite antenna phase center offsets (z-PCOs) had to be updated in igs14.atx, because of modeling changes recently introduced within the IGS that affect the scale of the IGS products. This was achieved by deriving and averaging time series of satellite z-PCO estimates, consistent with the ITRF2014 scale, from the daily repro2 and latest operational SINEX solutions of seven IGS Analysis Centers (ACs). Compared to igs08.atx, igs14.atx includes robot calibrations for 16 additional ground antenna types, so that the percentage of stations with absolute calibrations in the IGS network will reach 90% after the switch. 19 type-mean robot calibrations were also updated thanks to the availability of calibration results for additional antenna samples. IGS14 is basically an extract of well-suited reference frame stations (i.e., with long and stable position time series) from ITRF2014. However, to make the IGS14 station coordinates consistent with the new igs14.atx ground antenna calibrations, position offsets due to the switch from igs08.atx to igs14.atx were derived for all IGS14 stations affected by ground antenna calibration updates and applied to their ITRF2014 coordinates. This presentation will first detail the different steps of the elaboration of IGS14 and igs14.atx. The impact of the switch on GNSS-derived geodetic parameter time series will then be assessed by re-aligning the daily repro2 and latest operational IGS combined SINEX solutions to IGS14/igs14.atx. A particular focus will finally be given to the biases and trends present in the satellite z-PCO time series derived from the daily AC SINEX solutions, and to their interpretation in terms of scale and scale rate of the terrestrial frame.
Scientific Library to Hold Annual Winter Video Series | Poster
The Scientific Library is getting ready for its Annual Winter Video Series. Beginning on Monday, January 9 and concluding on Friday, February 17, the Winter Video Series will consist of two different PBS programs, each with three episodes.
NASA Astrophysics Data System (ADS)
Gentilucci, Matteo; Bisci, Carlo; Fazzini, Massimiliano; Tognetti, Danilo
2016-04-01
The analysis is focused on more than 100 meteorological recording stations located in the Province of Macerata (Marche region, Adriatic side of Central Italy) and in its neighbours; it aims to check the time series of their climatological data (temperatures and precipitations), covering about one century of observations, in order to remove or rectify any errors. This small area (about 2.800Km2) features many different climate types, because of its varied topography ranging, moving westward, from the Adriatic coast to the Appennines (over 2.100m of altitude). In this irregular context, it is difficult to establish a common procedure for each sector; therefore, it has been followed the general guidelines of the WMO, with some important difference (mostly in the method). Data are classified on the basis of validation codes (VC): missing datum (VC=-1), correct or verified datum (VC=0), datum under investigation (VC=1), datum removed after the analysis (VC=2), datum reconstructed through interpolation or by estimating the errors of digitization (VC=3). The first step was the "Logical Control", consisting in the investigation of gross errors of digitization: the data found in this phase of the analysis has been removed without any other control (VC=2). The second step, represented by the "Internal Consistency Check", leads to the elimination (VC=2) of all the data out of range, estimated on the basis of the climate zone for each investigated variable. The third one is the "Tolerance Test", carried out comparing each datum with the historical record it belongs to, in order to apply this test, the normal distribution of data has been evaluated. The "Tolerance Test" usually defines only suspect data (VC=1) to be verified with further tests, such as the "Temporal Consistency" and the "Spatial Consistency". The "Temporal Consistency" allows an evaluation of the time sequence of data, setting a specified range for each station basing upon its historical records. Data out of range have been considered under investigation (VC=1). Data are finally compared with the ones contemporaneously recorded in a set of neighboring meteorological stations through the "Spatial Consistency" test, thus eliminating every suspicious datum (recoded VC=2 or VC=0, depending upon the results of this analysis). This procedure uses a series of different statistic steps to avoid uncertainties: at its end, all the investigated data are either accepted (VC=0) or refused (VC=2). Refused and missing data (VC=-1 and VC=2) have been reconstructed through interpolation using co-kriging techniques (assigning VC=3), when necessary, in the final stage of the process. All the above procedure has been developed using a database managing software in a GIS (ESRI ArcGIS ®) environment. The refused data are 1.286 in 77.021 (1,67%) for the precipitations and 375 in 1.821.054 for the temperatures (0,02%).
Terminator field-aligned current system: A new finding from model-assimilated data set (MADS)
NASA Astrophysics Data System (ADS)
Zhu, L.; Schunk, R. W.; Scherliess, L.; Sojka, J. J.; Gardner, L. C.; Eccles, J. V.; Rice, D.
2013-12-01
Physics-based data assimilation models have been recognized by the space science community as the most accurate approach to specify and forecast the space weather of the solar-terrestrial environment. The model-assimilated data sets (MADS) produced by these models constitute an internally consistent time series of global three-dimensional fields whose accuracy can be estimated. Because of its internal consistency of physics and completeness of descriptions on the status of global systems, the MADS has also been a powerful tool to identify the systematic errors in measurements, reveal the missing physics in physical models, and discover the important dynamical physical processes that are inadequately observed or missed by measurements due to observational limitations. In the past years, we developed a data assimilation model for the high-latitude ionospheric plasma dynamics and electrodynamics. With a set of physical models, an ensemble Kalman filter, and the ingestion of data from multiple observations, the data assimilation model can produce a self-consistent time-series of the complete descriptions of the global high-latitude ionosphere, which includes the convection electric field, horizontal and field-aligned currents, conductivity, as well as 3-D plasma densities and temperatures, In this presentation, we will show a new field-aligned current system discovered from the analysis of the MADS produced by our data assimilation model. This new current system appears and develops near the ionospheric terminator. The dynamical features of this current system will be described and its connection to the active role of the ionosphere in the M-I coupling will be discussed.
Landsat Surface Reflectance Climate Data Records
,
2014-01-01
Landsat Surface Reflectance Climate Data Records (CDRs) are high level Landsat data products that support land surface change studies. Climate Data Records, as defined by the National Research Council, are a time series of measurements with sufficient length, consistency, and continuity to identify climate variability and change. The U.S. Geological Survey (USGS) is using the valuable 40-year Landsat archive to create CDRs that can be used to document changes to Earth’s terrestrial environment.