Sample records for continuous time series

  1. Continuous time transfer using GPS carrier phase.

    PubMed

    Dach, Rolf; Schildknecht, Thomas; Springer, Tim; Dudle, Gregor; Prost, Leon

    2002-11-01

    The Astronomical Institute of the University of Berne is hosting one of the Analysis Centers (AC) of the International GPS Service (IGS). A network of a few GPS stations in Europe and North America is routinely analyzed for time transfer purposes, using the carrier phase observations. This work is done in the framework of a joint project with the Swiss Federal Office of Metrology and Accreditation (METAS). The daily solutions are computed independently. The resulting time transfer series show jumps of up to 1 ns at the day boundaries. A method to concatenate the daily time transfer solutions to a continuous series was developed. A continuous time series is available for a time span of more than 4 mo. The results were compared with the time transfer results from other techniques such as two-way satellite time and frequency transfer. This concatenation improves the results obtained in a daily computing scheme because a continuous time series better reflects the characteristics of continuously working clocks.

  2. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    PubMed

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Bispectral Inversion: The Construction of a Time Series from Its Bispectrum

    DTIC Science & Technology

    1988-04-13

    take the inverse transform . Since the goal is to compute a time series given its bispectrum, it would also be nice to stay entirely in the frequency...domain and be able to go directly from the bispectrum to the Fourier transform of the time series without the need to inverse transform continuous...the picture. The approximations arise from representing the bicovariance, which is the inverse transform of a continuous function, by the inverse disrte

  4. EnvironmentalWaveletTool: Continuous and discrete wavelet analysis and filtering for environmental time series

    NASA Astrophysics Data System (ADS)

    Galiana-Merino, J. J.; Pla, C.; Fernandez-Cortes, A.; Cuezva, S.; Ortiz, J.; Benavente, D.

    2014-10-01

    A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of several environmental time series, particularly focused on the analyses of cave monitoring data. The continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform have been implemented to provide a fast and precise time-period examination of the time series at different period bands. Moreover, statistic methods to examine the relation between two signals have been included. Finally, the entropy of curves and splines based methods have also been developed for segmenting and modeling the analyzed time series. All these methods together provide a user-friendly and fast program for the environmental signal analysis, with useful, practical and understandable results.

  5. Segmenting the Stream of Consciousness: The Psychological Correlates of Temporal Structures in the Time Series Data of a Continuous Performance Task

    ERIC Educational Resources Information Center

    Smallwood, Jonathan; McSpadden, Merrill; Luus, Bryan; Schooler, Joanthan

    2008-01-01

    Using principal component analysis, we examined whether structural properties in the time series of response time would identify different mental states during a continuous performance task. We examined whether it was possible to identify regular patterns which were present in blocks classified as lacking controlled processing, either…

  6. 27 CFR 20.179 - Package identification number or serial number.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the time of a change, the numbering series in use at the time of the change may be continued. (Sec... at the same time on which all of the marks required by § 20.178 (a)(1) and (a)(3) through (a)(8) are... number “1” and continuing in regular sequence. The dealer shall use a separate but similar number series...

  7. Closed-Loop Optimal Control Implementations for Space Applications

    DTIC Science & Technology

    2016-12-01

    analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to feedback on the...through the analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering

  8. 27 CFR 20.179 - Package identification number or serial number.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the time of a change, the numbering series in use at the time of the change may be continued. (Sec... at the same time on which all of the marks required by § 20.178 (a)(1) and (a)(3) through (a)(8) are... number “1” and continuing in regular sequence. The dealer shall use a separate but similar number series...

  9. 27 CFR 20.179 - Package identification number or serial number.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the time of a change, the numbering series in use at the time of the change may be continued. (Sec... at the same time on which all of the marks required by § 20.178 (a)(1) and (a)(3) through (a)(8) are... number “1” and continuing in regular sequence. The dealer shall use a separate but similar number series...

  10. 27 CFR 20.179 - Package identification number or serial number.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the time of a change, the numbering series in use at the time of the change may be continued. (Sec... at the same time on which all of the marks required by § 20.178 (a)(1) and (a)(3) through (a)(8) are... number “1” and continuing in regular sequence. The dealer shall use a separate but similar number series...

  11. 27 CFR 20.179 - Package identification number or serial number.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the time of a change, the numbering series in use at the time of the change may be continued. (Sec... at the same time on which all of the marks required by § 20.178 (a)(1) and (a)(3) through (a)(8) are... number “1” and continuing in regular sequence. The dealer shall use a separate but similar number series...

  12. Prediction of flow dynamics using point processes

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito; Stemler, Thomas; Eroglu, Deniz; Marwan, Norbert

    2018-01-01

    Describing a time series parsimoniously is the first step to study the underlying dynamics. For a time-discrete system, a generating partition provides a compact description such that a time series and a symbolic sequence are one-to-one. But, for a time-continuous system, such a compact description does not have a solid basis. Here, we propose to describe a time-continuous time series using a local cross section and the times when the orbit crosses the local cross section. We show that if such a series of crossing times and some past observations are given, we can predict the system's dynamics with fine accuracy. This reconstructability neither depends strongly on the size nor the placement of the local cross section if we have a sufficiently long database. We demonstrate the proposed method using the Lorenz model as well as the actual measurement of wind speed.

  13. Have the temperature time series a structural change after 1998?

    NASA Astrophysics Data System (ADS)

    Werner, Rolf; Valev, Dimitare; Danov, Dimitar

    2012-07-01

    The global and hemisphere temperature GISS and Hadcrut3 time series were analysed for structural changes. We postulate the continuity of the preceding temperature function depending from the time. The slopes are calculated for a sequence of segments limited by time thresholds. We used a standard method, the restricted linear regression with dummy variables. We performed the calculations and tests for different number of thresholds. The thresholds are searched continuously in determined time intervals. The F-statistic is used to obtain the time points of the structural changes.

  14. Developing consistent time series landsat data products

    USDA-ARS?s Scientific Manuscript database

    The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...

  15. Time Series Discord Detection in Medical Data using a Parallel Relational Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Rintoul, Mark Daniel; Wilson, Andrew T.

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  16. Time Series Discord Detection in Medical Data using a Parallel Relational Database [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Wilson, Andrew T.; Rintoul, Mark Daniel

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  17. 77 FR 42038 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... market in 60% of the non-adjusted option series \\4\\ of each registered class that have a time to... time series Classes C2 (current rule) 99 of the time........ 60 Class-by-class. NOM 90 of a trading day... required to provide continuous quotes for the same amount of time in the same percentage of series as...

  18. Functional linear models to test for differences in prairie wetland hydraulic gradients

    USGS Publications Warehouse

    Greenwood, Mark C.; Sojda, Richard S.; Preston, Todd M.; Swayne, David A.; Yang, Wanhong; Voinov, A.A.; Rizzoli, A.; Filatova, T.

    2010-01-01

    Functional data analysis provides a framework for analyzing multiple time series measured frequently in time, treating each series as a continuous function of time. Functional linear models are used to test for effects on hydraulic gradient functional responses collected from three types of land use in Northeastern Montana at fourteen locations. Penalized regression-splines are used to estimate the underlying continuous functions based on the discretely recorded (over time) gradient measurements. Permutation methods are used to assess the statistical significance of effects. A method for accommodating missing observations in each time series is described. Hydraulic gradients may be an initial and fundamental ecosystem process that responds to climate change. We suggest other potential uses of these methods for detecting evidence of climate change.

  19. Time series models on analysing mortality rates and acute childhood lymphoid leukaemia.

    PubMed

    Kis, Maria

    2005-01-01

    In this paper we demonstrate applying time series models on medical research. The Hungarian mortality rates were analysed by autoregressive integrated moving average models and seasonal time series models examined the data of acute childhood lymphoid leukaemia.The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality rates of ischemic heart diseases and analysis of the mortality rates of cancer of digestive system. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by tree methods: standard normal distribution as estimation and estimation of the White's theory and the continuous time case estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.We present a new approach to analysing the occurrence of acute childhood lymphoid leukaemia. We decompose time series into components. The periodicity of acute childhood lymphoid leukaemia in Hungary was examined using seasonal decomposition time series method. The cyclic trend of the dates of diagnosis revealed that a higher percent of the peaks fell within the winter months than in the other seasons. This proves the seasonal occurrence of the childhood leukaemia in Hungary.

  20. 76 FR 25345 - Annual Assessment of the Status of Competition in the Market for the Delivery of Video Programming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-04

    ... as of June 30 of the relevant year to monitor trends on an annual basis. To continue our time-series... video programming? 24. MVPD Performance. We seek comment on the information and time- series data we... Television Performance. We seek information and time- series data for the analysis of various performance...

  1. A Continuous Long-Term Record of Magnetic-Storm Occurrence and Intensity

    NASA Astrophysics Data System (ADS)

    Love, J. J.

    2007-05-01

    Hourly magnetometer data have been produced by ground-based magnetic observatories for over a century. These data are used for a wide variety of applications, including many for space physics. In particular, hourly data from a longitudinal necklace of mid-latitude observatories can be used to construct a time series recording the storm-time disturbance index Dst, one of the most useful scalar summaries of magnetic storm intensity which is generally interpreted in terms of an equivalent equatorial magnetospheric ring current. Dst has been routinely calculated in a temporally piece-wise fashion since the IGY using a subset of the available observatories: four or five stations, typically including Honolulu (HON), San Juan (SJG), Kakioka Japan (KAK), Hermanus South Africa (HER), and Alibag India (ABG). In this presentation we discuss a single continuous Dst time series made using a denser and more uniform distribution of observatories than that which is standard: including, additionally, Watheroo Australia (WAT), Apia Samoa (API), and Vassouras Brazil (VSS). Starting first with the data from each individual observatory, we subtract the geomagnetic secular variation, caused primarily by the core dynamo, and the solar-quiet (Sq) variation, caused primarily by the ionospheric dynamo. The latter requires careful spectral analysis, and those intermediate results are, themselves, of scientific interest. Following this, we combine the disturbance residuals from each station to form the continuous Dst time series. Statistics deduced from this model allow us to quantify the likelihood of storm occurrence and intensity, both of which are modulated in time by the solar cycle. This analysis is accomplished using a 50 year Dst time series. The prospects for constructing a longer continuous Dst time series are discussed.

  2. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  3. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  4. 77 FR 47383 - Annual Assessment of the Status of Competition in the Market for the Delivery of Video Programming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-08

    ... monitor trends on an annual basis. To continue our time-series analysis, we request data as of June 30... information and time- series data we should collect for the analysis of various MVPD performance metrics. In... revenues, cash flows, and margins. To the extent possible, we seek five-year time-series data to allow us...

  5. Inclusion of mobile telephone numbers into an ongoing population health survey in New South Wales, Australia, using an overlapping dual-frame design: impact on the time series.

    PubMed

    Barr, Margo L; Ferguson, Raymond A; Steel, David G

    2014-08-12

    Since 1997, the NSW Population Health Survey (NSWPHS) had selected the sample using random digit dialing of landline telephone numbers. When the survey began coverage of the population by landline phone frames was high (96%). As landline coverage in Australia has declined and continues to do so, in 2012, a sample of mobile telephone numbers was added to the survey using an overlapping dual-frame design. Details of the methodology are published elsewhere. This paper discusses the impacts of the sampling frame change on the time series, and provides possible approaches to handling these impacts. Prevalence estimates were calculated for type of phone-use, and a range of health indicators. Prevalence ratios (PR) for each of the health indicators were also calculated using Poisson regression analysis with robust variance estimation by type of phone-use. Health estimates for 2012 were compared to 2011. The full time series was examined for selected health indicators. It was estimated from the 2012 NSWPHS that 20.0% of the NSW population were mobile-only phone users. Looking at the full time series for overweight or obese and current smoking if the NSWPHS had continued to be undertaken only using a landline frame, overweight or obese would have been shown to continue to increase and current smoking would have been shown to continue to decrease. However, with the introduction of the overlapping dual-frame design in 2012, overweight or obese increased until 2011 and then decreased in 2012, and current smoking decreased until 2011, and then increased in 2012. Our examination of these time series showed that the changes were a consequence of the sampling frame change and were not real changes. Both the backcasting method and the minimal coverage method could adequately adjust for the design change and allow for the continuation of the time series. The inclusion of the mobile telephone numbers, through an overlapping dual-frame design, did impact on the time series for some of the health indicators collected through the NSWPHS, but only in that it corrected the estimates that were being calculated from a sample frame that was progressively covering less of the population.

  6. Analysis of continuous GPS measurements from southern Victoria Land, Antarctica

    USGS Publications Warehouse

    Willis, Michael J.

    2007-01-01

    Several years of continuous data have been collected at remote bedrock Global Positioning System (GPS) sites in southern Victoria Land, Antarctica. Annual to sub-annual variations are observed in the position time-series. An atmospheric pressure loading (APL) effect is calculated from pressure field anomalies supplied by the European Centre for Medium-Range Weather Forecasts (ECMWF) model loading an elastic Earth model. The predicted APL signal has a moderate correlation with the vertical position time-series at McMurdo, Ross Island (International Global Navigation Satellite System Service (IGS) station MCM4), produced using a global solution. In contrast, a local solution in which MCM4 is the fiducial site generates a vertical time series for a remote site in Victoria Land (Cape Roberts, ROB4) which exhibits a low, inverse correlation with the predicted atmospheric pressure loading signal. If, in the future, known and well modeled geophysical loads can be separated from the time-series, then local hydrological loading, of interest for glaciological and climate applications, can potentially be extracted from the GPS time-series.

  7. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis.

    PubMed

    Astola, Laura; Molenaar, Jaap

    2014-07-01

    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  8. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    DTIC Science & Technology

    2016-05-12

    valued times series from a sample. (A practical algorithm to compute the estimator is a work in progress.) Third, finitely-valued spatial processes...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics; time series ; Markov chains; random...proved. Second, a statistical method is developed to estimate the memory depth of discrete- time and continuously-valued times series from a sample. (A

  9. 76 FR 51239 - North American Industry Classification System; Revision for 2012

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ... definitional and economic changes so that they can create continuous time series and accurately analyze data changes over time. The inclusion of revenues from FGP activities in manufacturing will effectively change...) to exclude production that occurs in a foreign country for historical consistency in time series...

  10. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis

    PubMed Central

    Astola, Laura; Molenaar, Jaap

    2014-01-01

    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data. PMID:27600344

  11. Time Series Remote Sensing in Monitoring the Spatio-Temporal Dynamics of Plant Invasions: A Study of Invasive Saltcedar (Tamarix Spp.)

    NASA Astrophysics Data System (ADS)

    Diao, Chunyuan

    In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.

  12. Long series of geomagnetic measurements - unique at satellite era

    NASA Astrophysics Data System (ADS)

    Mandea, Mioara; Balasis, Georgios

    2017-04-01

    We have long appreciated that magnetic measurements obtained at Earth's surface are of great value in characterizing geomagnetic field behavior and then probing the deep interior of our Planet. The existence of new magnetic satellite missions data offer a new detailed global understanding of the geomagnetic field. However, when our interest moves over long-time scales, the very long series of measurements play an important role. Here, we firstly provide an updated series of geomagnetic declination in Paris, shortly after a very special occasion: its value has reached zero after some 350 years of westerly values. We take this occasion to emphasize the importance of long series of continuous measurements, mainly when various techniques are used to detect the abrupt changes in geomagnetic field, the geomagnetic jerks. Many novel concepts originated in dynamical systems or information theory have been developed, partly motivated by specific research questions from the geosciences. This continuously extending toolbox of nonlinear time series analysis is a key to understand the complexity of geomagnetic field. Here, motivated by these efforts, a series of entropy analysis are applied to geomagnetic field time series aiming to detect dynamical complex changes associated with geomagnetic jerks.

  13. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  14. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    PubMed

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  15. Improving estimates of ecosystem metabolism by reducing effects of tidal advection on dissolved oxygen time series-Abstract

    EPA Science Inventory

    Continuous time series of dissolved oxygen (DO) have been used to compute estimates of metabolism in aquatic ecosystems. Central to this open water or "Odum" method is the assumption that the DO time is not strongly affected by advection and that effects due to advection or mixin...

  16. Continuous Remote Measurements of Atmospheric O2 Concentrations in Relation to Interannual Variations in Biological Production and Carbon Cycling in the Oceans

    NASA Technical Reports Server (NTRS)

    Keeling, Ralph F.; Campbell, J. A. (Technical Monitor)

    2002-01-01

    We successfully initiated a program to obtain continuous time series of atmospheric O2 concentrations at a semi-remote coastal site, in Trinidad, California. The installation, which was completed in September 1999, consists of a commercially-available O2 and CO2 analyzers interfaced to a custom gas handling system and housed in a dedicated building at the Trinidad site. Ultimately, the data from this site are expected to provide constraints, complementing satellite data, on variations in ocean productivity and carbon exchange on annual and interannual time scales, in the context of human-induced changes in global climate and other perturbations. The existing time-series, of limited duration, have been used in support of studies of the O2/CO2 exchange from a wild fire (which fortuitously occurred nearby in October 1999) and to quantify air-sea N2O and O2 exchanges related to coastal upwelling events. More generally, the project demonstrates the feasibility of obtaining semi-continuous O2 time series at moderate cost from strategic locations globally.

  17. On-off intermittency in time series of spontaneous paroxysmal activity in rats with genetic absence epilepsy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hramov, Alexander; Koronovskii, Alexey A.; Midzyanovskaya, I.S.

    2006-12-15

    In the present paper we consider the on-off intermittency phenomena observed in time series of spontaneous paroxysmal activity in rats with genetic absence epilepsy. The method to register and analyze the electroencephalogram with the help of continuous wavelet transform is also suggested.

  18. 75 FR 61178 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-04

    ... both time-series indexes and cost levels for industry and occupational groups, thereby increasing the... ensure that requested data can be provided in the desired format, reporting burden (time and financial... Employee Compensation. The data provided will be the same, and the series will be continuous. The NCS will...

  19. Comparison Of In Situ Soil Moisture Measurements: An Examination of the Neutron and Dielectric Measurements within the Illinois Climate Network

    USDA-ARS?s Scientific Manuscript database

    The continuity of soil moisture time series data is crucial for climatic research. Yet, a common problem for continuous data series is the changing of sensors, not only as replacements are necessary, but as technologies evolve. The Illinois Climate Network has one of the longest data records of soi...

  20. 27 CFR 19.594 - Numbering of packages and cases in processing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... trade name, all series in use at that time shall be continued. However, for a change in proprietorship... spirits and denatured spirits shall, when filled, be consecutively numbered in a separate series by the proprietor commencing with “1” in each series of serial numbers, except that any series of such numbers in...

  1. LAI, FAPAR and FCOVER products derived from AVHRR long time series: principles and evaluation

    NASA Astrophysics Data System (ADS)

    Verger, A.; Baret, F.; Weiss, M.; Lacaze, R.; Makhmara, H.; Pacholczyk, P.; Smets, B.; Kandasamy, S.; Vermote, E.

    2012-04-01

    Continuous and long term global monitoring of the terrestrial biosphere has draught an intense interest in the recent years in the context of climate and global change. Developing methodologies for generating historical data records from data collected with different satellite sensors over the past three decades by taking benefits from the improvements identified in the processing of the new generation sensors is a new central issue in remote sensing community. In this context, the Bio-geophysical Parameters (BioPar) service within Geoland2 project (http://www.geoland2.eu) aims at developing pre-operational infrastructures for providing global land products both in near real time and off-line mode with long time series. In this contribution, we describe the principles of the GEOLAND algorithm for generating long term datasets of three key biophysical variables, leaf area index (LAI), Fraction of Absorbed Photosynthetic Active Radiation (FAPAR) and cover fraction (FCOVER), that play a key role in several processes, including photosynthesis, respiration and transpiration. LAI, FAPAR and FCOVER are produced globally from AVHRR Long Term Data Record (LTDR) for the 1981-2000 period at 0.05° spatial resolution and 10 days temporal sampling frequency. The proposed algorithm aims to ensure robustness of the derived long time series and consistency with the ones developed in the recent years, and particularly with GEOLAND products derived from VEGETATION sensor. The approach is based on the capacity of neural networks to learn a particular biophysical product (GEOLAND) from reflectances from another sensor (AVHRR normalized reflectances in the red and near infrared bands). Outliers due to possible cloud contamination or residual atmospheric correction are iteratively eliminated. Prior information based on the climatology is used to get more robust estimates. A specific gap filing and smoothing procedure was applied to generate continuous and smooth time series of decadal products. Finally, quality assessment information as well as tentative quantitative uncertainties were proposed. The comparison of the resulting AVHRR LTDR products with actual GEOLAND series derived from VEGETATION demonstrates that they are very consistent, providing continuous time series of global observations of LAI, FAPAR and FCOVER for the last 30-year period, with continuation after 2011.

  2. Glucose Prediction Algorithms from Continuous Monitoring Data: Assessment of Accuracy via Continuous Glucose Error-Grid Analysis.

    PubMed

    Zanderigo, Francesca; Sparacino, Giovanni; Kovatchev, Boris; Cobelli, Claudio

    2007-09-01

    The aim of this article was to use continuous glucose error-grid analysis (CG-EGA) to assess the accuracy of two time-series modeling methodologies recently developed to predict glucose levels ahead of time using continuous glucose monitoring (CGM) data. We considered subcutaneous time series of glucose concentration monitored every 3 minutes for 48 hours by the minimally invasive CGM sensor Glucoday® (Menarini Diagnostics, Florence, Italy) in 28 type 1 diabetic volunteers. Two prediction algorithms, based on first-order polynomial and autoregressive (AR) models, respectively, were considered with prediction horizons of 30 and 45 minutes and forgetting factors (ff) of 0.2, 0.5, and 0.8. CG-EGA was used on the predicted profiles to assess their point and dynamic accuracies using original CGM profiles as reference. Continuous glucose error-grid analysis showed that the accuracy of both prediction algorithms is overall very good and that their performance is similar from a clinical point of view. However, the AR model seems preferable for hypoglycemia prevention. CG-EGA also suggests that, irrespective of the time-series model, the use of ff = 0.8 yields the highest accurate readings in all glucose ranges. For the first time, CG-EGA is proposed as a tool to assess clinically relevant performance of a prediction method separately at hypoglycemia, euglycemia, and hyperglycemia. In particular, we have shown that CG-EGA can be helpful in comparing different prediction algorithms, as well as in optimizing their parameters.

  3. A perturbative approach for enhancing the performance of time series forecasting.

    PubMed

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Behaviour of a series of reservoirs separated by drowned gates

    NASA Astrophysics Data System (ADS)

    Kolechkina, Alla; van Nooijen, Ronald

    2017-04-01

    Modern control systems tend to be based on computers and therefore to operate by sending commands to structures at given intervals (discrete time control system). Moreover, for almost all water management control systems there are practical lower limits on the time interval between structure adjustments and even between measurements. The water resource systems that are being controlled are physical systems whose state changes continuously. If we combine a continuously changing system and a discrete time controller we get a hybrid system. We use material from recent control theory literature to examine the behaviour of a series of reservoirs separated by drowned gates where the gates are under computer control.

  5. GPS Time Series and Geodynamic Implications for the Hellenic Arc Area, Greece

    NASA Astrophysics Data System (ADS)

    Hollenstein, Ch.; Heller, O.; Geiger, A.; Kahle, H.-G.; Veis, G.

    The quantification of crustal deformation and its temporal behavior is an important contribution to earthquake hazard assessment. With GPS measurements, especially from continuous operating stations, pre-, co-, post- and interseismic movements can be recorded and monitored. We present results of a continuous GPS network which has been operated in the Hellenic Arc area, Greece, since 1995. In order to obtain coordinate time series of high precision which are representative for crustal deformation, a main goal was to eliminate effects which are not of tectonic origin. By applying different steps of improvement, non-tectonic irregularities were reduced significantly, and the precision could be improved by an average of 40%. The improved time series are used to study the crustal movements in space and time. They serve as a base for the estimation of velocities and for the visualization of the movements in terms of trajectories. Special attention is given to large earthquakes (M>6), which occurred near GPS sites during the measuring time span.

  6. Improving Global Mass Flux Solutions from Gravity Recovery and Climate Experiment (GRACE) Through Forward Modeling and Continuous Time Correlation

    NASA Technical Reports Server (NTRS)

    Sabaka, T. J.; Rowlands, D. D.; Luthcke, S. B.; Boy, J.-P.

    2010-01-01

    We describe Earth's mass flux from April 2003 through November 2008 by deriving a time series of mas cons on a global 2deg x 2deg equal-area grid at 10 day intervals. We estimate the mass flux directly from K band range rate (KBRR) data provided by the Gravity Recovery and Climate Experiment (GRACE) mission. Using regularized least squares, we take into account the underlying process dynamics through continuous space and time-correlated constraints. In addition, we place the mascon approach in the context of other filtering techniques, showing its equivalence to anisotropic, nonsymmetric filtering, least squares collocation, and Kalman smoothing. We produce mascon time series from KBRR data that have and have not been corrected (forward modeled) for hydrological processes and fmd that the former produce superior results in oceanic areas by minimizing signal leakage from strong sources on land. By exploiting the structure of the spatiotemporal constraints, we are able to use a much more efficient (in storage and computation) inversion algorithm based upon the conjugate gradient method. This allows us to apply continuous rather than piecewise continuous time-correlated constraints, which we show via global maps and comparisons with ocean-bottom pressure gauges, to produce time series with reduced random variance and full systematic signal. Finally, we present a preferred global model, a hybrid whose oceanic portions are derived using forward modeling of hydrology but whose land portions are not, and thus represent a pure GRACE-derived signal.

  7. Microbial oceanography and the Hawaii Ocean Time-series programme.

    PubMed

    Karl, David M; Church, Matthew J

    2014-10-01

    The Hawaii Ocean Time-series (HOT) programme has been tracking microbial and biogeochemical processes in the North Pacific Subtropical Gyre since October 1988. The near-monthly time series observations have revealed previously undocumented phenomena within a temporally dynamic ecosystem that is vulnerable to climate change. Novel microorganisms, genes and unexpected metabolic pathways have been discovered and are being integrated into our evolving ecological paradigms. Continued research, including higher-frequency observations and at-sea experimentation, will help to provide a comprehensive scientific understanding of microbial processes in the largest biome on Earth.

  8. 8 CFR 245a.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... having interrupted his or her continuous residence as required at the time of filing an application. (2... in lieu of, the Federal Citizenship Text series); (3) Be designed to provide at least 60 hours of... the alien shall be regarded as having resided continuously in the United States if, at the time of...

  9. 8 CFR 245a.1 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... having interrupted his or her continuous residence as required at the time of filing an application. (2... in lieu of, the Federal Citizenship Text series); (3) Be designed to provide at least 60 hours of... the alien shall be regarded as having resided continuously in the United States if, at the time of...

  10. Estimations of the Global Distribution and Time Series of UV Noontime Irradiance (305, 310, 324, 380 nm, and Erythemal) from TOMS and SeaWiFS Data

    NASA Technical Reports Server (NTRS)

    Herman, J.

    2004-01-01

    The amount of UV irradiance reaching the Earth's surface is estimated from the measured cloud reflectivity, ozone, aerosol amounts, and surface reflectivity time series from 1980 to 1992 and 1997 to 2000 to estimate changes that have occurred over a 21-year period. Recent analysis of the TOMS data shows that there has been an apparent increase in reflectivity (decrease in W) in the Southern Hemisphere that is related to a calibration error in EP-TOMS. Data from the well-calibrated SeaWiFS satellite instrument have been used to correct the EP-TOMS reflectivity and UV time series. After correction, some of the local trend features seen in the N7 time series (1980 to 1992) have been continued in the combined time series, but the overall zonal average and global trends have changed. In addition to correcting the EP-TOMS radiance calibration, the use of SeaWiFS cloud data permits estimation of UV irradiance at higher spatial resolution (1 to 4 km) than is available from TOMS (100 km) under the assumption that ozone is slowly varying over a scale of 100 km. The key results include a continuing decrease in cloud cover over Europe and North America with a corresponding increase in UV and a decrease in UV irradiance near Antarctica.

  11. Crustal Movements and Gravity Variations in the Southeastern Po Plain, Italy

    NASA Astrophysics Data System (ADS)

    Zerbini, S.; Bruni, S.; Errico, M.; Santi, E.; Wilmes, H.; Wziontek, H.

    2014-12-01

    At the Medicina observatory, in the southeastern Po Plain, in Italy, we have started a project of continuous GPS and gravity observations in mid 1996. The experiment, focused on a comparison between height and gravity variations, is still ongoing; these uninterrupted time series certainly constitute a most important data base to observe and estimate reliably long-period behaviors but also to derive deeper insights on the nature of the crustal deformation. Almost two decades of continuous GPS observations from two closely located receivers have shown that the coordinate time series are characterized by linear and non-linear variations as well as by sudden jumps. Both over long- and short-period time scales, the GPS height series show signals induced by different phenomena, for example, those related to mass transport in the Earth system. Seasonal effects are clearly recognizable and are mainly associated with the water table seasonal behavior. To understand and separate the contribution of different forcings is not an easy task; to this end, the information provided by the superconducting gravimeter observations and also by absolute gravity measurements offers a most important means to detect and understand mass contributions. In addition to GPS and gravity data, at Medicina, a number of environmental parameters time series are also regularly acquired, among them water table levels. We present the results of study investigating correlations between height, gravity and environmental parameters time series.

  12. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis.

    PubMed

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio

    2015-12-01

    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  13. POLYNOMIAL-BASED DISAGGREGATION OF HOURLY RAINFALL FOR CONTINUOUS HYDROLOGIC SIMULATION

    EPA Science Inventory

    Hydrologic modeling of urban watersheds for designs and analyses of stormwater conveyance facilities can be performed in either an event-based or continuous fashion. Continuous simulation requires, among other things, the use of a time series of rainfall amounts. However, for urb...

  14. waterData--An R package for retrieval, analysis, and anomaly calculation of daily hydrologic time series data, version 1.0

    USGS Publications Warehouse

    Ryberg, Karen R.; Vecchia, Aldo V.

    2012-01-01

    Hydrologic time series data and associated anomalies (multiple components of the original time series representing variability at longer-term and shorter-term time scales) are useful for modeling trends in hydrologic variables, such as streamflow, and for modeling water-quality constituents. An R package, called waterData, has been developed for importing daily hydrologic time series data from U.S. Geological Survey streamgages into the R programming environment. In addition to streamflow, data retrieval may include gage height and continuous physical property data, such as specific conductance, pH, water temperature, turbidity, and dissolved oxygen. The package allows for importing daily hydrologic data into R, plotting the data, fixing common data problems, summarizing the data, and the calculation and graphical presentation of anomalies.

  15. 75 FR 49010 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-12

    ... also proposes to identify EOW and EOM trading patterns by undertaking a time series analysis of open... a Friday, the Exchange will list an End of Month expiration series and not an End of Week expiration... continue to exist. However, any further trading in those series would be restricted to transactions where...

  16. New Continuous Monitoring Technologies for Vapor Intrusion, Remediation and Site Assessment: Benefits of Time Series Data

    DTIC Science & Technology

    2011-03-31

    00-00-2011 4. TITLE AND SUBTITLE New Continuous Monitoring Technologies for Vapor Intrusion, Remediation and Site Assessment . Benefits of Time...Std Z39-18 Dr Peter Morris, Geoff Hewitt New Continuous Monitoring Technologies for Vapor Intrusion, Remediation and Site Assessment . Benefits of...but which poses a greater risk ? V O C p p m Acetone Industrial facility with VOC Leak Site characterisation and Real time monitoring of Remediation

  17. [MODIS Investigation

    NASA Technical Reports Server (NTRS)

    Abbott, Mark R.

    1998-01-01

    The objectives of the last six months were: Continue analysis of Hawaii Ocean Time-series (HOT) bio-optical mooring data, Recover instrumentation from JGOFS cruises in the Southern Ocean and analyze data Maintain documentation of MOCEAN algorithms and software for use by MOCEAN and GLI teams Continue chemostat experiments on the relationship of fluorescence quantum yield to environmental factors. Continue to develop and expand browser-based information system for in situ bio-optical data Work Analysis of Field Data from Hawaii We are continuing to analyze bio-optical data collected at the Hawaii Ocean Time Series mooring. The HOT bio-optical mooring was recovered in May 1998. After retrieving the data, the sensor package was serviced and redeployed. We now have over 18 months of data. These are being analyzed as part of a larger study of mesoscale processes at this JGOFS time series site. We have had some failures in the data logger which have affected the fluorescence channels. These are being repaired. We also had an instrument housing failure, and minor modifications have been made to avoid subsequent problems. In addition, Ricardo Letelier is funded as part of the SeaWiFS calibrator/validation effort (through a subcontract from the University of Hawaii, Dr. John Porter), and he is collecting bio-optical and fluorescence data as part of the HOT activity.

  18. Higher-Order Hurst Signatures: Dynamical Information in Time Series

    NASA Astrophysics Data System (ADS)

    Ferenbaugh, Willis

    2005-10-01

    Understanding and comparing time series from different systems requires characteristic measures of the dynamics embedded in the series. The Hurst exponent is a second-order dynamical measure of a time series which grew up within the blossoming fractal world of Mandelbrot. This characteristic measure is directly related to the behavior of the autocorrelation, the power-spectrum, and other second-order things. And as with these other measures, the Hurst exponent captures and quantifies some but not all of the intrinsic nature of a series. The more elusive characteristics live in the phase spectrum and the higher-order spectra. This research is a continuing quest to (more) fully characterize the dynamical information in time series produced by plasma experiments or models. The goal is to supplement the series information which can be represented by a Hurst exponent, and we would like to develop supplemental techniques in analogy with Hurst's original R/S analysis. These techniques should be another way to plumb the higher-order dynamics.

  19. Covariance Function for Nearshore Wave Assimilation Systems

    DTIC Science & Technology

    2018-01-30

    covariance can be modeled by a parameterized Gaussian function, for nearshore wave assimilation applications, the covariance function depends primarily on...case of missing values at the compiled time series, the gaps were filled by weighted interpolation. The weights depend on the number of the...averaging, in order to create the continuous time series, filters out the dependency on the instantaneous meteorological and oceanographic conditions

  20. Continuity of care in community midwifery.

    PubMed

    Bowers, John; Cheyne, Helen; Mould, Gillian; Page, Miranda

    2015-06-01

    Continuity of care is often critical in delivering high quality health care. However, it is difficult to achieve in community health care where shift patterns and a need to minimise travelling time can reduce the scope for allocating staff to patients. Community midwifery is one example of such a challenge in the National Health Service where postnatal care typically involves a series of home visits. Ideally mothers would receive all of their antenatal and postnatal care from the same midwife. Minimising the number of staff-handovers helps ensure a better relationship between mothers and midwives, and provides more opportunity for staff to identify emerging problems over a series of home visits. This study examines the allocation and routing of midwives in the community using a variant of a multiple travelling salesmen problem algorithm incorporating staff preferences to explore trade-offs between travel time and continuity of care. This algorithm was integrated in a simulation to assess the additional effect of staff availability due to shift patterns and part-time working. The results indicate that continuity of care can be achieved with relatively small increases in travel time. However, shift patterns are problematic: perfect continuity of care is impractical but if there is a degree of flexibility in the visit schedule, reasonable continuity is feasible.

  1. Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando

    2013-04-01

    SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.

  2. State-space prediction model for chaotic time series

    NASA Astrophysics Data System (ADS)

    Alparslan, A. K.; Sayar, M.; Atilgan, A. R.

    1998-08-01

    A simple method for predicting the continuation of scalar chaotic time series ahead in time is proposed. The false nearest neighbors technique in connection with the time-delayed embedding is employed so as to reconstruct the state space. A local forecasting model based upon the time evolution of the topological neighboring in the reconstructed phase space is suggested. A moving root-mean-square error is utilized in order to monitor the error along the prediction horizon. The model is tested for the convection amplitude of the Lorenz model. The results indicate that for approximately 100 cycles of the training data, the prediction follows the actual continuation very closely about six cycles. The proposed model, like other state-space forecasting models, captures the long-term behavior of the system due to the use of spatial neighbors in the state space.

  3. Characterizing the continuously acquired cardiovascular time series during hemodialysis, using median hybrid filter preprocessing noise reduction.

    PubMed

    Wilson, Scott; Bowyer, Andrea; Harrap, Stephen B

    2015-01-01

    The clinical characterization of cardiovascular dynamics during hemodialysis (HD) has important pathophysiological implications in terms of diagnostic, cardiovascular risk assessment, and treatment efficacy perspectives. Currently the diagnosis of significant intradialytic systolic blood pressure (SBP) changes among HD patients is imprecise and opportunistic, reliant upon the presence of hypotensive symptoms in conjunction with coincident but isolated noninvasive brachial cuff blood pressure (NIBP) readings. Considering hemodynamic variables as a time series makes a continuous recording approach more desirable than intermittent measures; however, in the clinical environment, the data signal is susceptible to corruption due to both impulsive and Gaussian-type noise. Signal preprocessing is an attractive solution to this problem. Prospectively collected continuous noninvasive SBP data over the short-break intradialytic period in ten patients was preprocessed using a novel median hybrid filter (MHF) algorithm and compared with 50 time-coincident pairs of intradialytic NIBP measures from routine HD practice. The median hybrid preprocessing technique for continuously acquired cardiovascular data yielded a dynamic regression without significant noise and artifact, suitable for high-level profiling of time-dependent SBP behavior. Signal accuracy is highly comparable with standard NIBP measurement, with the added clinical benefit of dynamic real-time hemodynamic information.

  4. Using simulations and data to evaluate mean sensitivity (ζ) as a useful statistic in dendrochronology

    Treesearch

    Andrew G. Bunn; Esther Jansma; Mikko Korpela; Robert D. Westfall; James Baldwin

    2013-01-01

    Mean sensitivity (ζ) continues to be used in dendrochronology despite a literature that shows it to be of questionable value in describing the properties of a time series. We simulate first-order autoregressive models with known parameters and show that ζ is a function of variance and autocorrelation of a time series. We then use 500 random tree-ring...

  5. Diagnosis of digestive functional disease by the statistics of continuous monitoring of esophageal acidity

    NASA Astrophysics Data System (ADS)

    Rivera Landa, Rogelio; Cardenas Cardenas, Eduardo; Fossion, Ruben; Pérez Zepeda, Mario Ulises

    2014-11-01

    Technological advances in the last few decennia allow the monitoring of many physiological observables in a continuous way, which in physics is called a "time series". The best studied physiological time series is that of the heart rhythm, which can be derived from an electrocardiogram (ECG). Studies have shown that a healthy heart is characterized by a complex time series and high heart rate variability (HRV). In adverse conditions, the cardiac time series degenerates towards randomness (as seen in, e.g., fibrillation) or rigidity (as seen in, e.g., ageing), both corresponding to a loss of HRV as described by, e.g., Golberger et. al [1]. Cardiac and digestive rhythms are regulated by the autonomous nervous system (ANS), that consists of two antagonistic branches, the orthosympathetic branch (ONS) that accelerates the cardiac rhythm but decelerates the digestive system, and the parasympathetic brand (PNS) that works in the opposite way. Because of this reason, one might expect that the statistics of gastro-esophageal time series, as described by Gardner et. al. [2,3], reflects the health state of the digestive system in a similar way as HRV in the cardiac case, described by Minocha et. al. In the present project, we apply statistical methods derived from HRV analysis to time series of esophageal acidity (24h pHmetry). The study is realized on data from a large patient population from the Instituto Nacional de Ciencias Médicas y Nutrición Salvador Zubirán. Our focus is on patients with functional disease (symptoms but no anatomical damage). We find that traditional statistical approaches (e.g. Fourier spectral analysis) are unable to distinguish between different degenerations of the digestive system, such as gastric esophageal reflux disease (GERD) or functional gastrointestinal disorder (FGID).

  6. Guidelines and Procedures for Computing Time-Series Suspended-Sediment Concentrations and Loads from In-Stream Turbidity-Sensor and Streamflow Data

    USGS Publications Warehouse

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.

    2009-01-01

    In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.

  7. Stochastic nature of series of waiting times.

    PubMed

    Anvari, Mehrnaz; Aghamohammadi, Cina; Dashti-Naserabadi, H; Salehi, E; Behjat, E; Qorbani, M; Nezhad, M Khazaei; Zirak, M; Hadjihosseini, Ali; Peinke, Joachim; Tabar, M Reza Rahimi

    2013-06-01

    Although fluctuations in the waiting time series have been studied for a long time, some important issues such as its long-range memory and its stochastic features in the presence of nonstationarity have so far remained unstudied. Here we find that the "waiting times" series for a given increment level have long-range correlations with Hurst exponents belonging to the interval 1/2

  8. On the statistical aspects of sunspot number time series and its association with the summer-monsoon rainfall over India

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Surajit; Chattopadhyay, Goutami

    The present paper reports studies on the association between the mean annual sunspot numbers and the summer monsoon rainfall over India. The cross correlations have been studied. After Box-Cox transformation, the time spectral analysis has been executed and it has been found that both of the time series have an important spectrum at the fifth harmonic. An artificial neural network (ANN) model has been developed on the data series averaged continuously by five years and the neural network could establish a predictor-predict and relationship between the sunspot numbers and the mean yearly summer monsoon rainfall over India.

  9. Long-term variability in Northern Hemisphere snow cover and associations with warmer winters

    USGS Publications Warehouse

    McCabe, Gregory J.; Wolock, David M.

    2010-01-01

    A monthly snow accumulation and melt model is used with gridded monthly temperature and precipitation data for the Northern Hemisphere to generate time series of March snow-covered area (SCA) for the period 1905 through 2002. The time series of estimated SCA for March is verified by comparison with previously published time series of SCA for the Northern Hemisphere. The time series of estimated Northern Hemisphere March SCA shows a substantial decrease since about 1970, and this decrease corresponds to an increase in mean winter Northern Hemisphere temperature. The increase in winter temperature has caused a decrease in the fraction of precipitation that occurs as snow and an increase in snowmelt for some parts of the Northern Hemisphere, particularly the mid-latitudes, thus reducing snow packs and March SCA. In addition, the increase in winter temperature and the decreases in SCA appear to be associated with a contraction of the circumpolar vortex and a poleward movement of storm tracks, resulting in decreased precipitation (and snow) in the low- to mid-latitudes and an increase in precipitation (and snow) in high latitudes. If Northern Hemisphere winter temperatures continue to warm as they have since the 1970s, then March SCA will likely continue to decrease.

  10. Long-term variability in Northern Hemisphere snow cover and associations with warmer winters

    USGS Publications Warehouse

    McCabe, G.J.; Wolock, D.M.

    2010-01-01

    A monthly snow accumulation and melt model is used with gridded monthly temperature and precipitation data for the Northern Hemisphere to generate time series of March snow-covered area (SCA) for the period 1905 through 2002. The time series of estimated SCA for March is verified by comparison with previously published time series of SCA for the Northern Hemisphere. The time series of estimated Northern Hemisphere March SCA shows a substantial decrease since about 1970, and this decrease corresponds to an increase in mean winter Northern Hemisphere temperature. The increase in winter temperature has caused a decrease in the fraction of precipitation that occurs as snow and an increase in snowmelt for some parts of the Northern Hemisphere, particularly the mid-latitudes, thus reducing snow packs and March SCA. In addition, the increase in winter temperature and the decreases in SCA appear to be associated with a contraction of the circumpolar vortex and a poleward movement of storm tracks, resulting in decreased precipitation (and snow) in the low- to mid-latitudes and an increase in precipitation (and snow) in high latitudes. If Northern Hemisphere winter temperatures continue to warm as they have since the 1970s, then March SCA will likely continue to decrease. ?? 2009 Springer Science+Business Media B.V.

  11. Coastline detection with time series of SAR images

    NASA Astrophysics Data System (ADS)

    Ao, Dongyang; Dumitru, Octavian; Schwarz, Gottfried; Datcu, Mihai

    2017-10-01

    For maritime remote sensing, coastline detection is a vital task. With continuous coastline detection results from satellite image time series, the actual shoreline, the sea level, and environmental parameters can be observed to support coastal management and disaster warning. Established coastline detection methods are often based on SAR images and wellknown image processing approaches. These methods involve a lot of complicated data processing, which is a big challenge for remote sensing time series. Additionally, a number of SAR satellites operating with polarimetric capabilities have been launched in recent years, and many investigations of target characteristics in radar polarization have been performed. In this paper, a fast and efficient coastline detection method is proposed which comprises three steps. First, we calculate a modified correlation coefficient of two SAR images of different polarization. This coefficient differs from the traditional computation where normalization is needed. Through this modified approach, the separation between sea and land becomes more prominent. Second, we set a histogram-based threshold to distinguish between sea and land within the given image. The histogram is derived from the statistical distribution of the polarized SAR image pixel amplitudes. Third, we extract continuous coastlines using a Canny image edge detector that is rather immune to speckle noise. Finally, the individual coastlines derived from time series of .SAR images can be checked for changes.

  12. Design Tool for Planning Permanganate Injection Systems

    DTIC Science & Technology

    2010-08-01

    Chemical Spill 10 CSTR continuously stirred tank reactors CT contact time EDB ethylene dibromide ESTCP Environmental Security Technology...63 6.2 Simulating Oxidant Distribution Using a Series of CSTRs ...ER- 0625. 6.2 SIMULATING OXIDANT DISTRIBUTION USING A SERIES OF CSTRS 6.2.1 MODEL DEVELOPMENT The transport and consumption of permanganate

  13. NCCAM's 5 Most Searched-For Herbs of 2012: What the Science Says about Evening Primrose Oil, St. John's Wort, Fenugreek,...

    MedlinePlus

    ... Resources CME/CEU and Online Lectures Online Continuing Education Series Distinguished Lecture Series Integrated Medicine Research Lecture ... has been a source of many folk or traditional remedies and more modern medicinal and cosmetic products. At various times aloe ...

  14. Stochastic nature of series of waiting times

    NASA Astrophysics Data System (ADS)

    Anvari, Mehrnaz; Aghamohammadi, Cina; Dashti-Naserabadi, H.; Salehi, E.; Behjat, E.; Qorbani, M.; Khazaei Nezhad, M.; Zirak, M.; Hadjihosseini, Ali; Peinke, Joachim; Tabar, M. Reza Rahimi

    2013-06-01

    Although fluctuations in the waiting time series have been studied for a long time, some important issues such as its long-range memory and its stochastic features in the presence of nonstationarity have so far remained unstudied. Here we find that the “waiting times” series for a given increment level have long-range correlations with Hurst exponents belonging to the interval 1/2

  15. Inference of scale-free networks from gene expression time series.

    PubMed

    Daisuke, Tominaga; Horton, Paul

    2006-04-01

    Quantitative time-series observation of gene expression is becoming possible, for example by cell array technology. However, there are no practical methods with which to infer network structures using only observed time-series data. As most computational models of biological networks for continuous time-series data have a high degree of freedom, it is almost impossible to infer the correct structures. On the other hand, it has been reported that some kinds of biological networks, such as gene networks and metabolic pathways, may have scale-free properties. We hypothesize that the architecture of inferred biological network models can be restricted to scale-free networks. We developed an inference algorithm for biological networks using only time-series data by introducing such a restriction. We adopt the S-system as the network model, and a distributed genetic algorithm to optimize models to fit its simulated results to observed time series data. We have tested our algorithm on a case study (simulated data). We compared optimization under no restriction, which allows for a fully connected network, and under the restriction that the total number of links must equal that expected from a scale free network. The restriction reduced both false positive and false negative estimation of the links and also the differences between model simulation and the given time-series data.

  16. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and

  17. A Methodology for the Parametric Reconstruction of Non-Steady and Noisy Meteorological Time Series

    NASA Astrophysics Data System (ADS)

    Rovira, F.; Palau, J. L.; Millán, M.

    2009-09-01

    Climatic and meteorological time series often show some persistence (in time) in the variability of certain features. One could regard annual, seasonal and diurnal time variability as trivial persistence in the variability of some meteorological magnitudes (as, e.g., global radiation, air temperature above surface, etc.). In these cases, the traditional Fourier transform into frequency space will show the principal harmonics as the components with the largest amplitude. Nevertheless, meteorological measurements often show other non-steady (in time) variability. Some fluctuations in measurements (at different time scales) are driven by processes that prevail on some days (or months) of the year but disappear on others. By decomposing a time series into time-frequency space through the continuous wavelet transformation, one is able to determine both the dominant modes of variability and how those modes vary in time. This study is based on a numerical methodology to analyse non-steady principal harmonics in noisy meteorological time series. This methodology combines both the continuous wavelet transform and the development of a parametric model that includes the time evolution of the principal and the most statistically significant harmonics of the original time series. The parameterisation scheme proposed in this study consists of reproducing the original time series by means of a statistically significant finite sum of sinusoidal signals (waves), each defined by using the three usual parameters: amplitude, frequency and phase. To ensure the statistical significance of the parametric reconstruction of the original signal, we propose a standard statistical t-student analysis of the confidence level of the amplitude in the parametric spectrum for the different wave components. Once we have assured the level of significance of the different waves composing the parametric model, we can obtain the statistically significant principal harmonics (in time) of the original time series by using the Fourier transform of the modelled signal. Acknowledgements The CEAM Foundation is supported by the Generalitat Valenciana and BANCAIXA (València, Spain). This study has been partially funded by the European Commission (FP VI, Integrated Project CIRCE - No. 036961) and by the Ministerio de Ciencia e Innovación, research projects "TRANSREG” (CGL2007-65359/CLI) and "GRACCIE” (CSD2007-00067, Program CONSOLIDER-INGENIO 2010).

  18. The LTDP ALTS Project: Contributing to the Continued Understanding and Exploitation of the ATSR Time Series

    NASA Astrophysics Data System (ADS)

    Clarke, Hannah; Done, Fay; Casadio, Stefano; Mackin, Stephen; Dinelli, Bianca Maria; Castelli, Elisa

    2016-08-01

    The long time-series of observations made by the Along Track Scanning Radiometers (ATSR) missions represents a valuable resource for a wide range of research and EO applications.With the advent of ESA's Long-TermData Preservation (LTDP) programme, thought has turned to the preservation and improved understanding of such long time-series, to support their continued exploitation in both existing and new areas of research, bringing the possibility of improving the existing data set and to inform and contribute towards future missions. For this reason, the 'Long Term Stability of the ATSR Instrument Series: SWIR Calibration, Cloud Masking and SAA' project, commonly known as the ATSR Long Term Stability (or ALTS) project, is designed to explore the key characteristics of the data set and new and innovative ways of enhancing and exploiting it.Work has focussed on: A new approach to the assessment of Short Wave Infra-Red (SWIR) channel calibration.; Developmentof a new method for Total Column Water Vapour (TCWV) retrieval.; Study of the South Atlantic Anomaly (SAA).; Radiative Transfer (RT) modelling for ATSR.; Providing AATSR observations with their location in the original instrument grid.; Strategies for the retrieval and archiving of historical ATSR documentation.; Study of TCWV retrieval over land; Development of new methods for cloud masking This paper provides an overview of these activities and illustrates the importance of preserving and understanding 'old' data for continued use in the future.

  19. Rainfall disaggregation for urban hydrology: Effects of spatial consistence

    NASA Astrophysics Data System (ADS)

    Müller, Hannes; Haberlandt, Uwe

    2015-04-01

    For urban hydrology rainfall time series with a high temporal resolution are crucial. Observed time series of this kind are very short in most cases, so they cannot be used. On the contrary, time series with lower temporal resolution (daily measurements) exist for much longer periods. The objective is to derive time series with a long duration and a high resolution by disaggregating time series of the non-recording stations with information of time series of the recording stations. The multiplicative random cascade model is a well-known disaggregation model for daily time series. For urban hydrology it is often assumed, that a day consists of only 1280 minutes in total as starting point for the disaggregation process. We introduce a new variant for the cascade model, which is functional without this assumption and also outperforms the existing approach regarding time series characteristics like wet and dry spell duration, average intensity, fraction of dry intervals and extreme value representation. However, in both approaches rainfall time series of different stations are disaggregated without consideration of surrounding stations. This yields in unrealistic spatial patterns of rainfall. We apply a simulated annealing algorithm that has been used successfully for hourly values before. Relative diurnal cycles of the disaggregated time series are resampled to reproduce the spatial dependence of rainfall. To describe spatial dependence we use bivariate characteristics like probability of occurrence, continuity ratio and coefficient of correlation. Investigation area is a sewage system in Northern Germany. We show that the algorithm has the capability to improve spatial dependence. The influence of the chosen disaggregation routine and the spatial dependence on overflow occurrences and volumes of the sewage system will be analyzed.

  20. A fast quadrature-based numerical method for the continuous spectrum biphasic poroviscoelastic model of articular cartilage.

    PubMed

    Stuebner, Michael; Haider, Mansoor A

    2010-06-18

    A new and efficient method for numerical solution of the continuous spectrum biphasic poroviscoelastic (BPVE) model of articular cartilage is presented. Development of the method is based on a composite Gauss-Legendre quadrature approximation of the continuous spectrum relaxation function that leads to an exponential series representation. The separability property of the exponential terms in the series is exploited to develop a numerical scheme that can be reduced to an update rule requiring retention of the strain history at only the previous time step. The cost of the resulting temporal discretization scheme is O(N) for N time steps. Application and calibration of the method is illustrated in the context of a finite difference solution of the one-dimensional confined compression BPVE stress-relaxation problem. Accuracy of the numerical method is demonstrated by comparison to a theoretical Laplace transform solution for a range of viscoelastic relaxation times that are representative of articular cartilage. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  1. Insights into shallow magmatic processes at Kīlauea Volcano, Hawaíi, from a multiyear continuous gravity time series

    NASA Astrophysics Data System (ADS)

    Poland, Michael P.; Carbone, Daniele

    2016-07-01

    Continuous gravity data collected near the summit eruptive vent at Kīlauea Volcano, Hawaíi, during 2011-2015 show a strong correlation with summit-area surface deformation and the level of the lava lake within the vent over periods of days to weeks, suggesting that changes in gravity reflect variations in volcanic activity. Joint analysis of gravity and lava level time series data indicates that over the entire time period studied, the average density of the lava within the upper tens to hundreds of meters of the summit eruptive vent remained low—approximately 1000-1500 kg/m3. The ratio of gravity change (adjusted for Earth tides and instrumental drift) to lava level change measured over 15 day windows rose gradually over the course of 2011-2015, probably reflecting either (1) a small increase in the density of lava within the eruptive vent or (2) an increase in the volume of lava within the vent due to gradual vent enlargement. Superimposed on the overall time series were transient spikes of mass change associated with inflation and deflation of Kīlauea's summit and coincident changes in lava level. The unexpectedly strong mass variations during these episodes suggest magma flux to and from the shallow magmatic system without commensurate deformation, perhaps indicating magma accumulation within, and withdrawal from, void space—a process that might not otherwise be apparent from lava level and deformation data alone. Continuous gravity data thus provide unique insights into magmatic processes, arguing for continued application of the method at other frequently active volcanoes.

  2. Near real-time monitoring of volcanic surface deformation from GPS measurements at Long Valley Caldera, California

    USGS Publications Warehouse

    Ji, Kang Hyeun; Herring, Thomas A.; Llenos, Andrea L.

    2013-01-01

    Long Valley Caldera in eastern California is an active volcanic area and has shown continued unrest in the last three decades. We have monitored surface deformation from Global Positioning System (GPS) data by using a projection method that we call Targeted Projection Operator (TPO). TPO projects residual time series with secular rates and periodic terms removed onto a predefined spatial pattern. We used the 2009–2010 slow deflation as a target spatial pattern. The resulting TPO time series shows a detailed deformation history including the 2007–2009 inflation, the 2009–2010 deflation, and a recent inflation that started in late-2011 and is continuing at the present time (November 2012). The recent inflation event is about four times faster than the previous 2007–2009 event. A Mogi source of the recent event is located beneath the resurgent dome at about 6.6 km depth at a rate of 0.009 km3/yr volume change. TPO is simple and fast and can provide a near real-time continuous monitoring tool without directly looking at all the data from many GPS sites in this potentially eruptive volcanic system.

  3. Saxon Math. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    "Saxon Math" is a textbook series covering grades K-12 based on incremental development and continual review of mathematical concepts to give students time to learn and practice concepts throughout the year. The series is aligned with standards of the National Council of Teachers of Mathematics (NCTM) and various states, and can be…

  4. Advances in Library Administration and Organization Volume 18.

    ERIC Educational Resources Information Center

    Garten, Edward D., Ed.; Williams, Delmus E., Ed.

    Long regarded as the premier monographic series in its area of coverage, "Advances in Library Administration and Organization" offers research perspectives that are both timely and lively. This 18th volume continues the series' long practice of bringing to its professional and academic readership an eclectic mix of scholarship and longish essays.…

  5. Modeling climate change impacts on combined sewer overflow using synthetic precipitation time series.

    PubMed

    Bendel, David; Beck, Ferdinand; Dittmer, Ulrich

    2013-01-01

    In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).

  6. An Illustration of Generalised Arma (garma) Time Series Modeling of Forest Area in Malaysia

    NASA Astrophysics Data System (ADS)

    Pillai, Thulasyammal Ramiah; Shitan, Mahendran

    Forestry is the art and science of managing forests, tree plantations, and related natural resources. The main goal of forestry is to create and implement systems that allow forests to continue a sustainable provision of environmental supplies and services. Forest area is land under natural or planted stands of trees, whether productive or not. Forest area of Malaysia has been observed over the years and it can be modeled using time series models. A new class of GARMA models have been introduced in the time series literature to reveal some hidden features in time series data. For these models to be used widely in practice, we illustrate the fitting of GARMA (1, 1; 1, δ) model to the Annual Forest Area data of Malaysia which has been observed from 1987 to 2008. The estimation of the model was done using Hannan-Rissanen Algorithm, Whittle's Estimation and Maximum Likelihood Estimation.

  7. Detecting dynamical changes in time series by using the Jensen Shannon divergence

    NASA Astrophysics Data System (ADS)

    Mateos, D. M.; Riveaud, L. E.; Lamberti, P. W.

    2017-08-01

    Most of the time series in nature are a mixture of signals with deterministic and random dynamics. Thus the distinction between these two characteristics becomes important. Distinguishing between chaotic and aleatory signals is difficult because they have a common wide band power spectrum, a delta like autocorrelation function, and share other features as well. In general, signals are presented as continuous records and require to be discretized for being analyzed. In this work, we introduce different schemes for discretizing and for detecting dynamical changes in time series. One of the main motivations is to detect transitions between the chaotic and random regime. The tools here used here originate from the Information Theory. The schemes proposed are applied to simulated and real life signals, showing in all cases a high proficiency for detecting changes in the dynamics of the associated time series.

  8. A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall

    NASA Astrophysics Data System (ADS)

    Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.

    2017-06-01

    Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.

  9. 76 FR 30919 - Marine Mammals; File No. 15844

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-27

    ... minke whales, (2) continue one of the longest and most complete time- series data set on humpback whale... ecotypes could be harassed up to 50 times each, to acquire 30 successful biopsy samples, per year. See the...

  10. Electromagnetic pulse propagation in dispersive planar dielectrics.

    PubMed

    Moten, K; Durney, C H; Stockham, T G

    1989-01-01

    The responses of a plane-wave pulse train irradiating a lossy dispersive dielectric half-space are investigated. The incident pulse train is expressed as a Fourier series with summing done by the inverse fast Fourier transform. The Fourier series technique is adopted to avoid the many difficulties often encountered in finding the inverse Fourier transform when transform analyses are used. Calculations are made for propagation in pure water, and typical waveforms inside the dielectric half-space are presented. Higher harmonics are strongly attenuated, resulting in a single continuous sinusoidal waveform at the frequency of the fundamental depth in the material. The time-averaged specific absorption rate (SAR) for pulse-train propagation is shown to be the sum of the time-averaged SARs of the individual harmonic components of the pulse train. For the same average power, calculated SARs reveal that pulse trains generally penetrate deeper than carrier-frequency continuous waves but not deeper than continuous waves at frequencies approaching the fundamental of the pulse train. The effects of rise time on the propagating pulse train in the dielectrics are shown and explained. Since most practical pulsed systems are very limited in bandwidth, no pronounced differences between their response and continuous wave (CW) response would be expected. Typical results for pulse-train propagation in arrays of dispersive planar dielectric slabs are presented. Expressing the pulse train as a Fourier series provides a practical way of interpreting the dispersion characteristics from the spectral point of view.

  11. Ramifications of a potential gap in passive microwave data for the long-term sea ice climate record

    NASA Astrophysics Data System (ADS)

    Meier, W.; Stewart, J. S.

    2017-12-01

    The time series of sea ice concentration and extent from passive microwave sensors is one of the longest satellite-derived climate records and the significant decline in Arctic sea ice extent is one of the most iconic indicators of climate change. However, this continuous and consistent record is under threat due to the looming gap in passive microwave sensor coverage. The record started in late 1978 with the launch of the Scanning Multichannel Microwave Radiometer (SMMR) and has continued with a series of Special Sensor Microwave Imager (SSMI) and Special Sensor Microwave Imager and Sounder (SSMIS) instruments on U.S. Defense Meteorological Satellite Program (DMSP) satellites. The data from the different sensors are intercalibrated at the algorithm level by adjusting algorithm coefficients so that the output sea ice data is as consistent as possible between the older and the newer sensor. A key aspect in constructing the time series is to have at least two sensors operating simultaneously so that data from the older and newer sensor can be obtained from the same locations. However, with recent losses of the DMSP F19 and F20, the remaining SSMIS sensors are all well beyond their planned mission lifetime. This means that risk of failure is not small and is increasing with each day of operation. The newest passive microwave sensor, the JAXA Advanced Microwave Scanning Radiometer-2 (AMSR2), is a potential contributor to the time series (though it too is now beyond it's planned 5-year mission lifetime). However, AMSR2's larger antenna and higher spatial resolution presents a challenge in integrating its data with the rest of the sea ice record because the ice edge is quite sensitive to the sensor resolution, which substantially affects the total sea ice extent and area estimates. This will need to be adjusted for if AMSR2 is used to continue the time series. Here we will discuss efforts at NSIDC to integrate AMSR2 estimates into the sea ice climate record if needed. We will also discuss potential contingency plans, such as using operational sea ice charts, to fill any gaps. This would allow the record to continue, but the consistency of the time series will be degraded because the ice charts use human analysis and differing sources, amounts and quality of input data, which makes them sub-optimal for long-term climate records.

  12. Long-term monitoring of river basins: strengths and weaknesses, opportunities and threats

    NASA Astrophysics Data System (ADS)

    Howden, N. J. K.; Burt, T. P.

    2016-12-01

    In a world where equilibrium is more and more uncommon, monitoring is an essential way to discover whether undesirable change is taking place. Monitoring requires a deliberate plan of action: the regular collection and processing of information. Long-term data reveal important patterns, allowing trends, cycles, and rare events to be identified. This is particularly important for complex systems where signals may be subtle and slow to emerge. Moreover, very long data sets are essential to test hypotheses undreamt of at the time the monitoring was started. This overview includes long time series from UK river basins showing how hydrology and water quality have changed over time - and continue to change. An important conclusion is the long time frame of system recovery, well beyond the normal lifetime of individual governments or research grants. At a time of increasing hydroclimatic variability, long time series remain crucially important; in particular, continuity of observations is vital at key benchmark sites.

  13. "Batch" kinetics in flow: online IR analysis and continuous control.

    PubMed

    Moore, Jason S; Jensen, Klavs F

    2014-01-07

    Currently, kinetic data is either collected under steady-state conditions in flow or by generating time-series data in batch. Batch experiments are generally considered to be more suitable for the generation of kinetic data because of the ability to collect data from many time points in a single experiment. Now, a method that rapidly generates time-series reaction data from flow reactors by continuously manipulating the flow rate and reaction temperature has been developed. This approach makes use of inline IR analysis and an automated microreactor system, which allowed for rapid and tight control of the operating conditions. The conversion/residence time profiles at several temperatures were used to fit parameters to a kinetic model. This method requires significantly less time and a smaller amount of starting material compared to one-at-a-time flow experiments, and thus allows for the rapid generation of kinetic data. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Time-Series Photographs of the Sea Floor in Western Massachusetts Bay: June 1998 to May 1999

    USGS Publications Warehouse

    Butman, Bradford; Alexander, P. Soupy; Bothner, Michael H.

    2004-01-01

    This report presents time-series photographs of the sea floor obtained from an instrumented tripod deployed at Site A in western Massachusetts Bay (42? 22.6' N., 70? 47.0' W., 30 m water depth, figure 1) from June 1998 through May 1999. Site A is approximately 1 km south of an ocean outfall that began discharging treated sewage effluent from the Boston metropolitan area into Massachusetts Bay in September 2000. Time-series photographs and oceanographic observations were initiated at Site A in December 1989 and are anticipated to continue to September 2005. This one of a series of reports that present these images in digital form. The objective of these reports is to enable easy and rapid viewing of the photographs and to provide a medium-resolution digital archive. The images, obtained every 4 hours, are presented as a movie (in .avi format, which may be viewed using an image viewer such as QuickTime or Windows Media Player) and as individual images (.tif format). The images provide time-series observations of changes of the sea floor and near-bottom water properties.

  15. Human activities and climate variability drive fast-paced change across the world's estuarine-coastal ecosystems

    USGS Publications Warehouse

    Cloern, James E.; Abreu, Paulo C.; Carstensen, Jacob; Chauvaud, Laurent; Elmgren, Ragnar; Grall, Jacques; Greening, Holly; Johansson, John O.R.; Kahru, Mati; Sherwood, Edward T.; Xu, Jie; Yin, Kedong

    2016-01-01

    Time series of environmental measurements are essential for detecting, measuring and understanding changes in the Earth system and its biological communities. Observational series have accumulated over the past 2–5 decades from measurements across the world's estuaries, bays, lagoons, inland seas and shelf waters influenced by runoff. We synthesize information contained in these time series to develop a global view of changes occurring in marine systems influenced by connectivity to land. Our review is organized around four themes: (i) human activities as drivers of change; (ii) variability of the climate system as a driver of change; (iii) successes, disappointments and challenges of managing change at the sea-land interface; and (iv) discoveries made from observations over time. Multidecadal time series reveal that many of the world's estuarine–coastal ecosystems are in a continuing state of change, and the pace of change is faster than we could have imagined a decade ago. Some have been transformed into novel ecosystems with habitats, biogeochemistry and biological communities outside the natural range of variability. Change takes many forms including linear and nonlinear trends, abrupt state changes and oscillations. The challenge of managing change is daunting in the coastal zone where diverse human pressures are concentrated and intersect with different responses to climate variability over land and over ocean basins. The pace of change in estuarine–coastal ecosystems will likely accelerate as the human population and economies continue to grow and as global climate change accelerates. Wise stewardship of the resources upon which we depend is critically dependent upon a continuing flow of information from observations to measure, understand and anticipate future changes along the world's coastlines.

  16. Modeling commodity salam contract between two parties for discrete and continuous time series

    NASA Astrophysics Data System (ADS)

    Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd

    2017-08-01

    In order for Islamic finance to remain competitive as the conventional, there needs a new development of Islamic compliance product such as Islamic derivative that can be used to manage the risk. However, under syariah principles and regulations, all financial instruments must not be conflicting with five syariah elements which are riba (interest paid), rishwah (corruption), gharar (uncertainty or unnecessary risk), maysir (speculation or gambling) and jahl (taking advantage of the counterparty's ignorance). This study has proposed a traditional Islamic contract namely salam that can be built as an Islamic derivative product. Although a lot of studies has been done on discussing and proposing the implementation of salam contract as the Islamic product however they are more into qualitative and law issues. Since there is lack of quantitative study of salam contract being developed, this study introduces mathematical models that can value the appropriate salam price for a commodity salam contract between two parties. In modeling the commodity salam contract, this study has modified the existing conventional derivative model and come out with some adjustments to comply with syariah rules and regulations. The cost of carry model has been chosen as the foundation to develop the commodity salam model between two parties for discrete and continuous time series. However, the conventional time value of money results from the concept of interest that is prohibited in Islam. Therefore, this study has adopted the idea of Islamic time value of money which is known as the positive time preference, in modeling the commodity salam contract between two parties for discrete and continuous time series.

  17. 49 CFR 563.5 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... dynamic time-series data during the time period just prior to a crash event (e.g., vehicle speed vs. time... EDR data in a temporary, volatile storage medium where it is continuously updated at regular time..., along the lateral axis, starting from crash time zero and ending at 0.25 seconds, recorded every 0.01...

  18. Investigation of aquifer-estuary interaction using wavelet analysis of fiber-optic temperature data

    USGS Publications Warehouse

    Henderson, R.D.; Day-Lewis, Frederick D.; Harvey, Charles F.

    2009-01-01

    Fiber-optic distributed temperature sensing (FODTS) provides sub-minute temporal and meter-scale spatial resolution over kilometer-long cables. Compared to conventional thermistor or thermocouple-based technologies, which measure temperature at discrete (and commonly sparse) locations, FODTS offers nearly continuous spatial coverage, thus providing hydrologic information at spatiotemporal scales previously impossible. Large and information-rich FODTS datasets, however, pose challenges for data exploration and analysis. To date, FODTS analyses have focused on time-series variance as the means to discriminate between hydrologic phenomena. Here, we demonstrate the continuous wavelet transform (CWT) and cross-wavelet transform (XWT) to analyze FODTS in the context of related hydrologic time series. We apply the CWT and XWT to data from Waquoit Bay, Massachusetts to identify the location and timing of tidal pumping of submarine groundwater.

  19. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere

    PubMed Central

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375

  20. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere.

    PubMed

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.

  1. Extracting the regional common-mode component of GPS station position time series from dense continuous network

    NASA Astrophysics Data System (ADS)

    Tian, Yunfeng; Shen, Zheng-Kang

    2016-02-01

    We develop a spatial filtering method to remove random noise and extract the spatially correlated transients (i.e., common-mode component (CMC)) that deviate from zero mean over the span of detrended position time series of a continuous Global Positioning System (CGPS) network. The technique utilizes a weighting scheme that incorporates two factors—distances between neighboring sites and their correlations of long-term residual position time series. We use a grid search algorithm to find the optimal thresholds for deriving the CMC that minimizes the root-mean-square (RMS) of the filtered residual position time series. Comparing to the principal component analysis technique, our method achieves better (>13% on average) reduction of residual position scatters for the CGPS stations in western North America, eliminating regional transients of all spatial scales. It also has advantages in data manipulation: less intervention and applicable to a dense network of any spatial extent. Our method can also be used to detect CMC irrespective of its origins (i.e., tectonic or nontectonic), if such signals are of particular interests for further study. By varying the filtering distance range, the long-range CMC related to atmospheric disturbance can be filtered out, uncovering CMC associated with transient tectonic deformation. A correlation-based clustering algorithm is adopted to identify stations cluster that share the common regional transient characteristics.

  2. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    PubMed

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  3. Coherent changes of multifractal properties of continuous acoustic emission at failure of heterogeneous materials

    NASA Astrophysics Data System (ADS)

    Panteleev, Ivan; Bayandin, Yuriy; Naimark, Oleg

    2017-12-01

    This work performs a correlation analysis of the statistical properties of continuous acoustic emission recorded in different parts of marble and fiberglass laminate samples under quasi-static deformation. A spectral coherent measure of time series, which is a generalization of the squared coherence spectrum on a multidimensional series, was chosen. The spectral coherent measure was estimated in a sliding time window for two parameters of the acoustic emission multifractal singularity spectrum: the spectrum width and the generalized Hurst exponent realizing the maximum of the singularity spectrum. It is shown that the preparation of the macrofracture focus is accompanied by the synchronization (coherent behavior) of the statistical properties of acoustic emission in allocated frequency intervals.

  4. Earth Observing System, Conclusions and Recommendations

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The following Earth Observing Systems (E.O.S.) recommendations were suggested: (1) a program must be initiated to ensure that present time series of Earth science data are maintained and continued. (2) A data system that provides easy, integrated, and complete access to past, present, and future data must be developed as soon as possible. (3) A long term research effort must be sustained to study and understand these time series of Earth observations. (4) The E.O.S. should be established as an information system to carry out those aspects of the above recommendations which go beyond existing and currently planned activities. (5) The scientific direction of the E.O.S. should be established and continued through an international scientific steering committee.

  5. 3-component time-dependent crustal deformation in Southern California from Sentinel-1 and GPS

    NASA Astrophysics Data System (ADS)

    Tymofyeyeva, E.; Fialko, Y. A.

    2017-12-01

    We combine data from the Sentinel-1 InSAR mission collected between 2014-2017 with continuous GPS measurements to calculate the three components of the interseismic surface velocity field in Southern California at the resolution of InSAR data ( 100 m). We use overlapping InSAR tracks with two different look geometries (descending tracks 71, 173, and 144, and ascending tracks 64 and 166) to obtain the 3 orthogonal components of surface motion. Because of the under-determined nature of the problem, we use the local azimuth of the horizontal velocity vector as an additional constraint. The spatially variable azimuths of the horizontal velocity are obtained by interpolating data from the continuous GPS network. We estimate both secular velocities and displacement time series. The latter are obtained by combining InSAR time series from different lines of sight with time-dependent azimuths computed using continuous GPS time series at every InSAR epoch. We use the CANDIS method [Tymofyeyeva and Fialko, 2015], a technique based on iterative common point stacking, to correct the InSAR data for tropospheric and ionospheric artifacts when calculating secular velocities and time series, and to isolate low-amplitude deformation signals in our study region. The obtained horizontal (East and North) components of secular velocity exhibit long-wavelength patterns consistent with strain accumulation on major faults of the Pacific-North America plate boundary. The vertical component of velocity reveals a number of localized uplift and subsidence anomalies, most likely related to hydrologic effects and anthropogenic activity. In particular, in the Los Angeles basin we observe localized uplift of about 10-15mm/yr near Anaheim, Long Beach, and Redondo Beach, as well as areas of rapid subsidence near Irvine and Santa Monica, which are likely caused by the injection of water in the oil fields, and the pumping and recharge cycles of the aquifers in the basin.

  6. SURMODERR: A MATLAB toolbox for estimation of velocity uncertainties of a non-permanent GPS station

    NASA Astrophysics Data System (ADS)

    Teza, Giordano; Pesci, Arianna; Casula, Giuseppe

    2010-08-01

    SURMODERR is a MATLAB toolbox intended for the estimation of reliable velocity uncertainties of a non-permanent GPS station (NPS), i.e. a GPS receiver used in campaign-style measurements. The implemented method is based on the subsampling of daily coordinate time series of one or more continuous GPS stations located inside or close to the area where the NPSs are installed. The continuous time series are subsampled according to real or planned occupation tables and random errors occurring in antenna replacement on different surveys are taken into account. In order to overcome the uncertainty underestimation that typically characterizes short duration GPS time series, statistical analysis of the simulated data is performed to estimate the velocity uncertainties of this real NPS. The basic hypotheses required are: (i) the signal must be a long-term linear trend plus seasonal and colored noise for each coordinate; (ii) the standard data processing should have already been performed to provide daily data series; and (iii) if the method is applied to survey planning, the future behavior should not be significantly different from the past behavior. In order to show the strength of the approach, two case studies with real data are presented and discussed (Central Apennine and Panarea Island, Italy).

  7. 27 CFR 24.260 - Serial numbers or filling date.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... filling date at the time of filling or when such containers or cases are prepared for removal. Serial numbers will commence with “1” and continue until the numeral “1,000,000” is reached, whereupon the series may recommence with the numeral “1.” However, the proprietor may initiate a new series after the...

  8. Study of Glycemic Variability Through Time Series Analyses (Detrended Fluctuation Analysis and Poincaré Plot) in Children and Adolescents with Type 1 Diabetes.

    PubMed

    García Maset, Leonor; González, Lidia Blasco; Furquet, Gonzalo Llop; Suay, Francisco Montes; Marco, Roberto Hernández

    2016-11-01

    Time series analysis provides information on blood glucose dynamics that is unattainable with conventional glycemic variability (GV) indices. To date, no studies have been published on these parameters in pediatric patients with type 1 diabetes. Our aim is to evaluate the relationship between time series analysis and conventional GV indices, and glycosylated hemoglobin (HbA1c) levels. This is a transversal study of 41 children and adolescents with type 1 diabetes. Glucose monitoring was carried out continuously for 72 h to study the following GV indices: standard deviation (SD) of glucose levels (mg/dL), coefficient of variation (%), interquartile range (IQR; mg/dL), mean amplitude of the largest glycemic excursions (MAGE), and continuous overlapping net glycemic action (CONGA). The time series analysis was conducted by means of detrended fluctuation analysis (DFA) and Poincaré plot. Time series parameters (DFA alpha coefficient and elements of the ellipse of the Poincaré plot) correlated well with the more conventional GV indices. Patients were grouped according to the terciles of these indices, to the terciles of eccentricity (1: 12.56-16.98, 2: 16.99-21.91, 3: 21.92-41.03), and to the value of the DFA alpha coefficient (> or ≤1.5). No differences were observed in the HbA1c of patients grouped by GV index criteria; however, significant differences were found in patients grouped by alpha coefficient and eccentricity, not only in terms of HbA1c, but also in SD glucose, IQR, and CONGA index. The loss of complexity in glycemic homeostasis is accompanied by an increase in variability.

  9. Identification of spikes associated with local sources in continuous time series of atmospheric CO, CO2 and CH4

    NASA Astrophysics Data System (ADS)

    El Yazidi, Abdelhadi; Ramonet, Michel; Ciais, Philippe; Broquet, Gregoire; Pison, Isabelle; Abbaris, Amara; Brunner, Dominik; Conil, Sebastien; Delmotte, Marc; Gheusi, Francois; Guerin, Frederic; Hazan, Lynn; Kachroudi, Nesrine; Kouvarakis, Giorgos; Mihalopoulos, Nikolaos; Rivier, Leonard; Serça, Dominique

    2018-03-01

    This study deals with the problem of identifying atmospheric data influenced by local emissions that can result in spikes in time series of greenhouse gases and long-lived tracer measurements. We considered three spike detection methods known as coefficient of variation (COV), robust extraction of baseline signal (REBS) and standard deviation of the background (SD) to detect and filter positive spikes in continuous greenhouse gas time series from four monitoring stations representative of the European ICOS (Integrated Carbon Observation System) Research Infrastructure network. The results of the different methods are compared to each other and against a manual detection performed by station managers. Four stations were selected as test cases to apply the spike detection methods: a continental rural tower of 100 m height in eastern France (OPE), a high-mountain observatory in the south-west of France (PDM), a regional marine background site in Crete (FKL) and a marine clean-air background site in the Southern Hemisphere on Amsterdam Island (AMS). This selection allows us to address spike detection problems in time series with different variability. Two years of continuous measurements of CO2, CH4 and CO were analysed. All methods were found to be able to detect short-term spikes (lasting from a few seconds to a few minutes) in the time series. Analysis of the results of each method leads us to exclude the COV method due to the requirement to arbitrarily specify an a priori percentage of rejected data in the time series, which may over- or underestimate the actual number of spikes. The two other methods freely determine the number of spikes for a given set of parameters, and the values of these parameters were calibrated to provide the best match with spikes known to reflect local emissions episodes that are well documented by the station managers. More than 96 % of the spikes manually identified by station managers were successfully detected both in the SD and the REBS methods after the best adjustment of parameter values. At PDM, measurements made by two analyzers located 200 m from each other allow us to confirm that the CH4 spikes identified in one of the time series but not in the other correspond to a local source from a sewage treatment facility in one of the observatory buildings. From this experiment, we also found that the REBS method underestimates the number of positive anomalies in the CH4 data caused by local sewage emissions. As a conclusion, we recommend the use of the SD method, which also appears to be the easiest one to implement in automatic data processing, used for the operational filtering of spikes in greenhouse gases time series at global and regional monitoring stations of networks like that of the ICOS atmosphere network.

  10. The CACAO Method for Smoothing, Gap Filling, and Characterizing Seasonal Anomalies in Satellite Time Series

    NASA Technical Reports Server (NTRS)

    Verger, Aleixandre; Baret, F.; Weiss, M.; Kandasamy, S.; Vermote, E.

    2013-01-01

    Consistent, continuous, and long time series of global biophysical variables derived from satellite data are required for global change research. A novel climatology fitting approach called CACAO (Consistent Adjustment of the Climatology to Actual Observations) is proposed to reduce noise and fill gaps in time series by scaling and shifting the seasonal climatological patterns to the actual observations. The shift and scale CACAO parameters adjusted for each season allow quantifying shifts in the timing of seasonal phenology and inter-annual variations in magnitude as compared to the average climatology. CACAO was assessed first over simulated daily Leaf Area Index (LAI) time series with varying fractions of missing data and noise. Then, performances were analyzed over actual satellite LAI products derived from AVHRR Long-Term Data Record for the 1981-2000 period over the BELMANIP2 globally representative sample of sites. Comparison with two widely used temporal filtering methods-the asymmetric Gaussian (AG) model and the Savitzky-Golay (SG) filter as implemented in TIMESAT-revealed that CACAO achieved better performances for smoothing AVHRR time series characterized by high level of noise and frequent missing observations. The resulting smoothed time series captures well the vegetation dynamics and shows no gaps as compared to the 50-60% of still missing data after AG or SG reconstructions. Results of simulation experiments as well as confrontation with actual AVHRR time series indicate that the proposed CACAO method is more robust to noise and missing data than AG and SG methods for phenology extraction.

  11. Further characterization of the time transfer capabilities of precise point positioning (PPP): the Sliding Batch Procedure.

    PubMed

    Guyennon, Nicolas; Cerretto, Giancarlo; Tavella, Patrizia; Lahaye, François

    2009-08-01

    In recent years, many national timing laboratories have installed geodetic Global Positioning System receivers together with their traditional GPS/GLONASS Common View receivers and Two Way Satellite Time and Frequency Transfer equipment. Many of these geodetic receivers operate continuously within the International GNSS Service (IGS), and their data are regularly processed by IGS Analysis Centers. From its global network of over 350 stations and its Analysis Centers, the IGS generates precise combined GPS ephemeredes and station and satellite clock time series referred to the IGS Time Scale. A processing method called Precise Point Positioning (PPP) is in use in the geodetic community allowing precise recovery of GPS antenna position, clock phase, and atmospheric delays by taking advantage of these IGS precise products. Previous assessments, carried out at Istituto Nazionale di Ricerca Metrologica (INRiM; formerly IEN) with a PPP implementation developed at Natural Resources Canada (NRCan), showed PPP clock solutions have better stability over short/medium term than GPS CV and GPS P3 methods and significantly reduce the day-boundary discontinuities when used in multi-day continuous processing, allowing time-limited, campaign-style time-transfer experiments. This paper reports on follow-on work performed at INRiM and NRCan to further characterize and develop the PPP method for time transfer applications, using data from some of the National Metrology Institutes. We develop a processing procedure that takes advantage of the improved stability of the phase-connected multi-day PPP solutions while allowing the generation of continuous clock time series, more applicable to continuous operation/monitoring of timing equipment.

  12. Identification of market trends with string and D2-brane maps

    NASA Astrophysics Data System (ADS)

    Bartoš, Erik; Pinčák, Richard

    2017-08-01

    The multidimensional string objects are introduced as a new alternative for an application of string models for time series forecasting in trading on financial markets. The objects are represented by open string with 2-endpoints and D2-brane, which are continuous enhancement of 1-endpoint open string model. We show how new object properties can change the statistics of the predictors, which makes them the candidates for modeling a wide range of time series systems. String angular momentum is proposed as another tool to analyze the stability of currency rates except the historical volatility. To show the reliability of our approach with application of string models for time series forecasting we present the results of real demo simulations for four currency exchange pairs.

  13. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    2000-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saaban, Azizan; Zainudin, Lutfi; Bakar, Mohd Nazari Abu

    This paper intends to reveal the ability of the linear interpolation method to predict missing values in solar radiation time series. Reliable dataset is equally tends to complete time series observed dataset. The absence or presence of radiation data alters long-term variation of solar radiation measurement values. Based on that change, the opportunities to provide bias output result for modelling and the validation process is higher. The completeness of the observed variable dataset has significantly important for data analysis. Occurrence the lack of continual and unreliable time series solar radiation data widely spread and become the main problematic issue. However,more » the limited number of research quantity that has carried out to emphasize and gives full attention to estimate missing values in the solar radiation dataset.« less

  15. Testing the effectiveness of family therapeutic assessment: a case study using a time-series design.

    PubMed

    Smith, Justin D; Wolf, Nicole J; Handler, Leonard; Nash, Michael R

    2009-11-01

    We describe a family Therapeutic Assessment (TA) case study employing 2 assessors, 2 assessment rooms, and a video link. In the study, we employed a daily measures time-series design with a pretreatment baseline and follow-up period to examine the family TA treatment model. In addition to being an illustrative addition to a number of clinical reports suggesting the efficacy of family TA, this study is the first to apply a case-based time-series design to test whether family TA leads to clinical improvement and also illustrates when that improvement occurs. Results support the trajectory of change proposed by Finn (2007), the TA model's creator, who posits that benefits continue beyond the formal treatment itself.

  16. Insights into shallow magmatic processes at Kīlauea Volcano, Hawaiʻi, from a multiyear continuous gravity time series

    USGS Publications Warehouse

    Poland, Michael P.; Carbone, Daniele

    2016-01-01

    Continuous gravity data collected near the summit eruptive vent at Kīlauea Volcano, Hawaiʻi, during 2011–2015 show a strong correlation with summit-area surface deformation and the level of the lava lake within the vent over periods of days to weeks, suggesting that changes in gravity reflect variations in volcanic activity. Joint analysis of gravity and lava level time series data indicates that over the entire time period studied, the average density of the lava within the upper tens to hundreds of meters of the summit eruptive vent remained low—approximately 1000–1500 kg/m3. The ratio of gravity change (adjusted for Earth tides and instrumental drift) to lava level change measured over 15 day windows rose gradually over the course of 2011–2015, probably reflecting either (1) a small increase in the density of lava within the eruptive vent or (2) an increase in the volume of lava within the vent due to gradual vent enlargement. Superimposed on the overall time series were transient spikes of mass change associated with inflation and deflation of Kīlauea's summit and coincident changes in lava level. The unexpectedly strong mass variations during these episodes suggest magma flux to and from the shallow magmatic system without commensurate deformation, perhaps indicating magma accumulation within, and withdrawal from, void space—a process that might not otherwise be apparent from lava level and deformation data alone. Continuous gravity data thus provide unique insights into magmatic processes, arguing for continued application of the method at other frequently active volcanoes.

  17. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  18. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  19. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  20. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  1. 10 CFR 300.5 - Submission of an entity statement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... report as a large emitter in all future years in order to ensure a consistent time series of reports... average annual emissions over a continuous period not to exceed four years of time ending in its chosen... necessary, update their entity statements. (2) From time to time, a reporting entity may choose to change...

  2. Extracting Hydrologic Understanding from the Unique Space-time Sampling of the Surface Water and Ocean Topography (SWOT) Mission

    NASA Astrophysics Data System (ADS)

    Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.

    2017-12-01

    The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.

  3. A new algorithm for automatic Outlier Detection in GPS Time Series

    NASA Astrophysics Data System (ADS)

    Cannavo', Flavio; Mattia, Mario; Rossi, Massimo; Palano, Mimmo; Bruno, Valentina

    2010-05-01

    Nowadays continuous GPS time series are considered a crucial product of GPS permanent networks, useful in many geo-science fields, such as active tectonics, seismology, crustal deformation and volcano monitoring (Altamimi et al. 2002, Elósegui et al. 2006, Aloisi et al. 2009). Although the GPS data elaboration software has increased in reliability, the time series are still affected by different kind of noise, from the intrinsic noise (e.g. thropospheric delay) to the un-modeled noise (e.g. cycle slips, satellite faults, parameters changing). Typically GPS Time Series present characteristic noise that is a linear combination of white noise and correlated colored noise, and this characteristic is fractal in the sense that is evident for every considered time scale or sampling rate. The un-modeled noise sources result in spikes, outliers and steps. These kind of errors can appreciably influence the estimation of velocities of the monitored sites. The outlier detection in generic time series is a widely treated problem in literature (Wei, 2005), while is not fully developed for the specific kind of GPS series. We propose a robust automatic procedure for cleaning the GPS time series from the outliers and, especially for long daily series, steps due to strong seismic or volcanic events or merely instrumentation changing such as antenna and receiver upgrades. The procedure is basically divided in two steps: a first step for the colored noise reduction and a second step for outlier detection through adaptive series segmentation. Both algorithms present novel ideas and are nearly unsupervised. In particular, we propose an algorithm to estimate an autoregressive model for colored noise in GPS time series in order to subtract the effect of non Gaussian noise on the series. This step is useful for the subsequent step (i.e. adaptive segmentation) which requires the hypothesis of Gaussian noise. The proposed algorithms are tested in a benchmark case study and the results confirm that the algorithms are effective and reasonable. Bibliography - Aloisi M., A. Bonaccorso, F. Cannavò, S. Gambino, M. Mattia, G. Puglisi, E. Boschi, A new dyke intrusion style for the Mount Etna May 2008 eruption modelled through continuous tilt and GPS data, Terra Nova, Volume 21 Issue 4 , Pages 316 - 321, doi: 10.1111/j.1365-3121.2009.00889.x (August 2009) - Altamimi Z., Sillard P., Boucher C., ITRF2000: A new release of the International Terrestrial Reference frame for earth science applications, J Geophys Res-Solid Earth, 107 (B10): art. no.-2214, (Oct 2002) - Elósegui, P., J. L. Davis, D. Oberlander, R. Baena, and G. Ekström , Accuracy of high-rate GPS for seismology, Geophys. Res. Lett., 33, L11308, doi:10.1029/2006GL026065 (2006) - Wei W. S., Time Series Analysis: Univariate and Multivariate Methods, Addison Wesley (2 edition), ISBN-10: 0321322169 (July, 2005)

  4. GPS Time Series Analysis of Southern California Associated with the 2010 M7.2 El Mayor/Cucapah Earthquake

    NASA Technical Reports Server (NTRS)

    Granat, Robert; Donnellan, Andrea

    2011-01-01

    The Magnitude 7.2 El-Mayor/Cucapah earthquake the occurred in Mexico on April 4, 2012 was well instrumented with continuous GPS stations in California. Large Offsets were observed at the GPS stations as a result of deformation from the earthquake providing information about the co-seismic fault slip as well as fault slip from large aftershocks. Information can also be obtained from the position time series at each station.

  5. Comparison of different synthetic 5-min rainfall time series on the results of rainfall runoff simulations in urban drainage modelling

    NASA Astrophysics Data System (ADS)

    Krämer, Stefan; Rohde, Sophia; Schröder, Kai; Belli, Aslan; Maßmann, Stefanie; Schönfeld, Martin; Henkel, Erik; Fuchs, Lothar

    2015-04-01

    The design of urban drainage systems with numerical simulation models requires long, continuous rainfall time series with high temporal resolution. However, suitable observed time series are rare. As a result, usual design concepts often use uncertain or unsuitable rainfall data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic rainfall data as input for urban drainage modelling are advanced, tested, and compared. Synthetic rainfall time series of three different precipitation model approaches, - one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model-, are provided for three catchments with different sewer system characteristics in different climate regions in Germany: - Hamburg (northern Germany): maritime climate, mean annual rainfall: 770 mm; combined sewer system length: 1.729 km (City center of Hamburg), storm water sewer system length (Hamburg Harburg): 168 km - Brunswick (Lower Saxony, northern Germany): transitional climate from maritime to continental, mean annual rainfall: 618 mm; sewer system length: 278 km, connected impervious area: 379 ha, height difference: 27 m - Friburg in Brisgau (southern Germany): Central European transitional climate, mean annual rainfall: 908 mm; sewer system length: 794 km, connected impervious area: 1 546 ha, height difference 284 m Hydrodynamic models are set up for each catchment to simulate rainfall runoff processes in the sewer systems. Long term event time series are extracted from the - three different synthetic rainfall time series (comprising up to 600 years continuous rainfall) provided for each catchment and - observed gauge rainfall (reference rainfall) according national hydraulic design standards. The synthetic and reference long term event time series are used as rainfall input for the hydrodynamic sewer models. For comparison of the synthetic rainfall time series against the reference rainfall and against each other the number of - surcharged manholes, - surcharges per manhole, - and the average surcharge volume per manhole are applied as hydraulic performance criteria. The results are discussed and assessed to answer the following questions: - Are the synthetic rainfall approaches suitable to generate high resolution rainfall series and do they produce, - in combination with numerical rainfall runoff models - valid results for design of urban drainage systems? - What are the bounds of uncertainty in the runoff results depending on the synthetic rainfall model and on the climate region? The work is carried out within the SYNOPSE project, funded by the German Federal Ministry of Education and Research (BMBF).

  6. Variations in Stratospheric Inorganic Chlorine Between 1991 and 2006

    NASA Technical Reports Server (NTRS)

    Lary, D. J.; Waugh, D. W.; Douglass, A. R.; Stolarski, R. S.; Newman, P. A.; Mussa, H.

    2007-01-01

    So how quickly will the ozone hole recover? This depends on how quickly the chlorine content (Cl2) of the atmosphere will decline. The ozone hole forms over the Antarctic each southern spring (September and October). The extremely small ozone amounts in the ozone hole are there because of chemical reactions of ozone with chlorine. This chlorine originates largely from industrially produced chlorofluorocarbon (CFC) compounds. An international agreement, the Montreal Protocol, is drastically reducing the amount of chlorine-containing compounds that we are releasing into the atmosphere. To be able to attribute changes in stratospheric ozone to changes in chlorine we need to know the distribution of atmospheric chlorine. However, due to a lack of continuous observations of all the key chlorine gases, producing a continuous time series of stratospheric chlorine has not been achieved to date. We have for the first time devised a technique to make a 17-year time series for stratospheric chlorine that uses the long time series of HCl observations made from several space borne instruments and a neural network. The neural networks allow us to both inter-calibrate the various HCl instruments and to infer the total amount of atmospheric chlorine from HCl. These new estimates of Cl, provide a much needed critical test for current global models that currently predict significant differences in both Cl(sub y) and ozone recovery. These models exhibit differences in their projection of the recovery time and our chlorine content time series will help separate the good from the bad in these projections.

  7. Spatiotemporal interpolation of discharge across a river network by using synthetic SWOT satellite data

    NASA Astrophysics Data System (ADS)

    Paiva, Rodrigo C. D.; Durand, Michael T.; Hossain, Faisal

    2015-01-01

    Recent efforts have sought to estimate river discharge and other surface water-related quantities using spaceborne sensors, with better spatial coverage but worse temporal sampling as compared with in situ measurements. The Surface Water and Ocean Topography (SWOT) mission will provide river discharge estimates globally from space. However, questions on how to optimally use the spatially distributed but asynchronous satellite observations to generate continuous fields still exist. This paper presents a statistical model (River Kriging-RK), for estimating discharge time series in a river network in the context of the SWOT mission. RK uses discharge estimates at different locations and times to produce a continuous field using spatiotemporal kriging. A key component of RK is the space-time river discharge covariance, which was derived analytically from the diffusive wave approximation of Saint Venant's equations. The RK covariance also accounts for the loss of correlation at confluences. The model performed well in a case study on Ganges-Brahmaputra-Meghna (GBM) River system in Bangladesh using synthetic SWOT observations. The correlation model reproduced empirically derived values. RK (R2=0.83) outperformed other kriging-based methods (R2=0.80), as well as a simple time series linear interpolation (R2=0.72). RK was used to combine discharge from SWOT and in situ observations, improving estimates when the latter is included (R2=0.91). The proposed statistical concepts may eventually provide a feasible framework to estimate continuous discharge time series across a river network based on SWOT data, other altimetry missions, and/or in situ data.

  8. Utilization of Historic Information in an Optimisation Task

    NASA Technical Reports Server (NTRS)

    Boesser, T.

    1984-01-01

    One of the basic components of a discrete model of motor behavior and decision making, which describes tracking and supervisory control in unitary terms, is assumed to be a filtering mechanism which is tied to the representational principles of human memory for time-series information. In a series of experiments subjects used the time-series information with certain significant limitations: there is a range-effect; asymmetric distributions seem to be recognized, but it does not seem to be possible to optimize performance based on skewed distributions. Thus there is a transformation of the displayed data between the perceptual system and representation in memory involving a loss of information. This rules out a number of representational principles for time-series information in memory and fits very well into the framework of a comprehensive discrete model for control of complex systems, modelling continuous control (tracking), discrete responses, supervisory behavior and learning.

  9. Time lagged ordinal partition networks for capturing dynamics of continuous dynamical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCullough, Michael; Iu, Herbert Ho-Ching; Small, Michael

    2015-05-15

    We investigate a generalised version of the recently proposed ordinal partition time series to network transformation algorithm. First, we introduce a fixed time lag for the elements of each partition that is selected using techniques from traditional time delay embedding. The resulting partitions define regions in the embedding phase space that are mapped to nodes in the network space. Edges are allocated between nodes based on temporal succession thus creating a Markov chain representation of the time series. We then apply this new transformation algorithm to time series generated by the Rössler system and find that periodic dynamics translate tomore » ring structures whereas chaotic time series translate to band or tube-like structures—thereby indicating that our algorithm generates networks whose structure is sensitive to system dynamics. Furthermore, we demonstrate that simple network measures including the mean out degree and variance of out degrees can track changes in the dynamical behaviour in a manner comparable to the largest Lyapunov exponent. We also apply the same analysis to experimental time series generated by a diode resonator circuit and show that the network size, mean shortest path length, and network diameter are highly sensitive to the interior crisis captured in this particular data set.« less

  10. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process.

    PubMed

    Moran, John L; Solomon, Patricia J

    2013-05-24

    Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Monthly mean raw mortality (at hospital discharge) time series, 1995-2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) "in-control" status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues.

  11. Sleep-Dependent Memory Consolidation and Reconsolidation

    PubMed Central

    Stickgold, Robert; Walker, Matthew P.

    2009-01-01

    Molecular, cellular, and systems-level processes convert initial, labile memory representations into more permanent ones, available for continued reactivation and recall over extended periods of time. These processes of memory consolidation and reconsolidation are not all-or-none phenomena, but rather a continuing series of biological adjustments that enhance both the efficiency and utility of stored memories over time. In this chapter, we review the role of sleep in supporting these disparate but related processes. PMID:17470412

  12. Continuous-time system identification of a smoking cessation intervention

    NASA Astrophysics Data System (ADS)

    Timms, Kevin P.; Rivera, Daniel E.; Collins, Linda M.; Piper, Megan E.

    2014-07-01

    Cigarette smoking is a major global public health issue and the leading cause of preventable death in the United States. Toward a goal of designing better smoking cessation treatments, system identification techniques are applied to intervention data to describe smoking cessation as a process of behaviour change. System identification problems that draw from two modelling paradigms in quantitative psychology (statistical mediation and self-regulation) are considered, consisting of a series of continuous-time estimation problems. A continuous-time dynamic modelling approach is employed to describe the response of craving and smoking rates during a quit attempt, as captured in data from a smoking cessation clinical trial. The use of continuous-time models provide benefits of parsimony, ease of interpretation, and the opportunity to work with uneven or missing data.

  13. Development of a Design Tool for Planning Aqueous Amendment Injection Systems Permanganate Design Tool

    DTIC Science & Technology

    2010-08-01

    CSTR continuously stirred tank reactors CT contact time EDB ethylene dibromide ESTCP Environmental Security Technology Certification Program...63 6.2 Simulating Oxidant Distribution Using a Series of CSTRs -------------------- 63 6.2.1 Model...SIMULATING OXIDANT DISTRIBUTION USING A SERIES OF CSTRS 6.2.1 MODEL DEVELOPMENT The transport and consumption of permanganate are simulated within the

  14. 36 CFR 1254.92 - How do I submit a request to microfilm records and donated historical materials?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., records preparation, and other NARA requirements in a shorter time frame. (1) You may include in your request only one project to microfilm a complete body of documents, such as an entire series, a major continuous segment of a very large series which is reasonably divisible, or a limited number of separate...

  15. 36 CFR 1254.92 - How do I submit a request to microfilm records and donated historical materials?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., records preparation, and other NARA requirements in a shorter time frame. (1) You may include in your request only one project to microfilm a complete body of documents, such as an entire series, a major continuous segment of a very large series which is reasonably divisible, or a limited number of separate...

  16. A Method for Oscillation Errors Restriction of SINS Based on Forecasted Time Series.

    PubMed

    Zhao, Lin; Li, Jiushun; Cheng, Jianhua; Jia, Chun; Wang, Qiufan

    2015-07-17

    Continuity, real-time, and accuracy are the key technical indexes of evaluating comprehensive performance of a strapdown inertial navigation system (SINS). However, Schuler, Foucault, and Earth periodic oscillation errors significantly cut down the real-time accuracy of SINS. A method for oscillation error restriction of SINS based on forecasted time series is proposed by analyzing the characteristics of periodic oscillation errors. The innovative method gains multiple sets of navigation solutions with different phase delays in virtue of the forecasted time series acquired through the measurement data of the inertial measurement unit (IMU). With the help of curve-fitting based on least square method, the forecasted time series is obtained while distinguishing and removing small angular motion interference in the process of initial alignment. Finally, the periodic oscillation errors are restricted on account of the principle of eliminating the periodic oscillation signal with a half-wave delay by mean value. Simulation and test results show that the method has good performance in restricting the Schuler, Foucault, and Earth oscillation errors of SINS.

  17. A Method for Oscillation Errors Restriction of SINS Based on Forecasted Time Series

    PubMed Central

    Zhao, Lin; Li, Jiushun; Cheng, Jianhua; Jia, Chun; Wang, Qiufan

    2015-01-01

    Continuity, real-time, and accuracy are the key technical indexes of evaluating comprehensive performance of a strapdown inertial navigation system (SINS). However, Schuler, Foucault, and Earth periodic oscillation errors significantly cut down the real-time accuracy of SINS. A method for oscillation error restriction of SINS based on forecasted time series is proposed by analyzing the characteristics of periodic oscillation errors. The innovative method gains multiple sets of navigation solutions with different phase delays in virtue of the forecasted time series acquired through the measurement data of the inertial measurement unit (IMU). With the help of curve-fitting based on least square method, the forecasted time series is obtained while distinguishing and removing small angular motion interference in the process of initial alignment. Finally, the periodic oscillation errors are restricted on account of the principle of eliminating the periodic oscillation signal with a half-wave delay by mean value. Simulation and test results show that the method has good performance in restricting the Schuler, Foucault, and Earth oscillation errors of SINS. PMID:26193283

  18. Continuous Change Detection and Classification (CCDC) of Land Cover Using All Available Landsat Data

    NASA Astrophysics Data System (ADS)

    Zhu, Z.; Woodcock, C. E.

    2012-12-01

    A new algorithm for Continuous Change Detection and Classification (CCDC) of land cover using all available Landsat data is developed. This new algorithm is capable of detecting many kinds of land cover change as new images are collected and at the same time provide land cover maps for any given time. To better identify land cover change, a two step cloud, cloud shadow, and snow masking algorithm is used for eliminating "noisy" observations. Next, a time series model that has components of seasonality, trend, and break estimates the surface reflectance and temperature. The time series model is updated continuously with newly acquired observations. Due to the high variability in spectral response for different kinds of land cover change, the CCDC algorithm uses a data-driven threshold derived from all seven Landsat bands. When the difference between observed and predicted exceeds the thresholds three consecutive times, a pixel is identified as land cover change. Land cover classification is done after change detection. Coefficients from the time series models and the Root Mean Square Error (RMSE) from model fitting are used as classification inputs for the Random Forest Classifier (RFC). We applied this new algorithm for one Landsat scene (Path 12 Row 31) that includes all of Rhode Island as well as much of Eastern Massachusetts and parts of Connecticut. A total of 532 Landsat images acquired between 1982 and 2011 were processed. During this period, 619,924 pixels were detected to change once (91% of total changed pixels) and 60,199 pixels were detected to change twice (8% of total changed pixels). The most frequent land cover change category is from mixed forest to low density residential which occupies more than 8% of total land cover change pixels.

  19. Detectability of Granger causality for subsampled continuous-time neurophysiological processes.

    PubMed

    Barnett, Lionel; Seth, Anil K

    2017-01-01

    Granger causality is well established within the neurosciences for inference of directed functional connectivity from neurophysiological data. These data usually consist of time series which subsample a continuous-time biophysiological process. While it is well known that subsampling can lead to imputation of spurious causal connections where none exist, less is known about the effects of subsampling on the ability to reliably detect causal connections which do exist. We present a theoretical analysis of the effects of subsampling on Granger-causal inference. Neurophysiological processes typically feature signal propagation delays on multiple time scales; accordingly, we base our analysis on a distributed-lag, continuous-time stochastic model, and consider Granger causality in continuous time at finite prediction horizons. Via exact analytical solutions, we identify relationships among sampling frequency, underlying causal time scales and detectability of causalities. We reveal complex interactions between the time scale(s) of neural signal propagation and sampling frequency. We demonstrate that detectability decays exponentially as the sample time interval increases beyond causal delay times, identify detectability "black spots" and "sweet spots", and show that downsampling may potentially improve detectability. We also demonstrate that the invariance of Granger causality under causal, invertible filtering fails at finite prediction horizons, with particular implications for inference of Granger causality from fMRI data. Our analysis emphasises that sampling rates for causal analysis of neurophysiological time series should be informed by domain-specific time scales, and that state-space modelling should be preferred to purely autoregressive modelling. On the basis of a very general model that captures the structure of neurophysiological processes, we are able to help identify confounds, and offer practical insights, for successful detection of causal connectivity from neurophysiological recordings. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    PubMed

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  1. Time-series photographs of the sea floor in western Massachusetts Bay: June 1997 to June 1998

    USGS Publications Warehouse

    Butman, Bradford; Alexander, P. Soupy; Bothner, Michael H.

    2004-01-01

    This report presents time-series photographs of the sea floor obtained from an instrumented tripod deployed at Site A in western Massachusetts Bay (42° 22.6' N., 70? 47.0' W., 30 m water depth, from June 1997 through June 1998. Site A is approximately 1 km south of an ocean outfall that began discharging treated sewage effluent from the Boston metropolitan area into Massachusetts Bay in September 2000. Time-series photographs and oceanographic observations were initiated at Site A in December 1989 and are anticipated to continue to September 2005. This is the first in a series of reports planned to summarize and distribute these images in digital form. The objective of these reports is to enable easy and rapid viewing of the photographs and to provide a medium-resolution digital archive. The images, obtained every 4 hours, are presented as a movie (in .avi format, which may be viewed using an image viewer such as QuickTime or Windows Media Player) and as individual images (.tif format). The images provide time-series observations of changes of the sea floor and near-bottom water properties.

  2. Analysis of Vlbi, Slr and GPS Site Position Time Series

    NASA Astrophysics Data System (ADS)

    Angermann, D.; Krügel, M.; Meisel, B.; Müller, H.; Tesmer, V.

    Conventionally the IERS terrestrial reference frame (ITRF) is realized by the adoption of a set of epoch coordinates and linear velocities for a set of global tracking stations. Due to the remarkable progress of the space geodetic observation techniques (e.g. VLBI, SLR, GPS) the accuracy and consistency of the ITRF increased continuously. The accuracy achieved today is mainly limited by technique-related systematic errors, which are often poorly characterized or quantified. Therefore it is essential to analyze the individual techniques' solutions with respect to systematic differences, models, parameters, datum definition, etc. Main subject of this presentation is the analysis of GPS, SLR and VLBI time series of site positions. The investigations are based on SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS time series are based on weekly IGS station coordinates solutions. We analyze the time series with respect to the issues mentioned above. In particular we characterize the noise in the time series, identify periodic signals, and investigate non-linear effects that complicate the assignment of linear velocities for global tracking sites. One important aspect is the comparison of results obtained by different techniques at colocation sites.

  3. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    PubMed

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  4. Spectral-decomposition techniques for the identification of radon anomalies temporally associated with earthquakes occurring in the UK in 2002 and 2008.

    NASA Astrophysics Data System (ADS)

    Crockett, R. G. M.; Gillmore, G. K.

    2009-04-01

    During the second half of 2002, the University of Northampton Radon Research Group operated two continuous hourly-sampling radon detectors 2.25 km apart in Northampton, in the (English) East Midlands. This period included the Dudley earthquake (22/09/2002) which was widely noticed by members of the public in the Northampton area. Also, at various periods during 2008 the Group has operated another pair of continuous hourly-sampling radon detectors similar distances apart in Northampton. One such period included the Market Rasen earthquake (27/02/2008) which was also widely noticed by members of the public in the Northampton area. During each period of monitoring, two time-series of radon readings were obtained, one from each detector. These have been analysed for evidence of simultaneous similar anomalies: the premise being that big disturbances occurring at big distances (in relation to the detector separation) should produce simultaneous similar anomalies but that simultaneous anomalies occurring by chance will be dissimilar. As previously reported, cross-correlating the two 2002 time-series over periods of 1-30 days duration, rolled forwards through the time-series at one-hour intervals produced two periods of significant correlation, i.e. two periods of simultaneous similar behaviour in the radon concentrations. One of these periods corresponded in time to the Dudley earthquake, the other corresponded in time to a smaller earthquake which occurred in the English Channel (26/08/2002). We here report subsequent investigation of the 2002 time-series and the 2008 time-series using spectral-decomposition techniques. These techniques have revealed additional simultaneous similar behaviour in the two radon concentrations, not revealed by the rolling correlation on the raw data. These correspond in time to the Manchester earthquake swarm of October 2002 and the Market Rasen earthquake of February 2008. The spectral-decomposition techniques effectively ‘de-noise' the data, and also remove lower-frequency variations (e.g. tidal variations), revealing the simultaneous similarities. Whilst this is very much work in progress, there is the potential that such techniques enhance the possibility that simultaneous real-time monitoring of radon levels - for short-term simultaneous anomalies - at several locations in earthquake areas might provide the core of an earthquake prediction method. Keywords: Radon; earthquakes; time series; cross-correlation; spectral-decomposition; real-time simultaneous monitoring.

  5. Characterizing and minimizing the effects of noise in tide gauge time series: relative and geocentric sea level rise around Australia

    NASA Astrophysics Data System (ADS)

    Burgette, Reed J.; Watson, Christopher S.; Church, John A.; White, Neil J.; Tregoning, Paul; Coleman, Richard

    2013-08-01

    We quantify the rate of sea level rise around the Australian continent from an analysis of tide gauge and Global Positioning System (GPS) data sets. To estimate the underlying linear rates of sea level change in the presence of significant interannual and decadal variability (treated here as noise), we adopt and extend a novel network adjustment approach. We simultaneously estimate time-correlated noise as well as linear model parameters and realistic uncertainties from sea level time series at individual gauges, as well as from time-series differences computed between pairs of gauges. The noise content at individual gauges is consistent with a combination of white and time-correlated noise. We find that the noise in time series from the western coast of Australia is best described by a first-order Gauss-Markov model, whereas east coast stations generally exhibit lower levels of time-correlated noise that is better described by a power-law process. These findings suggest several decades of monthly tide gauge data are needed to reduce rate uncertainties to <0.5 mm yr-1 for undifferenced single site time series with typical noise characteristics. Our subsequent adjustment strategy exploits the more precise differential rates estimated from differenced time series from pairs of tide gauges to estimate rates among the network of 43 tide gauges that passed a stability analysis. We estimate relative sea level rates over three temporal windows (1900-2011, 1966-2011 and 1993-2011), accounting for covariance between time series. The resultant adjustment reduces the rate uncertainty across individual gauges, and partially mitigates the need for century-scale time series at all sites in the network. Our adjustment reveals a spatially coherent pattern of sea level rise around the coastline, with the highest rates in northern Australia. Over the time periods beginning in 1900, 1966 and 1993, we find weighted average rates of sea level rise of 1.4 ± 0.6, 1.7 ± 0.6 and 4.6 ± 0.8 mm yr-1, respectively. While the temporal pattern of the rate estimates is consistent with acceleration in sea level rise, it may not be significant, as the uncertainties for the shorter analysis periods may not capture the full range of temporal variation. Analysis of the available continuous GPS records that have been collected within 80 km of Australian tide gauges suggests that rates of vertical crustal motion are generally low, with the majority of sites showing motion statistically insignificant from zero. A notable exception is the significant component of vertical land motion that contributes to the rapid rate of relative sea level change (>4 mm yr-1) at the Hillarys site in the Perth area. This corresponds to crustal subsidence that we estimate in our GPS analysis at a rate of -3.1 ± 0.7 mm yr-1, and appears linked to groundwater withdrawal. Uncertainties on the rates of vertical displacement at GPS sites collected over a decade are similar to what we measure in several decades of tide gauge data. Our results motivate continued observations of relative sea level using tide gauges, maintained with high-accuracy terrestrial and continuous co-located satellite-based surveying.

  6. Ice Stream Slowdown Will Drive Long-Term Thinning of the Ross Ice Shelf, With or Without Ocean Warming

    NASA Astrophysics Data System (ADS)

    Campbell, Adam J.; Hulbe, Christina L.; Lee, Choon-Ki

    2018-01-01

    As time series observations of Antarctic change proliferate, it is imperative that mathematical frameworks through which they are understood keep pace. Here we present a new method of interpreting remotely sensed change using spatial statistics and apply it to the specific case of thickness change on the Ross Ice Shelf. First, a numerical model of ice shelf flow is used together with empirical orthogonal function analysis to generate characteristic patterns of response to specific forcings. Because they are continuous and scalable in space and time, the patterns allow short duration observations to be placed in a longer time series context. Second, focusing only on changes that are statistically significant, the synthetic response surfaces are used to extract magnitude and timing of past events from the observational data. Slowdown of Kamb and Whillans Ice Streams is clearly detectable in remotely sensed thickness change. Moreover, those past events will continue to drive thinning into the future.

  7. Variability of daily UV index in Jokioinen, Finland, in 1995-2015

    NASA Astrophysics Data System (ADS)

    Heikkilä, A.; Uusitalo, K.; Kärhä, P.; Vaskuri, A.; Lakkala, K.; Koskela, T.

    2017-02-01

    UV Index is a measure for UV radiation harmful for the human skin, developed and used to promote the sun awareness and protection of people. Monitoring programs conducted around the world have produced a number of long-term time series of UV irradiance. One of the longest time series of solar spectral UV irradiance in Europe has been obtained from the continuous measurements of Brewer #107 spectrophotometer in Jokioinen (lat. 60°44'N, lon. 23°30'E), Finland, over the years 1995-2015. We have used descriptive statistics and estimates of cumulative distribution functions, quantiles and probability density functions in the analysis of the time series of daily UV Index maxima. Seasonal differences in the estimated distributions and in the trends of the estimated quantiles are found.

  8. Mapping the structure of the world economy.

    PubMed

    Lenzen, Manfred; Kanemoto, Keiichiro; Moran, Daniel; Geschke, Arne

    2012-08-07

    We have developed a new series of environmentally extended multi-region input-output (MRIO) tables with applications in carbon, water, and ecological footprinting, and Life-Cycle Assessment, as well as trend and key driver analyses. Such applications have recently been at the forefront of global policy debates, such as about assigning responsibility for emissions embodied in internationally traded products. The new time series was constructed using advanced parallelized supercomputing resources, and significantly advances the previous state of art because of four innovations. First, it is available as a continuous 20-year time series of MRIO tables. Second, it distinguishes 187 individual countries comprising more than 15,000 industry sectors, and hence offers unsurpassed detail. Third, it provides information just 1-3 years delayed therefore significantly improving timeliness. Fourth, it presents MRIO elements with accompanying standard deviations in order to allow users to understand the reliability of data. These advances will lead to material improvements in the capability of applications that rely on input-output tables. The timeliness of information means that analyses are more relevant to current policy questions. The continuity of the time series enables the robust identification of key trends and drivers of global environmental change. The high country and sector detail drastically improves the resolution of Life-Cycle Assessments. Finally, the availability of information on uncertainty allows policy-makers to quantitatively judge the level of confidence that can be placed in the results of analyses.

  9. Multifractals embedded in short time series: An unbiased estimation of probability moment

    NASA Astrophysics Data System (ADS)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  10. Continuous measurement of suspended-sediment discharge in rivers by use of optical backscatterance sensors

    USGS Publications Warehouse

    Schoellhamer, D.H.; Wright, S.A.; Bogen, J.; Fergus, T.; Walling, D.

    2003-01-01

    Optical sensors have been used to measure turbidity and suspended-sediment concentration by many marine and estuarine studies, and optical sensors can provide automated, continuous time series of suspended-sediment concentration and discharge in rivers. Three potential problems with using optical sensors are biological fouling, particle-size variability, and particle-reflectivity variability. Despite varying particle size, output from an optical backscatterance sensor in the Sacramento River at Freeport, California, USA, was calibrated successfully to discharge-weighted, cross-sectionally averaged suspended-sediment concentration, which was measured with the equal discharge-, or width-increment, methods and an isokinetic sampler. A correction for sensor drift was applied to the 3-year time series. However, the calibration of an optical backscatterance sensor used in the Colorado River at Cisco, Utah, USA, was affected by particle-size variability. The adjusted time series at Freeport was used to calculate hourly suspended-sediment discharge that compared well with daily values from a sediment station at Freeport. The appropriateness of using optical sensors in rivers should be evaluated on a site-specific basis and measurement objectives, potential particle size effects, and potential fouling should be considered.

  11. 77 FR 38035 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-26

    ... government spending and 35.8 percent of state government spending. This comprehensive and ongoing, time series collection of local education agency finances maintains historical continuity in the state and...

  12. Curve Number Application in Continuous Runoff Models: An Exercise in Futility?

    NASA Astrophysics Data System (ADS)

    Lamont, S. J.; Eli, R. N.

    2006-12-01

    The suitability of applying the NRCS (Natural Resource Conservation Service) Curve Number (CN) to continuous runoff prediction is examined by studying the dependence of CN on several hydrologic variables in the context of a complex nonlinear hydrologic model. The continuous watershed model Hydrologic Simulation Program-FORTRAN (HSPF) was employed using a simple theoretical watershed in two numerical procedures designed to investigate the influence of soil type, soil depth, storm depth, storm distribution, and initial abstraction ratio value on the calculated CN value. This study stems from a concurrent project involving the design of a hydrologic modeling system to support the Cumulative Hydrologic Impact Assessments (CHIA) of over 230 coal-mined watersheds throughout West Virginia. Because of the large number of watersheds and limited availability of data necessary for HSPF calibration, it was initially proposed that predetermined CN values be used as a surrogate for those HSPF parameters controlling direct runoff. A soil physics model was developed to relate CN values to those HSPF parameters governing soil moisture content and infiltration behavior, with the remaining HSPF parameters being adopted from previous calibrations on real watersheds. A numerical procedure was then adopted to back-calculate CN values from the theoretical watershed using antecedent moisture conditions equivalent to the NRCS Antecedent Runoff Condition (ARC) II. This procedure used the direct runoff produced from a cyclic synthetic storm event time series input to HSPF. A second numerical method of CN determination, using real time series rainfall data, was used to provide a comparison to those CN values determined using the synthetic storm event time series. It was determined that the calculated CN values resulting from both numerical methods demonstrated a nonlinear dependence on all of the computational variables listed above. It was concluded that the use of the Curve Number as a surrogate for the selected subset of HPSF parameters could not be justified. These results suggest that use of the Curve Number in other complex continuous time series hydrologic models may not be appropriate, given the limitations inherent in the definition of the NRCS CN method.

  13. Reconstruction of MODIS total suspended matter time series maps by DINEOF and validation with autonomous platform data

    NASA Astrophysics Data System (ADS)

    Nechad, Bouchra; Alvera-Azcaràte, Aida; Ruddick, Kevin; Greenwood, Naomi

    2011-08-01

    In situ measurements of total suspended matter (TSM) over the period 2003-2006, collected with two autonomous platforms from the Centre for Environment, Fisheries and Aquatic Sciences (Cefas) measuring the optical backscatter (OBS) in the southern North Sea, are used to assess the accuracy of TSM time series extracted from satellite data. Since there are gaps in the remote sensing (RS) data, due mainly to cloud cover, the Data Interpolating Empirical Orthogonal Functions (DINEOF) is used to fill in the TSM time series and build a continuous daily "recoloured" dataset. The RS datasets consist of TSM maps derived from MODIS imagery using the bio-optical model of Nechad et al. (Rem Sens Environ 114: 854-866, 2010). In this study, the DINEOF time series are compared to the in situ OBS measured in moderately to very turbid waters respectively in West Gabbard and Warp Anchorage, in the southern North Sea. The discrepancies between instantaneous RS, DINEOF-filled RS data and Cefas data are analysed in terms of TSM algorithm uncertainties, space-time variability and DINEOF reconstruction uncertainty.

  14. Continuity in fire disturbance between riparian and adjacent sideslopes in the Douglas-fire forest series.

    Treesearch

    Richard L. Everett; Richard Schellhaas; Pete Ohlson

    2000-01-01

    Fire scar and stand cohort records were used to estimate the number and timing of fire disturbance events that impacted riparian and adjacent sideslope forests in the Douglas-fir series. Data were gathered from 49 stream segments on 24 separate streams on the east slope of the Washington Cascade Range. Upslope forests had more traceable disturbance events than riparian...

  15. [Vegetation spatial and temporal dynamic characteristics based on NDVI time series trajectories in grassland opencast coal mining].

    PubMed

    Jia, Duo; Wang, Cang Jiao; Mu, Shou Guo; Zhao, Hua

    2017-06-18

    The spatiotemporal dynamic patterns of vegetation in mining area are still unclear. This study utilized time series trajectory segmentation algorithm to fit Landsat NDVI time series which generated from fusion images at the most prosperous period of growth based on ESTARFM algorithm. Combining with the shape features of the fitted trajectory, this paper extracted five vegetation dynamic patterns including pre-disturbance type, continuous disturbance type, stabilization after disturbance type, stabilization between disturbance and recovery type, and recovery after disturbance type. The result indicated that recovery after disturbance type was the dominant vegetation change pattern among the five types of vegetation dynamic pattern, which accounted for 55.2% of the total number of pixels. The follows were stabilization after disturbance type and continuous disturbance type, accounting for 25.6% and 11.0%, respectively. The pre-disturbance type and stabilization between disturbance and recovery type accounted for 3.5% and 4.7%, respectively. Vegetation disturbance mainly occurred from 2004 to 2009 in Shengli mining area. The onset time of stable state was 2008 and the spatial locations mainlydistributed in open-pit stope and waste dump. The reco-very state mainly started since the year of 2008 and 2010, while the areas were small and mainly distributed at the periphery of open-pit stope and waste dump. Duration of disturbance was mainly 1 year. The duration of stable period usually sustained 7 years. The duration of recovery state of the type of stabilization between disturbances continued 2 to 5 years, while the type of recovery after disturbance often sustained 8 years.

  16. Ozone Time Series From GOMOS and SAGE II Measurements

    NASA Astrophysics Data System (ADS)

    Kyrola, E. T.; Laine, M.; Tukiainen, S.; Sofieva, V.; Zawodny, J. M.; Thomason, L. W.

    2011-12-01

    Satellite measurements are essential for monitoring changes in the global stratospheric ozone distribution. Both the natural variation and anthropogenic change are strongly dependent on altitude. Stratospheric ozone has been measured from space with good vertical resolution since 1985 by the SAGE II solar occultation instrument. The advantage of the occultation measurement principle is the self-calibration, which is essential to ensuring stable time series. SAGE II measurements in 1985-2005 have been a valuable data set in investigations of trends in the vertical distribution of ozone. This time series can now be extended by the GOMOS measurements started in 2002. GOMOS is a stellar occultation instrument and offers, therefore, a natural continuation of SAGE II measurements. In this paper we study how well GOMOS and SAGE II measurements agree with each other in the period 2002-2005 when both instruments were measuring. We detail how the different spatial and temporal sampling of these two instruments affect the conformity of measurements. We study also how the retrieval specifics like absorption cross sections and assumed aerosol modeling affect the results. Various combined time series are constructed using different estimators and latitude-time grids. We also show preliminary results from a novel time series analysis based on Markov chain Monte Carlo approach.

  17. 40 CFR 86.115-78 - EPA urban dynamometer driving schedule.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission.... time relationships. They each consist of a distinct nonrepetitive series of idle, acceleration, cruise...

  18. 40 CFR 86.115-78 - EPA urban dynamometer driving schedule.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission.... time relationships. They each consist of a distinct nonrepetitive series of idle, acceleration, cruise...

  19. 40 CFR 86.115-78 - EPA urban dynamometer driving schedule.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission.... time relationships. They each consist of a distinct nonrepetitive series of idle, acceleration, cruise...

  20. Introduction to Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1986-01-01

    The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.

  1. Comparison of different synthetic 5-min rainfall time series regarding their suitability for urban drainage modelling

    NASA Astrophysics Data System (ADS)

    van der Heijden, Sven; Callau Poduje, Ana; Müller, Hannes; Shehu, Bora; Haberlandt, Uwe; Lorenz, Manuel; Wagner, Sven; Kunstmann, Harald; Müller, Thomas; Mosthaf, Tobias; Bárdossy, András

    2015-04-01

    For the design and operation of urban drainage systems with numerical simulation models, long, continuous precipitation time series with high temporal resolution are necessary. Suitable observed time series are rare. As a result, intelligent design concepts often use uncertain or unsuitable precipitation data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic precipitation data for urban drainage modelling are advanced, tested, and compared. The presented study compares four different approaches of precipitation models regarding their ability to reproduce rainfall and runoff characteristics. These include one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model, and one disaggregation approach based on daily precipitation measurements. All four models produce long precipitation time series with a temporal resolution of five minutes. The synthetic time series are first compared to observed rainfall reference time series. Comparison criteria include event based statistics like mean dry spell and wet spell duration, wet spell amount and intensity, long term means of precipitation sum and number of events, and extreme value distributions for different durations. Then they are compared regarding simulated discharge characteristics using an urban hydrological model on a fictitious sewage network. First results show a principal suitability of all rainfall models but with different strengths and weaknesses regarding the different rainfall and runoff characteristics considered.

  2. STEM connections to the GOES-R Satellite Series

    NASA Astrophysics Data System (ADS)

    Mooney, M. E.; Schmit, T.

    2015-12-01

    GOES-R, a new Geostationary Operational Environmental Satellite (GOES) is scheduled to be launched in October of 2016. Its role is to continue western hemisphere satellite coverage while the existing GOES series winds down its 20-year operation. However, instruments on the next generation GOES-R satellite series will provide major improvements to the current GOES, both in the frequency of images acquired and the spectral and spatial resolution of the images, providing a perfect conduit for STEM education. Most of these improvements will be provided by the Advanced Baseline Imager (ABI). ABI will provide three times more spectral information, four times the spatial resolution, and more than five times faster temporal coverage than the current GOES. Another exciting addition to the GOES-R satellite series will be the Geostationary Lightning Mapper (GLM). The all new GLM on GOES-R will measure total lightning activity continuously over the Americas and adjacent ocean regions with near uniform spatial resolution of approximately 10 km! Due to ABI, GLM and improved spacecraft calibration and navigation, the next generation GOES-R satellite series will usher in an exciting era of satellite applications and opportunities for STEM education. This session will present and demonstrate exciting next-gen imagery advancements and new HTML5 WebApps that demonstrate STEM connections to these improvements. Participants will also be invited to join the GOES-R Education Proving Ground, a national network of educators who will receive stipends to attend 4 webinars during the spring of 2016, pilot a STEM lesson plan, and organize a school-wide launch awareness event.

  3. OceanSITES: Sustained Ocean Time Series Observations in the Global Ocean.

    NASA Astrophysics Data System (ADS)

    Weller, R. A.; Gallage, C.; Send, U.; Lampitt, R. S.; Lukas, R.

    2016-02-01

    Time series observations at critical or representative locations are an essential element of a global ocean observing system that is unique and complements other approaches to sustained observing. OceanSITES is an international group of oceanographers associated with such time series sites. OceanSITES exists to promote the continuation and extension of ocean time series sites around the globe. It also exists to plan and oversee the global array of sites in order to address the needs of research, climate change detection, operational applications, and policy makers. OceanSITES is a voluntary group that sits as an Action Group of the JCOMM-OPS Data Buoy Cooperation Panel, where JCOMM-OPS is the operational ocean observing oversight group of the Joint Commission on Oceanography and Marine Meteorology of the International Oceanographic Commission and the World Meteorological Organization. The way forward includes working to complete the global array, moving toward multidisciplinary instrumentation on a subset of the sites, and increasing utilization of the time series data, which are freely available from two Global Data Assembly Centers, one at the National Data Buoy Center and one at Coriolis at IFREMER. One recnet OceanSITES initiative and several results from OceanSITES time series sites are presented. The recent initiative was the assembly of a pool of temperature/conductivity recorders fro provision to OceanSITES sites in order to provide deep ocean temperature and salinity time series. Examples from specific sites include: a 15-year record of surface meteorology and air-sea fluxes from off northern Chile that shows evidence of long-term trends in surface forcing; change in upper ocean salinity and stratification in association with regional change in the hydrological cycle can be seen at the Hawaii time series site; results from monitoring Atlantic meridional transport; and results from a European multidisciplinary time series site.

  4. 40 CFR 86.515-78 - EPA urban dynamometer driving schedule.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission... nonrepetitive series of idle, acceleration, cruise, and deceleration modes of various time sequences and rates...

  5. 40 CFR 86.515-78 - EPA urban dynamometer driving schedule.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission... nonrepetitive series of idle, acceleration, cruise, and deceleration modes of various time sequences and rates...

  6. 40 CFR 86.515-78 - EPA urban dynamometer driving schedule.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission... nonrepetitive series of idle, acceleration, cruise, and deceleration modes of various time sequences and rates...

  7. 40 CFR 86.515-78 - EPA urban dynamometer driving schedule.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission... nonrepetitive series of idle, acceleration, cruise, and deceleration modes of various time sequences and rates...

  8. Subordinated continuous-time AR processes and their application to modeling behavior of mechanical system

    NASA Astrophysics Data System (ADS)

    Gajda, Janusz; Wyłomańska, Agnieszka; Zimroz, Radosław

    2016-12-01

    Many real data exhibit behavior adequate to subdiffusion processes. Very often it is manifested by so-called ;trapping events;. The visible evidence of subdiffusion we observe not only in financial time series but also in technical data. In this paper we propose a model which can be used for description of such kind of data. The model is based on the continuous time autoregressive time series with stable noise delayed by the infinitely divisible inverse subordinator. The proposed system can be applied to real datasets with short-time dependence, visible jumps and mentioned periods of stagnation. In this paper we extend the theoretical considerations in analysis of subordinated processes and propose a new model that exhibits mentioned properties. We concentrate on the main characteristics of the examined subordinated process expressed mainly in the language of the measures of dependence which are main tools used in statistical investigation of real data. We present also the simulation procedure of the considered system and indicate how to estimate its parameters. The theoretical results we illustrate by the analysis of real technical data.

  9. Evaluating PRISM precipitation grid data as possible surrogates for station data at four sites in Oklahoma

    USDA-ARS?s Scientific Manuscript database

    The development of climate-sensitive decision support for agriculture or water resource management requires long time series of monthly precipitation for specific locations. Archived station data for many locations is available, but time continuity, quality, and spatial coverage of station data rem...

  10. 8 CFR 245a.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... may be used in addition to, but not in lieu of, the Federal Citizenship Text series); (3) Be designed... regarded as having resided continuously in the United States if, at the time of filing of the application... the United States could not be accomplished within the time period allowed; (ii) The alien was...

  11. 8 CFR 245a.1 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... may be used in addition to, but not in lieu of, the Federal Citizenship Text series); (3) Be designed... regarded as having resided continuously in the United States if, at the time of filing of the application... the United States could not be accomplished within the time period allowed; (ii) The alien was...

  12. 8 CFR 245a.1 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... may be used in addition to, but not in lieu of, the Federal Citizenship Text series); (3) Be designed... regarded as having resided continuously in the United States if, at the time of filing of the application... the United States could not be accomplished within the time period allowed; (ii) The alien was...

  13. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process

    PubMed Central

    2013-01-01

    Background Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Methods Monthly mean raw mortality (at hospital discharge) time series, 1995–2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) “in-control” status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. Results The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. Conclusions The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues. PMID:23705957

  14. Correlates of depression in bipolar disorder

    PubMed Central

    Moore, Paul J.; Little, Max A.; McSharry, Patrick E.; Goodwin, Guy M.; Geddes, John R.

    2014-01-01

    We analyse time series from 100 patients with bipolar disorder for correlates of depression symptoms. As the sampling interval is non-uniform, we quantify the extent of missing and irregular data using new measures of compliance and continuity. We find that uniformity of response is negatively correlated with the standard deviation of sleep ratings (ρ = –0.26, p = 0.01). To investigate the correlation structure of the time series themselves, we apply the Edelson–Krolik method for correlation estimation. We examine the correlation between depression symptoms for a subset of patients and find that self-reported measures of sleep and appetite/weight show a lower average correlation than other symptoms. Using surrogate time series as a reference dataset, we find no evidence that depression is correlated between patients, though we note a possible loss of information from sparse sampling. PMID:24352942

  15. Assessing dry weather flow contribution in TSS and COD storm events loads in combined sewer systems.

    PubMed

    Métadier, M; Bertrand-Krajewski, J L

    2011-01-01

    Continuous high resolution long term turbidity measurements along with continuous discharge measurements are now recognised as an appropriate technique for the estimation of in sewer total suspended solids (TSS) and Chemical Oxygen Demand (COD) loads during storm events. In the combined system of the Ecully urban catchment (Lyon, France), this technique is implemented since 2003, with more than 200 storm events monitored. This paper presents a method for the estimation of the dry weather (DW) contribution to measured total TSS and COD event loads with special attention devoted to uncertainties assessment. The method accounts for the dynamics of both discharge and turbidity time series at two minutes time step. The study is based on 180 DW days monitored in 2007-2008. Three distinct classes of DW days were evidenced. Variability analysis and quantification showed that no seasonal effect and no trend over the year were detectable. The law of propagation of uncertainties is applicable for uncertainties estimation. The method has then been applied to all measured storm events. This study confirms the interest of long term continuous discharge and turbidity time series in sewer systems, especially in the perspective of wet weather quality modelling.

  16. Time-series photographs of the sea floor in western Massachusetts Bay: May 1999 to September 1999; May 2000 to September 2000; and October 2001 to February 2002

    USGS Publications Warehouse

    Butman, Bradford; Alexander, P. Soupy; Bothner, Michael H.

    2004-01-01

    This report presents time-series photographs of the sea floor obtained from an instrumented tripod deployed at Site A in western Massachusetts Bay (42° 22.6' N., 70° 47.0' W., 30 m water depth) from May 1999 to September 1999; May 2000 to September 2000; and October 2001 to February 2002. Site A is approximately 1 km south of an ocean outfall that began discharging treated sewage effluent from the Boston metropolitan area into Massachusetts Bay in September 2000. Time-series photographs and oceanographic observations were initiated at Site A in December 1989 and are anticipated to continue to September 2005. This one of a series of reports that present these images in digital form. The objective of these reports is to enable easy and rapid viewing of the photographs and to provide a medium-resolution digital archive. The images, obtained every 4 hours, are presented as a movie (in .avi format, which may be viewed using an image viewer such as QuickTime or Windows Media Player) and as individual images (.tif format). The images provide time-series observations of changes of the sea floor and near-bottom water properties.

  17. Assessment of Program Impact Through First Grade, Volume V: Impact on Children. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Berrueta-Clement, John; And Others

    Fifth in a series of six volumes reporting outcomes of the preliminary evaluation of an educational intervention, this report presents the findings of the effects of Project Developmental Continuity (PDC) up to the time the evaluation study's cohort of children completed grade 1. Preliminary findings concerning the relationship between variables…

  18. From mess to mass: a methodology for calculating storm event pollutant loads with their uncertainties, from continuous raw data time series.

    PubMed

    Métadier, M; Bertrand-Krajewski, J-L

    2011-01-01

    With the increasing implementation of continuous monitoring of both discharge and water quality in sewer systems, large data bases are now available. In order to manage large amounts of data and calculate various variables and indicators of interest it is necessary to apply automated methods for data processing. This paper deals with the processing of short time step turbidity time series to estimate TSS (Total Suspended Solids) and COD (Chemical Oxygen Demand) event loads in sewer systems during storm events and their associated uncertainties. The following steps are described: (i) sensor calibration, (ii) estimation of data uncertainties, (iii) correction of raw data, (iv) data pre-validation tests, (v) final validation, and (vi) calculation of TSS and COD event loads and estimation of their uncertainties. These steps have been implemented in an integrated software tool. Examples of results are given for a set of 33 storm events monitored in a stormwater separate sewer system.

  19. Segmenting Continuous Motions with Hidden Semi-markov Models and Gaussian Processes

    PubMed Central

    Nakamura, Tomoaki; Nagai, Takayuki; Mochihashi, Daichi; Kobayashi, Ichiro; Asoh, Hideki; Kaneko, Masahide

    2017-01-01

    Humans divide perceived continuous information into segments to facilitate recognition. For example, humans can segment speech waves into recognizable morphemes. Analogously, continuous motions are segmented into recognizable unit actions. People can divide continuous information into segments without using explicit segment points. This capacity for unsupervised segmentation is also useful for robots, because it enables them to flexibly learn languages, gestures, and actions. In this paper, we propose a Gaussian process-hidden semi-Markov model (GP-HSMM) that can divide continuous time series data into segments in an unsupervised manner. Our proposed method consists of a generative model based on the hidden semi-Markov model (HSMM), the emission distributions of which are Gaussian processes (GPs). Continuous time series data is generated by connecting segments generated by the GP. Segmentation can be achieved by using forward filtering-backward sampling to estimate the model's parameters, including the lengths and classes of the segments. In an experiment using the CMU motion capture dataset, we tested GP-HSMM with motion capture data containing simple exercise motions; the results of this experiment showed that the proposed GP-HSMM was comparable with other methods. We also conducted an experiment using karate motion capture data, which is more complex than exercise motion capture data; in this experiment, the segmentation accuracy of GP-HSMM was 0.92, which outperformed other methods. PMID:29311889

  20. Long-term stormwater quantity and quality analysis using continuous measurements in a French urban catchment.

    PubMed

    Sun, Siao; Barraud, Sylvie; Castebrunet, Hélène; Aubin, Jean-Baptiste; Marmonier, Pierre

    2015-11-15

    The assessment of urban stormwater quantity and quality is important for evaluating and controlling the impact of the stormwater to natural water and environment. This study mainly addresses long-term evolution of stormwater quantity and quality in a French urban catchment using continuous measured data from 2004 to 2011. Storm event-based data series are obtained (716 rainfall events and 521 runoff events are available) from measured continuous time series. The Mann-Kendall test is applied to these event-based data series for trend detection. A lack of trend is found in rainfall and an increasing trend in runoff is detected. As a result, an increasing trend is present in the runoff coefficient, likely due to growing imperviousness of the catchment caused by urbanization. The event mean concentration of the total suspended solid (TSS) in stormwater does not present a trend, whereas the event load of TSS has an increasing tendency, which is attributed to the increasing event runoff volume. Uncertainty analysis suggests that the major uncertainty in trend detection results lies in uncertainty due to available data. A lack of events due to missing data leads to dramatically increased uncertainty in trend detection results. In contrast, measurement uncertainty in time series data plays a trivial role. The intra-event distribution of TSS is studied based on both M(V) curves and pollutant concentrations of absolute runoff volumes. The trend detection test reveals no significant change in intra-event distributions of TSS in the studied catchment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Using continuous monitoring of physical parameters to better estimate phosphorus fluxes in a small agricultural catchment

    NASA Astrophysics Data System (ADS)

    Minaudo, Camille; Dupas, Rémi; Moatar, Florentina; Gascuel-Odoux, Chantal

    2016-04-01

    Phosphorus fluxes in streams are subjected to high temporal variations, questioning the relevance of the monitoring strategies (generally monthly sampling) chosen to assist EU Directives to capture phosphorus fluxes and their variations over time. The objective of this study was to estimate the annual and seasonal P flux uncertainties depending on several monitoring strategies, with varying sampling frequencies, but also taking into account simultaneous and continuous time-series of parameters such as turbidity, conductivity, groundwater level and precipitation. Total Phosphorus (TP), Soluble Reactive Phosphorus (SRP) and Total Suspended Solids (TSS) concentrations were surveyed at a fine temporal frequency between 2007 and 2015 at the outlet of a small agricultural catchment in Brittany (Naizin, 5 km2). Sampling occurred every 3 to 6 days between 2007 and 2012 and daily between 2013 and 2015. Additionally, 61 storms were intensively surveyed (1 sample every 30 minutes) since 2007. Besides, water discharge, turbidity, conductivity, groundwater level and precipitation were monitored on a sub-hourly basis. A strong temporal decoupling between SRP and particulate P (PP) was found (Dupas et al., 2015). The phosphorus-discharge relationships displayed two types of hysteretic patterns (clockwise and counterclockwise). For both cases, time-series of PP and SRP were estimated continuously for the whole period using an empirical model linking P concentrations with the hydrological and physic-chemical variables. The associated errors of the estimated P concentrations were also assessed. These « synthetic » PP and SRP time-series allowed us to discuss the most efficient monitoring strategies, first taking into account different sampling strategies based on Monte Carlo random simulations, and then adding the information from continuous data such as turbidity, conductivity and groundwater depth based on empirical modelling. Dupas et al., (2015, Distinct export dynamics for dissolved and particulate phosphorus reveal independent transport mechanisms in an arable headwater catchment, Hydrological Processes, 29(14), 3162-3178

  2. Examining Submarine Ground-Water Discharge into Florida Bay by using 222Rn and Continuous Resistivity Profiling

    USGS Publications Warehouse

    Swarzenski, Peter; Reich, Chris; Rudnick, David

    2009-01-01

    Estimates of submarine ground-water discharge (SGD) into Florida Bay remain one of the least understood components of a regional water balance. To quantify the magnitude and seasonality of SGD into upper Florida Bay, research activities included the use of the natural geochemical tracer, 222Rn, to examine potential SGD hotspots (222Rn surveys) and to quantify the total (saline + fresh water component) SGD rates at select sites (222Rn time-series). To obtain a synoptic map of the 222Rn distribution within our study site in Florida Bay, we set up a flow-through system on a small boat that consisted of a Differential Global Positioning System, a calibrated YSI, Inc CTD sensor with a sampling rate of 0.5 min, and a submersible pump (z = 0.5 m) that continuously fed water into an air/water exchanger that was plumbed simultaneously into four RAD7 222Rn air monitors. To obtain local advective ground-water flux estimates, 222Rn time-series experiments were deployed at strategic positions across hydrologic and geologic gradients within our study site. These time-series stations consisted of a submersible pump, a Solinist DIVER (to record continuous CTD parameters) and two RAD7 222Rn air monitors plumbed into an air/water exchanger. Repeat time-series 222Rn measurements were conducted for 3-4 days across several tidal excursions. Radon was also measured in the air during each sampling campaign by a dedicated RAD7. We obtained ground-water discharge information by calculating a 222Rn mass balance that accounted for lateral and horizontal exchange, as well as an appropriate ground-water 222Rn end member activity. Another research component utilized marine continuous resistivity profiling (CRP) surveys to examine the subsurface salinity structure within Florida Bay sediments. This system consisted of an AGI SuperSting 8 channel receiver attached to a streamer cable that had two current (A,B) electrodes and nine potential electrodes that were spaced 10 m apart. A separate DGPS continuously sent position information to the SuperSting. Results indicate that the 222Rn maps provide a useful gauge of relative ground-water discharge into upper Florida Bay. The 222Rn time-series measurements provide a reasonable estimate of site- specific total (saline and fresh) ground-water discharge (mean = 12.5+-11.8 cm d-1), while the saline nature of the shallow ground-water at our study site, as evidenced by CPR results, indicates that most of this discharge must be recycled sea water. The CRP data show some interesting trends that appear to be consistent with subsurface geologic and hydrologic characterization. For example, some of the highest resistivity (electrical conductivity-1) values were recorded where one would expect a slight subsurface freshening (for example bayside Key Largo, or below the C111 canal).

  3. Toward a Global Horizontal and Vertical Elastic Load Deformation Model Derived from GRACE and GNSS Station Position Time Series

    NASA Astrophysics Data System (ADS)

    Chanard, Kristel; Fleitout, Luce; Calais, Eric; Rebischung, Paul; Avouac, Jean-Philippe

    2018-04-01

    We model surface displacements induced by variations in continental water, atmospheric pressure, and nontidal oceanic loading, derived from the Gravity Recovery and Climate Experiment (GRACE) for spherical harmonic degrees two and higher. As they are not observable by GRACE, we use at first the degree-1 spherical harmonic coefficients from Swenson et al. (2008, https://doi.org/10.1029/2007JB005338). We compare the predicted displacements with the position time series of 689 globally distributed continuous Global Navigation Satellite System (GNSS) stations. While GNSS vertical displacements are well explained by the model at a global scale, horizontal displacements are systematically underpredicted and out of phase with GNSS station position time series. We then reestimate the degree 1 deformation field from a comparison between our GRACE-derived model, with no a priori degree 1 loads, and the GNSS observations. We show that this approach reconciles GRACE-derived loading displacements and GNSS station position time series at a global scale, particularly in the horizontal components. Assuming that they reflect surface loading deformation only, our degree-1 estimates can be translated into geocenter motion time series. We also address and assess the impact of systematic errors in GNSS station position time series at the Global Positioning System (GPS) draconitic period and its harmonics on the comparison between GNSS and GRACE-derived annual displacements. Our results confirm that surface mass redistributions observed by GRACE, combined with an elastic spherical and layered Earth model, can be used to provide first-order corrections for loading deformation observed in both horizontal and vertical components of GNSS station position time series.

  4. Production and Uses of Multi-Decade Geodetic Earth Science Data Records

    NASA Astrophysics Data System (ADS)

    Bock, Y.; Kedar, S.; Moore, A. W.; Fang, P.; Liu, Z.; Sullivan, A.; Argus, D. F.; Jiang, S.; Marshall, S. T.

    2017-12-01

    The Solid Earth Science ESDR System (SESES) project funded under the NASA MEaSUREs program produces and disseminates mature, long-term, calibrated and validated, GNSS based Earth Science Data Records (ESDRs) that encompass multiple diverse areas of interest in Earth Science, such as tectonic motion, transient slip and earthquake dynamics, as well as meteorology, climate, and hydrology. The ESDRs now span twenty-five years for the earliest stations and today are available for thousands of global and regional stations. Using a unified metadata database and a combination of GNSS solutions generated by two independent analysis centers, the project currently produces four long-term ESDR's: Geodetic Displacement Time Series: Daily, combined, cleaned and filtered, GIPSY and GAMIT long-term time series of continuous GPS station positions (global and regional) in the latest version of ITRF, automatically updated weekly. Geodetic Velocities: Weekly updated velocity field + velocity field histories in various reference frames; compendium of all model parameters including earthquake catalog, coseismic offsets, and postseismic model parameters (exponential or logarithmic). Troposphere Delay Time Series: Long-term time series of troposphere delay (30-min resolution) at geodetic stations, necessarily estimated during position time series production and automatically updated weekly. Seismogeodetic records for historic earthquakes: High-rate broadband displacement and seismic velocity time series combining 1 Hz GPS displacements and 100 Hz accelerometer data for select large earthquakes and collocated cGPS and seismic instruments from regional networks. We present several recent notable examples of the ESDR's usage: A transient slip study that uses the combined position time series to unravel "tremor-less" slow tectonic transient events. Fault geometry determination from geodetic slip rates. Changes in water resources across California's physiographic provinces at a spatial resolution of 75 km. Retrospective study of a southern California summer monsoon event.

  5. Noise analysis of GPS time series in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, You-Chia; Chang, Wu-Lung

    2017-04-01

    Global positioning system (GPS) usually used for researches of plate tectonics and crustal deformation. In most studies, GPS time series considered only time-independent noises (white noise), but time-dependent noises (flicker noise, random walk noise) which were found by nearly twenty years are also important to the precision of data. The rate uncertainties of stations will be underestimated if the GPS time series are assumed only time-independent noise. Therefore studying the noise properties of GPS time series is necessary in order to realize the precision and reliability of velocity estimates. The lengths of our GPS time series are from over 500 stations around Taiwan with time spans longer than 2.5 years up to 20 years. The GPS stations include different monument types such as deep drill braced, roof, metal tripod, and concrete pier, and the most common type in Taiwan is the metal tripod. We investigated the noise properties of continuous GPS time series by using the spectral index and amplitude of the power law noise. During the process we first remove the data outliers, and then estimate linear trend, size of offsets, and seasonal signals, and finally the amplitudes of the power-law and white noise are estimated simultaneously. Our preliminary results show that the noise amplitudes of the north component are smaller than that of the other two components, and the largest amplitudes are in the vertical. We also find that the amplitudes of white noise and power-law noises are positively correlated in three components. Comparisons of noise amplitudes of different monument types in Taiwan reveal that the deep drill braced monuments have smaller data uncertainties and therefore are more stable than other monuments.

  6. GPS data exploration for seismologists and geodesists

    NASA Astrophysics Data System (ADS)

    Webb, F.; Bock, Y.; Kedar, S.; Dong, D.; Jamason, P.; Chang, R.; Prawirodirdjo, L.; MacLeod, I.; Wadsworth, G.

    2007-12-01

    Over the past decade, GPS and seismic networks spanning the western US plate boundaries have produced vast amounts of data that need to be made accessible to both the geodesy and seismology communities. Unlike seismic data, raw geodetic data requires significant processing before geophysical interpretations can be made. This requires the generation of data-products (time series, velocities and strain maps) and dissemination strategies to bridge these differences and assure efficient use of data across traditionally separate communities. "GPS DATA PRODUCTS FOR SOLID EARTH SCIENCE" (GDPSES) is a multi-year NASA funded project, designed to produce and deliver high quality GPS time series, velocities, and strain fields, derived from multiple GPS networks along the western US plate boundary, and to make these products easily accessible to geophysicists. Our GPS product dissemination is through modern web-based IT methodology. Product browsing is facilitated through a web tool known as GPS Explorer and continuous streams of GPS time series are provided using web services to the seismic archive, where it can be accessed by seismologists using traditional seismic data viewing and manipulation tools. GPS-Explorer enables users to efficiently browse several layers of data products from raw data through time series, velocities and strain by providing the user with a web interface, which seamlessly interacts with a continuously updated database of these data products through the use of web-services. The current archive contains GDPSES data products beginning in 1995, and includes observations from GPS stations in EarthScope's Plate Boundary Observatory (PBO), as well as from real-time real-time CGPS stations. The generic, standards-based approach used in this project enables GDPSES to seamlessly expand indefinitely to include other space-time-dependent data products from additional GPS networks. The prototype GPS-Explorer provides users with a personalized working environment in which the user may zoom in and access subsets of the data via web services. It provides users with a variety of interactive web tools interconnected in a portlet environment to explore and save datasets of interest to return to at a later date. At the same time the GPS time series are also made available through the seismic data archive, where the GPS networks are treated as regular seismic networks, whose data is made available in data formats used by seismic utilities such as SEED readers and SAC. A key challenge, stemming from the fundamental differences between seismic and geodetic time series, is the representation of reprocessed of GPS data in the seismic archive. As GPS processing algorithms evolve and their accuracy increases, a periodic complete recreation of the the GPS time series archive is necessary.

  7. New Perspectives on Compensation Strategies for the Out-of-School Time Workforce. Working Paper Series. Report.

    ERIC Educational Resources Information Center

    Morgan, Gwen; Harvey, Brooke

    Noting that the quality, continuity, and stability of out-of-school time programs depend, in part, on the presence of a well-trained and fairly compensated staff, this paper examines the unique characteristics of the out-of-school time workforce that contribute to inadequate compensation and explores workforce compensation from an economic…

  8. Agreement evaluation of AVHRR and MODIS 16-day composite NDVI data sets

    USGS Publications Warehouse

    Ji, Lei; Gallo, Kevin P.; Eidenshink, Jeffery C.; Dwyer, John L.

    2008-01-01

    Satellite-derived normalized difference vegetation index (NDVI) data have been used extensively to detect and monitor vegetation conditions at regional and global levels. A combination of NDVI data sets derived from AVHRR and MODIS can be used to construct a long NDVI time series that may also be extended to VIIRS. Comparative analysis of NDVI data derived from AVHRR and MODIS is critical to understanding the data continuity through the time series. In this study, the AVHRR and MODIS 16-day composite NDVI products were compared using regression and agreement analysis methods. The analysis shows a high agreement between the AVHRR-NDVI and MODIS-NDVI observed from 2002 and 2003 for the conterminous United States, but the difference between the two data sets is appreciable. Twenty per cent of the total difference between the two data sets is due to systematic difference, with the remainder due to unsystematic difference. The systematic difference can be eliminated with a linear regression-based transformation between two data sets, and the unsystematic difference can be reduced partially by applying spatial filters to the data. We conclude that the continuity of NDVI time series from AVHRR to MODIS is satisfactory, but a linear transformation between the two sets is recommended.

  9. [Financial protection in health: updates for Mexico to 2014].

    PubMed

    Knaul, Felicia Marie; Arreola-Ornelas, Héctor; Méndez-Carniado, Oscar

    2016-06-01

    Objetive: Document financial protection in health in Mexico up to 2014. We up date the measures of impoverishing and catastrophic health expenditure to 2014, to analyse shifts since the implementation of the System for Social Protection in Health and the Seguro Popular using time series data from the Household Income and Expenditure Survey. Between 2004 and 2014 there has been a continued improvement in levels of financial protection. Excessive expenditure reached its lowest point: -2.0% in 2012 and 2.1% in 2014. Impoverishing expenditure dropped to 1.3% in 2004, compared to 0.5% in 2014, and catastrophic expenditures from 2.7% to 2.1%. The time series of data on financial protection show a clear pattern of improvement between 2000 and 2014 and level off and low levels in 2012 and 2014. Still, levels continue to be relatively high for households in the poorest quintile, in rural areas and with an elderly person.

  10. Chinese Communicating in the Culture Performance 2

    ERIC Educational Resources Information Center

    Walker, Galal

    2005-01-01

    This is the second text in a series of Mandarin Chinese learning texts. It continues with the theme of learning to communicate in various forms, especially with time and location, and the hanzi writing system is introduced. An MP3 file accompanies this book. Contents include: (1) Acknowledgments; (2) Introduction; (3) Unit Three, Time When:…

  11. Deep learning on temporal-spectral data for anomaly detection

    NASA Astrophysics Data System (ADS)

    Ma, King; Leung, Henry; Jalilian, Ehsan; Huang, Daniel

    2017-05-01

    Detecting anomalies is important for continuous monitoring of sensor systems. One significant challenge is to use sensor data and autonomously detect changes that cause different conditions to occur. Using deep learning methods, we are able to monitor and detect changes as a result of some disturbance in the system. We utilize deep neural networks for sequence analysis of time series. We use a multi-step method for anomaly detection. We train the network to learn spectral and temporal features from the acoustic time series. We test our method using fiber-optic acoustic data from a pipeline.

  12. Ordinary kriging as a tool to estimate historical daily streamflow records

    USGS Publications Warehouse

    Farmer, William H.

    2016-01-01

    Efficient and responsible management of water resources relies on accurate streamflow records. However, many watersheds are ungaged, limiting the ability to assess and understand local hydrology. Several tools have been developed to alleviate this data scarcity, but few provide continuous daily streamflow records at individual streamgages within an entire region. Building on the history of hydrologic mapping, ordinary kriging was extended to predict daily streamflow time series on a regional basis. Pooling parameters to estimate a single, time-invariant characterization of spatial semivariance structure is shown to produce accurate reproduction of streamflow. This approach is contrasted with a time-varying series of variograms, representing the temporal evolution and behavior of the spatial semivariance structure. Furthermore, the ordinary kriging approach is shown to produce more accurate time series than more common, single-index hydrologic transfers. A comparison between topological kriging and ordinary kriging is less definitive, showing the ordinary kriging approach to be significantly inferior in terms of Nash–Sutcliffe model efficiencies while maintaining significantly superior performance measured by root mean squared errors. Given the similarity of performance and the computational efficiency of ordinary kriging, it is concluded that ordinary kriging is useful for first-order approximation of daily streamflow time series in ungaged watersheds.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piper, Stephen C; Keeling, Ralph F

    The main objective of this project was to continue research to develop carbon cycle relationships related to the land biosphere based on remote measurements of atmospheric CO2 concentration and its isotopic ratios 13C/12C, 18O/16O, and 14C/12C. The project continued time-series observations of atmospheric carbon dioxide and isotopic composition begun by Charles D. Keeling at remote sites, including Mauna Loa, the South Pole, and eight other sites. Using models of varying complexity, the concentration and isotopic measurements were used to study long-term change in the interhemispheric gradients in CO2 and 13C/12C to assess the magnitude and evolution of the northern terrestrialmore » carbon sink, to study the increase in amplitude of the seasonal cycle of CO2, to use isotopic data to refine constraints on large scale changes in isotopic fractionation which may be related to changes in stomatal conductance, and to motivate improvements in terrestrial carbon cycle models. The original proposal called for a continuation of the new time series of 14C measurements but subsequent descoping to meet budgetary constraints required termination of measurements in 2007.« less

  14. Simulating transient dynamics of the time-dependent time fractional Fokker-Planck systems

    NASA Astrophysics Data System (ADS)

    Kang, Yan-Mei

    2016-09-01

    For a physically realistic type of time-dependent time fractional Fokker-Planck (FP) equation, derived as the continuous limit of the continuous time random walk with time-modulated Boltzmann jumping weight, a semi-analytic iteration scheme based on the truncated (generalized) Fourier series is presented to simulate the resultant transient dynamics when the external time modulation is a piece-wise constant signal. At first, the iteration scheme is demonstrated with a simple time-dependent time fractional FP equation on finite interval with two absorbing boundaries, and then it is generalized to the more general time-dependent Smoluchowski-type time fractional Fokker-Planck equation. The numerical examples verify the efficiency and accuracy of the iteration method, and some novel dynamical phenomena including polarized motion orientations and periodic response death are discussed.

  15. Cross-recurrence quantification analysis of categorical and continuous time series: an R package

    PubMed Central

    Coco, Moreno I.; Dale, Rick

    2014-01-01

    This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between “correlation” and “co-visitation” when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results’ consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels. PMID:25018736

  16. Satellite Ocean Color: Present Status, Future Challenges

    NASA Technical Reports Server (NTRS)

    Gregg, Watson W.; McClain, Charles R.; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    We are midway into our 5th consecutive year of nearly continuous, high quality ocean color observations from space. The Ocean Color and Temperature Scanner/Polarization and Directionality of the Earth's Reflectances (OCTS/POLDER: Nov. 1996 - Jun. 1997), the Sea-viewing Wide Field-of-view Sensor (SeaWiFS: Sep. 1997 - present), and now the Moderate Resolution Imaging Spectrometer (MODIS: Sep. 2000 - present) have and are providing unprecedented views of chlorophyll dynamics on global scales. Global synoptic views of ocean chlorophyll were once a fantasy for ocean color scientists. It took nearly the entire 8-year lifetime of limited Coastal Zone Color Scanner (CZCS) observations to compile seasonal climatologies. Now SeaWIFS produces comparably complete fields in about 8 days. For the first time, scientists may observe spatial and temporal variability never before seen in a synoptic context. Even more exciting, we are beginning to plausibly ask questions of interannual variability. We stand at the beginning of long-time time series of ocean color, from which we may begin to ask questions of interdecadal variability and climate change. These are the scientific questions being addressed by users of the 18-year Advanced Very High Resolution Radiometer time series with respect to terrestrial processes and ocean temperatures. The nearly 5-year time series of ocean color observations now being constructed, with possibilities of continued observations, can put us at comparable standing with our terrestrial and physical oceanographic colleagues, and enable us to understand how ocean biological processes contribute to, and are affected by global climate change.

  17. Assessment of Program Impact Through First Grade, Volume IV: Impact on Teachers. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Wacker, Sally; And Others

    The fourth in a series reporting evaluation findings on the impact of Project Developmental Continuity (PDC), this volume reports treatment-related and other findings concerning teachers and classrooms up to the time the evaluation study's cohort of children had completed grade 1. Begun at 15 sites in 1974 with the purpose of ensuring that…

  18. Assessment of Program Impact Through First Grade, Volume III: Impact on Parents. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Morris, Mary; And Others

    Third in a series of six, this volume reports findings concerning the impact of Project Developmental Continuity (PDC) on the parents of the evaluation study's cohort of children as well as preliminary findings on the relationship between family characteristics and program outcome variables up to the time the children had completed grade 1. Begun…

  19. Assessment of Program Impact Through First Grade, Volume II: Impact on Institutions. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Rosario, Jose; And Others

    As part of a longitudinal study evaluating program effects, this report, the second in a series of six, describes the impact of Project Developmental Continuity (PDC) on the institutional policies and procedures of participating Head Start centers and elementary schools up to the time the evaluation study's cohort of children had completed grade…

  20. Predictive time-series modeling using artificial neural networks for Linac beam symmetry: an empirical study.

    PubMed

    Li, Qiongge; Chan, Maria F

    2017-01-01

    Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field. © 2016 New York Academy of Sciences.

  1. VIIRS On-Orbit Calibration for Ocean Color Data Processing

    NASA Technical Reports Server (NTRS)

    Eplee, Robert E., Jr.; Turpie, Kevin R.; Fireman, Gwyn F.; Meister, Gerhard; Stone, Thomas C.; Patt, Frederick S.; Franz, Bryan; Bailey, Sean W.; Robinson, Wayne D.; McClain, Charles R.

    2012-01-01

    The NASA VIIRS Ocean Science Team (VOST) has the task of evaluating Suomi NPP VIIRS ocean color data for the continuity of the NASA ocean color climate data records. The generation of science quality ocean color data products requires an instrument calibration that is stable over time. Since the VIIRS NIR Degradation Anomaly directly impacts the bands used for atmospheric correction of the ocean color data (Bands M6 and M7), the VOST has adapted the VIIRS on-orbit calibration approach to meet the ocean science requirements. The solar diffuser calibration time series and the solar diffuser stability monitor time series have been used to derive changes in the instrument response and diffuser reflectance over time for bands M1-M11.

  2. Deformation history of Mauna Loa (Hawaii) from 2003 to 2014 through InSAR data: understanding the shorter-term processes

    NASA Astrophysics Data System (ADS)

    La Marra, Daniele; Poland, Michael P.; Acocella, Valerio; Battaglia, Maurizio; Miklius, Asta

    2016-04-01

    Geodesy allows detecting the deformation of volcanoes, thus understanding magmatic processes. This becomes particularly efficient when time series are available and volcanoes can be monitored on the mean-term (decades), and not only during a specific event. Here we exploit the SBAS technique, using SAR images from ENVISAT (descending and ascending orbits; 2003 - 2010) and COSMO-SkyMed (descending and ascending orbits; 2012 - 2014), to study a decade of deformation at Mauna Loa (Hawaii). These data are merged time series data from 24 continuously operating GPS stations, which allows us to calibrate the InSAR time series. Our results show a long-term inflation of the volcano from 2003 to 2014, reaching a peak of ~11 cm/yr on the summit area between mid-2004 to mid-2005 and then slowing down. Within this frame, we were able to identify five main periods with approximately linear deformation behavior. The inversion of the deformation data in the first four periods suggests the repeated, though not constant, intrusion of one or more dikes below the summit caldera and the upper Southwest Rift Zone. Moreover, the dike intrusion coincides with minor acceleration of flank slip. Such a behavior is distinctive and, with the exception of the nearby Kilauea, has not been observed at any other volcano on the mean term. It is proposed that continuous, even though not constant flank instability of the SE flank may promote semi-continuous intrusions in a volcano with a ready magma supply.

  3. When Dread Risks Are More Dreadful than Continuous Risks: Comparing Cumulative Population Losses over Time.

    PubMed

    Bodemer, Nicolai; Ruggeri, Azzurra; Galesic, Mirta

    2013-01-01

    People show higher sensitivity to dread risks, rare events that kill many people at once, compared with continuous risks, relatively frequent events that kill many people over a longer period of time. The different reaction to dread risks is often considered a bias: If the continuous risk causes the same number of fatalities, it should not be perceived as less dreadful. We test the hypothesis that a dread risk may have a stronger negative impact on the cumulative population size over time in comparison with a continuous risk causing the same number of fatalities. This difference should be particularly strong when the risky event affects children and young adults who would have produced future offspring if they had survived longer. We conducted a series of simulations, with varying assumptions about population size, population growth, age group affected by risky event, and the underlying demographic model. Results show that dread risks affect the population more severely over time than continuous risks that cause the same number of fatalities, suggesting that fearing a dread risk more than a continuous risk is an ecologically rational strategy.

  4. High Resolution Time Series of Plankton Communities: From Early Warning of Harmful Blooms to Sentinels of Climate Change

    NASA Astrophysics Data System (ADS)

    Sosik, H. M.; Campbell, L.; Olson, R. J.

    2016-02-01

    The combination of ocean observatory infrastructure and automated submersible flow cytometry provides an unprecedented capability for sustained high resolution time series of plankton, including taxa that are harmful or early indicators of ecosystem response to environmental change. On-going time series produced with the FlowCytobot series of instruments document important ways this challenge is already being met for phytoplankton and microzooplankton. FlowCytobot and Imaging FlowCytobot use a combination of laser-based scattering and fluorescence measurements and video imaging of individual particles to enumerate and characterize cells ranging from picocyanobacteria to large chaining-forming diatoms. Over a decade of observations at the Martha's Vineyard Coastal Observatory (MVCO), a cabled facility on the New England Shelf, have been compiled from repeated instrument deployments, typically 6 months or longer in duration. These multi-year high resolution (hourly to daily) time series are providing new insights into dynamics of community structure such as blooms, seasonality, and multi-year trends linked to regional climate-related variables. Similar observations in Texas coastal waters at the Texas Observatory for Algal Succession Time series (TOAST) have repeatedly provided early warning of harmful algal bloom events that threaten human and ecosystem health. As coastal ocean observing systems mature and expand, the continued integration of these type of detailed observations of the plankton will provide unparalleled information about variability and patterns of change at the base of the marine food webs, with direct implications for informed ecosystem-based management.

  5. Hydraulic Fatigue-Testing Machine

    NASA Technical Reports Server (NTRS)

    Hodo, James D.; Moore, Dennis R.; Morris, Thomas F.; Tiller, Newton G.

    1987-01-01

    Fatigue-testing machine applies fluctuating tension to number of specimens at same time. When sample breaks, machine continues to test remaining specimens. Series of tensile tests needed to determine fatigue properties of materials performed more rapidly than in conventional fatigue-testing machine.

  6. An Unmanned Spacecraft Subsystem Cost Model for Advanced Mission Planning

    NASA Technical Reports Server (NTRS)

    Madrid, G.

    1998-01-01

    As a NASA center, the Jet Propulsion Laboratory (JPL) is committed to the concept of developing and launching a continuously improving series of smaller robotic space exploration missions in shorter intervals of time (faster, better, cheaper).

  7. Periodicity and Multi-scale Analysis of Runoff and Sediment Load in the Wulanghe River, Jinsha River

    NASA Astrophysics Data System (ADS)

    Chen, Yiming

    2018-01-01

    Based on the annual runoff and sediment data (1959-2014 ) of Zongguantian hydrological station, time-frequency wavelet transform characteristics and their periodic rules of high and low flow alternating change were analyzed in multi-time scales by the Morlet continue wavelet transformation (CWT). It is concluded that the primary periods of runoff and sediment load time series of the high and low annual flow in the different time scales were 12-year, 3-year and 26-year, 18-year, 13-year, 5-year, respectively, and predicted that the major variant trend of the two time series would been gradually decreasing and been in the high flow period around 8-year (from 2014 to 2022) and 10-year (from 2014 to 2020).

  8. Validation of the inverse pulse wave transit time series as surrogate of systolic blood pressure in MVAR modeling.

    PubMed

    Giassi, Pedro; Okida, Sergio; Oliveira, Maurício G; Moraes, Raimes

    2013-11-01

    Short-term cardiovascular regulation mediated by the sympathetic and parasympathetic branches of the autonomic nervous system has been investigated by multivariate autoregressive (MVAR) modeling, providing insightful analysis. MVAR models employ, as inputs, heart rate (HR), systolic blood pressure (SBP) and respiratory waveforms. ECG (from which HR series is obtained) and respiratory flow waveform (RFW) can be easily sampled from the patients. Nevertheless, the available methods for acquisition of beat-to-beat SBP measurements during exams hamper the wider use of MVAR models in clinical research. Recent studies show an inverse correlation between pulse wave transit time (PWTT) series and SBP fluctuations. PWTT is the time interval between the ECG R-wave peak and photoplethysmography waveform (PPG) base point within the same cardiac cycle. This study investigates the feasibility of using inverse PWTT (IPWTT) series as an alternative input to SBP for MVAR modeling of the cardiovascular regulation. For that, HR, RFW, and IPWTT series acquired from volunteers during postural changes and autonomic blockade were used as input of MVAR models. Obtained results show that IPWTT series can be used as input of MVAR models, replacing SBP measurements in order to overcome practical difficulties related to the continuous sampling of the SBP during clinical exams.

  9. Outcomes of an intervention to improve hospital antibiotic prescribing: interrupted time series with segmented regression analysis.

    PubMed

    Ansari, Faranak; Gray, Kirsteen; Nathwani, Dilip; Phillips, Gabby; Ogston, Simon; Ramsay, Craig; Davey, Peter

    2003-11-01

    To evaluate an intervention to reduce inappropriate use of key antibiotics with interrupted time series analysis. The intervention is a policy for appropriate use of Alert Antibiotics (carbapenems, glycopeptides, amphotericin, ciprofloxacin, linezolid, piperacillin-tazobactam and third-generation cephalosporins) implemented through concurrent, patient-specific feedback by clinical pharmacists. Statistical significance and effect size were calculated by segmented regression analysis of interrupted time series of drug use and cost for 2 years before and after the intervention started. Use of Alert Antibiotics increased before the intervention started but decreased steadily for 2 years thereafter. The changes in slope of the time series were 0.27 defined daily doses/100 bed-days per month (95% CI 0.19-0.34) and pound 1908 per month (95% CI pound 1238- pound 2578). The cost of development, dissemination and implementation of the intervention ( pound 20133) was well below the most conservative estimate of the reduction in cost ( pound 133296), which is the lower 95% CI of effect size assuming that cost would not have continued to increase without the intervention. However, if use had continued to increase, the difference between predicted and actual cost of Alert Antibiotics was pound 572448 (95% CI pound 435696- pound 709176) over the 24 months after the intervention started. Segmented regression analysis of pharmacy stock data is a simple, practical and robust method for measuring the impact of interventions to change prescribing. The Alert Antibiotic Monitoring intervention was associated with significant decreases in total use and cost in the 2 years after the programme was implemented. In our hospital, the value of the data far exceeded the cost of processing and analysis.

  10. Statistical properties and time-frequency analysis of temperature, salinity and turbidity measured by the MAREL Carnot station in the coastal waters of Boulogne-sur-Mer (France)

    NASA Astrophysics Data System (ADS)

    Kbaier Ben Ismail, Dhouha; Lazure, Pascal; Puillat, Ingrid

    2016-10-01

    In marine sciences, many fields display high variability over a large range of spatial and temporal scales, from seconds to thousands of years. The longer recorded time series, with an increasing sampling frequency, in this field are often nonlinear, nonstationary, multiscale and noisy. Their analysis faces new challenges and thus requires the implementation of adequate and specific methods. The objective of this paper is to highlight time series analysis methods already applied in econometrics, signal processing, health, etc. to the environmental marine domain, assess advantages and inconvenients and compare classical techniques with more recent ones. Temperature, turbidity and salinity are important quantities for ecosystem studies. The authors here consider the fluctuations of sea level, salinity, turbidity and temperature recorded from the MAREL Carnot system of Boulogne-sur-Mer (France), which is a moored buoy equipped with physico-chemical measuring devices, working in continuous and autonomous conditions. In order to perform adequate statistical and spectral analyses, it is necessary to know the nature of the considered time series. For this purpose, the stationarity of the series and the occurrence of unit-root are addressed with the Augmented-Dickey Fuller tests. As an example, the harmonic analysis is not relevant for temperature, turbidity and salinity due to the nonstationary condition, except for the nearly stationary sea level datasets. In order to consider the dominant frequencies associated to the dynamics, the large number of data provided by the sensors should enable the estimation of Fourier spectral analysis. Different power spectra show a complex variability and reveal an influence of environmental factors such as tides. However, the previous classical spectral analysis, namely the Blackman-Tukey method, requires not only linear and stationary data but also evenly-spaced data. Interpolating the time series introduces numerous artifacts to the data. The Lomb-Scargle algorithm is adapted to unevenly-spaced data and is used as an alternative. The limits of the method are also set out. It was found that beyond 50% of missing measures, few significant frequencies are detected, several seasonalities are no more visible, and even a whole range of high frequency disappears progressively. Furthermore, two time-frequency decomposition methods, namely wavelets and Hilbert-Huang Transformation (HHT), are applied for the analysis of the entire dataset. Using the Continuous Wavelet Transform (CWT), some properties of the time series are determined. Then, the inertial wave and several low-frequency tidal waves are identified by the application of the Empirical Mode Decomposition (EMD). Finally, EMD based Time Dependent Intrinsic Correlation (TDIC) analysis is applied to consider the correlation between two nonstationary time series.

  11. Spatiotemporal Patterns of Precipitation-Modulated Landslide Deformation From Independent Component Analysis of InSAR Time Series

    NASA Astrophysics Data System (ADS)

    Cohen-Waeber, J.; Bürgmann, R.; Chaussard, E.; Giannico, C.; Ferretti, A.

    2018-02-01

    Long-term landslide deformation is disruptive and costly in urbanized environments. We rely on TerraSAR-X satellite images (2009-2014) and an improved data processing algorithm (SqueeSAR™) to produce an exceptionally dense Interferometric Synthetic Aperture Radar ground deformation time series for the San Francisco East Bay Hills. Independent and principal component analyses of the time series reveal four distinct spatial and temporal surface deformation patterns in the area around Blakemont landslide, which we relate to different geomechanical processes. Two components of time-dependent landslide deformation isolate continuous motion and motion driven by precipitation-modulated pore pressure changes controlled by annual seasonal cycles and multiyear drought conditions. Two components capturing more widespread seasonal deformation separate precipitation-modulated soil swelling from annual cycles that may be related to groundwater level changes and thermal expansion of buildings. High-resolution characterization of landslide response to precipitation is a first step toward improved hazard forecasting.

  12. Calculation of Rate Spectra from Noisy Time Series Data

    PubMed Central

    Voelz, Vincent A.; Pande, Vijay S.

    2011-01-01

    As the resolution of experiments to measure folding kinetics continues to improve, it has become imperative to avoid bias that may come with fitting data to a predetermined mechanistic model. Towards this end, we present a rate spectrum approach to analyze timescales present in kinetic data. Computing rate spectra of noisy time series data via numerical discrete inverse Laplace transform is an ill-conditioned inverse problem, so a regularization procedure must be used to perform the calculation. Here, we show the results of different regularization procedures applied to noisy multi-exponential and stretched exponential time series, as well as data from time-resolved folding kinetics experiments. In each case, the rate spectrum method recapitulates the relevant distribution of timescales present in the data, with different priors on the rate amplitudes naturally corresponding to common biases toward simple phenomenological models. These results suggest an attractive alternative to the “Occam’s razor” philosophy of simply choosing models with the fewest number of relaxation rates. PMID:22095854

  13. Solar signals detected within neutral atmospheric and ionospheric parameters

    NASA Astrophysics Data System (ADS)

    Koucka Knizova, Petra; Georgieva, Katya; Mosna, Zbysek; Kozubek, Michal; Kouba, Daniel; Kirov, Boian; Potuzníkova, Katerina; Boska, Josef

    2018-06-01

    We have analyzed time series of solar data together with the atmospheric and ionospheric measurements for solar cycles 19 till 23 according to particular data availability. For the analyses we have used long term data with 1-day sampling. By mean of Continuous Wavelet Transform (CWT) we have found common spectral domains within solar and atmospheric and ionospheric time series. Further we have identified terms when particular pairs of signals show high coherence applying Wavelet Transform Coherence (WTC). Despite wide oscillation ranges detected in particular time series CWT spectra we found only limited domains with high coherence by mean of WTC. Wavelet Transform Coherence reveals significant high power domains with stable phase difference for periods 1 month, 2 months, 6 months, 1 year, 2 years and 3-4 years between pairs of solar data and atmospheric and ionospheric data. The occurence of the detected domains vary significantly during particular solar cycle (SC) and from cycle to the following one. It indicates the changing solar forcing and/or atmospheric sensitivity with time.

  14. AQUAdexIM: highly efficient in-memory indexing and querying of astronomy time series images

    NASA Astrophysics Data System (ADS)

    Hong, Zhi; Yu, Ce; Wang, Jie; Xiao, Jian; Cui, Chenzhou; Sun, Jizhou

    2016-12-01

    Astronomy has always been, and will continue to be, a data-based science, and astronomers nowadays are faced with increasingly massive datasets, one key problem of which is to efficiently retrieve the desired cup of data from the ocean. AQUAdexIM, an innovative spatial indexing and querying method, performs highly efficient on-the-fly queries under users' request to search for Time Series Images from existing observation data on the server side and only return the desired FITS images to users, so users no longer need to download entire datasets to their local machines, which will only become more and more impractical as the data size keeps increasing. Moreover, AQUAdexIM manages to keep a very low storage space overhead and its specially designed in-memory index structure enables it to search for Time Series Images of a given area of the sky 10 times faster than using Redis, a state-of-the-art in-memory database.

  15. Study of spectro-temporal variation in paleo-climatic marine proxy records using wavelet transformations

    NASA Astrophysics Data System (ADS)

    Pandey, Chhavi P.

    2017-10-01

    Wavelet analysis is a powerful mathematical and computational tool to study periodic phenomena in time series particu-larly in the presence of potential frequency changes in time. Continuous wavelet transformation (CWT) provides localised spectral information of the analysed dataset and in particular useful to study multiscale, nonstationary processes occurring over finite spatial and temporal domains. In the present work, oxygen-isotope ratio from the plantonic foraminifera species (viz. Globigerina bul-loides and Globigerinoides ruber) acquired from the broad central plateau of the Maldives ridge situated in south-eastern Arabian sea have been used as climate proxy. CWT of the time series generated using both the biofacies indicate spectro-temporal varia-tion of the natural climatic cycles. The dominant period resembles to the period of Milankovitch glacial-interglacial cycle. Apart from that, various other cycles are present in the time series. The results are in good agreement with the astronomical theory of paleoclimates and can provide better visualisation of Indian summer monsoon in the context of climate change.

  16. Detection and characterization of lightning-based sources using continuous wavelet transform: application to audio-magnetotellurics

    NASA Astrophysics Data System (ADS)

    Larnier, H.; Sailhac, P.; Chambodut, A.

    2018-01-01

    Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.

  17. A 305-year continuous monthly rainfall series for the island of Ireland (1711-2016)

    NASA Astrophysics Data System (ADS)

    Murphy, Conor; Broderick, Ciaran; Burt, Timothy P.; Curley, Mary; Duffy, Catriona; Hall, Julia; Harrigan, Shaun; Matthews, Tom K. R.; Macdonald, Neil; McCarthy, Gerard; McCarthy, Mark P.; Mullan, Donal; Noone, Simon; Osborn, Timothy J.; Ryan, Ciara; Sweeney, John; Thorne, Peter W.; Walsh, Seamus; Wilby, Robert L.

    2018-03-01

    A continuous 305-year (1711-2016) monthly rainfall series (IoI_1711) is created for the Island of Ireland. The post 1850 series draws on an existing quality assured rainfall network for Ireland, while pre-1850 values come from instrumental and documentary series compiled, but not published by the UK Met Office. The series is evaluated by comparison with independent long-term observations and reconstructions of precipitation, temperature and circulation indices from across the British-Irish Isles. Strong decadal consistency of IoI_1711 with other long-term observations is evident throughout the annual, boreal spring and autumn series. Annually, the most recent decade (2006-2015) is found to be the wettest in over 300 years. The winter series is probably too dry between the 1740s and 1780s, but strong consistency with other long-term observations strengthens confidence from 1790 onwards. The IoI_1711 series has remarkably wet winters during the 1730s, concurrent with a period of strong westerly airflow, glacial advance throughout Scandinavia and near unprecedented warmth in the Central England Temperature record - all consistent with a strongly positive phase of the North Atlantic Oscillation. Unusually wet summers occurred in the 1750s, consistent with proxy (tree-ring) reconstructions of summer precipitation in the region. Our analysis shows that inter-decadal variability of precipitation is much larger than previously thought, while relationships with key modes of climate variability are time-variant. The IoI_1711 series reveals statistically significant multi-centennial trends in winter (increasing) and summer (decreasing) seasonal precipitation. However, given uncertainties in the early winter record, the former finding should be regarded as tentative. The derived record, one of the longest continuous series in Europe, offers valuable insights for understanding multi-decadal and centennial rainfall variability in Ireland, and provides a firm basis for benchmarking other long-term records and reconstructions of past climate. Correlation of Irish rainfall with other parts of Europe increases the utility of the series for understanding historical climate in further regions.

  18. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  19. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  20. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  1. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  2. 33 CFR Appendix B to Part 263 - Application of Multiobjective Planning Framework to Continuing Authorities Program

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Authorities Program 1. General. The planning process described in the ER 1105-2-200 series of regulations... at the same time keeping the requirements for information and analyses consistent with the scope of the study, solutions recommended, and the Program completion-time objectives outlined in § 263.18 of...

  3. Amoco Fabric and Fibers. PLATO Evaluation Series.

    ERIC Educational Resources Information Center

    Dennen, Venessa

    This PLATO (registered) mathematics curriculum was used in a pilot study as a continuing education offering for employees of an Amoco Fabric and Fibers plant in North Carolina. Thirty-eight Amoco employees used the PLATO learning system over a 6-month period, during which time their progress, in terms of grade level mastery and time, in terms of…

  4. Challenges to the New Republic: Prelude to the War of 1812. Public Policy Debate in the Classroom. Choices for the 21st Century Education Project. [Student Guidebook and] Teacher's Resource Book.

    ERIC Educational Resources Information Center

    Kampmeier, Scott

    This 4-day curriculum unit explores U.S. foreign policy between 1787 and 1812. During this time the United States faced a series of foreign policy challenges that threatened its survival as an independent, constitutional republic. Between 1793 and 1815, a nearly continuous series of wars pitting the French against the British engulfed the European…

  5. An Approach To Using All Location Tagged Numerical Data Sets As Continuous Fields With User-Assigned Continuity As A Basis For User-Driven Data Assimilation

    NASA Astrophysics Data System (ADS)

    Vernon, F.; Arrott, M.; Orcutt, J. A.; Mueller, C.; Case, J.; De Wardener, G.; Kerfoot, J.; Schofield, O.

    2013-12-01

    Any approach sophisticated enough to handle a variety of data sources and scale, yet easy enough to promote wide use and mainstream adoption is required to address the following mappings: - From the authored domain of observation to the requested domain of interest; - From the authored spatiotemporal resolution to the requested resolution; and - From the representation of data placed on wide variety of discrete mesh types to the use of that data as a continuos field with a selectable continuity. The Open Geospatial Consortium's (OGC) Reference Model[1] with its direct association with the ISO 19000 series standards provides a comprehensive foundation to represent all data on any type of mesh structure, aka "Discrete Coverages". The Reference Model also provides the specification for the core operations required to utilize any Discrete Coverage. The FEniCS Project[2] provides a comprehensive model for how to represent the Basis Functions on mesh structures as "Degrees of Freedom" to present discrete data as continuous fields with variable continuity. In this talk, we will present the research and development the OOI Cyberinfrastructure Project is pursuing to integrate these approaches into a comprehensive Application Programming Interface (API) to author, acquire and operate on the broad range of data formulation from time series, trajectories and tables through to time variant finite difference grids and finite element meshes.

  6. Phytoplankton production in the Sargasso Sea as determined using optical mooring data

    NASA Technical Reports Server (NTRS)

    Waters, K. J.; Smith, R. C.; Marra, J.

    1994-01-01

    Optical measurements from an untended mooring provide high-frequency observations of in-water optical properties and permit the estimation of important biological parameters continuously as a function of time. A 9-month time series, composed of three separate deployments, of optical data from the BIOWATT 1987 deep-sea mooring located in the oligotrophic waters of the Sargasso Sea at 34 deg N, 70 deg W are presented. These data have been tested using several bio-optical models for the purpose of providing a continuous estimate of phytoplankton productivity. The data are discussed in the context of contemporaneous shipboard observations and for future ocean color satellite observations. We present a continuous estimation of phytoplankton productivity for the 9-month time series. Results from the first 70-day deployment are emphasized to demonstrate the utility of optical observations as proxy measures of biological parameters, to present preliminary analysis, and to compare our bio-optical observations with concurrent physical observations. The bio-optical features show variation in response to physical forcings including diel variations of incident solar irradiance, episodic changes corresponding to wind forcing, variability caused by advective mesoscale eddy events in the vicinity of the mooring, and seasonal variability corresponding to changes in solar radiation, shoaling of the mixed layer depth, and succession of phytoplankton populations.

  7. Determination of recharge fraction of injection water in combined abstraction-injection wells using continuous radon monitoring.

    PubMed

    Lee, Kil Yong; Kim, Yong-Chul; Cho, Soo Young; Kim, Seong Yun; Yoon, Yoon Yeol; Koh, Dong Chan; Ha, Kyucheol; Ko, Kyung-Seok

    2016-12-01

    The recharge fractions of injection water in combined abstraction-injection wells (AIW) were determined using continuous radon monitoring and radon mass balance model. The recharge system consists of three combined abstraction-injection wells, an observation well, a collection tank, an injection tank, and tubing for heating and transferring used groundwater. Groundwater was abstracted from an AIW and sprayed on the water-curtain heating facility and then the used groundwater was injected into the same AIW well by the recharge system. Radon concentrations of fresh groundwater in the AIWs and of used groundwater in the injection tank were measured continuously using a continuous radon monitoring system. Radon concentrations of fresh groundwater in the AIWs and used groundwater in the injection tank were in the ranges of 10,830-13,530 Bq/m 3 and 1500-5600 Bq/m 3 , respectively. A simple radon mass balance model was developed to estimate the recharge fraction of used groundwater in the AIWs. The recharge fraction in the 3 AIWs was in the range of 0.595-0.798. The time series recharge fraction could be obtained using the continuous radon monitoring system with a simple radon mass balance model. The results revealed that the radon mass balance model using continuous radon monitoring was effective for determining the time series recharge fractions in AIWs as well as for characterizing the recharge system. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Age-Related Differences in Brain Electrical Activity during Extended Continuous Face Recognition in Younger Children, Older Children and Adults

    ERIC Educational Resources Information Center

    Van Strien, Jan W.; Glimmerveen, Johanna C.; Franken, Ingmar H. A.; Martens, Vanessa E. G.; de Bruin, Eveline A.

    2011-01-01

    To examine the development of recognition memory in primary-school children, 36 healthy younger children (8-9 years old) and 36 healthy older children (11-12 years old) participated in an ERP study with an extended continuous face recognition task (Study 1). Each face of a series of 30 faces was shown randomly six times interspersed with…

  9. Analysis and Forecasting of Shoreline Position

    NASA Astrophysics Data System (ADS)

    Barton, C. C.; Tebbens, S. F.

    2007-12-01

    Analysis of historical shoreline positions on sandy coasts, in the geologic record, and study of sea-level rise curves reveals that the dynamics of the underlying processes produce temporal/spatial signals that exhibit power scaling and are therefore self-affine fractals. Self-affine time series signals can be quantified over many orders of magnitude in time and space in terms of persistence, a measure of the degree of correlation between adjacent values in the stochastic portion of a time series. Fractal statistics developed for self-affine time series are used to forecast a probability envelope bounding future shoreline positions. The envelope provides the standard deviation as a function of three variables: persistence, a constant equal to the value of the power spectral density when 1/period equals 1, and the number of time increments. The persistence of a twenty-year time series of the mean-high-water (MHW) shoreline positions was measured for four profiles surveyed at Duck, NC at the Field Research Facility (FRF) by the U.S. Army Corps of Engineers. The four MHW shoreline time series signals are self-affine with persistence ranging between 0.8 and 0.9, which indicates that the shoreline position time series is weakly persistent (where zero is uncorrelated), and has highly varying trends for all time intervals sampled. Forecasts of a probability envelope for future MHW positions are made for the 20 years of record and beyond to 50 years from the start of the data records. The forecasts describe the twenty-year data sets well and indicate that within a 96% confidence envelope, future decadal MHW shoreline excursions should be within 14.6 m of the position at the start of data collection. This is a stable-oscillatory shoreline. The forecasting method introduced here includes the stochastic portion of the time series while the traditional method of predicting shoreline change reduces the time series to a linear trend line fit to historic shoreline positions and extrapolated linearly to forecast future positions with a linearly increasing mean that breaks the confidence envelope eight years into the future and continues to increase. The traditional method is a poor representation of the observed shoreline position time series and is a poor basis for extrapolating future shoreline positions.

  10. Time series modeling of live-cell shape dynamics for image-based phenotypic profiling.

    PubMed

    Gordonov, Simon; Hwang, Mun Kyung; Wells, Alan; Gertler, Frank B; Lauffenburger, Douglas A; Bathe, Mark

    2016-01-01

    Live-cell imaging can be used to capture spatio-temporal aspects of cellular responses that are not accessible to fixed-cell imaging. As the use of live-cell imaging continues to increase, new computational procedures are needed to characterize and classify the temporal dynamics of individual cells. For this purpose, here we present the general experimental-computational framework SAPHIRE (Stochastic Annotation of Phenotypic Individual-cell Responses) to characterize phenotypic cellular responses from time series imaging datasets. Hidden Markov modeling is used to infer and annotate morphological state and state-switching properties from image-derived cell shape measurements. Time series modeling is performed on each cell individually, making the approach broadly useful for analyzing asynchronous cell populations. Two-color fluorescent cells simultaneously expressing actin and nuclear reporters enabled us to profile temporal changes in cell shape following pharmacological inhibition of cytoskeleton-regulatory signaling pathways. Results are compared with existing approaches conventionally applied to fixed-cell imaging datasets, and indicate that time series modeling captures heterogeneous dynamic cellular responses that can improve drug classification and offer additional important insight into mechanisms of drug action. The software is available at http://saphire-hcs.org.

  11. Re-analysis of Alaskan benchmark glacier mass-balance data using the index method

    USGS Publications Warehouse

    Van Beusekom, Ashely E.; O'Nell, Shad R.; March, Rod S.; Sass, Louis C.; Cox, Leif H.

    2010-01-01

    At Gulkana and Wolverine Glaciers, designated the Alaskan benchmark glaciers, we re-analyzed and re-computed the mass balance time series from 1966 to 2009 to accomplish our goal of making more robust time series. Each glacier's data record was analyzed with the same methods. For surface processes, we estimated missing information with an improved degree-day model. Degree-day models predict ablation from the sum of daily mean temperatures and an empirical degree-day factor. We modernized the traditional degree-day model and derived new degree-day factors in an effort to match the balance time series more closely. We estimated missing yearly-site data with a new balance gradient method. These efforts showed that an additional step needed to be taken at Wolverine Glacier to adjust for non-representative index sites. As with the previously calculated mass balances, the re-analyzed balances showed a continuing trend of mass loss. We noted that the time series, and thus our estimate of the cumulative mass loss over the period of record, was very sensitive to the data input, and suggest the need to add data-collection sites and modernize our weather stations.

  12. Time-Series Analysis of Remotely-Sensed SeaWiFS Chlorophyll in River-Influenced Coastal Regions

    NASA Technical Reports Server (NTRS)

    Acker, James G.; McMahon, Erin; Shen, Suhung; Hearty, Thomas; Casey, Nancy

    2009-01-01

    The availability of a nearly-continuous record of remotely-sensed chlorophyll a data (chl a) from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) mission, now longer than ten years, enables examination of time-series trends for multiple global locations. Innovative data analysis technology available on the World Wide Web facilitates such analyses. In coastal regions influenced by river outflows, chl a is not always indicative of actual trends in phytoplankton chlorophyll due to the interference of colored dissolved organic matter and suspended sediments; significant chl a timeseries trends for coastal regions influenced by river outflows may nonetheless be indicative of important alterations of the hydrologic and coastal environment. Chl a time-series analysis of nine marine regions influenced by river outflows demonstrates the simplicity and usefulness of this technique. The analyses indicate that coastal time-series are significantly influenced by unusual flood events. Major river systems in regions with relatively low human impact did not exhibit significant trends. Most river systems with demonstrated human impact exhibited significant negative trends, with the noteworthy exception of the Pearl River in China, which has a positive trend.

  13. Clustering Multivariate Time Series Using Hidden Markov Models

    PubMed Central

    Ghassempour, Shima; Girosi, Federico; Maeder, Anthony

    2014-01-01

    In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs), where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers. PMID:24662996

  14. a Landsat Time-Series Stacks Model for Detection of Cropland Change

    NASA Astrophysics Data System (ADS)

    Chen, J.; Chen, J.; Zhang, J.

    2017-09-01

    Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.

  15. Analysis on Difference of Forest Phenology Extracted from EVI and LAI Based on PhenoCams

    NASA Astrophysics Data System (ADS)

    Wang, C.; Jing, L.; Qinhuo, L.

    2017-12-01

    Land surface phenology can make up for the deficiency of field observation with advantages of capturing the continuous expression of phenology on a large scale. However, there are some variability in phenological metrics derived from different satellite time-series data of vegetation parameters. This paper aims at assessing the difference of phenology information extracted from EVI and LAI time series. To achieve this, some web-camera sites were selected to analyze the characteristics between MODIS-EVI and MODIS-LAI time series from 2010 to 2014 for different forest types, including evergreen coniferous forest, evergreen broadleaf forest, deciduous coniferous forest and deciduous broadleaf forest. At the same time, satellite-based phenological metrics were extracted by the Logistics algorithm and compared with camera-based phenological metrics. Results show that the SOS and EOS that are extracted from LAI are close to bud burst and leaf defoliation respectively, while the SOS and EOS that are extracted from EVI is close to leaf unfolding and leaf coloring respectively. Thus the SOS that is extracted from LAI is earlier than that from EVI, while the EOS that is extracted from LAI is later than that from EVI at deciduous forest sites. Although the seasonal variation characteristics of evergreen forests are not apparent, significant discrepancies exist in LAI time series and EVI time series. In addition, Satellite- and camera-based phenological metrics agree well generally, but EVI has higher correlation with the camera-based canopy greenness (green chromatic coordinate, gcc) than LAI.

  16. Comparisons of molecular karyotype and RAPD patterns of anuran trypanosome isolates during long-term in vitro cultivation.

    PubMed

    Lun, Z R; Desser, S S

    1996-01-01

    The patterns of random amplified fragments and molecular karyotypes of 12 isolates of anuran trypanosomes continuously cultured in vitro were compared by random amplified polymorphic DNA (RAPD) analysis and pulsed field gradient gel electrophoresis (PFGE). The time interval between preparation of two series of samples was one year. Changes were not observed in the number and size of sharp, amplified fragments of DNA samples from both series examined with the ten primers used. Likewise, changes in the molecular karyotypes were not detected between the two samples of these isolates. These results suggest that the molecular karyotype and the RAPD patterns of the anuran trypanosomes remain stable after being cultured continuously in vitro for one year.

  17. Real time wave forecasting using wind time history and numerical model

    NASA Astrophysics Data System (ADS)

    Jain, Pooja; Deo, M. C.; Latha, G.; Rajendran, V.

    Operational activities in the ocean like planning for structural repairs or fishing expeditions require real time prediction of waves over typical time duration of say a few hours. Such predictions can be made by using a numerical model or a time series model employing continuously recorded waves. This paper presents another option to do so and it is based on a different time series approach in which the input is in the form of preceding wind speed and wind direction observations. This would be useful for those stations where the costly wave buoys are not deployed and instead only meteorological buoys measuring wind are moored. The technique employs alternative artificial intelligence approaches of an artificial neural network (ANN), genetic programming (GP) and model tree (MT) to carry out the time series modeling of wind to obtain waves. Wind observations at four offshore sites along the east coast of India were used. For calibration purpose the wave data was generated using a numerical model. The predicted waves obtained using the proposed time series models when compared with the numerically generated waves showed good resemblance in terms of the selected error criteria. Large differences across the chosen techniques of ANN, GP, MT were not noticed. Wave hindcasting at the same time step and the predictions over shorter lead times were better than the predictions over longer lead times. The proposed method is a cost effective and convenient option when a site-specific information is desired.

  18. Monitoring of seismic time-series with advanced parallel computational tools and complex networks

    NASA Astrophysics Data System (ADS)

    Kechaidou, M.; Sirakoulis, G. Ch.; Scordilis, E. M.

    2012-04-01

    Earthquakes have been in the focus of human and research interest for several centuries due to their catastrophic effect to the everyday life as they occur almost all over the world demonstrating a hard to be modelled unpredictable behaviour. On the other hand, their monitoring with more or less technological updated instruments has been almost continuous and thanks to this fact several mathematical models have been presented and proposed so far to describe possible connections and patterns found in the resulting seismological time-series. Especially, in Greece, one of the most seismically active territories on earth, detailed instrumental seismological data are available from the beginning of the past century providing the researchers with valuable and differential knowledge about the seismicity levels all over the country. Considering available powerful parallel computational tools, such as Cellular Automata, these data can be further successfully analysed and, most important, modelled to provide possible connections between different parameters of the under study seismic time-series. More specifically, Cellular Automata have been proven very effective to compose and model nonlinear complex systems resulting in the advancement of several corresponding models as possible analogues of earthquake fault dynamics. In this work preliminary results of modelling of the seismic time-series with the help of Cellular Automata so as to compose and develop the corresponding complex networks are presented. The proposed methodology will be able to reveal under condition hidden relations as found in the examined time-series and to distinguish the intrinsic time-series characteristics in an effort to transform the examined time-series to complex networks and graphically represent their evolvement in the time-space. Consequently, based on the presented results, the proposed model will eventually serve as a possible efficient flexible computational tool to provide a generic understanding of the possible triggering mechanisms as arrived from the adequately monitoring and modelling of the regional earthquake phenomena.

  19. Monitoring scale scores over time via quality control charts, model-based approaches, and time series techniques.

    PubMed

    Lee, Yi-Hsuan; von Davier, Alina A

    2013-07-01

    Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.

  20. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    PubMed

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  1. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  2. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  3. Laboratory and Modeling Studies of Insect Swarms

    DTIC Science & Technology

    2016-03-10

    and measuring the response. These novel methods allowed us for the first time to characterize precisely properties of the swarm at the group level... Time series for a randomly chosen pair as well as its continuous wavelet transform (CWT; bottom panel). Nearly all of the power in the signal for... based time -frequency analysis to identify such transient interactions, as long as they modified the frequency structure of the insect flight

  4. Visualization of synchronization of the uterine contraction signals: running cross-correlation and wavelet running cross-correlation methods.

    PubMed

    Oczeretko, Edward; Swiatecka, Jolanta; Kitlas, Agnieszka; Laudanski, Tadeusz; Pierzynski, Piotr

    2006-01-01

    In physiological research, we often study multivariate data sets, containing two or more simultaneously recorded time series. The aim of this paper is to present the cross-correlation and the wavelet cross-correlation methods to assess synchronization between contractions in different topographic regions of the uterus. From a medical point of view, it is important to identify time delays between contractions, which may be of potential diagnostic significance in various pathologies. The cross-correlation was computed in a moving window with a width corresponding to approximately two or three contractions. As a result, the running cross-correlation function was obtained. The propagation% parameter assessed from this function allows quantitative description of synchronization in bivariate time series. In general, the uterine contraction signals are very complicated. Wavelet transforms provide insight into the structure of the time series at various frequencies (scales). To show the changes of the propagation% parameter along scales, a wavelet running cross-correlation was used. At first, the continuous wavelet transforms as the uterine contraction signals were received and afterwards, a running cross-correlation analysis was conducted for each pair of transformed time series. The findings show that running functions are very useful in the analysis of uterine contractions.

  5. USGS Menlo Park GPS Data Processing Techniques and Derived North America Velocity Field (Invited)

    NASA Astrophysics Data System (ADS)

    Svarc, J. L.; Murray-Moraleda, J. R.; Langbein, J. O.

    2010-12-01

    The U.S. Geological Survey in Menlo Park routinely conducts repeated GPS surveys of geodetic markers throughout the western United States using dual-frequency geodetic GPS receivers. We combine campaign, continuous, and semi-permanent data to present a North America fixed velocity field for regions in the western United States. Mobile campaign-based surveys require less up-front investment than permanently monumented and telemetered GPS systems, and hence have achieved a broad and dense spatial coverage. The greater flexibility and mobility comes at the cost of greater uncertainties in individual daily position solutions. We also routinely process continuous GPS data collected at PBO stations operated by UNAVCO along with data from other continuous GPS networks such as BARD, PANGA, and CORS operated by other agencies. We have broken the Western US into several subnetworks containing approximately 150-250 stations each. The data are processed using JPL’s GIPSY-OASIS II release 5.0 software using a modified precise positioning strategy (Zumberge and others, 1997). We use the “ambizap” code provided by Geoff Blewitt (Blewitt, 2008) to fix phase ambiguities in continuous networks. To mitigate the effect of common mode noise we use the positions of stations in the network with very long, clean time series (i.e. those with no large outliers or offsets) to transform all position estimates into “regionally filtered” results following the approach of Hammond and Thatcher (2007). Velocity uncertainties from continuously operated GPS stations tend to be about 3 times smaller than those from campaign data. Langbein (2004) presents a maximum likelihood method for fitting a time series employing a variety of temporal noise models. We assume that GPS observations are contaminated by a combination of white, flicker, and random walk noise. For continuous and semi-permanent time series longer than 2 years we estimate these values, otherwise we fix the amplitudes of these processes to 0.85 mm, 1.7 mm/yr1/4, and 0.4 mm/yr1/2 respectively for the north components, 0.84 mm, 1.4 mm/yr1/4, and 0.6 mm/yr1/2 respectively for the east components and 3.2 mm, 6.4 mm/yr1/4, and 0.0 mm/yr1/2 respectively for the vertical. We have also deployed “semi-permanent” stations in selected regions of California. Semi-permanent stations have the advantage of increasing the density of coverage without the high cost of monumentation and telemetry associated with continuous GPS stations. Also, because of the increased temporal coverage of these stations, accurate estimates of station velocities can be achieved in a far shorter time period than from campaign mode surveys.

  6. Assessment of Program Impact Through First Grade, Volume I: The Context, Conceptual Approach and Methods of the Evaluation. An Evaluation of Project Developmental Continuity. Interim Report X.

    ERIC Educational Resources Information Center

    Rosario, Jose; And Others

    This volume is the first of a series reporting evaluation findings on the impact of Project Developmental Continuity (PDC) on institutions, classroom staff, parents and children from the time the children entered Head Start through the first grade. PDC was begun in 1974 with the purpose of ensuring that disadvantaged children receive continuous…

  7. Classification of time series patterns from complex dynamic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less

  8. Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján

    2017-06-01

    It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.

  9. Period and phase comparisons of near-decadal oscillations in solar, geomagnetic, and cosmic ray time series

    NASA Astrophysics Data System (ADS)

    Juckett, David A.

    2001-09-01

    A more complete understanding of the periodic dynamics of the Sun requires continued exploration of non-11-year oscillations in addition to the benchmark 11-year sunspot cycle. In this regard, several solar, geomagnetic, and cosmic ray time series were examined to identify common spectral components and their relative phase relationships. Several non-11-year oscillations were identified within the near-decadal range with periods of ~8, 10, 12, 15, 18, 22, and 29 years. To test whether these frequency components were simply low-level noise or were related to a common source, the phases were extracted for each component in each series. The phases were nearly identical across the solar and geomagnetic series, while the corresponding components in four cosmic ray surrogate series exhibited inverted phases, similar to the known phase relationship with the 11-year sunspot cycle. Cluster analysis revealed that this pattern was unlikely to occur by chance. It was concluded that many non-11-year oscillations truly exist in the solar dynamical environment and that these contribute to the complex variations observed in geomagnetic and cosmic ray time series. Using the different energy sensitivities of the four cosmic ray surrogate series, a preliminary indication of the relative intensities of the various solar-induced oscillations was observed. It provides evidence that many of the non-11-year oscillations result from weak interplanetary magnetic field/solar wind oscillations that originate from corresponding variations in the open-field regions of the Sun.

  10. Unobtrusive Indicators of Cultural Change: Neckties, Girdles, Marijuana, Garbage, Magazines, and Urban Sprawl.

    ERIC Educational Resources Information Center

    Felson, Marcus

    1983-01-01

    Different types of time-series data sets can be used to identify cultural change and continuity. Indicators, including musical instruments, clothing, sporting goods, drugs, garbage, telephones, and magazines, are used to study social change since World War II. (Author/RM)

  11. Distinguished Lecture Series - Balancing the Energy & Climate Budget

    ScienceCinema

    None

    2017-12-09

    The average American uses 11400 Watts of power continuously. This is the equivalent of burning 114 x100 Watt light bulbs, all the time. The average person globally uses 2255 Watts of power, or a little less than 23 x100 Watt light bulbs.

  12. Using chaotic forcing to detect damage in a structure

    USGS Publications Warehouse

    Moniz, L.; Nichols, J.; Trickey, S.; Seaver, M.; Pecora, D.; Pecora, L.

    2005-01-01

    In this work we develop a numerical test for Holder continuity and apply it and another test for continuity to the difficult problem of detecting damage in structures. We subject a thin metal plate with incremental damage to the plate changes, its filtering properties, and therefore the phase space trajectories of the response chaotic excitation of various bandwidths. Damage to the plate changes its filtering properties and therefore the phase space of the response. Because the data are multivariate (the plate is instrumented with multiple sensors) we use a singular value decomposition of the set of the output time series to reduce the embedding dimension of the response time series. We use two geometric tests to compare an attractor reconstructed from data from an undamaged structure to that reconstructed from data from a damaged structure. These two tests translate to testing for both generalized and differentiable synchronization between responses. We show loss of synchronization of responses with damage to the structure. ?? 2005 American Institute of Physics.

  13. Using chaotic forcing to detect damage in a structure.

    USGS Publications Warehouse

    Moniz, L.; Nichols, J.; Trickey, S.; Seaver, M.; Pecora, D.; Pecora, L.

    2005-01-01

    In this work we develop a numerical test for Holder continuity and apply it and another test for continuity to the difficult problem of detecting damage in structures. We subject a thin metal plate with incremental damage to the plate changes, its filtering properties, and therefore the phase space trajectories of the response chaotic excitation of various bandwidths. Damage to the plate changes its filtering properties and therefore the phase space of the response. Because the data are multivariate (the plate is instrumented with multiple sensors) we use a singular value decomposition of the set of the output time series to reduce the embedding dimension of the response time series. We use two geometric tests to compare an attractor reconstructed from data from an undamaged structure to that reconstructed from data from a damaged structure. These two tests translate to testing for both generalized and differentiable synchronization between responses. We show loss of synchronization of responses with damage to the structure.

  14. Assessment of Current Global and Regional Mean Sea Level Estimates Based on the TOPEX/Poseidon Jason-1 and 2 Climate Data Record

    NASA Technical Reports Server (NTRS)

    Beckley, B. D.; Lemoine, F. G.; Zelensky, N. P.; Yang, X.; Holmes, S.; Ray, R. D.; Mitchum, G. T.; Desai, S.; Brown, S.; Haines, B.

    2011-01-01

    Recent developments in Precise Orbit Determinations (POD) due to in particular to revisions to the terrestrial reference frame realization and the time variable gravity (TVG) continues to provide improvements to the accuracy and stability of the PO directly affecting mean sea level (MSL) estimates. Long-term credible MSL estimates require the development and continued maintenance of a stable reference frame, along with vigilant monitoring of the performance of the independent tracking systems used to calculate the orbits for altimeter spacecrafts. The stringent MSL accuracy requirements of a few tenths of an mm/yr are particularly essential for mass budget closure analysis over the relative short time period of Jason-l &2, GRACE, and Argo coincident measurements. In an effort to adhere to cross mission consistency, we have generated a full time series of experimental orbits (GSFC stdlllO) for TOPEX/Poseidon (TP), Jason-I, and OSTM based on an improved terrestrial reference frame (TRF) realization (ITRF2008), revised static (GGM03s), and time variable gravity field (Eigen6s). In this presentation we assess the impact of the revised precision orbits on inter-mission bias estimates, and resultant global and regional MSL trends. Tide gauge verification results are shown to assess the current stability of the Jason-2 sea surface height time series that suggests a possible discontinuity initiated in early 2010. Although the Jason-2 time series is relatively short (approximately 3 years), a thorough review of the entire suite of geophysical and environmental range corrections is warranted and is underway to maintain the fidelity of the record.

  15. Visibility graphlet approach to chaotic time series

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mutua, Stephen; Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega; Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems.more » Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.« less

  16. Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.

    PubMed

    Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester

    2016-11-01

    Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Enhancement to Non-Contacting Stress Measurement of Blade Vibration Frequency

    NASA Technical Reports Server (NTRS)

    Platt, Michael; Jagodnik, John

    2011-01-01

    A system for turbo machinery blade vibration has been developed that combines time-of-arrival sensors for blade vibration amplitude measurement and radar sensors for vibration frequency and mode identification. The enabling technology for this continuous blade monitoring system is the radar sensor, which provides a continuous time series of blade displacement over a portion of a revolution. This allows the data reduction algorithms to directly calculate the blade vibration frequency and to correctly identify the active modes of vibration. The work in this project represents a significant enhancement in the mode identification and stress calculation accuracy in non-contacting stress measurement system (NSMS) technology when compared to time-of-arrival measurements alone.

  18. Classification mapping and species identification of salt marshes based on a short-time interval NDVI time-series from HJ-1 optical imagery

    NASA Astrophysics Data System (ADS)

    Sun, Chao; Liu, Yongxue; Zhao, Saishuai; Zhou, Minxi; Yang, Yuhao; Li, Feixue

    2016-03-01

    Salt marshes are seen as the most dynamic and valuable ecosystems in coastal zones, and in these areas, it is crucial to obtain accurate remote sensing information on the spatial distributions of species over time. However, discriminating various types of salt marsh is rather difficult because of their strong spectral similarities. Previous salt marsh mapping studies have focused mainly on high spatial and spectral (i.e., hyperspectral) resolution images combined with auxiliary information; however, the results are often limited to small regions. With a high temporal and moderate spatial resolution, the Chinese HuanJing-1 (HJ-1) satellite optical imagery can be used not only to monitor phenological changes of salt marsh vegetation over short-time intervals, but also to obtain coverage of large areas. Here, we apply HJ-1 satellite imagery to the middle coast of Jiangsu in east China to monitor changes in saltmarsh vegetation cover. First, we constructed a monthly NDVI time-series to classify various types of salt marsh and then we tested the possibility of using compressed time-series continuously, to broaden the applicability of this particular approach. Our principal findings are as follows: (1) the overall accuracy of salt marsh mapping based on the monthly NDVI time-series was 90.3%, which was ∼16.0% higher than the single-phase classification strategy; (2) a compressed time-series, including NDVI from six key months (April, June-September, and November), demonstrated very little reduction (2.3%) in overall accuracy but led to obvious improvements in unstable regions; and (3) a simple rule for Spartina alterniflora identification was established using a scene solely from November, which may provide an effective way for regularly monitoring its distribution.

  19. Starting and Promoting a "First-Time" Association Seminar Series. TECHNIQUES.

    ERIC Educational Resources Information Center

    Paul, Sharon A.

    1984-01-01

    As the competition among providers in the continuing education market intensifies, universities starting new seminars will need to alter their marketing and recruitment procedures drastically. Telemarketing and a two-step marketing approach will undoubtedly become more widespread in the future. Individuals responsible for marketing continuing…

  20. Functional data analysis: An approach for environmental ordination and matching discrete with continuous observations

    EPA Science Inventory

    Investigators are frequently confronted with data sets that include both discrete observations and extended time series of environmental data that had been collected by autonomous recorders. Evaluating the relationships between these two kinds of data is challenging. A common a...

  1. Human factors phase II: design and evaluation of decision aids for control of high-speed trains: experiments and model

    DOT National Transportation Integrated Search

    1996-12-01

    Although the speed of some guided ground transportation systems continues to increase, the reaction time and the sensory : and information processing capacities of railroad personnel remain constant. This second report in a series examining : critica...

  2. Dynamical glucometry: Use of multiscale entropy analysis in diabetes

    NASA Astrophysics Data System (ADS)

    Costa, Madalena D.; Henriques, Teresa; Munshi, Medha N.; Segal, Alissa R.; Goldberger, Ary L.

    2014-09-01

    Diabetes mellitus (DM) is one of the world's most prevalent medical conditions. Contemporary management focuses on lowering mean blood glucose values toward a normal range, but largely ignores the dynamics of glucose fluctuations. We probed analyte time series obtained from continuous glucose monitor (CGM) sensors. We show that the fluctuations in CGM values sampled every 5 min are not uncorrelated noise. Next, using multiscale entropy analysis, we quantified the complexity of the temporal structure of the CGM time series from a group of elderly subjects with type 2 DM and age-matched controls. We further probed the structure of these CGM time series using detrended fluctuation analysis. Our findings indicate that the dynamics of glucose fluctuations from control subjects are more complex than those of subjects with type 2 DM over time scales ranging from about 5 min to 5 h. These findings support consideration of a new framework, dynamical glucometry, to guide mechanistic research and to help assess and compare therapeutic interventions, which should enhance complexity of glucose fluctuations and not just lower mean and variance of blood glucose levels.

  3. Very slow lava extrusion continued for more than five years after the 2011 Shinmoedake eruption observed from SAR interferometry

    NASA Astrophysics Data System (ADS)

    Ozawa, T.; Miyagi, Y.

    2017-12-01

    Shinmoe-dake located to SW Japan erupted in January 2011 and lava accumulated in the crater (e.g., Ozawa and Kozono, EPS, 2013). Last Vulcanian eruption occurred in September 2011, and after that, no eruption has occurred until now. Miyagi et al. (GRL, 2014) analyzed TerraSAR-X and Radarsat-2 SAR data acquired after the last eruption and found continuous inflation in the crater. Its inflation decayed with time, but had not terminated in May 2013. Since the time-series of inflation volume change rate fitted well to the exponential function with the constant term, we suggested that lava extrusion had continued in long-term due to deflation of shallow magma source and to magma supply from deeper source. To investigate its deformation after that, we applied InSAR to Sentinel-1 and ALOS-2 SAR data. Inflation decayed further, and almost terminated in the end of 2016. It means that this deformation has continued more than five years from the last eruption. We have found that the time series of inflation volume change rate fits better to the double-exponential function than single-exponential function with the constant term. The exponential component with the short time constant has almost settled in one year from the last eruption. Although InSAR result from TerraSAR-X data of November 2011 and May 2013 indicated deflation of shallow source under the crater, such deformation has not been obtained from recent SAR data. It suggests that this component has been due to deflation of shallow magma source with excess pressure. In this study, we found the possibility that long-term component also decayed exponentially. Then this factor may be deflation of deep source or delayed vesiculation.

  4. Dynamic Black-Level Correction and Artifact Flagging for Kepler Pixel Time Series

    NASA Technical Reports Server (NTRS)

    Kolodziejczak, J. J.; Clarke, B. D.; Caldwell, D. A.

    2011-01-01

    Methods applied to the calibration stage of Kepler pipeline data processing [1] (CAL) do not currently use all of the information available to identify and correct several instrument-induced artifacts. These include time-varying crosstalk from the fine guidance sensor (FGS) clock signals, and manifestations of drifting moire pattern as locally correlated nonstationary noise, and rolling bands in the images which find their way into the time series [2], [3]. As the Kepler Mission continues to improve the fidelity of its science data products, we are evaluating the benefits of adding pipeline steps to more completely model and dynamically correct the FGS crosstalk, then use the residuals from these model fits to detect and flag spatial regions and time intervals of strong time-varying black-level which may complicate later processing or lead to misinterpretation of instrument behavior as stellar activity.

  5. How To Dance through Time. Volume V: Victorian Era Couple Dances. [Videotape].

    ERIC Educational Resources Information Center

    Teten, Carol

    This 55-minute VHS videotape is the fifth in a series of "How To Dance Through Time" videos. It continues the tradition of the romance of the mid-19th century couple dances, focusing on Victorian era couple dances. The videotape offers 35 variations of the renowned 19th century couple dances, including the waltz, the polka, the galop,…

  6. Data on copula modeling of mixed discrete and continuous neural time series.

    PubMed

    Hu, Meng; Li, Mingyao; Li, Wu; Liang, Hualou

    2016-06-01

    Copula is an important tool for modeling neural dependence. Recent work on copula has been expanded to jointly model mixed time series in neuroscience ("Hu et al., 2016, Joint Analysis of Spikes and Local Field Potentials using Copula" [1]). Here we present further data for joint analysis of spike and local field potential (LFP) with copula modeling. In particular, the details of different model orders and the influence of possible spike contamination in LFP data from the same and different electrode recordings are presented. To further facilitate the use of our copula model for the analysis of mixed data, we provide the Matlab codes, together with example data.

  7. Incorporating Satellite Time-Series Data into Modeling

    NASA Technical Reports Server (NTRS)

    Gregg, Watson

    2008-01-01

    In situ time series observations have provided a multi-decadal view of long-term changes in ocean biology. These observations are sufficiently reliable to enable discernment of even relatively small changes, and provide continuous information on a host of variables. Their key drawback is their limited domain. Satellite observations from ocean color sensors do not suffer the drawback of domain, and simultaneously view the global oceans. This attribute lends credence to their use in global and regional model validation and data assimilation. We focus on these applications using the NASA Ocean Biogeochemical Model. The enhancement of the satellite data using data assimilation is featured and the limitation of tongterm satellite data sets is also discussed.

  8. Freshening of the Labrador Sea Surface Waters in the 1990s: Another Great Salinity Anomaly

    NASA Technical Reports Server (NTRS)

    Hakkinen, Sirpa; Koblinsky, Chester J. (Technical Monitor)

    2002-01-01

    Both the observed and simulated time series of the Labrador Sea surface salinities show a major freshening event since the middles. It continues the series of decoder events of the 1970s and 1980s from which the freshening in the early 1970's was named as the Great Salinity Anomaly (GSA). These events are especially distinguishable in the late summer (August and September) time series. The observed data suggests that the 1990's freshening may equal the GSA in magnitude. This recent event is associated with a large reduction in the overturning rate between the early and latter part of the 1990s. Both the observations and model results indicate that the surface salinity conditions appear to be returning towards normal daring 1999 and 2000 in the coastal area, but offshore, the model predicts the freshening to linger on after peaking 1997.

  9. Validation of Vegetation Index Time Series from Suomi NPP Visible Infrared Imaging Radiometer Suite Using Tower Radiation Flux Measurements

    NASA Astrophysics Data System (ADS)

    Miura, T.; Kato, A.; Wang, J.; Vargas, M.; Lindquist, M.

    2015-12-01

    Satellite vegetation index (VI) time series data serve as an important means to monitor and characterize seasonal changes of terrestrial vegetation and their interannual variability. It is, therefore, critical to ensure quality of such VI products and one method of validating VI product quality is cross-comparison with in situ flux tower measurements. In this study, we evaluated the quality of VI time series derived from Visible Infrared Imaging Radiometer Suite (VIIRS) onboard the Suomi National Polar-orbiting Partnership (NPP) spacecraft by cross-comparison with in situ radiation flux measurements at select flux tower sites over North America and Europe. VIIRS is a new polar-orbiting satellite sensor series, slated to replace National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer in the afternoon overpass and to continue the highly-calibrated data streams initiated with Moderate Resolution Imaging Spectrometer of National Aeronautics and Space Administration's Earth Observing System. The selected sites covered a wide range of biomes, including croplands, grasslands, evergreen needle forest, woody savanna, and open shrublands. The two VIIRS indices of the Top-of-Atmosphere (TOA) Normalized Difference Vegetation Index (NDVI) and the atmospherically-corrected, Top-of-Canopy (TOC) Enhanced Vegetation Index (EVI) (daily, 375 m spatial resolution) were compared against the TOC NDVI and a two-band version of EVI (EVI2) calculated from tower radiation flux measurements, respectively. VIIRS and Tower VI time series showed comparable seasonal profiles across biomes with statistically significant correlations (> 0.60; p-value < 0.01). "Start-of-season (SOS)" phenological metric values extracted from VIIRS and Tower VI time series were also highly compatible (R2 > 0.95), with mean differences of 2.3 days and 5.0 days for the NDVI and the EVI, respectively. These results indicate that VIIRS VI time series can capture seasonal evolution of vegetated land surface as good as in situ radiometric measurements. Future studies that address biophysical or physiological interpretations of Tower VI time series derived from radiation flux measurements are desirable.

  10. Global sea turtle conservation successes

    PubMed Central

    Mazaris, Antonios D.; Schofield, Gail; Gkazinou, Chrysoula; Almpanidou, Vasiliki; Hays, Graeme C.

    2017-01-01

    We document a tendency for published estimates of population size in sea turtles to be increasing rather than decreasing across the globe. To examine the population status of the seven species of sea turtle globally, we obtained 299 time series of annual nesting abundance with a total of 4417 annual estimates. The time series ranged in length from 6 to 47 years (mean, 16.2 years). When levels of abundance were summed within regional management units (RMUs) for each species, there were upward trends in 12 RMUs versus downward trends in 5 RMUs. This prevalence of more upward than downward trends was also evident in the individual time series, where we found 95 significant increases in abundance and 35 significant decreases. Adding to this encouraging news for sea turtle conservation, we show that even small sea turtle populations have the capacity to recover, that is, Allee effects appear unimportant. Positive trends in abundance are likely linked to the effective protection of eggs and nesting females, as well as reduced bycatch. However, conservation concerns remain, such as the decline in leatherback turtles in the Eastern and Western Pacific. Furthermore, we also show that, often, time series are too short to identify trends in abundance. Our findings highlight the importance of continued conservation and monitoring efforts that underpin this global conservation success story. PMID:28948215

  11. Connectionist Architectures for Time Series Prediction of Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Weigend, Andreas Sebastian

    We investigate the effectiveness of connectionist networks for predicting the future continuation of temporal sequences. The problem of overfitting, particularly serious for short records of noisy data, is addressed by the method of weight-elimination: a term penalizing network complexity is added to the usual cost function in back-propagation. We describe the dynamics of the procedure and clarify the meaning of the parameters involved. From a Bayesian perspective, the complexity term can be usefully interpreted as an assumption about prior distribution of the weights. We analyze three time series. On the benchmark sunspot series, the networks outperform traditional statistical approaches. We show that the network performance does not deteriorate when there are more input units than needed. In the second example, the notoriously noisy foreign exchange rates series, we pick one weekday and one currency (DM vs. US). Given exchange rate information up to and including a Monday, the task is to predict the rate for the following Tuesday. Weight-elimination manages to extract a significant part of the dynamics and makes the solution interpretable. In the third example, the networks predict the resource utilization of a chaotic computational ecosystem for hundreds of steps forward in time.

  12. A study of the effect of legal settlement on post-concussion symptoms.

    PubMed Central

    Fee, C R; Rutherford, W H

    1988-01-01

    Forty-four consecutive patients with concussion for whom a medico-legal report had been written were followed up for 3-4 years after their accidents. Three cases were still pending at the end of the study. Fifty-seven per cent complained of symptoms when the medico-legal reports were written (mean interval from accident 12.9 months), 39% had symptoms at the time of settlement (mean interval 22.1 months) and 34% had symptoms one year later. When these results were compared with a general series from the same department some years earlier, it was found that the symptoms at the time of writing the reports were not significantly different from symptoms at 6 weeks in the earlier series, but the symptoms one year after settlement were almost two-and-a-half times greater than the symptoms at 12 months in the general series. No evidence could be found to suggest any organic basis for the higher symptom rate in the litigation series. It is suggested that the litigation process itself is a factor in the persistence of symptoms and this effect continues after legal settlement has been reached. Early settlement of the cases might significantly reduce morbidity. PMID:3408521

  13. Land science with Sentinel-2 and Sentinel-3 data series synergy

    NASA Astrophysics Data System (ADS)

    Moreno, Jose; Guanter, Luis; Alonso, Luis; Gomez, Luis; Amoros, Julia; Camps, Gustavo; Delegido, Jesus

    2010-05-01

    Although the GMES/Sentinel satellite series were primarily designed to provide observations for operational services and routine applications, there is a growing interest in the scientific community towards the usage of Sentinel data for more advanced and innovative science. Apart from the improved spatial and spectral capabilities, the availability of consistent time series covering a period of over 20 years opens possibilities never explored before, such as systematic data assimilation approaches exploiting the time-series concept, or the incorporation in the modelling approaches of processes covering time scales from weeks to decades. Sentinel-3 will provide continuity to current ENVISAT MERIS/AATSR capabilities. The results already derived from MERIS/AATRS will be more systematically exploited by using OLCI in synergy with SLST. Particularly innovative is the case of Sentinel-2, which is specifically designed for land applications. Built on a constellation of two satellites operating simultaneously to provide 5 days geometric revisit time, the Sentinel-2 system will providing global and systematic acquisitions with high spatial resolution and with a high revisit time tailored towards the needs of land monitoring. Apart from providing continuity to Landsat and SPOT time series, the Sentinel-2 Multi-Spectral Instrument (MSI) incorporates new narrow bands around the red-edge for improved retrievals of biophysical parameters. The limitations imposed by the need of a proper cloud screening and atmospheric corrections have represented a serious constraint in the past for optical data. The fact that both Sentinel-2 and 3 have dedicated bands to allow such needed corrections for optical data represents an important step towards a proper exploitation, guarantying consistent time series showing actual variability in land surface conditions without the artefacts introduced by the atmosphere. Expected operational products (such as Land Cover maps, Leaf Area Index, Fractional Vegetation Cover, Fraction of Absorbed Photosynthetically Active Radiation, and Leaf Chlorophyll and Water Contents), will be enhanced with new scientific applications. Higher level products will also be provided, by means of mosaicking, averaging, synthesising or compositing of spatially and temporally resampled data. A key element in the exploitation of the Sentinel series will be the adequate use of data synergy, which will open new possibilities for improved Land Models. This paper analyses in particular the possibilities offered by mosaicking and compositing information derived from Sentinel-2 observations in high spatial resolution to complement dense time series derived from Sentinel-3 data with more frequent coverage. Interpolation of gaps in high spatial resolution time series (from Sentinel-2 data) by using medium/low resolution data from Sentinel-3 (OLCI and SLSTR) is also a way of making series more temporally consistent with high spatial resolution. The primary goal of such temporal interpolation / spatial mosaicking techniques is to derive consistent surface reflectance data virtually for every date and geographical location, no matter the initial spatial/temporal coverage of the original data used to produce the composite. As a result, biophysical products can be derived in a more consistent way from the spectral information of Sentinel-3 data by making use of a description of surface heterogeneity derived from Sentinel-2 data. Using data from dedicated experiments (SEN2FLEX, CEFLES2, SEN3EXP), that include a large dataset of satellite and airborne data and of ground-based measurements of atmospheric and vegetation parameters, different techniques are tested, including empirical / statistical approaches that builds nonlinear regression by mapping spectra to a high dimensional space, up to model inversion / data assimilation scenarios. Exploitation of the temporal domain and spatial multi-scale domain becomes then a driver for the systematic exploitation of GMES/Sentinels data time series. This paper review current status, and identifies research priorities in such direction.

  14. Adaptive Sensing of Time Series with Application to Remote Exploration

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Cabrol, Nathalie A.; Furlong, Michael; Hardgrove, Craig; Low, Bryan K. H.; Moersch, Jeffrey; Wettergreen, David

    2013-01-01

    We address the problem of adaptive informationoptimal data collection in time series. Here a remote sensor or explorer agent throttles its sampling rate in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility -- all collected datapoints lie in the past, but its resource allocation decisions require predicting far into the future. Our solution is to continually fit a Gaussian process model to the latest data and optimize the sampling plan on line to maximize information gain. We compare the performance characteristics of stationary and nonstationary Gaussian process models. We also describe an application based on geologic analysis during planetary rover exploration. Here adaptive sampling can improve coverage of localized anomalies and potentially benefit mission science yield of long autonomous traverses.

  15. Estimating secular velocities from GPS data contaminated by postseismic motion at sites with limited pre-earthquake data

    NASA Astrophysics Data System (ADS)

    Murray, J. R.; Svarc, J. L.

    2016-12-01

    Constant secular velocities estimated from Global Positioning System (GPS)-derived position time series are a central input for modeling interseismic deformation in seismically active regions. Both postseismic motion and temporally correlated noise produce long-period signals that are difficult to separate from secular motion and can bias velocity estimates. For GPS sites installed post-earthquake it is especially challenging to uniquely estimate velocities and postseismic signals and to determine when the postseismic transient has decayed sufficiently to enable use of subsequent data for estimating secular rates. Within 60 km of the 2003 M6.5 San Simeon and 2004 M6 Parkfield earthquakes in California, 16 continuous GPS sites (group 1) were established prior to mid-2001, and 52 stations (group 2) were installed following the events. We use group 1 data to investigate how early in the post-earthquake time period one may reliably begin using group 2 data to estimate velocities. For each group 1 time series, we obtain eight velocity estimates using observation time windows with successively later start dates (2006 - 2013) and a parameterization that includes constant velocity, annual, and semi-annual terms but no postseismic decay. We compare these to velocities estimated using only pre-San Simeon data to find when the pre- and post-earthquake velocities match within uncertainties. To obtain realistic velocity uncertainties, for each time series we optimize a temporally correlated noise model consisting of white, flicker, random walk, and, in some cases, band-pass filtered noise contributions. Preliminary results suggest velocities can be reliably estimated using data from 2011 to the present. Ongoing work will assess velocity bias as a function of epicentral distance and length of post-earthquake time series as well as explore spatio-temporal filtering of detrended group 1 time series to provide empirical corrections for postseismic motion in group 2 time series.

  16. Modeling the dynamics of metabolism in montane streams using continuous dissolved oxygen measurements

    NASA Astrophysics Data System (ADS)

    Birkel, Christian; Soulsby, Chris; Malcolm, Iain; Tetzlaff, Doerthe

    2013-09-01

    We inferred in-stream ecosystem processes in terms of photosynthetic productivity (P), system respiration (R), and reaeration capacity (RC) from a five parameter numerical oxygen mass balance model driven by radiation, stream and air temperature, and stream depth. This was calibrated to high-resolution (15 min), long-term (2.5 years) dissolved oxygen (DO) time series for moorland and forest reaches of a third-order montane stream in Scotland. The model was multicriteria calibrated to continuous 24 h periods within the time series to identify behavioral simulations representative of ecosystem functioning. Results were evaluated using a seasonal regional sensitivity analysis and a colinearity index for parameter sensitivity. This showed that >95 % of the behavioral models for the moorland and forest sites were identifiable and able to infer in-stream processes from the DO time series for around 40% and 32% of the time period, respectively. Monthly P/R ratios <1 indicate a heterotrophic system with both sites exhibiting similar temporal patterns; with a maximum in February and a second peak during summer months. However, the estimated net ecosystem productivity suggests that the moorland reach without riparian tree cover is likely to be a much larger source of carbon to the atmosphere (122 mmol C m-2 d-1) compared to the forested reach (64 mmol C m-2 d-1). We conclude that such process-based oxygen mass balance models may be transferable tools for investigating other systems; specifically, well-oxygenated upland channels with high hydraulic roughness and lacking reaeration measurements.

  17. 33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... disengaging apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f...

  18. 40 CFR 63.742 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... commercial or military service in the capacity for which it was designed. Carbon adsorber means one vessel in a series of vessels in a carbon adsorption system that contains carbon and is used to remove gaseous... process that removes permanent coating in small sections at a time and maintains a continuous vacuum...

  19. 40 CFR 63.742 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... commercial or military service in the capacity for which it was designed. Carbon adsorber means one vessel in a series of vessels in a carbon adsorption system that contains carbon and is used to remove gaseous... process that removes permanent coating in small sections at a time and maintains a continuous vacuum...

  20. National Centers for Environmental Prediction

    Science.gov Websites

    Modeling Center continuously monitors its NWP model performance against different performance measures, and AIRCFT GFS SSI and forecast fits to RAOBS for last 7 days spatial bias maps for different regions different regions GFS SSI and forecast fits to RAOBS for calendar months (time series, spatial and vertical

  1. The Demand for Higher Education in Puerto Rico.

    ERIC Educational Resources Information Center

    King, Jonathan

    1993-01-01

    Uses time-series data to estimate empirical enrollment functions for three Puerto Rico university systems. Measures opportunity cost and benefits to education as expected wage rates and tests a market segmentation process. Results show that the universities are not substitutes for one another. To cope with continuing revenue shortfalls,…

  2. 40 CFR 63.742 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... commercial or military service in the capacity for which it was designed. Carbon adsorber means one vessel in a series of vessels in a carbon adsorption system that contains carbon and is used to remove gaseous... process that removes permanent coating in small sections at a time and maintains a continuous vacuum...

  3. 40 CFR 63.742 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... commercial or military service in the capacity for which it was designed. Carbon adsorber means one vessel in a series of vessels in a carbon adsorption system that contains carbon and is used to remove gaseous... process that removes permanent coating in small sections at a time and maintains a continuous vacuum...

  4. 40 CFR 63.742 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... commercial or military service in the capacity for which it was designed. Carbon adsorber means one vessel in a series of vessels in a carbon adsorption system that contains carbon and is used to remove gaseous... process that removes permanent coating in small sections at a time and maintains a continuous vacuum...

  5. Mapping the Archives: 2

    ERIC Educational Resources Information Center

    Jackson, Anthony

    2012-01-01

    With this issue, "RiDE" continues its new occasional series of short informational pieces on archives in the field of drama and theatre education and applied theatre and performance. Each instalment includes summaries of one or more collections of significant material in the field. Over time this will build into a readily accessible…

  6. Mapping the Archives: 3

    ERIC Educational Resources Information Center

    Jackson, Anthony

    2013-01-01

    With this issue, "Research in Drama Education" (RiDE) continues its occasional series of short informational pieces on archives in the field of drama and theatre education and applied theatre and performance. Each instalment includes summaries of one or more collections of significant material in the field. Over time, this will build in…

  7. Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.

    PubMed

    Bühler, Jonas; von Lieres, Eric; Huber, Gregor J

    2018-01-01

    Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.

  8. Using long time series of Landsat data to monitor impervious surface dynamics: a case study in the Zhoushan Islands

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoping; Pan, Delu; Chen, Jianyu; Zhan, Yuanzeng; Mao, Zhihua

    2013-01-01

    Islands are an important part of the marine ecosystem. Increasing impervious surfaces in the Zhoushan Islands due to new development and increased population have an ecological impact on the runoff and water quality. Based on time-series classification and the complement of vegetation fraction in urban regions, Landsat thematic mapper and other high-resolution satellite images were applied to monitor the dynamics of impervious surface area (ISA) in the Zhoushan Islands from 1986 to 2011. Landsat-derived ISA results were validated by the high-resolution Worldview-2 and aerial photographs. The validation shows that mean relative errors of these ISA maps are <15 %. The results reveal that the ISA in the Zhoushan Islands increased from 19.2 km2 in 1986 to 86.5 km2 in 2011, and the period from 2006 to 2011 had the fastest expansion rate of 5.59 km2 per year. The major land conversions to high densities of ISA were from the tidal zone and arable lands. The expansions of ISA were unevenly distributed and most of them were located along the periphery of these islands. Time-series maps revealed that ISA expansions happened continuously over the last 25 years. Our analysis indicated that the policy and the topography were the dominant factors controlling the spatial patterns of ISA and its expansions in the Zhoushan Islands. With continuous urbanization processes, the rapid ISA expansions may not be stopped in the near feature.

  9. GPU-accelerated algorithms for many-particle continuous-time quantum walks

    NASA Astrophysics Data System (ADS)

    Piccinini, Enrico; Benedetti, Claudia; Siloi, Ilaria; Paris, Matteo G. A.; Bordone, Paolo

    2017-06-01

    Many-particle continuous-time quantum walks (CTQWs) represent a resource for several tasks in quantum technology, including quantum search algorithms and universal quantum computation. In order to design and implement CTQWs in a realistic scenario, one needs effective simulation tools for Hamiltonians that take into account static noise and fluctuations in the lattice, i.e. Hamiltonians containing stochastic terms. To this aim, we suggest a parallel algorithm based on the Taylor series expansion of the evolution operator, and compare its performances with those of algorithms based on the exact diagonalization of the Hamiltonian or a 4th order Runge-Kutta integration. We prove that both Taylor-series expansion and Runge-Kutta algorithms are reliable and have a low computational cost, the Taylor-series expansion showing the additional advantage of a memory allocation not depending on the precision of calculation. Both algorithms are also highly parallelizable within the SIMT paradigm, and are thus suitable for GPGPU computing. In turn, we have benchmarked 4 NVIDIA GPUs and 3 quad-core Intel CPUs for a 2-particle system over lattices of increasing dimension, showing that the speedup provided by GPU computing, with respect to the OPENMP parallelization, lies in the range between 8x and (more than) 20x, depending on the frequency of post-processing. GPU-accelerated codes thus allow one to overcome concerns about the execution time, and make it possible simulations with many interacting particles on large lattices, with the only limit of the memory available on the device.

  10. A robust algorithm for optimisation and customisation of fractal dimensions of time series modified by nonlinearly scaling their time derivatives: mathematical theory and practical applications.

    PubMed

    Fuss, Franz Konstantin

    2013-01-01

    Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals.

  11. A Robust Algorithm for Optimisation and Customisation of Fractal Dimensions of Time Series Modified by Nonlinearly Scaling Their Time Derivatives: Mathematical Theory and Practical Applications

    PubMed Central

    2013-01-01

    Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals. PMID:24151522

  12. New View of Relativity Theory

    NASA Astrophysics Data System (ADS)

    Martini, Luiz Cesar

    2014-04-01

    This article results from Introducing the Dimensional Continuous Space-Time Theory that was published in reference 1. The Dimensional Continuous Space-Time Theory shows a series of facts relative to matter, energy, space and concludes that empty space is inelastic, absolutely stationary, motionless, perpetual, without possibility of deformation neither can it be destroyed or created. A elementary cell of empty space or a certain amount of empty space can be occupied by any quantity of energy or matter without any alteration or deformation. As a consequence of these properties and being a integral part of the theory, the principles of Relativity Theory must be changed to become simple and intuitive.

  13. Infilling and quality checking of discharge, precipitation and temperature data using a copula based approach

    NASA Astrophysics Data System (ADS)

    Anwar, Faizan; Bárdossy, András; Seidel, Jochen

    2017-04-01

    Estimating missing values in a time series of a hydrological variable is an everyday task for a hydrologist. Existing methods such as inverse distance weighting, multivariate regression, and kriging, though simple to apply, provide no indication of the quality of the estimated value and depend mainly on the values of neighboring stations at a given step in the time series. Copulas have the advantage of representing the pure dependence structure between two or more variables (given the relationship between them is monotonic). They rid us of questions such as transforming the data before use or calculating functions that model the relationship between the considered variables. A copula-based approach is suggested to infill discharge, precipitation, and temperature data. As a first step the normal copula is used, subsequently, the necessity to use non-normal / non-symmetrical dependence is investigated. Discharge and temperature are treated as regular continuous variables and can be used without processing for infilling and quality checking. Due to the mixed distribution of precipitation values, it has to be treated differently. This is done by assigning a discrete probability to the zeros and treating the rest as a continuous distribution. Building on the work of others, along with infilling, the normal copula is also utilized to identify values in a time series that might be erroneous. This is done by treating the available value as missing, infilling it using the normal copula and checking if it lies within a confidence band (5 to 95% in our case) of the obtained conditional distribution. Hydrological data from two catchments Upper Neckar River (Germany) and Santa River (Peru) are used to demonstrate the application for datasets with different data quality. The Python code used here is also made available on GitHub. The required input is the time series of a given variable at different stations.

  14. Snowmelt and Surface Freeze/Thaw Timings over Alaska derived from Passive Microwave Observations using a Wavelet Classifier

    NASA Astrophysics Data System (ADS)

    Steiner, N.; McDonald, K. C.; Dinardo, S. J.; Miller, C. E.

    2015-12-01

    Arctic permafrost soils contain a vast amount of organic carbon that will be released into the atmosphere as carbon dioxide or methane when thawed. Surface to air greenhouse gas fluxes are largely dependent on such surface controls as the frozen/thawed state of the snow and soil. Satellite remote sensing is an important means to create continuous mapping of surface properties. Advances in the ability to determine soil and snow freeze/thaw timings from microwave frequency observations improves upon our ability to predict the response of carbon gas emission to warming through synthesis with in-situ observation, such as the 2012-2015 Carbon in Arctic Reservoir Vulnerability Experiment (CARVE). Surface freeze/thaw or snowmelt timings are often derived using a constant or spatially/temporally variable threshold applied to time-series observations. Alternately, time-series singularity classifiers aim to detect discontinuous changes, or "edges", in time-series data similar to those that occur from the large contrast in dielectric constant during the freezing or thaw of soil or snow. We use multi-scale analysis of continuous wavelet transform spectral gradient brightness temperatures from various channel combinations of passive microwave radiometers, Advanced Microwave Scanning Radiometer (AMSR-E, AMSR2) and Special Sensor Microwave Imager (SSM/I F17) gridded at a 10 km posting with resolution proportional to the observational footprint. Channel combinations presented here aim to illustrate and differentiate timings of "edges" from transitions in surface water related to various landscape components (e.g. snow-melt, soil-thaw). To support an understanding of the physical basis of observed "edges" we compare satellite measurements with simple radiative transfer microwave-emission modeling of the snow, soil and vegetation using in-situ observations from the SNOw TELemetry (SNOTEL) automated weather stations. Results of freeze/thaw and snow-melt timings and trends are reported for Alaska and the North-West Canadian Arctic for the period 2002 to 2015.

  15. Continuous day-time time series of E-region equatorial electric fields derived from ground magnetic observatory data

    NASA Astrophysics Data System (ADS)

    Alken, P.; Chulliat, A.; Maus, S.

    2012-12-01

    The day-time eastward equatorial electric field (EEF) in the ionospheric E-region plays an important role in equatorial ionospheric dynamics. It is responsible for driving the equatorial electrojet (EEJ) current system, equatorial vertical ion drifts, and the equatorial ionization anomaly (EIA). Due to its importance, there is much interest in accurately measuring and modeling the EEF. However, there are limited sources of direct EEF measurements with full temporal and spatial coverage of the equatorial ionosphere. In this work, we propose a method of estimating a continuous day-time time series of the EEF at any longitude, provided there is a pair of ground magnetic observatories in the region which can accurately track changes in the strength of the EEJ. First, we derive a climatological unit latitudinal current profile from direct overflights of the CHAMP satellite and use delta H measurements from the ground observatory pair to determine the magnitude of the current. The time series of current profiles is then inverted for the EEF by solving the governing electrodynamic equations. While this method has previously been applied and validated in the Peruvian sector, in this work we demonstrate the method using a pair of magnetometers in Africa (Samogossoni, SAM, 0.18 degrees magnetic latitude and Tamanrasset, TAM, 11.5 degrees magnetic latitude) and validate the resulting EEF values against the CINDI ion velocity meter (IVM) instrument on the C/NOFS satellite. We find a very good 80% correlation with C/NOFS IVM measurements and a root-mean-square difference of 9 m/s in vertical drift velocity. This technique can be extended to any pair of ground observatories which can capture the day-time strength of the EEJ. We plan to apply this work to more observatory pairs around the globe and distribute real-time equatorial electric field values to the community.

  16. Searching for the Signature of Wastewater Injection in continuous GPS Data from The Geysers Geothermal Field

    NASA Astrophysics Data System (ADS)

    Terry, R. L.; Funning, G.; Floyd, M.

    2017-12-01

    The Geysers geothermal field in California, which provides a large portion of northern California's power, has seen declining steam pressures over the past three decades, accompanied by surface subsidence. Together, these two phenomena are likely the result of the exploitation of the reservoir without adequate time for natural restoration. To combat the decline in steam pressures, The Geysers began injecting imported wastewater into the geothermal reservoir in 1997 and expanded injection in 2003. In 2012 and 2013, we installed three continuously recording GPS stations in The Geysers to closely monitor crustal deformation due to both the extraction of steam and the injection of wastewater. To assess the impact of the current injection and extraction activities on the geothermal reservoir, we analyze the position time-series from these GPS stations alongside wastewater injection and steam extraction data. We use common-mode filtering to remove any regionally-correlated noise from our GPS time series, and also estimate and subtract any seasonal signals present. To predict the effect of injection and production on surface movement, we summed the monthly time series of well data within a rectangular grid framework. We then use an array of Mogi sources based on each grid cell's total volume change to calculate the expected surface deformation due to these volume changes at depth. The temporal resolution provided by GPS allows us to characterize more accurately the properties of the subsurface geothermal reservoir related to forcing. For example, based on a similar spatiotemporal relationship between injection and seismicity, we hypothesize that there may be a delayed deformation response following injection, related to the permeability of the reservoir, and are undertaking detailed comparisons between our time series data to identify this response. Overall changes in the sense and rate of vertical motion in the field due to injection over time are also expected. We anticipate that the impact of discovering a relationship between injection and surface deformation will be of great importance in maintaining and managing geothermal resources in the future.

  17. A Fresh Look at Spatio-Temporal Remote Sensing Data: Data Formats, Processing Flow, and Visualization

    NASA Astrophysics Data System (ADS)

    Gens, R.

    2017-12-01

    With increasing number of experimental and operational satellites in orbit, remote sensing based mapping and monitoring of the dynamic Earth has entered into the realm of `big data'. Just the Landsat series of satellites provide a near continuous archive of 45 years of data. The availability of such spatio-temporal datasets has created opportunities for long-term monitoring diverse features and processes operating on the Earth's terrestrial and aquatic systems. Processes such as erosion, deposition, subsidence, uplift, evapotranspiration, urbanization, land-cover regime shifts can not only be monitored and change can be quantified using time-series data analysis. This unique opportunity comes with new challenges in management, analysis, and visualization of spatio-temporal datasets. Data need to be stored in a user-friendly format, and relevant metadata needs to be recorded, to allow maximum flexibility for data exchange and use. Specific data processing workflows need to be defined to support time-series analysis for specific applications. Value-added data products need to be generated keeping in mind the needs of the end-users, and using best practices in complex data visualization. This presentation systematically highlights the various steps for preparing spatio-temporal remote sensing data for time series analysis. It showcases a prototype workflow for remote sensing based change detection that can be generically applied while preserving the application-specific fidelity of the datasets. The prototype includes strategies for visualizing change over time. This has been exemplified using a time-series of optical and SAR images for visualizing the changing glacial, coastal, and wetland landscapes in parts of Alaska.

  18. Modelling spatiotemporal change using multidimensional arrays Meng

    NASA Astrophysics Data System (ADS)

    Lu, Meng; Appel, Marius; Pebesma, Edzer

    2017-04-01

    The large variety of remote sensors, model simulations, and in-situ records provide great opportunities to model environmental change. The massive amount of high-dimensional data calls for methods to integrate data from various sources and to analyse spatiotemporal and thematic information jointly. An array is a collection of elements ordered and indexed in arbitrary dimensions, which naturally represent spatiotemporal phenomena that are identified by their geographic locations and recording time. In addition, array regridding (e.g., resampling, down-/up-scaling), dimension reduction, and spatiotemporal statistical algorithms are readily applicable to arrays. However, the role of arrays in big geoscientific data analysis has not been systematically studied: How can arrays discretise continuous spatiotemporal phenomena? How can arrays facilitate the extraction of multidimensional information? How can arrays provide a clean, scalable and reproducible change modelling process that is communicable between mathematicians, computer scientist, Earth system scientist and stakeholders? This study emphasises on detecting spatiotemporal change using satellite image time series. Current change detection methods using satellite image time series commonly analyse data in separate steps: 1) forming a vegetation index, 2) conducting time series analysis on each pixel, and 3) post-processing and mapping time series analysis results, which does not consider spatiotemporal correlations and ignores much of the spectral information. Multidimensional information can be better extracted by jointly considering spatial, spectral, and temporal information. To approach this goal, we use principal component analysis to extract multispectral information and spatial autoregressive models to account for spatial correlation in residual based time series structural change modelling. We also discuss the potential of multivariate non-parametric time series structural change methods, hierarchical modelling, and extreme event detection methods to model spatiotemporal change. We show how array operations can facilitate expressing these methods, and how the open-source array data management and analytics software SciDB and R can be used to scale the process and make it easily reproducible.

  19. Modeling multiple time series annotations as noisy distortions of the ground truth: An Expectation-Maximization approach.

    PubMed

    Gupta, Rahul; Audhkhasi, Kartik; Jacokes, Zach; Rozga, Agata; Narayanan, Shrikanth

    2018-01-01

    Studies of time-continuous human behavioral phenomena often rely on ratings from multiple annotators. Since the ground truth of the target construct is often latent, the standard practice is to use ad-hoc metrics (such as averaging annotator ratings). Despite being easy to compute, such metrics may not provide accurate representations of the underlying construct. In this paper, we present a novel method for modeling multiple time series annotations over a continuous variable that computes the ground truth by modeling annotator specific distortions. We condition the ground truth on a set of features extracted from the data and further assume that the annotators provide their ratings as modification of the ground truth, with each annotator having specific distortion tendencies. We train the model using an Expectation-Maximization based algorithm and evaluate it on a study involving natural interaction between a child and a psychologist, to predict confidence ratings of the children's smiles. We compare and analyze the model against two baselines where: (i) the ground truth in considered to be framewise mean of ratings from various annotators and, (ii) each annotator is assumed to bear a distinct time delay in annotation and their annotations are aligned before computing the framewise mean.

  20. Record statistics of a strongly correlated time series: random walks and Lévy flights

    NASA Astrophysics Data System (ADS)

    Godrèche, Claude; Majumdar, Satya N.; Schehr, Grégory

    2017-08-01

    We review recent advances on the record statistics of strongly correlated time series, whose entries denote the positions of a random walk or a Lévy flight on a line. After a brief survey of the theory of records for independent and identically distributed random variables, we focus on random walks. During the last few years, it was indeed realized that random walks are a very useful ‘laboratory’ to test the effects of correlations on the record statistics. We start with the simple one-dimensional random walk with symmetric jumps (both continuous and discrete) and discuss in detail the statistics of the number of records, as well as of the ages of the records, i.e. the lapses of time between two successive record breaking events. Then we review the results that were obtained for a wide variety of random walk models, including random walks with a linear drift, continuous time random walks, constrained random walks (like the random walk bridge) and the case of multiple independent random walkers. Finally, we discuss further observables related to records, like the record increments, as well as some questions raised by physical applications of record statistics, like the effects of measurement error and noise.

  1. Near-Field Postseismic Deformation Measurements from the Andaman and Nicobar Islands

    NASA Astrophysics Data System (ADS)

    Freymueller, J. T.; Rajendran, C.; Rajendran, K.; Rajamani, A.

    2006-12-01

    Since the December 26, 2004 Sumatra-Andaman Islands earthquake, we have carried out campaign GPS measurements at several sites in the Andaman and Nicobar Islands (India) and installed three continuous GPS sites in the region. Most of these sites had pre-earthquake measurements, which showed slow westward motion relative to the Indian plate. Postseismic measurements, on the other hand, show average westward velocities of several cm/yr to a few decimeters per year relative to the Indian plate. The motion of all sites is strongly non-linear in time, and is not uniform in space. We use a combination of continuous site time series and nearby campaign site time series to construct the most complete possible postseismic displacement records. Postseismic deformation from large earthquakes is likely to be dominated by a combination of afterslip on the deeper subduction interface, and viscoelastic relaxation of the mantle. Afterslip following the (similar magnitude) 1964 Alaska earthquake amounted to 20-50% of the magnitude of the coseismic slip, and smaller subduction zone earthquakes have exhibited the same or even larger proportion of afterslip to coseismic slip. We compare the time decay and spatial pattern of the observed postseismic displacement to postseismic deformation models and to observations from the Alaska earthquake.

  2. A stochastical event-based continuous time step rainfall generator based on Poisson rectangular pulse and microcanonical random cascade models

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph

    2017-04-01

    Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30

  3. An accuracy assessment of realtime GNSS time series toward semi- real time seafloor geodetic observation

    NASA Astrophysics Data System (ADS)

    Osada, Y.; Ohta, Y.; Demachi, T.; Kido, M.; Fujimoto, H.; Azuma, R.; Hino, R.

    2013-12-01

    Large interplate earthquake repeatedly occurred in Japan Trench. Recently, the detail crustal deformation revealed by the nation-wide inland GPS network called as GEONET by GSI. However, the maximum displacement region for interplate earthquake is mainly located offshore region. GPS/Acoustic seafloor geodetic observation (hereafter GPS/A) is quite important and useful for understanding of shallower part of the interplate coupling between subducting and overriding plates. We typically conduct GPS/A in specific ocean area based on repeated campaign style using research vessel or buoy. Therefore, we cannot monitor the temporal variation of seafloor crustal deformation in real time. The one of technical issue on real time observation is kinematic GPS analysis because kinematic GPS analysis based on reference and rover data. If the precise kinematic GPS analysis will be possible in the offshore region, it should be promising method for real time GPS/A with USV (Unmanned Surface Vehicle) and a moored buoy. We assessed stability, precision and accuracy of StarFireTM global satellites based augmentation system. We primarily tested for StarFire in the static condition. In order to assess coordinate precision and accuracy, we compared 1Hz StarFire time series and post-processed precise point positioning (PPP) 1Hz time series by GIPSY-OASIS II processing software Ver. 6.1.2 with three difference product types (ultra-rapid, rapid, and final orbits). We also used difference interval clock information (30 and 300 seconds) for the post-processed PPP processing. The standard deviation of real time StarFire time series is less than 30 mm (horizontal components) and 60 mm (vertical component) based on 1 month continuous processing. We also assessed noise spectrum of the estimated time series by StarFire and post-processed GIPSY PPP results. We found that the noise spectrum of StarFire time series is similar pattern with GIPSY-OASIS II processing result based on JPL rapid orbit products with 300 seconds interval clock information. And we report stability, precision and accuracy of StarFire in the moving conditon.

  4. Interferometer with Continuously Varying Path Length Measured in Wavelengths to the Reference Mirror

    NASA Technical Reports Server (NTRS)

    Ohara, Tetsuo (Inventor)

    2016-01-01

    An interferometer in which the path length of the reference beam, measured in wavelengths, is continuously changing in sinusoidal fashion and the interference signal created by combining the measurement beam and the reference beam is processed in real time to obtain the physical distance along the measurement beam between the measured surface and a spatial reference frame such as the beam splitter. The processing involves analyzing the Fourier series of the intensity signal at one or more optical detectors in real time and using the time-domain multi-frequency harmonic signals to extract the phase information independently at each pixel position of one or more optical detectors and converting the phase information to distance information.

  5. Investigation of Coastal Hydrogeology Utilizing Geophysical and Geochemical Tools along the Broward County Coast, Florida

    USGS Publications Warehouse

    Reich, Christopher D.; Swarzenski, Peter W.; Greenwood, W. Jason; Wiese, Dana S.

    2008-01-01

    Geophysical (CHIRP, boomer, and continuous direct-current resistivity) and geochemical tracer studies (continuous and time-series 222Radon) were conducted along the Broward County coast from Port Everglades to Hillsboro Inlet, Florida. Simultaneous seismic, direct-current resistivity, and radon surveys in the coastal waters provided information to characterize the geologic framework and identify potential groundwater-discharge sites. Time-series radon at the Nova Southeastern University National Coral Reef Institute (NSU/NCRI) seawall indicated a very strong tidally modulated discharge of ground water with 222Rn activities ranging from 4 to 10 disintegrations per minute per liter depending on tidal stage. CHIRP seismic data provided very detailed bottom profiles (i.e., bathymetry); however, acoustic penetration was poor and resulted in no observed subsurface geologic structure. Boomer data, on the other hand, showed features that are indicative of karst, antecedent topography (buried reefs), and sand-filled troughs. Continuous resistivity profiling (CRP) data showed slight variability in the subsurface along the coast. Subtle changes in subsurface resistivity between nearshore (higher values) and offshore (lower values) profiles may indicate either a freshening of subsurface water nearshore or a change in sediment porosity or lithology. Further lithologic and hydrologic controls from sediment or rock cores or well data are needed to constrain the variability in CRP data.

  6. Detection of contaminated pixels based on the short-term continuity of NDVI and correction using spatio-temporal continuity

    NASA Astrophysics Data System (ADS)

    Cho, A.-Ra; Suh, Myoung-Seok

    2013-08-01

    The present study developed and assessed a correction technique (CSaTC: Correction based on Spatial and Temporal Continuity) for the detection and correction of contaminated Normalized Difference Vegetation Index (NDVI) time series data. Global Inventory Modeling and Mapping Studies (GIMMS) NDVI data from 1982 to 2006 with a 15-day period and an 8-km spatial resolution was used. CSaTC utilizes short-term continuity of vegetation to detect contaminated pixels, and then, corrects the detected pixels using the spatio-temporal continuity of vegetation. CSaTC was applied to the NDVI data over the East Asian region, which exhibits diverse seasonal and interannual variations in vegetation activities. The correction skill of CSaTC was compared to two previously applied methods, IDR (iterative Interpolation for Data Reconstruction) and Park et al. (2011) using GIMMS NDVI data. CSaTC reasonably resolved the overcorrection and spreading phenomenon caused by excessive correction of Park et al. (2011). The validation using the simulated NDVI time series data showed that CSaTC shows a systematically better correction skill in bias and RMSE irrespective of phenology types of vegetation and noise levels. In general, CSaTC showed a good recovery of the contaminated data appearing over the short-term period on a level similar to that obtained using the IDR technique. In addition, it captured the multi-peak of NDVI, and the germination and defoliating patterns more accurately than that by IDR, which overly compensates for seasons with a high temporal variation and where NDVI data exhibit multi-peaks.

  7. Spectral analysis of time series of categorical variables in earth sciences

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.; Dorador, Javier

    2016-10-01

    Time series of categorical variables often appear in Earth Science disciplines and there is considerable interest in studying their cyclic behavior. This is true, for example, when the type of facies, petrofabric features, ichnofabrics, fossil assemblages or mineral compositions are measured continuously over a core or throughout a stratigraphic succession. Here we deal with the problem of applying spectral analysis to such sequences. A full indicator approach is proposed to complement the spectral envelope often used in other disciplines. Additionally, a stand-alone computer program is provided for calculating the spectral envelope, in this case implementing the permutation test to assess the statistical significance of the spectral peaks. We studied simulated sequences as well as real data in order to illustrate the methodology.

  8. Decadal GPS Time Series and Velocity Fields Spanning the North American Continent and Beyond: New Data Products, Cyberinfrastructure and Case Studies from the EarthScope Plate Boundary Observatory (PBO) and Other Regional Networks

    NASA Astrophysics Data System (ADS)

    Phillips, D. A.; Herring, T.; Melbourne, T. I.; Murray, M. H.; Szeliga, W. M.; Floyd, M.; Puskas, C. M.; King, R. W.; Boler, F. M.; Meertens, C. M.; Mattioli, G. S.

    2017-12-01

    The Geodesy Advancing Geosciences and EarthScope (GAGE) Facility, operated by UNAVCO, provides a diverse suite of geodetic data, derived products and cyberinfrastructure services to support community Earth science research and education. GPS data and products including decadal station position time series and velocities are provided for 2000+ continuous GPS stations from the Plate Boundary Observatory (PBO) and other networks distributed throughout the high Arctic, North America, and Caribbean regions. The position time series contain a multitude of signals in addition to the secular motions, including coseismic and postseismic displacements, interseismic strain accumulation, and transient signals associated with hydrologic and other processes. We present our latest velocity field solutions, new time series offset estimate products, and new time series examples associated with various phenomena. Position time series, and the signals they contain, are inherently dependent upon analysis parameters such as network scaling and reference frame realization. The estimation of scale changes for example, a common practice, has large impacts on vertical motion estimates. GAGE/PBO velocities and time series are currently provided in IGS (IGb08) and North America (NAM08, IGb08 rotated to a fixed North America Plate) reference frames. We are reprocessing all data (1996 to present) as part of the transition from IGb08 to IGS14 that began in 2017. New NAM14 and IGS14 data products are discussed. GAGE/PBO GPS data products are currently generated using onsite computing clusters. As part of an NSF funded EarthCube Building Blocks project called "Deploying MultiFacility Cyberinfrastructure in Commercial and Private Cloud-based Systems (GeoSciCloud)", we are investigating performance, cost, and efficiency differences between local computing resources and cloud based resources. Test environments include a commercial cloud provider (Amazon/AWS), NSF cloud-like infrastructures within XSEDE (TACC, the Texas Advanced Computing Center), and in-house cyberinfrastructures. Preliminary findings from this effort are presented. Web services developed by UNAVCO to facilitate the discovery, customization and dissemination of GPS data and products are also presented.

  9. West-Coast Wide Expansion and Testing of the Geodetic Alarm System (G-larmS)

    NASA Astrophysics Data System (ADS)

    Ruhl, C. J.; Grapenthin, R.; Melgar, D.; Aranha, M. A.; Allen, R. M.

    2016-12-01

    The Geodetic Alarm System (G-larmS) was developed in collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech for real-time Earthquake Early Warning (EEW). G-larmS has been in continuous operation at the BSL since 2014 using event triggers from the ShakeAlert EEW system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California (CA). G-larmS has been extended to include southern CA and Cascadia, providing continuous west-coast wide coverage. G-larmS currently uses high rate (1 Hz), low latency (< 5 s), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from the ShakeAlert EEW system. It extracts static offsets from real-time GPS time series upon S-wave arrival and performs a least squares inversion on these offsets to determine slip on a finite fault. A key issue with geodetic EEW approaches is that unlike seismology-based algorithms that are routinely tested using frequent small-magnitude events, geodetic systems are not regularly exercised. Scenario ruptures are therefore important for testing the performance of G-larmS. We discuss results from scenario events on several large faults (capable of M>6.5) in CA and Cascadia built from realistic 3D geometries. Synthetic long-period 1Hz displacement waveforms were obtained from a new stochastic kinematic slip distribution generation method. Waveforms are validated by direct comparison to peak P-wave displacement scaling laws and to PGD GMPEs obtained from high-rate GPS observations of large events worldwide. We run the scenarios on real-time streams to systematically test the recovery of slip and magnitude by G-larmS. In addition to presenting these results, we will discuss new capabilities, such as implementing 2D geometry and the applicability of these results to GPS enhanced tsunami warning systems.

  10. 76 FR 79650 - Proposed Information Collection; Comment Request; Survey of Income and Program Participation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... household-based survey designed as a continuous series of national panels. New panels are introduced every... with questions designed to address specific needs, such as obtaining information on household members... economic well-being and permitted changes in these levels to be measured over time. The 2008 panel is...

  11. 33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...

  12. 33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...

  13. 40 CFR 797.1950 - Mysid shrimp chronic toxicity test.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... kill 50 percent of a test population during continuous exposure over a specified period of time. (6... with the test design into retention chambers within the test and the control chambers. Mysids in the... the definitive test. (ii) The mysids should be exposed to a series of widely spaced concentrations of...

  14. 76 FR 24457 - Proposed Information Collection; Comment Request; Survey of Income and Program Participation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-02

    ... the SIPP, which is a household-based survey designed as a continuous series of national panels. New... the panel. The core is supplemented with questions designed to address specific needs, such as... be measured over time. The 2008 panel is currently scheduled for approximately 6 years and will...

  15. 33 CFR 149.315 - What embarkation, launching, and recovery arrangements must rescue boats meet?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION... the shortest possible time. (c) If the rescue boat is one of the deepwater port's survival craft, then... apparatus, approved under approval series 160.170, instead of a lifeboat release mechanism. (f) The rescue...

  16. Status of the American Public School Teacher, 2005-2006

    ERIC Educational Resources Information Center

    Wolman, Paul, Ed.

    2010-01-01

    A continuing need for comprehensive and timely information about the public school teachers of the United States led the National Education Association (NEA) Research Division in 1956 to develop the first of a series of surveys and subsequent reports covering various aspects of teachers' professional, family, and civic lives. The NEA has conducted…

  17. Military Enlistments: What Can We Learn from Geographic Variation? Technical Report 620.

    ERIC Educational Resources Information Center

    Brown, Charles

    Some economic variables were examined that affect enlistment decisions and therefore affect the continued success of the All-Volunteer Force. The study used a multiple regression, pooled cross-section/time-series model over the 1975-1982 period, including pay, unemployment, educational benefits, and recruiting resources as independent variables.…

  18. New Roles for New Times: Digital Curation for Preservation

    ERIC Educational Resources Information Center

    Walters, Tyler; Skinner, Katherine

    2011-01-01

    Digital curation refers to the actions people take to maintain and add value to digital information over its lifecycle, including the processes used when creating digital content. Digital preservation focuses on the "series of managed activities necessary to ensure continued access to digital materials for as long as necessary." In this…

  19. Safety of High Speed Ground Transportation Systems - Human Factors Phase II: Design and Evaluation of Decision Aids for Control of High-Speed Trains: Experiments and Model

    DOT National Transportation Integrated Search

    1996-12-01

    Although the speed of some guided ground transportation systems continues to : increase, the reaction time and the sensory and information processing : capacities of railroad personnel remain constant. This second report in a : series examining criti...

  20. State Actions to Advance Teacher Evaluation. Educator Effectiveness Series

    ERIC Educational Resources Information Center

    Gandha, Tysza; Baxter, Andy

    2016-01-01

    This report offers state leaders key areas for action to continue progress in implementing evaluation systems, even as federal policies on teacher evaluation relax state requirements. The Southern Regional Education Board (SREB) offers its current best thinking for how state agencies can make the smartest use of funds, time and partners to refine…

  1. Technology Education and the Arts

    ERIC Educational Resources Information Center

    Roman, Harry T.

    2009-01-01

    One hears quite frequently how the arts continually suffer in the academic day. Many long-time technology education champions certainly know what this is all about; but there may be some ways to use technology education to bring the arts into the classroom. This article offers a series of activities and suggestions that will help students better…

  2. Neural Networks as a Tool for Constructing Continuous NDVI Time Series from AVHRR and MODIS

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.; Lary, David J.; Vrieling, Anton; Stathakis, Demetris; Mussa, Hamse

    2008-01-01

    The long term Advanced Very High Resolution Radiometer-Normalized Difference Vegetation Index (AVHRR-NDVI) record provides a critical historical perspective on vegetation dynamics necessary for global change research. Despite the proliferation of new sources of global, moderate resolution vegetation datasets, the remote sensing community is still struggling to create datasets derived from multiple sensors that allow the simultaneous use of spectral vegetation for time series analysis. To overcome the non-stationary aspect of NDVI, we use an artificial neural network (ANN) to map the NDVI indices from AVHRR to those from MODIS using atmospheric, surface type and sensor-specific inputs to account for the differences between the sensors. The NDVI dynamics and range of MODIS NDVI data at one degree is matched and extended through the AVHRR record. Four years of overlap between the two sensors is used to train a neural network to remove atmospheric and sensor specific effects on the AVHRR NDVI. In this paper, we present the resulting continuous dataset, its relationship to MODIS data, and a validation of the product.

  3. Hypergeometric continuation of divergent perturbation series: II. Comparison with Shanks transformation and Padé approximation

    NASA Astrophysics Data System (ADS)

    Sanders, Sören; Holthaus, Martin

    2017-11-01

    We explore in detail how analytic continuation of divergent perturbation series by generalized hypergeometric functions is achieved in practice. Using the example of strong-coupling perturbation series provided by the two-dimensional Bose-Hubbard model, we compare hypergeometric continuation to Shanks and Padé techniques, and demonstrate that the former yields a powerful, efficient and reliable alternative for computing the phase diagram of the Mott insulator-to-superfluid transition. In contrast to Shanks transformations and Padé approximations, hypergeometric continuation also allows us to determine the exponents which characterize the divergence of correlation functions at the transition points. Therefore, hypergeometric continuation constitutes a promising tool for the study of quantum phase transitions.

  4. Long-term retrospective analysis of mackerel spawning in the North Sea: a new time series and modeling approach to CPR data.

    PubMed

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock.

  5. Subsidence and current strain patterns on Tenerife Island (Canary Archipelago, Spain) derived from continuous GNSS time series (2008-2015)

    NASA Astrophysics Data System (ADS)

    Sánchez-Alzola, A.; Martí, J.; García-Yeguas, A.; Gil, A. J.

    2016-11-01

    In this paper we present the current crustal deformation model of Tenerife Island derived from daily CGPS time series processing (2008-2015). Our results include the position time series, a global velocity estimation and the current crustal deformation on the island in terms of strain tensors. We detect a measurable subsidence of 1.5-2 mm/yr. in the proximities of the Cañadas-Teide-Pico Viejo (CTPV) complex. These values are higher in the central part of the complex and could be explained by a lateral spreading of the elastic lithosphere combined with the effect of the drastic descent of the water table in the island experienced during recent decades. The results show that the Anaga massif is stable in both its horizontal and vertical components. The strain tensor analysis shows a 70 nstrain/yr. E-W compression in the central complex, perpendicular to the 2004 sismo-volcanic area, and 50 nstrain/yr. SW-NE extension towards the Northeast ridge. The residual velocity and strain patterns coincide with a decline in volcanic activity since the 2004 unrest.

  6. Long-Term Retrospective Analysis of Mackerel Spawning in the North Sea: A New Time Series and Modeling Approach to CPR Data

    PubMed Central

    Jansen, Teunis; Kristensen, Kasper; Payne, Mark; Edwards, Martin; Schrum, Corinna; Pitois, Sophie

    2012-01-01

    We present a unique view of mackerel (Scomber scombrus) in the North Sea based on a new time series of larvae caught by the Continuous Plankton Recorder (CPR) survey from 1948-2005, covering the period both before and after the collapse of the North Sea stock. Hydrographic backtrack modelling suggested that the effect of advection is very limited between spawning and larvae capture in the CPR survey. Using a statistical technique not previously applied to CPR data, we then generated a larval index that accounts for both catchability as well as spatial and temporal autocorrelation. The resulting time series documents the significant decrease of spawning from before 1970 to recent depleted levels. Spatial distributions of the larvae, and thus the spawning area, showed a shift from early to recent decades, suggesting that the central North Sea is no longer as important as the areas further west and south. These results provide a consistent and unique perspective on the dynamics of mackerel in this region and can potentially resolve many of the unresolved questions about this stock. PMID:22737221

  7. Disentangling the stochastic behavior of complex time series

    NASA Astrophysics Data System (ADS)

    Anvari, Mehrnaz; Tabar, M. Reza Rahimi; Peinke, Joachim; Lehnertz, Klaus

    2016-10-01

    Complex systems involving a large number of degrees of freedom, generally exhibit non-stationary dynamics, which can result in either continuous or discontinuous sample paths of the corresponding time series. The latter sample paths may be caused by discontinuous events - or jumps - with some distributed amplitudes, and disentangling effects caused by such jumps from effects caused by normal diffusion processes is a main problem for a detailed understanding of stochastic dynamics of complex systems. Here we introduce a non-parametric method to address this general problem. By means of a stochastic dynamical jump-diffusion modelling, we separate deterministic drift terms from different stochastic behaviors, namely diffusive and jumpy ones, and show that all of the unknown functions and coefficients of this modelling can be derived directly from measured time series. We demonstrate appli- cability of our method to empirical observations by a data-driven inference of the deterministic drift term and of the diffusive and jumpy behavior in brain dynamics from ten epilepsy patients. Particularly these different stochastic behaviors provide extra information that can be regarded valuable for diagnostic purposes.

  8. The Cross-Wavelet Transform and Analysis of Quasi-periodic Behavior in the Pearson-Readhead VLBI Survey Sources

    NASA Astrophysics Data System (ADS)

    Kelly, Brandon C.; Hughes, Philip A.; Aller, Hugh D.; Aller, Margo F.

    2003-07-01

    We introduce an algorithm for applying a cross-wavelet transform to analysis of quasi-periodic variations in a time series and introduce significance tests for the technique. We apply a continuous wavelet transform and the cross-wavelet algorithm to the Pearson-Readhead VLBI survey sources using data obtained from the University of Michigan 26 m paraboloid at observing frequencies of 14.5, 8.0, and 4.8 GHz. Thirty of the 62 sources were chosen to have sufficient data for analysis, having at least 100 data points for a given time series. Of these 30 sources, a little more than half exhibited evidence for quasi-periodic behavior in at least one observing frequency, with a mean characteristic period of 2.4 yr and standard deviation of 1.3 yr. We find that out of the 30 sources, there were about four timescales for every 10 time series, and about half of those sources showing quasi-periodic behavior repeated the behavior in at least one other observing frequency.

  9. Review of current GPS methodologies for producing accurate time series and their error sources

    NASA Astrophysics Data System (ADS)

    He, Xiaoxing; Montillet, Jean-Philippe; Fernandes, Rui; Bos, Machiel; Yu, Kegen; Hua, Xianghong; Jiang, Weiping

    2017-05-01

    The Global Positioning System (GPS) is an important tool to observe and model geodynamic processes such as plate tectonics and post-glacial rebound. In the last three decades, GPS has seen tremendous advances in the precision of the measurements, which allow researchers to study geophysical signals through a careful analysis of daily time series of GPS receiver coordinates. However, the GPS observations contain errors and the time series can be described as the sum of a real signal and noise. The signal itself can again be divided into station displacements due to geophysical causes and to disturbing factors. Examples of the latter are errors in the realization and stability of the reference frame and corrections due to ionospheric and tropospheric delays and GPS satellite orbit errors. There is an increasing demand on detecting millimeter to sub-millimeter level ground displacement signals in order to further understand regional scale geodetic phenomena hence requiring further improvements in the sensitivity of the GPS solutions. This paper provides a review spanning over 25 years of advances in processing strategies, error mitigation methods and noise modeling for the processing and analysis of GPS daily position time series. The processing of the observations is described step-by-step and mainly with three different strategies in order to explain the weaknesses and strengths of the existing methodologies. In particular, we focus on the choice of the stochastic model in the GPS time series, which directly affects the estimation of the functional model including, for example, tectonic rates, seasonal signals and co-seismic offsets. Moreover, the geodetic community continues to develop computational methods to fully automatize all phases from analysis of GPS time series. This idea is greatly motivated by the large number of GPS receivers installed around the world for diverse applications ranging from surveying small deformations of civil engineering structures (e.g., subsidence of the highway bridge) to the detection of particular geophysical signals.

  10. Generalization bounds of ERM-based learning processes for continuous-time Markov chains.

    PubMed

    Zhang, Chao; Tao, Dacheng

    2012-12-01

    Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.

  11. Sensitivity analysis of the GNSS derived Victoria plate motion

    NASA Astrophysics Data System (ADS)

    Apolinário, João; Fernandes, Rui; Bos, Machiel

    2014-05-01

    Fernandes et al. (2013) estimated the angular velocity of the Victoria tectonic block from geodetic data (GNSS derived velocities) only.. GNSS observations are sparse in this region and it is therefore of the utmost importance to use the available data (5 sites) in the most optimal way. Unfortunately, the existing time-series were/are affected by missing data and offsets. In addition, some time-series were close to the considered minimal threshold value to compute one reliable velocity solution: 2.5-3.0 years. In this research, we focus on the sensitivity of the derived angular velocity to changes in the data (longer data-span for some stations) by extending the used data-span: Fernandes et al. (2013) used data until September 2011. We also investigate the effect of adding other stations to the solution, which is now possible since more stations became available in the region. In addition, we study if the conventional power-law plus white noise model is indeed the best stochastic model. In this respect, we apply different noise models using HECTOR (Bos et al. (2013), which can use different noise models and estimate offsets and seasonal signals simultaneously. The seasonal signal estimation is also other important parameter, since the time-series are rather short or have large data spans at some stations, which implies that the seasonal signals still can have some effect on the estimated trends as shown by Blewitt and Lavellee (2002) and Bos et al. (2010). We also quantify the magnitude of such differences in the estimation of the secular velocity and their effect in the derived angular velocity. Concerning the offsets, we investigate how they can, detected and undetected, influence the estimated plate motion. The time of offsets has been determined by visual inspection of the time-series. The influence of undetected offsets has been done by adding small synthetic random walk signals that are too small to be detected visually but might have an effect on the estimated trend (Williams 2003, Langbein 2012). Finally, our preferable angular velocity estimation is used to evaluate the consequences on the kinematics of the Victoria block, namely the magnitude and azimuth of the relative motions with respect to the Nubia and Somalia plates and their tectonic implications. References Agnew, D. C. (2013). Realistic simulations of geodetic network data: The Fakenet package, Seismol. Res. Lett., 84 , 426-432, doi:10.1785/0220120185. Blewitt, G. & Lavallee, D., (2002). Effect of annual signals on geodetic velocity, J. geophys. Res., 107(B7), doi:10.1029/2001JB000570. Bos, M.S., R.M.S. Fernandes, S. Williams, L. Bastos (2012) Fast Error Analysis of Continuous GNSS Observations with Missing Data, Journal of Geodesy, doi: 10.1007/s00190-012-0605-0. Bos, M.S., L. Bastos, R.M.S. Fernandes, (2009). The influence of seasonal signals on the estimation of the tectonic motion in short continuous GPS time-series, J. of Geodynamics, j.jog.2009.10.005. Fernandes, R.M.S., J. M. Miranda, D. Delvaux, D. S. Stamps and E. Saria (2013). Re-evaluation of the kinematics of Victoria Block using continuous GNSS data, Geophysical Journal International, doi:10.1093/gji/ggs071. Langbein, J. (2012). Estimating rate uncertainty with maximum likelihood: differences between power-law and flicker-random-walk models, Journal of Geodesy, Volume 86, Issue 9, pp 775-783, Williams, S. D. P. (2003). Offsets in Global Positioning System time series, J. Geophys. Res., 108, 2310, doi:10.1029/2002JB002156, B6.

  12. Phenological Parameters Estimation Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney D.; Ross, Kenton W.; Spruce, Joseph P.; Smoot, James C.; Ryan, Robert E.; Gasser, Gerald E.; Prados, Donald L.; Vaughan, Ronald D.

    2010-01-01

    The Phenological Parameters Estimation Tool (PPET) is a set of algorithms implemented in MATLAB that estimates key vegetative phenological parameters. For a given year, the PPET software package takes in temporally processed vegetation index data (3D spatio-temporal arrays) generated by the time series product tool (TSPT) and outputs spatial grids (2D arrays) of vegetation phenological parameters. As a precursor to PPET, the TSPT uses quality information for each pixel of each date to remove bad or suspect data, and then interpolates and digitally fills data voids in the time series to produce a continuous, smoothed vegetation index product. During processing, the TSPT displays NDVI (Normalized Difference Vegetation Index) time series plots and images from the temporally processed pixels. Both the TSPT and PPET currently use moderate resolution imaging spectroradiometer (MODIS) satellite multispectral data as a default, but each software package is modifiable and could be used with any high-temporal-rate remote sensing data collection system that is capable of producing vegetation indices. Raw MODIS data from the Aqua and Terra satellites is processed using the TSPT to generate a filtered time series data product. The PPET then uses the TSPT output to generate phenological parameters for desired locations. PPET output data tiles are mosaicked into a Conterminous United States (CONUS) data layer using ERDAS IMAGINE, or equivalent software package. Mosaics of the vegetation phenology data products are then reprojected to the desired map projection using ERDAS IMAGINE

  13. Continuous pCO2 time series from Ocean Networks Canada cabled observatories at the northeast Pacific shelf edge and in the sub-tidal Arctic

    NASA Astrophysics Data System (ADS)

    Juniper, S. Kim; Sastri, Akash; Mihaly, Steven; Duke, Patrick; Else, Brent; Thomas, Helmuth; Miller, Lisa

    2017-04-01

    Marine pCO2 sensor technology has progressed to the point where months-long time series from remotely-deployed pCO2 sensors can be used to document seasonal and higher frequency variability in pCO2 and its relationship to oceanographic processes. Ocean Networks Canada recently deployed pCO2 sensors on two cabled platforms: a bottom-moored (400 m depth), vertical profiler at the edge of the northeast Pacific continental shelf off Vancouver Island, Canada, and a subtidal seafloor platform in the Canadian High Arctic (69˚ N) at Cambridge Bay, Nunavut. Both platforms streamed continuous data to a shore-based archive from Pro-Oceanus pCO2 sensors and other oceanographic instruments. The vertical profiler time series revealed substantial intrusions of corrosive (high CO2/low O2), saltier, colder water masses during the summertime upwelling season and during winter-time reversals of along-slope currents. Step-wise profiles during the downcast provided the most reliable pCO2 data, permitting the sensor to equilibrate to the broad range of pCO2 concentrations encountered over the 400 metre depth interval. The Arctic pCO2 sensor was deployed in August 2015. Reversing seasonal trends in pCO2 and dissolved oxygen values can be related to the changing balance of photosynthesis and respiration under sea ice, as influenced by irradiance. Correlation of pCO2 and dissolved oxygen sensor data and the collection of calibration samples have permitted evaluation of sensor performance in relation to operational conditions encountered in vertical profiling and lengthy exposure to subzero seawater.

  14. Computing Science and Statistics. Volume 24. Graphics and Visualization

    DTIC Science & Technology

    1993-03-01

    the dough , turbulent fluid flow, the time between drips of behavior changes radically when the population growth water from a faucet, Brownian motion... cookie which clearly is the discrete parameter analogue of continuous param- appropriate as after dinner fun. eter time series analysis". I strongly...methods. Your fortune cookie of the night reads: One problem that statisticians traditionally seem to "uYou have good friends who will come to your aid in

  15. A Satellite-Derived Climate-Quality Data Record of the Clear-Sky Surface Temperature of the Greenland Ice Sheet

    NASA Technical Reports Server (NTRS)

    Hall, Dorothy K.; Comiso, Josefino C.; DiGirolamo, Nicolo E.; Shuman, Christopher A.; Key, Jeffrey R.; Koenig, Lora S.

    2011-01-01

    We have developed a climate-quality data record of the clear-sky surface temperature of the Greenland Ice Sheet using the Moderate-Resolution Imaging Spectroradiometer (MODIS) Terra ice-surface temperature (1ST) algorithm. A climate-data record (CDR) is a time series of measurements of sufficient length, consistency, and continuity to determine climate variability and change. We present daily and monthly Terra MODIS ISTs of the Greenland Ice Sheet beginning on 1 March 2000 and continuing through 31 December 2010 at 6.25-km spatial resolution on a polar stereographic grid within +/-3 hours of 17:00Z or 2:00 PM Local Solar Time. Preliminary validation of the ISTs at Summit Camp, Greenland, during the 2008-09 winter, shows that there is a cold bias using the MODIS IST which underestimates the measured surface temperature by approximately 3 C when temperatures range from approximately -50 C to approximately -35 C. The ultimate goal is to develop a CDR that starts in 1981 with the Advanced Very High Resolution (AVHRR) Polar Pathfinder (APP) dataset and continues with MODIS data from 2000 to the present. Differences in the APP and MODIS cloud masks have so far precluded the current IST records from spanning both the APP and MODIS IST time series in a seamless manner though this will be revisited when the APP dataset has been reprocessed. The Greenland IST climate-quality data record is suitable for continuation using future Visible Infrared Imager Radiometer Suite (VIIRS) data and will be elevated in status to a CDR when at least 9 more years of climate-quality data become available either from MODIS Terra or Aqua, or from the VIIRS. The complete MODIS IST data record will be available online in the summer of 2011.

  16. Increasing retention in care of HIV-positive women in PMTCT services through continuous quality improvement-breakthrough (CQI-BTS) series in primary and secondary health care facilities in Nigeria: a cluster randomized controlled trial. The Lafiyan Jikin Mata Study.

    PubMed

    Oyeledun, Bolanle; Oronsaye, Frank; Oyelade, Taiwo; Becquet, Renaud; Odoh, Deborah; Anyaike, Chukwuma; Ogirima, Francis; Ameh, Bernice; Ajibola, Abiola; Osibo, Bamidele; Imarhiagbe, Collins; Abutu, Inedu

    2014-11-01

    Rates of retention in care of HIV-positive pregnant women in care programs in Nigeria remain generally poor with rates around 40% reported for specific programs. Poor quality of services in health facilities and long waiting times are among the critical factors militating against retention of these women in care. The aim of the interventions in this study is to assess whether a continuous quality improvement intervention using a Breakthrough Series approach in local district hospitals and primary health care clinics will lead to improved retention of HIV-positive women and mothers. A cluster randomized controlled trial with 32 health facilities randomized to receive a continuous quality improvement/Breakthrough Series intervention or not. The care protocol for HIV-infected pregnant women and mothers is the same in all sites. The quality improvement intervention started 4 months before enrollment of individual HIV-infected pregnant women and initially focused on reducing waiting times for women and also ensuring that antiretroviral drugs are dispensed on the same day as clinic attendance. The primary outcome measure is retention of HIV-positive mothers in care at 6 months postpartum. Results of this trial will inform whether quality improvement interventions are an effective means of improving retention in prevention of mother-to-child transmission of HIV programs and will also guide where health system interventions should focus to improve the quality of care for HIV-positive women. This will benefit policymakers and program managers as they seek to improve retention rates in HIV care programs.

  17. Chaos control by electric current in an enzymatic reaction.

    PubMed

    Lekebusch, A; Förster, A; Schneider, F W

    1996-09-01

    We apply the continuous delayed feedback method of Pyragas to control chaos in the enzymatic Peroxidase-Oxidase (PO) reaction, using the electric current as the control parameter. At each data point in the time series, a time delayed feedback function applies a small amplitude perturbation to inert platinum electrodes, which causes redox processes on the surface of the electrodes. These perturbations are calculated as the difference between the previous (time delayed) signal and the actual signal. Unstable periodic P1, 1(1), and 1(2) orbits (UPOs) were stabilized in the CSTR (continuous stirred tank reactor) experiments. The stabilization is demonstrated by at least three conditions: A minimum in the experimental dispersion function, the equality of the delay time with the period of the stabilized attractor and the embedment of the stabilized periodic attractor in the chaotic attractor.

  18. Test Results for the 2.5-kVA Ground Fault Detector.

    DTIC Science & Technology

    1986-01-01

    powered through Q1 is delayed in turning off by capacitor C4 for approximately 12 msec. When the relay does drop out, the triac Q7 (a bidirectional...SECONDS) CALBIRATIONS, ETC.) / / -. L a 2 33 ( j4 -40 7 S’f9 I! Note time of con pletlon o test series . .I1 " , -3 TEST DATA FORM Bc (CONTINUATION...test series . 2 *51/7 ,6. L7.. i K-t TEST CATA FORM D OCEAN OPERABILITY TESTS FOR ELEETRIC FIELD DETECTOR AND GROUND FAULT DETECTORS This data form is to

  19. A continuous analog of run length distributions reflecting accumulated fractionation events.

    PubMed

    Yu, Zhe; Sankoff, David

    2016-11-11

    We propose a new, continuous model of the fractionation process (duplicate gene deletion after polyploidization) on the real line. The aim is to infer how much DNA is deleted at a time, based on segment lengths for alternating deleted (invisible) and undeleted (visible) regions. After deriving a number of analytical results for "one-sided" fractionation, we undertake a series of simulations that help us identify the distribution of segment lengths as a gamma with shape and rate parameters evolving over time. This leads to an inference procedure based on observed length distributions for visible and invisible segments. We suggest extensions of this mathematical and simulation work to biologically realistic discrete models, including two-sided fractionation.

  20. PM₁₀ exposure and non-accidental mortality in Asian populations: a meta-analysis of time-series and case-crossover studies.

    PubMed

    Park, Hye Yin; Bae, Sanghyuk; Hong, Yun-Chul

    2013-01-01

    We investigated the association between particulate matter less than 10 µm in aerodynamic diameter (PM₁₀) exposure and non-accidental mortality in Asian populations by meta-analysis, using both time-series and case-crossover analysis. Among the 819 published studies searched from PubMed and EMBASE using key words related to PM₁₀ exposure and non-accidental mortality in Asian countries, 8 time-series and 4 case-crossover studies were selected for meta-analysis after exclusion by selection criteria. We obtained the relative risk (RR) and 95% confidence intervals (CI) of non-accidental mortality per 10 µg/m³ increase of daily PM₁₀ from each study. We used Q statistics to test the heterogeneity of the results among the different studies and evaluated for publication bias using Begg funnel plot and Egger test. Testing for heterogeneity showed significance (p<0.001); thus, we applied a random-effects model. RR (95% CI) per 10 µg/m³ increase of daily PM₁₀ for both the time-series and case-crossover studies combined, time-series studies relative risk only, and case-crossover studies only, were 1.0047 (1.0033 to 1.0062), 1.0057 (1.0029 to 1.0086), and 1.0027 (1.0010 to 1.0043), respectively. The non-significant Egger test suggested that this analysis was not likely to have a publication bias. We found a significant positive association between PM₁₀ exposure and non-accidental mortality among Asian populations. Continued investigations are encouraged to contribute to the health impact assessment and public health management of air pollution in Asian countries.

  1. The Chaotic Long-term X-ray Variability of 4U 1705-44

    NASA Astrophysics Data System (ADS)

    Phillipson, R. A.; Boyd, P. T.; Smale, A. P.

    2018-04-01

    The low-mass X-ray binary 4U1705-44 exhibits dramatic long-term X-ray time variability with a timescale of several hundred days. The All-Sky Monitor (ASM) aboard the Rossi X-ray Timing Explorer (RXTE) and the Japanese Monitor of All-sky X-ray Image (MAXI) aboard the International Space Station together have continuously observed the source from December 1995 through May 2014. The combined ASM-MAXI data provide a continuous time series over fifty times the length of the timescale of interest. Topological analysis can help us identify 'fingerprints' in the phase-space of a system unique to its equations of motion. The Birman-Williams theorem postulates that if such fingerprints are the same between two systems, then their equations of motion must be closely related. The phase-space embedding of the source light curve shows a strong resemblance to the double-welled nonlinear Duffing oscillator. We explore a range of parameters for which the Duffing oscillator closely mirrors the time evolution of 4U1705-44. We extract low period, unstable periodic orbits from the 4U1705-44 and Duffing time series and compare their topological information. The Duffing and 4U1705-44 topological properties are identical, providing strong evidence that they share the same underlying template. This suggests that we can look to the Duffing equation to help guide the development of a physical model to describe the long-term X-ray variability of this and other similarly behaved X-ray binary systems.

  2. Changes in seasonal streamflow extremes experienced in rivers of Northwestern South America (Colombia)

    NASA Astrophysics Data System (ADS)

    Pierini, J. O.; Restrepo, J. C.; Aguirre, J.; Bustamante, A. M.; Velásquez, G. J.

    2017-04-01

    A measure of the variability in seasonal extreme streamflow was estimated for the Colombian Caribbean coast, using monthly time series of freshwater discharge from ten watersheds. The aim was to detect modifications in the streamflow monthly distribution, seasonal trends, variance and extreme monthly values. A 20-year length time moving window, with 1-year successive shiftments, was applied to the monthly series to analyze the seasonal variability of streamflow. The seasonal-windowed data were statistically fitted through the Gamma distribution function. Scale and shape parameters were computed using the Maximum Likelihood Estimation (MLE) and the bootstrap method for 1000 resample. A trend analysis was performed for each windowed-serie, allowing to detect the window of maximum absolute values for trends. Significant temporal shifts in seasonal streamflow distribution and quantiles (QT), were obtained for different frequencies. Wet and dry extremes periods increased significantly in the last decades. Such increase did not occur simultaneously through the region. Some locations exhibited continuous increases only at minimum QT.

  3. Improved detection of radioactive material using a series of measurements

    NASA Astrophysics Data System (ADS)

    Mann, Jenelle

    The goal of this project is to develop improved algorithms for detection of radioactive sources that have low signal compared to background. The detection of low signal sources is of interest in national security applications where the source may have weak ionizing radiation emissions, is heavily shielded, or the counting time is short (such as portal monitoring). Traditionally to distinguish signal from background the decision threshold (y*) is calculated by taking a long background count and limiting the false negative error (alpha error) to 5%. Some problems with this method include: background is constantly changing due to natural environmental fluctuations and large amounts of data are being taken as the detector continuously scans that are not utilized. Rather than looking at a single measurement, this work investigates looking at a series of N measurements and develops an appropriate decision threshold for exceeding the decision threshold n times in a series of N. This methodology is investigated for a rectangular, triangular, sinusoidal, Poisson, and Gaussian distribution.

  4. An application of HOMER and ACMANT for homogenising monthly precipitation records in Ireland

    NASA Astrophysics Data System (ADS)

    Coll, John; Curley, Mary; Domonkos, Peter; Aguilar, Enric; Walsh, Seamus; Sweeney, John

    2015-04-01

    Climate change studies based only on raw long-term data are potentially flawed due to the many breaks introduced from non-climatic sources. Consequently, accurate climate data is an essential prerequisite for basing climate related decision making on; and quality controlled, homogenised climate data are becoming integral to European Union Member State efforts to deliver climate services. Ireland has a good repository of monthly precipitation data at approximately 1900 locations stored in the Met Éireann database. The record length at individual precipitation stations varies greatly. However, an audit of the data established the continuous record length at each station and the number of missing months, and based on this two initial subsets of station series (n = 88 and n = 110) were identified for preliminary homogenisation efforts. The HOMER joint detection algorithm was applied to the combined network of these 198 longer station series on an Ireland-wide basis where contiguous intact monthly records ranged from ~40 to 71 years (1941 - 2010). HOMER detected 91 breaks in total in the country-wide series analysis distributed across 63 (~32%) of the 71 year series records analysed. In a separate approach, four sub-series clusters (n = 38 - 61) for the 1950 - 2010 period were used in a parallel analysis applying both ACMANT and HOMER to a regionalised split of the 198 series. By comparison ACMANT detected a considerably higher number of breaks across the four regional series clusters, 238 distributed across 123 (~62%) of the 61 year series records analysed. These preliminary results indicate a relatively high proportion of detected breaks in the series, a situation not generally reflected in observed later 20th century precipitation records across Europe (Domonkos, 2014). However, this elevated ratio of series with detected breaks (~32% in HOMER and ~62% in ACMANT) parallels the break detection rate in a recent analysis of series in the Netherlands (Buishand et al 2013). In the case of Ireland, the climate is even more markedly maritime than that of the Netherlands and the spatial correlations between the Irish series are high (>0.8). Therefore it is likely that both HOMER and ACMANT are detecting relatively small breaks in the series; e.g. the overall range of correction amplitudes derived by HOMER were small and only applied to sections of the corrected series. As Ireland has a relatively dense network of highly correlated station series, we anticipate continued high detection rates as the analysis is extended to incorporate a greater number of station series, and that the ongoing work will quantify the extent of any breaks in Ireland's monthly precipitation series. KEY WORDS: Ireland, precipitation, time series, homogenisation, HOMER, ACMANT. References Buishand, T.A., DeMartino, G., Spreeuw, J.N., Brandsma, T. (2013). Homogeneity of precipitation series in the Netherlands and their trends in the past century. International Journal of Climatology. 33:815-833 Domonkos, P. (2014). Homogenisation of precipitation time series with ACMANT. Theoretical and Applied Climatology. 118:1-2. DOI 10.1007/s00704-014-1298-5.

  5. Time series of low-degree geopotential coefficients from SLR data: estimation of Earth's figure axis and LOD variations

    NASA Astrophysics Data System (ADS)

    Luceri, V.; Sciarretta, C.; Bianco, G.

    2012-12-01

    The redistribution of the mass within the earth system induces changes in the Earth's gravity field. In particular, the second-degree geopotential coefficients reflect the behaviour of the Earth's inertia tensor of order 2, describing the main mass variations of our planet impacting the EOPs. Thanks to the long record of accurate and continuous laser ranging observations to Lageos and other geodetic satellites, SLR is the only current space technique capable to monitor the long time variability of the Earth's gravity field with adequate accuracy. Time series of low-degree geopotential coefficients are estimated with our analysis of SLR data (spanning more than 25 years) from several geodetic satellites in order to detect trends and periodic variations related to tidal effects and atmospheric/oceanic mass variations. This study is focused on the variations of the second-degree Stokes coefficients related to the Earth's principal figure axis and oblateness: C21, S21 and C20. On the other hand, surface mass load variations induce excitations in the EOPs that are proportional to the same second-degree coefficients. The time series of direct estimates of low degree geopotential and those derived from the EOP excitation functions are compared and presented together with their time and frequency analysis.

  6. 18 years of continuous observation of tritium and atmospheric precipitations in Ramnicu Valcea (Romania): A time series analysis.

    PubMed

    Duliu, Octavian G; Varlam, Carmen; Shnawaw, Muataz Dheyaa

    2018-05-16

    To get more information on the origin of tritium and to evidence any possible presence of anthropogenic sources, between January 1999 and December 2016, the precipitation level and tritium concentration were monthly recorded and investigated by the Cryogenic Institute of Ramnicu Valcea, Romania. Compared with similar data covering a radius of about 1200 km westward, the measurements gave similar results concerning the time evolution of tritium content and precipitation level for the entire time interval excepting the period between 2009 and 2011 when the tritium concentrations showed a slight increase, most probable due to the activity of neighboring experimental pilot plant for tritium and deuterium separation. Regardless this fact, all data pointed towards a steady tendency of tritium concentrations to decrease with an annual rate of about 1.4 ± 0.05%. The experimental data on precipitation levels and tritium concentrations form two complete time series whose time series analysis showed, at p < 0.01, the presence of a single one-year periodicity whose coincident maximums which correspond to late spring - early summer months suggest the existence of the Spring Leak mechanism with a possible contribution of the soil moisture remobilization during the warm period. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Data Rescue for precipitation station network in Slovak Republic

    NASA Astrophysics Data System (ADS)

    Fasko, Pavel; Bochníček, Oliver; Švec, Marek; Paľušová, Zuzana; Markovič, Ladislav

    2016-04-01

    Transparency of archive catalogues presents very important task for the data saving. It helps to the further activities e.g. digitalization and homogenization. For the time being visualization of time series continuation in precipitation stations (approximately 1250 stations) is under way in Slovak Republic since the beginning of observation (meteorological stations gradually began to operate during the second half of the 19th century in Slovakia). Visualization is joined with the activities like verification and accessibility of the data mentioned in the archive catalogue, station localization according to the historical annual books, conversion of coordinates into x-JTSK, y-JTSK and hydrological catchment assignment. Clustering of precipitation stations at the specific hydrological catchment in the map and visualization of the data duration (line graph) will lead to the effective assignment of corresponding precipitation stations for the prolongation of time series. This process should be followed by the process of turn or trend detection and homogenization. The risks and problems at verification of records from archive catalogues, their digitalization, repairs and the way of visualization will be seen in poster. During the searching process of the historical and often short time series, we realized the importance of mainly those stations, located in the middle and higher altitudes. They might be used as replacement for up to now quoted fictive points used at the construction of precipitation maps. Supplementing and enhancing the time series of individual stations will enable to follow changes in precipitation totals during the certain period as well as area totals for individual catchments in various time periods appreciated mainly by hydrologists and agro-climatologists.

  8. The chaotic long-term X-ray variability of 4U 1705-44

    NASA Astrophysics Data System (ADS)

    Phillipson, R. A.; Boyd, P. T.; Smale, A. P.

    2018-07-01

    The low-mass X-ray binary 4U1705-44 exhibits dramatic long-term X-ray time variability with a time-scale of several hundred days. The All-Sky Monitor (ASM) aboard the Rossi X-ray Timing Explorer (RXTE) and the Japanese Monitor of All-sky X-ray Image (MAXI) aboard the International Space Station together have continuously observed the source from 1995 December through 2014 May. The combined ASM-MAXI data provide a continuous time series over 50 times the length of the time-scale of interest. Topological analysis can help us identify `fingerprints' in the phase space of a system unique to its equations of motion. The Birman-Williams theorem postulates that if such fingerprints are the same between two systems, then their equations of motion must be closely related. The phase-space embedding of the source light curve shows a strong resemblance to the double-welled non-linear Duffing oscillator. We explore a range of parameters for which the Duffing oscillator closely mirrors the time evolution of 4U1705-44. We extract low period, unstable periodic orbits from the 4U1705-44 and Duffing time series and compare their topological information. The Duffing and 4U1705-44 topological properties are identical, providing strong evidence that they share the same underlying template. This suggests that we can look to the Duffing equation to help guide the development of a physical model to describe the long-term X-ray variability of this and other similarly behaved X-ray binary systems.

  9. Recent Papers in Parametric Modelling of Time Series.

    DTIC Science & Technology

    1983-04-01

    conceptually R t A A different ways. First if one is Interested in savi 8h t~p t,p ,p parameters, then a resolution comparable to the . t -) Let’s agree to...has been affected most. Continue to next page for Figures 2 through 7 CONCLUSIONS We have presented a general framwork for de- riving and

  10. Relating Factor Models for Longitudinal Data to Quasi-Simplex and NARMA Models

    ERIC Educational Resources Information Center

    Rovine, Michael J.; Molenaar, Peter C. M.

    2005-01-01

    In this article we show the one-factor model can be rewritten as a quasi-simplex model. Using this result along with addition theorems from time series analysis, we describe a common general model, the nonstationary autoregressive moving average (NARMA) model, that includes as a special case, any latent variable model with continuous indicators…

  11. 27 CFR 19.1008 - Marks.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... Where there is a change in proprietorship, or in the individual, firm, corporate name or trade name, the series in use at the time of the change may be continued. (Sec. 232, Pub. L. 96-233, 94 Stat. 278, (26 U...) Serial number of container; (4) Name, address (city or town and State) and permit number of the alcohol...

  12. Process air quality data

    NASA Technical Reports Server (NTRS)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  13. Random cascade model in the limit of infinite integral scale as the exponential of a nonstationary 1/f noise: Application to volatility fluctuations in stock markets

    NASA Astrophysics Data System (ADS)

    Muzy, Jean-François; Baïle, Rachel; Bacry, Emmanuel

    2013-04-01

    In this paper we propose a new model for volatility fluctuations in financial time series. This model relies on a nonstationary Gaussian process that exhibits aging behavior. It turns out that its properties, over any finite time interval, are very close to continuous cascade models. These latter models are indeed well known to reproduce faithfully the main stylized facts of financial time series. However, it involves a large-scale parameter (the so-called “integral scale” where the cascade is initiated) that is hard to interpret in finance. Moreover, the empirical value of the integral scale is in general deeply correlated to the overall length of the sample. This feature is precisely predicted by our model, which, as illustrated by various examples from daily stock index data, quantitatively reproduces the empirical observations.

  14. A Geodetic Strain Rate Model for the Pacific-North American Plate Boundary, western United States

    NASA Astrophysics Data System (ADS)

    Kreemer, C.; Hammond, W. C.; Blewitt, G.; Holland, A. A.; Bennett, R. A.

    2012-04-01

    We present a model of crustal strain rates derived from GPS measurements of horizontal station velocities in the Pacific-North American plate boundary in the western United States. The model reflects a best estimate of present-day deformation from the San Andreas fault system in the west to the Basin and Range province in the east. Of the total 2,846 GPS velocities used in the model, 1,197 are derived by ourselves, and 1,649 are taken from (mostly) published results. The velocities derived by ourselves (the "UNR solution") are estimated from GPS position time-series of continuous and semi-continuous stations for which data are publicly available. We estimated ITRF2005 positions from 2002-2011.5 using JPL's GIPSY-OASIS II software with ambiguity resolution applied using our custom Ambizap software. Only stations with time-series that span at least 2.25 years are considered. We removed from the time-series continental-scale common-mode errors using a spatially-varying filtering technique. Velocity uncertainties (typically 0.1-0.3 mm/yr) assume that the time-series contain flicker plus white noise. We used a subset of stations on the stable parts of the Pacific and North American plates to estimate the Pacific-North American pole of rotation. This pole is applied as a boundary condition to the model and the North American - ITRF2005 pole is used to rotate our velocities into a North America fixed reference frame. We do not include parts of the time-series that show curvature due to post-seismic deformation after major earthquakes and we also exclude stations whose time-series display a significant unexplained non-linearity or that are near volcanic centers. Transient effects longer than the observation period (i.e., slow viscoelastic relaxation) are left in the data. We added to the UNR solution velocities from 12 other studies. The velocities are transformed onto the UNR solution's reference frame by estimating and applying a translation and rotation that minimizes the velocities at collocated stations. We removed obvious outliers and velocities in areas that we identified to undergo subsidence likely due to excessive water pumping. For the strain rate calculations we excluded GPS stations with anomalous vertical motion or annual horizontal periodicity, which are indicators of local site instability. First, we used the stations from the UNR solution to create a Delaunay triangulation and estimated the horizontal strain rate components (and rigid body rotation) for each triangle in a linear least-squares inversion using the horizontal velocities as input. Some level of spatial damping was applied to minimize unnecessary spatial variation in the model parameters. The strain rates estimates were then used as a priori strain rate variances in a method that fits continuous bi-cubic Bessel spline functions through the velocity gradient field while minimizing the weighted misfit to all velocities. A minimal level of spatial smoothing of the variances was applied. The strain rate tensor model is shown by contours of the second invariant of the tensor, which is a measure of the amplitude that is coordinate frame independent. We also show a map of the tensor style and of the signal-to-noise ratio of the model.

  15. Long-times series of infrasonic records at open-vents volcanoes (Yasur volcano, Vanuatu, 2003-2014): the remarkable temporal stability of magma viscosity

    NASA Astrophysics Data System (ADS)

    Vergniolle, S.; Souty, V.; Zielinski, C.; Bani, P.; LE Pichon, A.; Lardy, M.; Millier, P.; Herry, P.; Todman, S.; Garaebiti, E.

    2017-12-01

    Open-vents volcanoes, often presenting series of Strombolian explosions of various intensity, are responding, although with a delay, to any changes in the degassing pattern, providing a quasi-direct route to processes at depth. Open-vents volcanoes display a persistent volcanic activity, although of variable intensity. Long-times series at open-vents volcanoes could therefore be key measurements to unravel physical processes at the origin of Strombolian explosions and be crucial for monitoring. Continuous infrasonic records can be used to estimate the gas volume expelled at the vent during explosions (bursting of a long slug). The gas volume of each explosion is deduced from a series of two successive integrations of acoustic pressure (monopole). Here we analysed more than 4 years of infrasonic records at Yasur volcano (Vanuatu), spanning between 2003 and 2014 and organised into 8 main quasi-continuous periods. The relationship between the gas volume of each explosion and its associated maximum positive acoustic pressure, a proxy for the inner gas overpressure at bursting, shows a remarkably stable trend over the 8 periods. Two main trends exists, one which covers the full range of acoustic pressures (called « strong explosions ») and the second which represents explosions with a large gas volume and mild acoustic pressure. The class of « strong explosions » clearly follows the model of Del Bello et al. (2012), which shows that the inner gas overpressure at bursting, here empirically measured by the maximum acoustic pressure, is proportional to the gas volume. Constrains on magma viscosity and conduit radius, are deduced from this trend and from the gas volume at the transition passive-active degassing. The remarkable stability of this trend over time suggests that 1) the magma viscosity is stable at the depth where gas overpressure is produced within the slug and 2) any potential changes in magma viscosity occur very close to the top of the magma column.

  16. AIRS Ozone Burden During Antarctic Winter: Time Series from 8/1/2005 to 9/30/2005

    NASA Image and Video Library

    2007-07-24

    The Atmospheric Infrared Sounder (AIRS) provides a daily global 3-dimensional view of Earth's ozone layer. Since AIRS observes in the thermal infrared spectral range, it also allows scientists to view from space the Antarctic ozone hole for the first time continuously during polar winter. This image sequence captures the intensification of the annual ozone hole in the Antarctic Polar Vortex. http://photojournal.jpl.nasa.gov/catalog/PIA09938

  17. A Satellite-Derived Climate-Quality Data Record of the Clear-Sky Surface Temperature of the Greenland Ice Sheet

    NASA Technical Reports Server (NTRS)

    Hall, Dorothy K.; Comiso, Josefino C.; DiGirolamo, Nikolo E.; Shuman, Christopher A.; Key, Jeffrey R.; Koenig, Lora S.

    2012-01-01

    We have developed a climate-quality data record of the clear-sky surface temperature of the Greenland Ice Sheet using the Moderate-Resolution Imaging Spectroradiometer (MODIS) ice-surface temperature (1ST) algorithm. A climate-data record (CDR) is a time series of measurements of sufficient length, consistency, and continuity to determine climate variability and change. We present daily and monthly MODIS ISTs of the Greenland Ice Sheet beginning on 1 March 2000 and continuing through 31 December 2010 at 6.25-km spatial resolution on a polar stereographic grid. This record will be elevated in status to a CDR when at least nine more years of data become available either from MODIS Terra or Aqua, or from the Visible Infrared Imager Radiometer Suite (VIIRS) to be launched in October 2011. Our ultimate goal is to develop a CDR that starts in 1981 with the Advanced Very High Resolution (AVHRR) Polar Pathfinder (APP) dataset and continues with MODIS data from 2000 to the present, and into the VIIRS era. Differences in the APP and MODIS cloud masks have so far precluded the current 1ST records from spanning both the APP and MODIS time series in a seamless manner though this will be revisited when the APP dataset has been reprocessed. The complete MODIS 1ST daily and monthly data record is available online.

  18. Promises and Challenges in Continuous Tracking Utilizing Amino Acids in Skin Secretions for Active Multi-Factor Biometric Authentication for Cybersecurity.

    PubMed

    Agudelo, Juliana; Privman, Vladimir; Halámek, Jan

    2017-07-05

    We consider a new concept of biometric-based cybersecurity systems for active authentication by continuous tracking, which utilizes biochemical processing of metabolites present in skin secretions. Skin secretions contain a large number of metabolites and small molecules that can be targeted for analysis. Here we argue that amino acids found in sweat can be exploited for the establishment of an amino acid profile capable of identifying an individual user of a mobile or wearable device. Individual and combinations of amino acids processed by biocatalytic cascades yield physical (optical or electronic) signals, providing a time-series of several outputs that, in their entirety, should suffice to authenticate a specific user based on standard statistical criteria. Initial results, motivated by biometrics, indicate that single amino acid levels can provide analog signals that vary according to the individual donor, albeit with limited resolution versus noise. However, some such assays offer digital separation (into well-defined ranges of values) according to groups such as age, biological sex, race, and physiological state of the individual. Multi-input biocatalytic cascades that handle several amino acid signals to yield a single digital-type output, as well as continuous-tracking time-series data rather than a single-instance sample, should enable active authentication at the level of an individual. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. 78 FR 50128 - Self-Regulatory Organizations; EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-16

    ... Change To Amend Rule 2.5 To Outline the Continuing Education Requirements for Series 56 Licensees and Its Fee Schedule To Include Fees for the Series 56 Examination and Its Related Continuing Education... Rule 2.5 to: (i) outline the continuing education requirements for Authorized Traders \\3\\ of Members \\4...

  20. 78 FR 50120 - Self-Regulatory Organizations; EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-16

    ... Change To Amend Rule 2.5 To Outline the Continuing Education Requirements for Series 56 Licensees and Its Fee Schedule To Include Fees for the Series 56 Examination and Its Related Continuing Education... Rule 2.5 to: (i) Outline the continuing education requirements for Authorized Traders \\3\\ of Members \\4...

  1. Data Sets and Data Services at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2014-12-01

    The Northern California Earthquake Data Center (NCEDC) houses a unique and comprehensive data archive and provides real-time services for a variety of seismological and geophysical data sets that encompass northern and central California. We have over 80 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates in both raw and RINEX format. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 890,000 events from 1984 to the present, and the NCEDC provides catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also host and provide event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a variety of ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  2. Northern California Earthquake Data Center: Data Sets and Data Services

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Allen, R. M.; Zuzlewski, S.

    2015-12-01

    The Northern California Earthquake Data Center (NCEDC) provides a permanent archive and real-time data distribution services for a unique and comprehensive data set of seismological and geophysical data sets encompassing northern and central California. We provide access to over 85 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 900,000 events from 1984 to the present, and the NCEDC serves catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also serve event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a several ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  3. Applications and development of new algorithms for displacement analysis using InSAR time series

    NASA Astrophysics Data System (ADS)

    Osmanoglu, Batuhan

    Time series analysis of Synthetic Aperture Radar Interferometry (InSAR) data has become an important scientific tool for monitoring and measuring the displacement of Earth's surface due to a wide range of phenomena, including earthquakes, volcanoes, landslides, changes in ground water levels, and wetlands. Time series analysis is a product of interferometric phase measurements, which become ambiguous when the observed motion is larger than half of the radar wavelength. Thus, phase observations must first be unwrapped in order to obtain physically meaningful results. Persistent Scatterer Interferometry (PSI), Stanford Method for Persistent Scatterers (StaMPS), Short Baselines Interferometry (SBAS) and Small Temporal Baseline Subset (STBAS) algorithms solve for this ambiguity using a series of spatio-temporal unwrapping algorithms and filters. In this dissertation, I improve upon current phase unwrapping algorithms, and apply the PSI method to study subsidence in Mexico City. PSI was used to obtain unwrapped deformation rates in Mexico City (Chapter 3),where ground water withdrawal in excess of natural recharge causes subsurface, clay-rich sediments to compact. This study is based on 23 satellite SAR scenes acquired between January 2004 and July 2006. Time series analysis of the data reveals a maximum line-of-sight subsidence rate of 300mm/yr at a high enough resolution that individual subsidence rates for large buildings can be determined. Differential motion and related structural damage along an elevated metro rail was evident from the results. Comparison of PSI subsidence rates with data from permanent GPS stations indicate root mean square (RMS) agreement of 6.9 mm/yr, about the level expected based on joint data uncertainty. The Mexico City results suggest negligible recharge, implying continuing degradation and loss of the aquifer in the third largest metropolitan area in the world. Chapters 4 and 5 illustrate the link between time series analysis and three-dimensional (3-D) phase unwrapping. Chapter 4 focuses on the unwrapping path. Unwrapping algorithms can be divided into two groups, path-dependent and path-independent algorithms. Path-dependent algorithms use local unwrapping functions applied pixel-by-pixel to the dataset. In contrast, path-independent algorithms use global optimization methods such as least squares, and return a unique solution. However, when aliasing and noise are present, path-independent algorithms can underestimate the signal in some areas due to global fitting criteria. Path-dependent algorithms do not underestimate the signal, but, as the name implies, the unwrapping path can affect the result. Comparison between existing path algorithms and a newly developed algorithm based on Fisher information theory was conducted. Results indicate that Fisher information theory does indeed produce lower misfit results for most tested cases. Chapter 5 presents a new time series analysis method based on 3-D unwrapping of SAR data using extended Kalman filters. Existing methods for time series generation using InSAR data employ special filters to combine two-dimensional (2-D) spatial unwrapping with one-dimensional (1-D) temporal unwrapping results. The new method, however, combines observations in azimuth, range and time for repeat pass interferometry. Due to the pixel-by-pixel characteristic of the filter, the unwrapping path is selected based on a quality map. This unwrapping algorithm is the first application of extended Kalman filters to the 3-D unwrapping problem. Time series analyses of InSAR data are used in a variety of applications with different characteristics. Consequently, it is difficult to develop a single algorithm that can provide optimal results in all cases, given that different algorithms possess a unique set of strengths and weaknesses. Nonetheless, filter-based unwrapping algorithms such as the one presented in this dissertation have the capability of joining multiple observations into a uniform solution, which is becoming an important feature with continuously growing datasets.

  4. Creating a monthly time series of the potentiometric surface in the Upper Floridan aquifer, Northern Tampa Bay area, Florida, January 2000-December 2009

    USGS Publications Warehouse

    Lee, Terrie M.; Fouad, Geoffrey G.

    2014-01-01

    In Florida’s karst terrain, where groundwater and surface waters interact, a mapping time series of the potentiometric surface in the Upper Floridan aquifer offers a versatile metric for assessing the hydrologic condition of both the aquifer and overlying streams and wetlands. Long-term groundwater monitoring data were used to generate a monthly time series of potentiometric surfaces in the Upper Floridan aquifer over a 573-square-mile area of west-central Florida between January 2000 and December 2009. Recorded groundwater elevations were collated for 260 groundwater monitoring wells in the Northern Tampa Bay area, and a continuous time series of daily observations was created for 197 of the wells by estimating missing daily values through regression relations with other monitoring wells. Kriging was used to interpolate the monthly average potentiometric-surface elevation in the Upper Floridan aquifer over a decade. The mapping time series gives spatial and temporal coherence to groundwater monitoring data collected continuously over the decade by three different organizations, but at various frequencies. Further, the mapping time series describes the potentiometric surface beneath parts of six regionally important stream watersheds and 11 municipal well fields that collectively withdraw about 90 million gallons per day from the Upper Floridan aquifer. Monthly semivariogram models were developed using monthly average groundwater levels at wells. Kriging was used to interpolate the monthly average potentiometric-surface elevations and to quantify the uncertainty in the interpolated elevations. Drawdown of the potentiometric surface within well fields was likely the cause of a characteristic decrease and then increase in the observed semivariance with increasing lag distance. This characteristic made use of the hole effect model appropriate for describing the monthly semivariograms and the interpolated surfaces. Spatial variance reflected in the monthly semivariograms decreased markedly between 2002 and 2003, timing that coincided with decreases in well-field pumping. Cross-validation results suggest that the kriging interpolation may smooth over the drawdown of the potentiometric surface near production wells. The groundwater monitoring network of 197 wells yielded an average kriging error in the potentiometric-surface elevations of 2 feet or less over approximately 70 percent of the map area. Additional data collection within the existing monitoring network of 260 wells and near selected well fields could reduce the error in individual months. Reducing the kriging error in other areas would require adding new monitoring wells. Potentiometric-surface elevations fluctuated by as much as 30 feet over the study period, and the spatially averaged elevation for the entire surface rose by about 2 feet over the decade. Monthly potentiometric-surface elevations describe the lateral groundwater flow patterns in the aquifer and are usable at a variety of spatial scales to describe vertical groundwater recharge and discharge conditions for overlying surface-water features.

  5. Mapping Impervious Surface Expansion using Medium-resolution Satellite Image Time Series: A Case Study in the Yangtze River Delta, China

    NASA Technical Reports Server (NTRS)

    Gao, Feng; DeColstoun, Eric Brown; Ma, Ronghua; Weng, Qihao; Masek, Jeffrey G.; Chen, Jin; Pan, Yaozhong; Song, Conghe

    2012-01-01

    Cities have been expanding rapidly worldwide, especially over the past few decades. Mapping the dynamic expansion of impervious surface in both space and time is essential for an improved understanding of the urbanization process, land-cover and land-use change, and their impacts on the environment. Landsat and other medium-resolution satellites provide the necessary spatial details and temporal frequency for mapping impervious surface expansion over the past four decades. Since the US Geological Survey opened the historical record of the Landsat image archive for free access in 2008, the decades-old bottleneck of data limitation has gone. Remote-sensing scientists are now rich with data, and the challenge is how to make best use of this precious resource. In this article, we develop an efficient algorithm to map the continuous expansion of impervious surface using a time series of four decades of medium-resolution satellite images. The algorithm is based on a supervised classification of the time-series image stack using a decision tree. Each imerpervious class represents urbanization starting in a different image. The algorithm also allows us to remove inconsistent training samples because impervious expansion is not reversible during the study period. The objective is to extract a time series of complete and consistent impervious surface maps from a corresponding times series of images collected from multiple sensors, and with a minimal amount of image preprocessing effort. The approach was tested in the lower Yangtze River Delta region, one of the fastest urban growth areas in China. Results from nearly four decades of medium-resolution satellite data from the Landsat Multispectral Scanner (MSS), Thematic Mapper (TM), Enhanced Thematic Mapper plus (ETM+) and China-Brazil Earth Resources Satellite (CBERS) show a consistent urbanization process that is consistent with economic development plans and policies. The time-series impervious spatial extent maps derived from this study agree well with an existing urban extent polygon data set that was previously developed independently. The overall mapping accuracy was estimated at about 92.5% with 3% commission error and 12% omission error for the impervious type from all images regardless of image quality and initial spatial resolution.

  6. GPS Imaging of Time-Dependent Seasonal Strain in Central California

    NASA Astrophysics Data System (ADS)

    Kraner, M.; Hammond, W. C.; Kreemer, C.; Borsa, A. A.; Blewitt, G.

    2016-12-01

    Recently, studies are suggesting that crustal deformation can be time-dependent and nontectonic. Continuous global positioning system (cGPS) measurements are now showing how steady long-term deformation can be influenced by factors such as fluctuations in loading and temperature variations. Here we model the seasonal time-dependent dilatational and shear strain in Central California, specifically surrounding the Parkfield region and try to uncover the sources of these deformation patterns. We use 8 years of cGPS data (2008 - 2016) processed by the Nevada Geodetic Laboratory and carefully select the cGPS stations for our analysis based on the vertical position of cGPS time series during the drought period. In building our strain model, we first detrend the selected station time series using a set of velocities from the robust MIDAS trend estimator. This estimation algorithm is a robust approach that is insensitive to common problems such as step discontinuities, outliers, and seasonality. We use these detrended time series to estimate the median cGPS positions for each month of the 8-year period and filter displacement differences between these monthly median positions using a filtering technique called "GPS Imaging." This technique improves the overall robustness and spatial resolution of the input displacements for the strain model. We then model our dilatational and shear strain field for each month of time series. We also test a variety of a priori constraints, which controls the style of faulting within the strain model. Upon examining our strain maps, we find that a seasonal strain signal exists in Central California. We investigate how this signal compares to thermoelastic, hydrologic, and atmospheric loading models during the 8-year period. We additionally determine whether the drought played a role in influencing the seasonal signal.

  7. Autogenic geomorphic processes determine the resolution and fidelity of terrestrial paleoclimate records.

    PubMed

    Foreman, Brady Z; Straub, Kyle M

    2017-09-01

    Terrestrial paleoclimate records rely on proxies hosted in alluvial strata whose beds are deposited by unsteady and nonlinear geomorphic processes. It is broadly assumed that this renders the resultant time series of terrestrial paleoclimatic variability noisy and incomplete. We evaluate this assumption using a model of oscillating climate and the precise topographic evolution of an experimental alluvial system. We find that geomorphic stochasticity can create aliasing in the time series and spurious climate signals, but these issues are eliminated when the period of climate oscillation is longer than a key time scale of internal dynamics in the geomorphic system. This emergent autogenic geomorphic behavior imparts regularity to deposition and represents a natural discretization interval of the continuous climate signal. We propose that this time scale in nature could be in excess of 10 4 years but would still allow assessments of the rates of climate change at resolutions finer than the existing age model techniques in isolation.

  8. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    NASA Astrophysics Data System (ADS)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  9. Nonlinear stochastic exclusion financial dynamics modeling and time-dependent intrinsic detrended cross-correlation

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Wang, Jun

    2017-09-01

    In attempt to reproduce price dynamics of financial markets, a stochastic agent-based financial price model is proposed and investigated by stochastic exclusion process. The exclusion process, one of interacting particle systems, is usually thought of as modeling particle motion (with the conserved number of particles) in a continuous time Markov process. In this work, the process is utilized to imitate the trading interactions among the investing agents, in order to explain some stylized facts found in financial time series dynamics. To better understand the correlation behaviors of the proposed model, a new time-dependent intrinsic detrended cross-correlation (TDI-DCC) is introduced and performed, also, the autocorrelation analyses are applied in the empirical research. Furthermore, to verify the rationality of the financial price model, the actual return series are also considered to be comparatively studied with the simulation ones. The comparison results of return behaviors reveal that this financial price dynamics model can reproduce some correlation features of actual stock markets.

  10. Autogenic geomorphic processes determine the resolution and fidelity of terrestrial paleoclimate records

    PubMed Central

    Foreman, Brady Z.; Straub, Kyle M.

    2017-01-01

    Terrestrial paleoclimate records rely on proxies hosted in alluvial strata whose beds are deposited by unsteady and nonlinear geomorphic processes. It is broadly assumed that this renders the resultant time series of terrestrial paleoclimatic variability noisy and incomplete. We evaluate this assumption using a model of oscillating climate and the precise topographic evolution of an experimental alluvial system. We find that geomorphic stochasticity can create aliasing in the time series and spurious climate signals, but these issues are eliminated when the period of climate oscillation is longer than a key time scale of internal dynamics in the geomorphic system. This emergent autogenic geomorphic behavior imparts regularity to deposition and represents a natural discretization interval of the continuous climate signal. We propose that this time scale in nature could be in excess of 104 years but would still allow assessments of the rates of climate change at resolutions finer than the existing age model techniques in isolation. PMID:28924607

  11. Characterisation of hydrogeological connections in a lowland karst network using time series analysis of water levels in ephemeral groundwater-fed lakes (turloughs)

    NASA Astrophysics Data System (ADS)

    Gill, L. W.; Naughton, O.; Johnston, P. M.; Basu, B.; Ghosh, B.

    2013-08-01

    This research has used continuous water level measurements five groundwater-fed lakes (or turloughs) in a linked lowland karst network of south Galway in Ireland over a 3 year period in order to elucidate the hydrogeological controls and conduit configurations forming the flooded karstic hydraulic system beneath the ground. The main spring outflow from this network discharges below mean sea level making it difficult to determine the hydraulic nature of the network using traditional rainfall-spring flow cross analysis, as has been done in many other studies on karst systems. However, the localised groundwater-surface water interactions (the turloughs) in this flooded lowland karst system can yield information about the nature of the hydraulic connections beneath the ground. Various different analytical techniques have been applied to the fluctuating turlough water level time series data in order to determine the nature of the linkage between them as well as hydraulic pipe configurations at key points in order to improve the conceptual model of the overall karst network. Initially, simple cross correlations between the different turlough water levels were carried out applying different time lags. Frequency analysis of the signals was then carried out using Fast Fourier transform analysis and then both discrete and continuous wavelet analyses have been applied to the data sets to characterise these inherently non-stationary time-series of fluctuating water levels. The analysis has indicated which turloughs are on the main line conduit system and which are somewhat off-line, the relative size of the main conduit in the network including evidence of localised constrictions, as well as clearly showing the tidal influence on the water levels in the three lower turloughs at shallow depths ∼8 km from the main spring outfall at the sea. It has also indicated that the timing of high rainfall events coincident with maximum spring tide levels may promote more consistent, long duration flooding of the turloughs throughout the winter.

  12. An evaluation of grease type ball bearing lubricants operating in various environments

    NASA Technical Reports Server (NTRS)

    Mcmurtrey, E. L.

    1981-01-01

    Because many future spacecraft or space stations will require mechanisms to operate for long periods of time in environments which are adverse to most bearing lubricants, a series of tests is continuing to evaluate 38 grease type lubricants in R-4 size bearings in five different environments for a 1 year period. Four repetitions of each test are made to provide statistical samples. These tests were used to select four lubricants for 5 year tests in selected environments with five repetitions of each test for statistical samples. At the present time, 100 test sets are completed and 22 test sets are underway. Three 5 year tests were started in (1) continuous operation and (2) start-stop operation, with both in vacuum at ambient temperatures, and (3) continuous operation at 93.3 C. In the 1 year tests the best results to date in all environments were obtained with a high viscosity index perfluoroalkylpolyether (PFPE) grease.

  13. Temporal–Spatial Surface Seasonal Mass Changes and Vertical Crustal Deformation in South China Block from GPS and GRACE Measurements

    PubMed Central

    He, Meilin; Shen, Wenbin; Chen, Ruizhi; Ding, Hao; Guo, Guangyi

    2017-01-01

    The solid Earth deforms elastically in response to variations of surface atmosphere, hydrology, and ice/glacier mass loads. Continuous geodetic observations by Global Positioning System (CGPS) stations and Gravity Recovery and Climate Experiment (GRACE) record such deformations to estimate seasonal and secular mass changes. In this paper, we present the seasonal variation of the surface mass changes and the crustal vertical deformation in the South China Block (SCB) identified by GPS and GRACE observations with records spanning from 1999 to 2016. We used 33 CGPS stations to construct a time series of coordinate changes, which are decomposed by empirical orthogonal functions (EOFs) in SCB. The average weighted root-mean-square (WRMS) reduction is 38% when we subtract GRACE-modeled vertical displacements from GPS time series. The first common mode shows clear seasonal changes, indicating seasonal surface mass re-distribution in and around the South China Block. The correlation between GRACE and GPS time series is analyzed which provides a reference for further improvement of the seasonal variation of CGPS time series. The results of the GRACE observations inversion are the surface deformations caused by the surface mass change load at a rate of about −0.4 to −0.8 mm/year, which is used to improve the long-term trend of non-tectonic loads of the GPS vertical velocity field to further explain the crustal tectonic movement in the SCB and surroundings. PMID:29301236

  14. Detection of early postseismic deformation from high-rate GNSS time series

    NASA Astrophysics Data System (ADS)

    Twardzik, C.; Vergnolle, M.; Avallone, A.; Sladen, A.

    2017-12-01

    Postseismic processes after an earthquake contribute to the redistribution of stresses in addition to that induced by the coseismic rupture. With the exception of very few studies (e.g., Miyazaki and Larson, 2008), most postseismic analyses only start one or two days following the mainshock. This leaves a critical part of postseismic phase unexplored, from a few minutes up to a few hours after the earthquake. In this study, we use kinematic precise point positioning (K-PPP) to analyze continuous GNSS data in order to obtain 30s position time series. These time series provide information on the surface displacements a soon as the dynamic response of the earthquake is over. Our first analysis focuses on the 2016 Pedernales, Ecuador, earthquake (Mw7.8). Using spectral analysis, we show that the typical logarithmic postseismic displacement trend can be detected as early as one to six hours after the earthquake depending on the station location and the level of noise. This analysis also allows to estimate the bias on the coseismic offsets usually based on daily pre- and post- earthquake positions. We use the early postseismic time series to test whether rate-and-state friction laws, traditionally used to explain postseismic processes days after the earthquake, still hold right after the mainshock. This study is being extended to two other subduction earthquakes: the 2010 Maule, Chile, earthquake (Mw8.8) and the 2015 Illapel, Chile, earthquake (Mw8.2).

  15. The quasi-biennial vertical oscillations at global GPS stations: identification by ensemble empirical mode decomposition.

    PubMed

    Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu

    2015-10-14

    Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations.

  16. The Quasi-Biennial Vertical Oscillations at Global GPS Stations: Identification by Ensemble Empirical Mode Decomposition

    PubMed Central

    Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu

    2015-01-01

    Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations. PMID:26473882

  17. Searching for geodetic transient slip signals along the Parkfield segment of the San Andreas Fault

    NASA Astrophysics Data System (ADS)

    Rousset, B.; Burgmann, R.

    2017-12-01

    The Parkfield section of the San Andreas fault is at the transition between a segment locked since the 1857 Mw 7.9 Fort Tejon earthquake to its south and a creeping segment to the north. It is particularly well instrumented since it is the many previous studies have focused on studying the coseismic and postseismic phases of the two most recent earthquake cycles, the interseismic phase is exhibiting interesting dynamics at the down-dip edge of the seismogenic zone, characterized by a very large number of low frequency earthquakes (LFE) with different behaviors depending on location. Interseismic fault creep rates appear to vary over a wide range of spatial and temporal scales, from the Earth's surface to the base of crust. In this study, we take advantage of the dense Global Positioning System (GPS) network, with 77 continuous stations located within a circle of radius 80 km centered on Parkfield. We correct these time series for the co- and postseismic signals of the 2003 Mw 6.3 San Simeon and 2004 Mw 6.0 Parkfield earthquakes. We then cross-correlate the residual time series with synthetic slow-slip templates following the approach of Rousset et al. (2017). Synthetic tests with transient events contained in GPS time series with realistic noise show the limit of detection of the method. In the application with real GPS time series, the highest correlation amplitudes are compared with micro-seismicity rates, as well as tremor and LFE observations.

  18. Forecasting air quality time series using deep learning.

    PubMed

    Freeman, Brian S; Taylor, Graham; Gharabaghi, Bahram; Thé, Jesse

    2018-04-13

    This paper presents one of the first applications of deep learning (DL) techniques to predict air pollution time series. Air quality management relies extensively on time series data captured at air monitoring stations as the basis of identifying population exposure to airborne pollutants and determining compliance with local ambient air standards. In this paper, 8 hr averaged surface ozone (O 3 ) concentrations were predicted using deep learning consisting of a recurrent neural network (RNN) with long short-term memory (LSTM). Hourly air quality and meteorological data were used to train and forecast values up to 72 hours with low error rates. The LSTM was able to forecast the duration of continuous O 3 exceedances as well. Prior to training the network, the dataset was reviewed for missing data and outliers. Missing data were imputed using a novel technique that averaged gaps less than eight time steps with incremental steps based on first-order differences of neighboring time periods. Data were then used to train decision trees to evaluate input feature importance over different time prediction horizons. The number of features used to train the LSTM model was reduced from 25 features to 5 features, resulting in improved accuracy as measured by Mean Absolute Error (MAE). Parameter sensitivity analysis identified look-back nodes associated with the RNN proved to be a significant source of error if not aligned with the prediction horizon. Overall, MAE's less than 2 were calculated for predictions out to 72 hours. Novel deep learning techniques were used to train an 8-hour averaged ozone forecast model. Missing data and outliers within the captured data set were replaced using a new imputation method that generated calculated values closer to the expected value based on the time and season. Decision trees were used to identify input variables with the greatest importance. The methods presented in this paper allow air managers to forecast long range air pollution concentration while only monitoring key parameters and without transforming the data set in its entirety, thus allowing real time inputs and continuous prediction.

  19. Capabilities of stochastic rainfall models as data providers for urban hydrology

    NASA Astrophysics Data System (ADS)

    Haberlandt, Uwe

    2017-04-01

    For planning of urban drainage systems using hydrological models, long, continuous precipitation series with high temporal resolution are needed. Since observed time series are often too short or not available everywhere, the use of synthetic precipitation is a common alternative. This contribution compares three precipitation models regarding their suitability to provide 5 minute continuous rainfall time series for a) sizing of drainage networks for urban flood protection and b) dimensioning of combined sewage systems for pollution reduction. The rainfall models are a parametric stochastic model (Haberlandt et al., 2008), a non-parametric probabilistic approach (Bárdossy, 1998) and a stochastic downscaling of dynamically simulated rainfall (Berg et al., 2013); all models are operated both as single site and multi-site generators. The models are applied with regionalised parameters assuming that there is no station at the target location. Rainfall and discharge characteristics are utilised for evaluation of the model performance. The simulation results are compared against results obtained from reference rainfall stations not used for parameter estimation. The rainfall simulations are carried out for the federal states of Baden-Württemberg and Lower Saxony in Germany and the discharge simulations for the drainage networks of the cities of Hamburg, Brunswick and Freiburg. Altogether, the results show comparable simulation performance for the three models, good capabilities for single site simulations but low skills for multi-site simulations. Remarkably, there is no significant difference in simulation performance comparing the tasks flood protection with pollution reduction, so the models are finally able to simulate both the extremes and the long term characteristics of rainfall equally well. Bárdossy, A., 1998. Generating precipitation time series using simulated annealing. Wat. Resour. Res., 34(7): 1737-1744. Berg, P., Wagner, S., Kunstmann, H., Schädler, G., 2013. High resolution regional climate model simulations for Germany: part I — validation. Climate Dynamics, 40(1): 401-414. Haberlandt, U., Ebner von Eschenbach, A.-D., Buchwald, I., 2008. A space-time hybrid hourly rainfall model for derived flood frequency analysis. Hydrol. Earth Syst. Sci., 12: 1353-1367.

  20. Improvement of Linde Kryotechnik's internal purifier

    NASA Astrophysics Data System (ADS)

    Decker, Lutz; Meier, Albert; Wilhelm, Hanspeter

    2014-01-01

    With the recent shortage in supply of helium, recovery solutions have experienced a new focus with a tendency to recover streams with higher impurity content. This development calls for purifier systems operating efficiently and with low impact on liquefaction capacity for helium streams with impurity levels in the percentage range. Linde Kryotechnik has answered this demand by improving the performance of its purifier technology. Since 1983, its standardized helium liquefiers of the L- and former TCF-series type contain an internal purifier which already allows efficient impurity removal with minimized space demand. Along with a line dryer to absorb humidity, it is designed to remove air impurities up to 5 mol%. However, with increasing impurity level, liquefaction capacity reduced significantly being furthermore restricted to an upper level of approx. 180 l/h and continuous purification became limited in time. With the current redesign of this purifier, the impact on liquefaction capacity is now minimized without any limitation within the capacity range of the L-series plants. Continuous purification is hence ensured beyond previous maximum impurity content. This paper provides the key design changes and the achievable performance, which has been verified in the recent L-series plants delivered to customers.

  1. Combined cGPS and InSAR time series for observing subsidence in the southern Central Valley due to groundwater exploitation

    NASA Astrophysics Data System (ADS)

    Neely, W.; Borsa, A. A.; Silverii, F.

    2017-12-01

    Recent droughts have increased reliance on groundwater for agricultural production in California's Central Valley. Using Interferometric Synthetic Aperture Radar (InSAR), we observe upwards of 25 cm/yr of subsidence from November 2014 to February 2017 due to intense pumping. However, these observations are contaminated by atmospheric noise and orbital errors. We present a novel method for correcting long wavelength errors in InSAR deformation estimates using time series from continuous Global Positioning System (cGPS) stations within the SAR footprint, which we apply to C-band data from the Sentinel mission. We test our method using 49 SAR acquisitions from the Sentinel 1 satellites and 107 cGPS times series from the Geodesy Advancing Geoscience and EarthScope (GAGE) network in southern Central Valley. We correct each interferogram separately, implementing an intermittent Small Baseline Subset (ISBAS) technique to produce a time series of line-of-sight surface motion from 276 InSAR pairs. To estimate the vertical component of this motion, we remove horizontal tectonic displacements predicted by the Southern California Earthquake Center's (SCEC) Community Geodetic Model. We validate our method by comparing the corrected InSAR results with independent cGPS data and find a marked improvement in agreement between the two data sets, particularly in the deformation rates. Using this technique, we characterize the time evolution of surface vertical deformation in the southern Central Valley related to human exploitation of local groundwater resources. This methodology is applicable to data from other SAR satellites, including ALOS-2 and the upcoming US-India NISAR mission.

  2. Observing climate change trends in ocean biogeochemistry: when and where.

    PubMed

    Henson, Stephanie A; Beaulieu, Claudie; Lampitt, Richard

    2016-04-01

    Understanding the influence of anthropogenic forcing on the marine biosphere is a high priority. Climate change-driven trends need to be accurately assessed and detected in a timely manner. As part of the effort towards detection of long-term trends, a network of ocean observatories and time series stations provide high quality data for a number of key parameters, such as pH, oxygen concentration or primary production (PP). Here, we use an ensemble of global coupled climate models to assess the temporal and spatial scales over which observations of eight biogeochemically relevant variables must be made to robustly detect a long-term trend. We find that, as a global average, continuous time series are required for between 14 (pH) and 32 (PP) years to distinguish a climate change trend from natural variability. Regional differences are extensive, with low latitudes and the Arctic generally needing shorter time series (<~30 years) to detect trends than other areas. In addition, we quantify the 'footprint' of existing and planned time series stations, that is the area over which a station is representative of a broader region. Footprints are generally largest for pH and sea surface temperature, but nevertheless the existing network of observatories only represents 9-15% of the global ocean surface. Our results present a quantitative framework for assessing the adequacy of current and future ocean observing networks for detection and monitoring of climate change-driven responses in the marine ecosystem. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  3. A daily Azores-Iceland North Atlantic Oscillation index back to 1850.

    PubMed

    Cropper, Thomas; Hanna, Edward; Valente, Maria Antónia; Jónsson, Trausti

    2015-07-01

    We present the construction of a continuous, daily (09:00 UTC), station-based (Azores-Iceland) North Atlantic Oscillation (NAO) Index back to 1871 which is extended back to 1850 with additional daily mean data. The constructed index more than doubles the length of previously existing, widely available, daily NAO time series. The index is created using entirely observational sea-level pressure (SLP) data from Iceland and 73.5% of observational SLP data from the Azores - the remainder being filled in via reanalysis (Twentieth Century Reanalysis Project and European Mean Sea Level Pressure) SLP data. Icelandic data are taken from the Southwest Iceland pressure series. We construct and document a new Ponta Delgada SLP time series based on recently digitized and newly available data that extend back to 1872. The Ponta Delgada time series is created by splicing together several fractured records (from Ponta Delgada, Lajes, and Santa Maria) and filling in the major gaps (pre-1872, 1888-1905, and 1940-1941) and occasional days (145) with reanalysis data. Further homogeneity corrections are applied to the Azores record, and the daily (09:00 UTC) NAO index is then calculated. The resulting index, with its extended temporal length and daily resolution, is the first reconstruction of daily NAO back into the 19th Century and therefore is useful for researchers across multiple disciplines.

  4. A 19-year radar altimeter elevation change time-series of the East and West Antarctic ice sheets

    NASA Astrophysics Data System (ADS)

    Sundal, A. V.; Shepherd, A.; Wingham, D.; Muir, A.; Mcmillan, M.; Galin, N.

    2012-12-01

    We present 19 years of continuous radar altimeter observations of the East and West Antarctic ice sheets acquired by the ERS-1, ERS-2, and ENVISAT satellites between May 1992 and September 2010. Time-series of surface elevation change were developed at 39,375 crossing points of the satellite orbit ground tracks using the method of dual cycle crossovers (Zwally et al., 1989; Wingham et al., 1998). In total, 46.5 million individual measurements were included in the analysis, encompassing 74 and 76 % of the East and West Antarctic ice sheet, respectively. The satellites were cross-calibrated by calculating differences between elevation changes occurring during periods of mission overlap. We use the merged time-series to explore spatial and temporal patterns of elevation change and to characterise and quantify the signals of Antarctic ice sheet imbalance. References: Wingham, D., Ridout, A., Scharroo, R., Arthern, R. & Shum, C.K. (1998): Antarctic elevation change from 1992 to 1996. Science, 282, 456-458. Zwally, H. J., Brenner, A. C., Major, J. A., Bindschadler, R. A. & Marsh, J. G. (1989): Growth of Greenland ice-sheet - measurements. Science, 246, 1587-1589.

  5. Current Research at the Endeavour Ridge 2000 Integrated Studies Site

    NASA Astrophysics Data System (ADS)

    Butterfield, D. A.; Kelley, D. S.; Ridge 2000 Community, R.

    2004-12-01

    Integrated geophysical, geological, chemical, and biological studies are being conducted on the Endeavour segment with primary support from NSF, the W.M. Keck Foundation, and NSERC (Canada). The research includes a seismic network, physical and chemical sensors, high-precision mapping and time-series sampling. Several research expeditions have taken place at the Endeavour ISS in the past year. In June 2003, an NSF-sponsored cruise with R.V. al T.G.Thompson/ROV al Jason2 installed microbial incubators in drill-holes in the sides of active sulfide chimneys and sampled rocks, fluids, and microbes in the Mothra and Main Endeavour Field (MEF). In July 2003, with al Thompson/Jason2, an NSF-LEXEN project at Baby Bare on Endeavour east flank conducted sampling through seafloor-penetrating probes, plus time-series sampling of fluids, microbes, and rocks at the MEF. In September 2003, with al Thompson/ROV al ROPOS, the Keck Proto-Neptune project installed a seismic network consisting of 1 broadband and 7 short-period seismometers, installation of chemical/physical sensors and time-series samplers for chemistry and microbiology in the MEF and Clam Bed sites, collection of rocks, fluids, animals, and microbes. In May/June 2004, an NSF-sponsored al Atlantis/Alvin cruise recovered sulfide incubators installed in 2003, redeployed a sulfide incubator, mapped MEF and Mothra vent fields with high-resolution Imagenix sonar, sampled fluids from MEF, Mothra, and Clam Bed, recovered year-long time-series fluid and microbial samplers from MEF and Clam Bed, recovered and installed hot vent temperature-resistivity monitors, cleaned up the MEF and deployed new markers at major sulfide structures. In August 2004, there were two MBARI/Keck-sponsored cruises with R.V. al Western Flyer/ROV al Tiburon. The first cruise completed the seismic network with addition of two more broadband seismometers and serviced all 7 short-period seismometers. al Tiburon then performed microbial and chemical investigations at MEF, Mothra, Sasquatch, and Middle Valley, collecting fluid, particle, and animal samples for culture and phylogenetic analysis. al Tiburon continued in late August/September with detailed petrological sampling. A Keck-sponsored al Thompson/ROPOS cruise in September continued work on chemical/physical sensor deployments and time-series chemical and microbial sampling. A graduate student workshop at Friday Harbor beginning October 2004 will analyze the first year of data from the seismic network and begin to correlate seismic activity with hydrothermal activity. The Endeavour ISS is still in a phase of data collection and sensor development, but moving toward data integration.

  6. Methods used to compute low-flow frequency characteristics for continuous-record streamflow stations in Minnesota, 2006

    USGS Publications Warehouse

    Winterstein, Thomas A.; Arntson, Allan D.; Mitton, Gregory B.

    2007-01-01

    The 1-, 7-, and 30-day low-flow series were determined for 120 continuous-record streamflow stations in Minnesota having at least 20 years of continuous record. The 2-, 5-, 10-, 50-, and 100-year statistics were determined for each series by fitting a log Pearson type III distribution to the data. The methods used to determine the low-flow statistics and to construct the plots of the low-flow frequency curves are described. The low-flow series and the low-flow statistics are presented in tables and graphs.

  7. 78 FR 17357 - Fisheries of the Gulf of Mexico; Southeast Data, Assessment, and Review (SEDAR); Public Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-21

    ... assessment of the Gulf of Mexico red snapper fishery will consist of a series of workshops and supplemental... Workshop Webinars 7 and 8 are as follows: Panelists will continue to review the progress of modeling... (see ADDRESSES) at least 10 business days prior to the meeting. Note: The times and sequence specified...

  8. Information Services; A Survey of the History and Present Status of the Field. MOREL Regional Information System for Educators.

    ERIC Educational Resources Information Center

    Grimes, George

    This document is one of a series describing the background, functions, and utilization of the Regional Information System (RIS) developed by the Michigan-Ohio Regional Educational Laboratory (MOREL). The continuing history of the field of librarianship and information services is reviewed in this report. The first part covers ancient times to the…

  9. Training Adult Educators. Proceedings of a National Conference (2nd, Wodonga, Victoria, Australia, May 25-28, 1985). The AAAE Monograph Series in Adult and Continuing Education Number Two.

    ERIC Educational Resources Information Center

    Peace, Brian, Ed.; Foster, Keith, Ed.

    The following papers are included: "Setting the Scene" (Brian Peace); "Different Training for Different Adult Educators?" (Michael Newman); "The Training of Part-Time Teachers in Adult Education: The UK Experience" (Brian Graham); "Adult Education Tutor Support" (Aileen Kelly); "Six Category…

  10. The PREDICTS database: a global database of how local terrestrial biodiversity responds to human impacts

    Treesearch

    L.N. Hudson; T. Newbold; S. Contu

    2014-01-01

    Biodiversity continues to decline in the face of increasing anthropogenic pressures such as habitat destruction, exploitation, pollution and introduction of alien species. Existing global databases of species’ threat status or population time series are dominated by charismatic species. The collation of datasets with broad taxonomic and biogeographic extents, and that...

  11. Indian Astronomy: History of

    NASA Astrophysics Data System (ADS)

    Mercier, R.; Murdin, P.

    2002-01-01

    From the time of A macronryabhat under dota (ca AD 500) there appeared in India a series of Sanskrit treatises on astronomy. Written always in verse, and normally accompanied by prose commentaries, these served to create an Indian tradition of mathematical astronomy which continued into the 18th century. There are as well texts from earlier centuries, grouped under the name Jyotishaveda macronn d...

  12. Learning as a shared responsibility: Insights from a series of dialogic workshops with practitioners, leaders, and researchers (Abstract)

    Treesearch

    Anne Black; Dave Thomas; Jennifer Ziegler; Jim Saveland

    2012-01-01

    For some time now, the wildland fire community has been interested in 'organizational learning' as a way to improve safety and overall performance. For instance, in the US, federal agencies have established and continue to support the Wildland Fire Lessons Learned Center, sponsored several national conferences and are currently considering how incident...

  13. Transfer Function Identification Using Orthogonal Fourier Transform Modeling Functions

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2013-01-01

    A method for transfer function identification, including both model structure determination and parameter estimation, was developed and demonstrated. The approach uses orthogonal modeling functions generated from frequency domain data obtained by Fourier transformation of time series data. The method was applied to simulation data to identify continuous-time transfer function models and unsteady aerodynamic models. Model fit error, estimated model parameters, and the associated uncertainties were used to show the effectiveness of the method for identifying accurate transfer function models from noisy data.

  14. Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis

    NASA Astrophysics Data System (ADS)

    Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A.; Ghosh, Sujit K.; Montet, Benjamin T.; Newton, Elisabeth R.

    2017-05-01

    Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.

  15. On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series

    NASA Astrophysics Data System (ADS)

    Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman

    2016-04-01

    The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for different types of noise are determined. Furthermore, we have selected 40 globally distributed stations that have a clear non-linear behaviour from two different International GNSS Service (IGS) analysis centers: JPL (Jet Propulsion Laboratory) and BLT (British Isles continuous GNSS Facility and University of Luxembourg Tide Gauge Benchmark Monitoring (TIGA) Analysis Center). We obtained maximum accelerations of -1.8±1.2 mm2/y and -4.5±3.3 mm2/y for the horizontal and vertical components, respectively. The noise analysis tests have shown that the addition of the non-linear term has significantly whitened the power spectra of the position time series, i.e. shifted the spectral index from flicker towards white noise.

  16. Estimating Unbiased Land Cover Change Areas In The Colombian Amazon Using Landsat Time Series And Statistical Inference Methods

    NASA Astrophysics Data System (ADS)

    Arevalo, P. A.; Olofsson, P.; Woodcock, C. E.

    2017-12-01

    Unbiased estimation of the areas of conversion between land categories ("activity data") and their uncertainty is crucial for providing more robust calculations of carbon emissions to the atmosphere, as well as their removals. This is particularly important for the REDD+ mechanism of UNFCCC where an economic compensation is tied to the magnitude and direction of such fluxes. Dense time series of Landsat data and statistical protocols are becoming an integral part of forest monitoring efforts, but there are relatively few studies in the tropics focused on using these methods to advance operational MRV systems (Monitoring, Reporting and Verification). We present the results of a prototype methodology for continuous monitoring and unbiased estimation of activity data that is compliant with the IPCC Approach 3 for representation of land. We used a break detection algorithm (Continuous Change Detection and Classification, CCDC) to fit pixel-level temporal segments to time series of Landsat data in the Colombian Amazon. The segments were classified using a Random Forest classifier to obtain annual maps of land categories between 2001 and 2016. Using these maps, a biannual stratified sampling approach was implemented and unbiased stratified estimators constructed to calculate area estimates with confidence intervals for each of the stable and change classes. Our results provide evidence of a decrease in primary forest as a result of conversion to pastures, as well as increase in secondary forest as pastures are abandoned and the forest allowed to regenerate. Estimating areas of other land transitions proved challenging because of their very small mapped areas compared to stable classes like forest, which corresponds to almost 90% of the study area. Implications on remote sensing data processing, sample allocation and uncertainty reduction are also discussed.

  17. Filtering methods in tidal-affected groundwater head measurements: Application of harmonic analysis and continuous wavelet transform

    NASA Astrophysics Data System (ADS)

    Sánchez-Úbeda, Juan Pedro; Calvache, María Luisa; Duque, Carlos; López-Chicano, Manuel

    2016-11-01

    A new methodology has been developed to obtain tidal-filtered time series of groundwater levels in coastal aquifers. Two methods used for oceanography processing and forecasting of sea level data were adapted for this purpose and compared: HA (Harmonic Analysis) and CWT (Continuous Wavelet Transform). The filtering process is generally comprised of two main steps: the detection and fitting of the major tide constituents through the decomposition of the original signal and the subsequent extraction of the complete tidal oscillations. The abilities of the optional HA and CWT methods to decompose and extract the tidal oscillations were assessed by applying them to the data from two piezometers at different depths close to the shoreline of a Mediterranean coastal aquifer (Motril-Salobreña, SE Spain). These methods were applied to three time series of different lengths (one month, one year, and 3.7 years of hourly data) to determine the range of detected frequencies. The different lengths of time series were also used to determine the fit accuracies of the tidal constituents for both the sea level and groundwater heads measurements. The detected tidal constituents were better resolved with increasing depth in the aquifer. The application of these methods yielded a detailed resolution of the tidal components, which enabled the extraction of the major tidal constituents of the sea level measurements from the groundwater heads (e.g., semi-diurnal, diurnal, fortnightly, monthly, semi-annual and annual). In the two wells studied, the CWT method was shown to be a more effective method than HA for extracting the tidal constituents of highest and lowest frequencies from groundwater head measurements.

  18. Non-target time trend screening: a data reduction strategy for detecting emerging contaminants in biological samples.

    PubMed

    Plassmann, Merle M; Tengstrand, Erik; Åberg, K Magnus; Benskin, Jonathan P

    2016-06-01

    Non-targeted mass spectrometry-based approaches for detecting novel xenobiotics in biological samples are hampered by the occurrence of naturally fluctuating endogenous substances, which are difficult to distinguish from environmental contaminants. Here, we investigate a data reduction strategy for datasets derived from a biological time series. The objective is to flag reoccurring peaks in the time series based on increasing peak intensities, thereby reducing peak lists to only those which may be associated with emerging bioaccumulative contaminants. As a result, compounds with increasing concentrations are flagged while compounds displaying random, decreasing, or steady-state time trends are removed. As an initial proof of concept, we created artificial time trends by fortifying human whole blood samples with isotopically labelled standards. Different scenarios were investigated: eight model compounds had a continuously increasing trend in the last two to nine time points, and four model compounds had a trend that reached steady state after an initial increase. Each time series was investigated at three fortification levels and one unfortified series. Following extraction, analysis by ultra performance liquid chromatography high-resolution mass spectrometry, and data processing, a total of 21,700 aligned peaks were obtained. Peaks displaying an increasing trend were filtered from randomly fluctuating peaks using time trend ratios and Spearman's rank correlation coefficients. The first approach was successful in flagging model compounds spiked at only two to three time points, while the latter approach resulted in all model compounds ranking in the top 11 % of the peak lists. Compared to initial peak lists, a combination of both approaches reduced the size of datasets by 80-85 %. Overall, non-target time trend screening represents a promising data reduction strategy for identifying emerging bioaccumulative contaminants in biological samples. Graphical abstract Using time trends to filter out emerging contaminants from large peak lists.

  19. 42 years of continuous observations of the Solar 1 diameter from 1974 to 2015 - What do they forecast.

    NASA Astrophysics Data System (ADS)

    Humberto Andrei, Alexandre; Penna, Jucira; Boscardin, Sergio; Papa, Andres R. R.; Garcia, Marcos Antonio; Sigismondi, Costantino

    2016-07-01

    Several research groups in the world developed observational programs for the Sun in order to measure its apparent diameter over time with dedicated instruments, called solar astrolabes, since 1974. Their data have been gathered in several observing stations connected in the R2S3 (Réseau de Suivi au Sol du Rayon Solaire) network and through reciprocal visits and exchanges: Nice/Calern Observatoire/France, Rio de Janeiro Observatório Nacional/Brazil, Observatório de São Paulo IAGUSP/Brazil, Observatório Abrahão de Moraes IAGUSP/Brazil, Antalya Observatory/Turkey, San Fernando/Spain. Since all the optics and data treatment of the solar astrolabes was the same, from the oldest, with a single fixed objective prism, to the newest, with an angle variable objective prism and digital image acquisition, their results could be put together. Each instrument had its own density filter with a prismatic effect responsible for a particular shift. Thus, identical data gathering and just a different prismatic shift, enabled to reconcile all those series by using the common stretches and derive a single additive constant to place each one onto a common average. By doing so, although the value itself of the ground observed solar diameter is lost, its variations are determined over 35 years. On the combined series of the ground observed solar diameter a modulation with the 11 years main solar cycle is evident. However when such modulation is removed, both from the solar diameter compound series and from the solar activity series (as given by the sunspots count), a very strong anticorrelation is revealed. This suggested a larger diameter for the forthcoming cycles. This was very well verified for solar cycle 23, and correctly forecasted for cycle 24,in a behavior similar to that on the Minima of Dalton and Maunder. The ground monitoring keeps being routinely followed at Observatório Nacional in Rio de Janeiro, now using the Solar Heliometer, specially built to this end . The Heliometer has the same focal length and aperture of the earlier solar astrolabes, and the diameter determination uses the same physical and mathematical definition of the solar limb. Therefore the same robust, no-hypothesis, simple combination by an adding constant, can be used to include the Heliometer measurements along the previous long, continuous series. As a result the series of measurements of the variation of the solar diameter reaches 42 years, and covers also the solar cycle 24. In this paper we review all the individual series chief elements, as well as the calculation and values of the adding constants. We show the earlier comparison that lead to an anticorrelation at 0.867 to the solar activity record, when the 11 years modulation is expurgate, and exhibits an impressively accurate description of cycle 23. On the strength of such successful analysis we employ the new longer series to discuss the current solar cycle 24 and forecast for the following solar cycle 25. We thus advocate in favor of continued and continuous ground measurements of the solar diameter, on the usefulness of making these results available to the scientific community at large, and on the matter-of-fact, real variations of the solar diameter on long term time periods and/or local places on the Sun, in this case possibly associated to major magnetism driven solar transients.

  20. Biogeochemical Response to Mesoscale Physical Forcing in the California Current System

    NASA Technical Reports Server (NTRS)

    Niiler, Pearn P.; Letelier, Ricardo; Moisan, John R.; Marra, John A. (Technical Monitor)

    2001-01-01

    In the first part of the project, we investigated the local response of the coastal ocean ecosystems (changes in chlorophyll, concentration and chlorophyll, fluorescence quantum yield) to physical forcing by developing and deploying Autonomous Drifting Ocean Stations (ADOS) within several mesoscale features along the U.S. west coast. Also, we compared the temporal and spatial variability registered by sensors mounted in the drifters to that registered by the sensors mounted in the satellites in order to assess the scales of variability that are not resolved by the ocean color satellite. The second part of the project used the existing WOCE SVP Surface Lagrangian drifters to track individual water parcels through time. The individual drifter tracks were used to generate multivariate time series by interpolating/extracting the biological and physical data fields retrieved by remote sensors (ocean color, SST, wind speed and direction, wind stress curl, and sea level topography). The individual time series of the physical data (AVHRR, TOPEX, NCEP) were analyzed against the ocean color (SeaWiFS) time-series to determine the time scale of biological response to the physical forcing. The results from this part of the research is being used to compare the decorrelation scales of chlorophyll from a Lagrangian and Eulerian framework. The results from both parts of this research augmented the necessary time series data needed to investigate the interactions between the ocean mesoscale features, wind, and the biogeochemical processes. Using the historical Lagrangian data sets, we have completed a comparison of the decorrelation scales in both the Eulerian and Lagrangian reference frame for the SeaWiFS data set. We are continuing to investigate how these results might be used in objective mapping efforts.

  1. Identification and classification of transient pulses observed in magnetometer array data by time-domain principal component analysis filtering

    NASA Astrophysics Data System (ADS)

    Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.

    2017-08-01

    A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.

  2. Perspectives on monitoring gradual change across the continuity of Landsat sensors using time-series data

    USGS Publications Warehouse

    Vogelmann, James; Gallant, Alisa L.; Shi, Hua; Zhu, Zhe

    2016-01-01

    There are many types of changes occurring over the Earth's landscapes that can be detected and monitored using Landsat data. Here we focus on monitoring “within-state,” gradual changes in vegetation in contrast with traditional monitoring of “abrupt” land-cover conversions. Gradual changes result from a variety of processes, such as vegetation growth and succession, damage from insects and disease, responses to shifts in climate, and other factors. Despite the prevalence of gradual changes across the landscape, they are largely ignored by the remote sensing community. Gradual changes are best characterized and monitored using time-series analysis, and with the successful launch of Landsat 8 we now have appreciable data continuity that extends the Landsat legacy across the previous 43 years. In this study, we conducted three related analyses: (1) comparison of spectral values acquired by Landsats 7 and 8, separated by eight days, to ensure compatibility for time-series evaluation; (2) tracking of multitemporal signatures for different change processes across Landsat 5, 7, and 8 sensors using anniversary-date imagery; and (3) tracking the same type of processes using all available acquisitions. In this investigation, we found that data representing natural vegetation from Landsats 5, 7, and 8 were comparable and did not indicate a need for major modification prior to use for long-term monitoring. Analyses using anniversary-date imagery can be very effective for assessing long term patterns and trends occurring across the landscape, and are especially good for providing insights regarding trends related to long-term and continuous trends of growth or decline. We found that use of all available data provided a much more comprehensive level of understanding of the trends occurring, providing information about rate, duration, and intra- and inter-annual variability that could not be readily gleaned from the anniversary date analyses. We observed that using all available clear Landsat 5–8 observations with the new Continuous Change Detection and Classification (CCDC) algorithm was very effective for illuminating vegetation trends. There are a number of potential challenges for assessing gradual changes, including atmospheric impacts, algorithm development and visualization of the changes. One of the biggest challenges for studying gradual change will be the lack of appropriate data for validating results and products.

  3. Coupling Poisson rectangular pulse and multiplicative microcanonical random cascade models to generate sub-daily precipitation timeseries

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph

    2018-07-01

    To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.

  4. Using BiSON to detect solar internal g-modes

    NASA Astrophysics Data System (ADS)

    Kuszlewicz, J.; Davies, G. R.; Chaplin, W. J.

    2015-09-01

    The unambiguous detection of individual solar internal g modes continues to elude us. With the aid of new additions to calibration procedures, as well as updated methods to combine multi-site time series more effectively, the noise and signal detection threshold levels in the low-frequency domain (where the g modes are expected to be found) have been greatly improved. In the BiSON 23-year dataset these levels now rival those of GOLF, and with much greater frequency resolution available, due to the long time series, there is an opportunity to place more constraints on the upper limits of individual g mode amplitudes. Here we detail recent work dedicated to the challenges of observing low-frequency oscillations using a ground-based network, including the role of the window function as well as the effect of calibration on the low frequency domain.

  5. Lunisolar tidal force and its relationship to chlorophyll fluorescence in Arabidopsis thaliana.

    PubMed

    Fisahn, Joachim; Klingelé, Emile; Barlow, Peter

    2015-01-01

    The yield of chlorophyll fluorescence Ft was measured in leaves of Arabidopsis thaliana over periods of several days under conditions of continuous illumination (LL) without the application of saturating light pulses. After linearization of the time series of the chlorophyll fluorescence yield (ΔFt), oscillations became apparent with periodicities in the circatidal range. Alignments of these linearized time series ΔFt with the lunisolar tidal acceleration revealed high degrees of synchrony and phase congruence. Similar congruence with the lunisolar tide was obtained with the linearized quantum yield of PSII (ΔФII), recorded after application of saturating light pulses. These findings strongly suggest that there is an exogenous timekeeper which is a stimulus for the oscillations detected in both the linearized yield of chlorophyll fluorescence (ΔFt) and the linearized quantum yield of PSII (ΔФII).

  6. Seasonality in ocean microbial communities.

    PubMed

    Giovannoni, Stephen J; Vergin, Kevin L

    2012-02-10

    Ocean warming occurs every year in seasonal cycles that can help us to understand long-term responses of plankton to climate change. Rhythmic seasonal patterns of microbial community turnover are revealed when high-resolution measurements of microbial plankton diversity are applied to samples collected in lengthy time series. Seasonal cycles in microbial plankton are complex, but the expansion of fixed ocean stations monitoring long-term change and the development of automated instrumentation are providing the time-series data needed to understand how these cycles vary across broad geographical scales. By accumulating data and using predictive modeling, we gain insights into changes that will occur as the ocean surface continues to warm and as the extent and duration of ocean stratification increase. These developments will enable marine scientists to predict changes in geochemical cycles mediated by microbial communities and to gauge their broader impacts.

  7. Predicting Flood in Perlis Using Ant Colony Optimization

    NASA Astrophysics Data System (ADS)

    Nadia Sabri, Syaidatul; Saian, Rizauddin

    2017-06-01

    Flood forecasting is widely being studied in order to reduce the effect of flood such as loss of property, loss of life and contamination of water supply. Usually flood occurs due to continuous heavy rainfall. This study used a variant of Ant Colony Optimization (ACO) algorithm named the Ant-Miner to develop the classification prediction model to predict flood. However, since Ant-Miner only accept discrete data, while rainfall data is a time series data, a pre-processing steps is needed to discretize the rainfall data initially. This study used a technique called the Symbolic Aggregate Approximation (SAX) to convert the rainfall time series data into discrete data. As an addition, Simple K-Means algorithm was used to cluster the data produced by SAX. The findings show that the predictive accuracy of the classification prediction model is more than 80%.

  8. An agreement coefficient for image comparison

    USGS Publications Warehouse

    Ji, Lei; Gallo, Kevin

    2006-01-01

    Combination of datasets acquired from different sensor systems is necessary to construct a long time-series dataset for remotely sensed land-surface variables. Assessment of the agreement of the data derived from various sources is an important issue in understanding the data continuity through the time-series. Some traditional measures, including correlation coefficient, coefficient of determination, mean absolute error, and root mean square error, are not always optimal for evaluating the data agreement. For this reason, we developed a new agreement coefficient for comparing two different images. The agreement coefficient has the following properties: non-dimensional, bounded, symmetric, and distinguishable between systematic and unsystematic differences. The paper provides examples of agreement analyses for hypothetical data and actual remotely sensed data. The results demonstrate that the agreement coefficient does include the above properties, and therefore is a useful tool for image comparison.

  9. Raising the legal drinking age in Maine: impact on traffic accidents among young drivers.

    PubMed

    Wagenaar, A C

    1983-04-01

    The minimum legal age for purchase and consumption of alcoholic beverages continues to be a controversial issue in North America as numerous jurisdictions that lowered the legal age in the early 1970s are returning to higher drinking ages. Monthly frequencies of motor vehicle crashes among drivers aged 18-45 in the states of Maine and Pennsylvania from 1972 through 1979 were examined using a multiple time series design. Controlling for the effects of long-term trends, seasonal cycles, and other factors with Box-Jenkins time series models, a significant 17-21% reduction in alcohol-related property damage crash involvement among drivers aged 18-19 is attributable to Maine's increase in drinking age. No demonstrable effect of the raised drinking age on the incidence of injury and fatal crashes was found.

  10. Lunisolar tidal force and its relationship to chlorophyll fluorescence in Arabidopsis thaliana

    PubMed Central

    Fisahn, Joachim; Klingelé, Emile; Barlow, Peter

    2015-01-01

    The yield of chlorophyll fluorescence Ft was measured in leaves of Arabidopsis thaliana over periods of several days under conditions of continuous illumination (LL) without the application of saturating light pulses. After linearization of the time series of the chlorophyll fluorescence yield (ΔFt), oscillations became apparent with periodicities in the circatidal range. Alignments of these linearized time series ΔFt with the lunisolar tidal acceleration revealed high degrees of synchrony and phase congruence. Similar congruence with the lunisolar tide was obtained with the linearized quantum yield of PSII (ΔФII), recorded after application of saturating light pulses. These findings strongly suggest that there is an exogenous timekeeper which is a stimulus for the oscillations detected in both the linearized yield of chlorophyll fluorescence (ΔFt) and the linearized quantum yield of PSII (ΔФII). PMID:26376108

  11. Field Performance of ISFET based Deep Ocean pH Sensors

    NASA Astrophysics Data System (ADS)

    Branham, C. W.; Murphy, D. J.

    2017-12-01

    Historically, ocean pH time series data was acquired from infrequent shipboard grab samples and measured using labor intensive spectrophotometry methods. However, with the introduction of robust and stable ISFET pH sensors for use in ocean applications a paradigm shift in the methods used to acquire long-term pH time series data has occurred. Sea-Bird Scientific played a critical role in the adoption this new technology by commercializing the SeaFET pH sensor and float pH Sensor developed by the MBARI chemical sensor group. Sea-Bird Scientific continues to advance this technology through a concerted effort to improve pH sensor accuracy and reliability by characterizing their performance in the laboratory and field. This presentation will focus on calibration of the ISFET pH sensor, evaluate its analytical performance, and validate performance using recent field data.

  12. Continuous relaxation and retardation spectrum method for viscoelastic characterization of asphalt concrete

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, Sudip; Swamy, Aravind Krishna; Daniel, Jo S.

    2012-08-01

    This paper presents a simple and practical approach to obtain the continuous relaxation and retardation spectra of asphalt concrete directly from the complex (dynamic) modulus test data. The spectra thus obtained are continuous functions of relaxation and retardation time. The major advantage of this method is that the continuous form is directly obtained from the master curves which are readily available from the standard characterization tests of linearly viscoelastic behavior of asphalt concrete. The continuous spectrum method offers efficient alternative to the numerical computation of discrete spectra and can be easily used for modeling viscoelastic behavior. In this research, asphalt concrete specimens have been tested for linearly viscoelastic characterization. The linearly viscoelastic test data have been used to develop storage modulus and storage compliance master curves. The continuous spectra are obtained from the fitted sigmoid function of the master curves via the inverse integral transform. The continuous spectra are shown to be the limiting case of the discrete distributions. The continuous spectra and the time-domain viscoelastic functions (relaxation modulus and creep compliance) computed from the spectra matched very well with the approximate solutions. It is observed that the shape of the spectra is dependent on the master curve parameters. The continuous spectra thus obtained can easily be implemented in material mix design process. Prony-series coefficients can be easily obtained from the continuous spectra and used in numerical analysis such as finite element analysis.

  13. Earthquake forecasting studies using radon time series data in Taiwan

    NASA Astrophysics Data System (ADS)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  14. Optimal Reorganization of NASA Earth Science Data for Enhanced Accessibility and Usability for the Hydrology Community

    NASA Technical Reports Server (NTRS)

    Teng, William; Rui, Hualan; Strub, Richard; Vollmer, Bruce

    2016-01-01

    A long-standing "Digital Divide" in data representation exists between the preferred way of data access by the hydrology community and the common way of data archival by earth science data centers. Typically, in hydrology, earth surface features are expressed as discrete spatial objects (e.g., watersheds), and time-varying data are contained in associated time series. Data in earth science archives, although stored as discrete values (of satellite swath pixels or geographical grids), represent continuous spatial fields, one file per time step. This Divide has been an obstacle, specifically, between the Consortium of Universities for the Advancement of Hydrologic Science, Inc. and NASA earth science data systems. In essence, the way data are archived is conceptually orthogonal to the desired method of access. Our recent work has shown an optimal method of bridging the Divide, by enabling operational access to long-time series (e.g., 36 years of hourly data) of selected NASA datasets. These time series, which we have termed "data rods," are pre-generated or generated on-the-fly. This optimal solution was arrived at after extensive investigations of various approaches, including one based on "data curtains." The on-the-fly generation of data rods uses "data cubes," NASA Giovanni, and parallel processing. The optimal reorganization of NASA earth science data has significantly enhanced the access to and use of the data for the hydrology user community.

  15. Automated classification of Permanent Scatterers time-series based on statistical characterization tests

    NASA Astrophysics Data System (ADS)

    Berti, Matteo; Corsini, Alessandro; Franceschini, Silvia; Iannacone, Jean Pascal

    2013-04-01

    The application of space borne synthetic aperture radar interferometry has progressed, over the last two decades, from the pioneer use of single interferograms for analyzing changes on the earth's surface to the development of advanced multi-interferogram techniques to analyze any sort of natural phenomena which involves movements of the ground. The success of multi-interferograms techniques in the analysis of natural hazards such as landslides and subsidence is widely documented in the scientific literature and demonstrated by the consensus among the end-users. Despite the great potential of this technique, radar interpretation of slope movements is generally based on the sole analysis of average displacement velocities, while the information embraced in multi interferogram time series is often overlooked if not completely neglected. The underuse of PS time series is probably due to the detrimental effect of residual atmospheric errors, which make the PS time series characterized by erratic, irregular fluctuations often difficult to interpret, and also to the difficulty of performing a visual, supervised analysis of the time series for a large dataset. In this work is we present a procedure for automatic classification of PS time series based on a series of statistical characterization tests. The procedure allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) and retrieve for each trend a series of descriptive parameters which can be efficiently used to characterize the temporal changes of ground motion. The classification algorithms were developed and tested using an ENVISAT datasets available in the frame of EPRS-E project (Extraordinary Plan of Environmental Remote Sensing) of the Italian Ministry of Environment (track "Modena", Northern Apennines). This dataset was generated using standard processing, then the time series are typically affected by a significant noise to signal ratio. The results of the analysis show that even with such a rough-quality dataset, our automated classification procedure can greatly improve radar interpretation of mass movements. In general, uncorrelated PS (type 0) are concentrated in flat areas such as fluvial terraces and valley bottoms, and along stable watershed divides; linear PS (type 1) are mainly located on slopes (both inside or outside mapped landslides) or near the edge of scarps or steep slopes; non-linear PS (types 2 to 5) typically fall inside landslide deposits or in the surrounding areas. The spatial distribution of classified PS allows to detect deformation phenomena not visible by considering the average velocity alone, and provide important information on the temporal evolution of the phenomena such as acceleration, deceleration, seasonal fluctuations, abrupt or continuous changes of the displacement rate. Based on these encouraging results we integrated all the classification algorithms into a Graphical User Interface (called PSTime) which is freely available as a standalone application.

  16. Measurements of liquid phase residence time distributions in a pilot-scale continuous leaching reactor using radiotracer technique.

    PubMed

    Pant, H J; Sharma, V K; Shenoy, K T; Sreenivas, T

    2015-03-01

    An alkaline based continuous leaching process is commonly used for extraction of uranium from uranium ore. The reactor in which the leaching process is carried out is called a continuous leaching reactor (CLR) and is expected to behave as a continuously stirred tank reactor (CSTR) for the liquid phase. A pilot-scale CLR used in a Technology Demonstration Pilot Plant (TDPP) was designed, installed and operated; and thus needed to be tested for its hydrodynamic behavior. A radiotracer investigation was carried out in the CLR for measurement of residence time distribution (RTD) of liquid phase with specific objectives to characterize the flow behavior of the reactor and validate its design. Bromine-82 as ammonium bromide was used as a radiotracer and about 40-60MBq activity was used in each run. The measured RTD curves were treated and mean residence times were determined and simulated using a tanks-in-series model. The result of simulation indicated no flow abnormality and the reactor behaved as an ideal CSTR for the range of the operating conditions used in the investigation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Individualistic and Time-Varying Tree-Ring Growth to Climate Sensitivity

    PubMed Central

    Carrer, Marco

    2011-01-01

    The development of dendrochronological time series in order to analyze climate-growth relationships usually involves first a rigorous selection of trees and then the computation of the mean tree-growth measurement series. This study suggests a change in the perspective, passing from an analysis of climate-growth relationships that typically focuses on the mean response of a species to investigating the whole range of individual responses among sample trees. Results highlight that this new approach, tested on a larch and stone pine tree-ring dataset, outperforms, in terms of information obtained, the classical one, with significant improvements regarding the strength, distribution and time-variability of the individual tree-ring growth response to climate. Moreover, a significant change over time of the tree sensitivity to climatic variability has been detected. Accordingly, the best-responder trees at any one time may not always have been the best-responders and may not continue to be so. With minor adjustments to current dendroecological protocol and adopting an individualistic approach, we can improve the quality and reliability of the ecological inferences derived from the climate-growth relationships. PMID:21829523

  18. [Epidemiologic surveillance of contact allergens. The "monitoring series" of IVDK (Information Network ofDermatologic Clinics for Detection and Scientific Evaluation of Contact Allergy].

    PubMed

    Aberer, W; Komericki, P; Uter, W; Hausen, B M; Lessmann, H; Kränke, B; Geier, J; Schnuch, A

    2003-08-01

    The selection of the most important contact allergens is subject to a continuous change. Several factors may influence the sensitization rates and thus the decision, which substances to include in the standard series of the most frequent allergens. The Information Network of Departments of Dermatology adds substances of interest for a certain time period to the standard series in order to evaluate parameters such as sensitization rate, grade of reaction, and clinical relevance of positive reactions. In 6 testing periods starting in 1996, 13 test substances were evaluated. Due to the results, propolis, compositae mix, and bufexamac were included in the standard series in 1999, while lyral was added in 2002. Sorbitansesquioleat, dispers blue mix, and iodopropynyl butylcarbamate are under further discussion. Substances such as glutaraldehyde and p-aminoazobenzole should be tested in certain risk groups only, whereas the steroids budesonide and tixocortol should be tested when clinically suspected.

  19. Processing arctic eddy-flux data using a simple carbon-exchange model embedded in the ensemble Kalman filter.

    PubMed

    Rastetter, Edward B; Williams, Mathew; Griffin, Kevin L; Kwiatkowski, Bonnie L; Tomasky, Gabrielle; Potosnak, Mark J; Stoy, Paul C; Shaver, Gaius R; Stieglitz, Marc; Hobbie, John E; Kling, George W

    2010-07-01

    Continuous time-series estimates of net ecosystem carbon exchange (NEE) are routinely made using eddy covariance techniques. Identifying and compensating for errors in the NEE time series can be automated using a signal processing filter like the ensemble Kalman filter (EnKF). The EnKF compares each measurement in the time series to a model prediction and updates the NEE estimate by weighting the measurement and model prediction relative to a specified measurement error estimate and an estimate of the model-prediction error that is continuously updated based on model predictions of earlier measurements in the time series. Because of the covariance among model variables, the EnKF can also update estimates of variables for which there is no direct measurement. The resulting estimates evolve through time, enabling the EnKF to be used to estimate dynamic variables like changes in leaf phenology. The evolving estimates can also serve as a means to test the embedded model and reconcile persistent deviations between observations and model predictions. We embedded a simple arctic NEE model into the EnKF and filtered data from an eddy covariance tower located in tussock tundra on the northern foothills of the Brooks Range in northern Alaska, USA. The model predicts NEE based only on leaf area, irradiance, and temperature and has been well corroborated for all the major vegetation types in the Low Arctic using chamber-based data. This is the first application of the model to eddy covariance data. We modified the EnKF by adding an adaptive noise estimator that provides a feedback between persistent model data deviations and the noise added to the ensemble of Monte Carlo simulations in the EnKF. We also ran the EnKF with both a specified leaf-area trajectory and with the EnKF sequentially recalibrating leaf-area estimates to compensate for persistent model-data deviations. When used together, adaptive noise estimation and sequential recalibration substantially improved filter performance, but it did not improve performance when used individually. The EnKF estimates of leaf area followed the expected springtime canopy phenology. However, there were also diel fluctuations in the leaf-area estimates; these are a clear indication of a model deficiency possibly related to vapor pressure effects on canopy conductance.

  20. Smoothing data series by means of cubic splines: quality of approximation and introduction of a repeating spline approach

    NASA Astrophysics Data System (ADS)

    Wüst, Sabine; Wendt, Verena; Linz, Ricarda; Bittner, Michael

    2017-09-01

    Cubic splines with equidistant spline sampling points are a common method in atmospheric science, used for the approximation of background conditions by means of filtering superimposed fluctuations from a data series. What is defined as background or superimposed fluctuation depends on the specific research question. The latter also determines whether the spline or the residuals - the subtraction of the spline from the original time series - are further analysed.Based on test data sets, we show that the quality of approximation of the background state does not increase continuously with an increasing number of spline sampling points and/or decreasing distance between two spline sampling points. Splines can generate considerable artificial oscillations in the background and the residuals.We introduce a repeating spline approach which is able to significantly reduce this phenomenon. We apply it not only to the test data but also to TIMED-SABER temperature data and choose the distance between two spline sampling points in a way that is sensitive for a large spectrum of gravity waves.

  1. Quantifying ecosystem carbon losses and gains following development in New England: A combined field, modeling, and remote sensing approach

    NASA Astrophysics Data System (ADS)

    Raciti, S. M.; Hutyra, L.; Briber, B. M.; Dunn, A. L.; Friedl, M. A.; Woodcock, C.; Zhu, Z.; Olofsson, P.

    2013-12-01

    If current trends continue, the world's urban population may double and urban land area may quadruple over the next 50 years. Despite the rapid expansion of urban areas, the trajectories of carbon losses and gains following development remain poorly quantified. We are using a combination of field measurements, modeling, and remote sensing to advance our ability to measure and monitor trajectories of ecosystem carbon over space and time. To characterize how carbon stocks change across urban-to-rural gradients, we previously established field plots to survey live and dead tree biomass, tree canopy, soil and foliar carbon and nitrogen concentrations, and a range of landscape characteristics (Raciti et al. 2012). In 2013, we extended our field sampling to focus specifically on places that experienced land use and land cover change over the past 35 years. This chronosequence approach was informed by Landsat time series (1982-present) and property records (before 1982). The Landsat time series approach differs from traditional remote-sensing-based land use change detection methods because it leverages the entire Landsat archive of imagery using a Fourier fitting approach (Zhu et al. 2012). The result is a temporally and spatially continuous map of land use and land cover change across the study region. We used these field and remote sensing data to inform a carbon bookkeeping model that estimates changes in past and potential future carbon stocks over time. Here we present preliminary results of this work for eastern Massachusetts.

  2. Robust Flood Monitoring Using Sentinel-1 SAR Time Series

    NASA Astrophysics Data System (ADS)

    DeVries, B.; Huang, C.; Armston, J.; Huang, W.

    2017-12-01

    The 2017 hurricane season in North and Central America has resulted in unprecedented levels of flooding that have affected millions of people and continue to impact communities across the region. The extent of casualties and damage to property incurred by these floods underscores the need for reliable systems to track flood location, timing and duration to aid response and recovery efforts. While a diverse range of data sources provide vital information on flood status in near real-time, only spaceborne Synthetic Aperture Radar (SAR) sensors can ensure wall-to-wall coverage over large areas, mostly independently of weather conditions or site accessibility. The European Space Agency's Sentinel-1 constellation represents the only SAR mission currently providing open access and systematic global coverage, allowing for a consistent stream of observations over flood-prone regions. Importantly, both the data and pre-processing software are freely available, enabling the development of improved methods, tools and data products to monitor floods in near real-time. We tracked flood onset and progression in Southeastern Texas, Southern Florida, and Puerto Rico using a novel approach based on temporal backscatter anomalies derived from times series of Sentinel-1 observations and historic baselines defined for each of the three sites. This approach was shown to provide a more objective measure of flood occurrence than the simple backscatter thresholds often employed in operational flood monitoring systems. Additionally, the use of temporal anomaly measures allowed us to partially overcome biases introduced by varying sensor view angles and image acquisition modes, allowing increased temporal resolution in areas where additional targeted observations are available. Our results demonstrate the distinct advantages offered by data from operational SAR missions such as Sentinel-1 and NASA's planned NISAR mission, and call attention to the continuing need for SAR Earth Observation missions that provide systematic repeat observations to facilitate continuous monitoring of flood-affected regions.

  3. 31 CFR 359.4 - In what form are Series I savings bonds issued?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT OFFERING OF UNITED STATES SAVINGS BONDS, SERIES I General Information § 359.4 In what form are Series I savings bonds issued? Series...

  4. Increasing the quality, comparability and accessibility of phytoplankton species composition time-series data

    NASA Astrophysics Data System (ADS)

    Zingone, Adriana; Harrison, Paul J.; Kraberg, Alexandra; Lehtinen, Sirpa; McQuatters-Gollop, Abigail; O'Brien, Todd; Sun, Jun; Jakobsen, Hans H.

    2015-09-01

    Phytoplankton diversity and its variation over an extended time scale can provide answers to a wide range of questions relevant to societal needs. These include human health, the safe and sustained use of marine resources and the ecological status of the marine environment, including long-term changes under the impact of multiple stressors. The analysis of phytoplankton data collected at the same place over time, as well as the comparison among different sampling sites, provide key information for assessing environmental change, and evaluating new actions that must be made to reduce human induced pressures on the environment. To achieve these aims, phytoplankton data may be used several decades later by users that have not participated in their production, including automatic data retrieval and analysis. The methods used in phytoplankton species analysis vary widely among research and monitoring groups, while quality control procedures have not been implemented in most cases. Here we highlight some of the main differences in the sampling and analytical procedures applied to phytoplankton analysis and identify critical steps that are required to improve the quality and inter-comparability of data obtained at different sites and/or times. Harmonization of methods may not be a realistic goal, considering the wide range of purposes of phytoplankton time-series data collection. However, we propose that more consistent and detailed metadata and complementary information be recorded and made available along with phytoplankton time-series datasets, including description of the procedures and elements allowing for a quality control of the data. To keep up with the progress in taxonomic research, there is a need for continued training of taxonomists, and for supporting and complementing existing web resources, in order to allow a constant upgrade of knowledge in phytoplankton classification and identification. Efforts towards the improvement of metadata recording, data annotation and quality control procedures will ensure the internal consistency of phytoplankton time series and facilitate their comparability and accessibility, thus strongly increasing the value of the precious information they provide. Ultimately, the sharing of quality controlled data will allow one to recoup the high cost of obtaining the data through the multiple use of the time-series data in various projects over many decades.

  5. Nonequilibrium transitions driven by external dichotomous noise

    NASA Astrophysics Data System (ADS)

    Behn, U.; Schiele, K.; Teubel, A.; Kühnel, A.

    1987-06-01

    The stationary probability density P s for a class of nonlinear one-dimensional models driven by a dichotomous Markovian process (DMP) I t , can be calculated explicitly. For the specific case of the Stratonovich model, x=ax - -x 3 +I t x, the qualitative shape of P s and its support is discussed in the whole parameter region. The location of the maxima of P s shows a behavior similar to order parameters in continuous phase transitions. The possibility of a noiseinduced change from continuous to a discontinuous transition in an extended model, in which the DMP couples also to the cubic term, is discussed. The time-dependent moments can be represented as an infinite series of terms, which are determined by a recursion formula. For negative even moments the series terminates and the long-time behavior can be obtained analytically. As a function of the physical parameters, qualitative changes of this behavior may occur which can be partially related to the behavior of P s . All results reproduce those for Gaussian white noise in the corresponding limit. The influence of the finite correlation time and the discreteness of the space of states of the DMP are discussed. An extensive list of references is contained in U. Behn, K. Schiele, and A. Teubel, Wiss. Z. Karl-Marx-Univ. Leipzig, Mathem.-Naturwiss. R. 34:602 (1985).

  6. Development of synchronized, autonomous, and self-regulated oscillations in transpiration rate of a whole tomato plant under water stress.

    PubMed

    Wallach, Rony; Da-Costa, Noam; Raviv, Michael; Moshelion, Menachem

    2010-07-01

    Plants respond to many environmental changes by rapidly adjusting their hydraulic conductivity and transpiration rate, thereby optimizing water-use efficiency and preventing damage due to low water potential. A multiple-load-cell apparatus, time-series analysis of the measured data, and residual low-pass filtering methods were used to monitor continuously and analyse transpiration of potted tomato plants (Solanum lycopersicum cv. Ailsa Craig) grown in a temperature-controlled greenhouse during well-irrigated and drought periods. A time derivative of the filtered residual time series yielded oscillatory behaviour of the whole plant's transpiration (WPT) rate. A subsequent cross-correlation analysis between the WPT oscillatory pattern and wet-wick evaporation rates (vertical cotton fabric, 0.14 m(2) partly submerged in water in a container placed on an adjacent load cell) revealed that autonomous oscillations in WPT rate develop under a continuous increase in water stress, whereas these oscillations correspond with the fluctuations in evaporation rate when water is fully available. The relative amplitude of these autonomous oscillations increased with water stress as transpiration rate decreased. These results support the recent finding that an increase in xylem tension triggers hydraulic signals that spread instantaneously via the plant vascular system and control leaf conductance. The regulatory role of synchronized oscillations in WPT rate in eliminating critical xylem tension points and preventing embolism is discussed.

  7. Effects of Assuming Independent Component Failure Times, If They Are Actually Dependent, in a Series System.

    DTIC Science & Technology

    1985-11-26

    etc.).., Major decisions involving reliability ptudies, based on competing risk methodology , have been made in the past and will continue to be made...censoring mechanism. In such instances, the methodology for estimating relevant reliabili- ty probabilities has received considerable attention (cf. David...proposal for a discussion of the general methodology . .,4..% . - ’ -. - ’ . ’ , . * I - " . . - - - - . . ,_ . . . . . . . . .4

  8. Mass Communication: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," January through June 1980 (Vol. 40 Nos. 7 through 12).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 55 titles deal with a variety of topics, including the following: (1) the prime time access rule; (2) media education; (3) magazine and children's advertising; (4) Irish national and Third World cinema; (5) international radio…

  9. Seeing the Solar System through Two Perspectives. Part 1 of a Series Focusing on Learning Progressions

    ERIC Educational Resources Information Center

    Thornburgh, Bill R.; Tretter, Tom R.; Duckwall, Mark

    2015-01-01

    Space has fascinated and intrigued humans of all ages since time immemorial, and continues to do so today. The natural curiosity is engaged when looking up into the sky, notice patterns among celestial objects such as the Sun, Moon, and stars, and wonder. Scientific understanding of those patterns has progressed immensely over the span of human…

  10. The Missing Treatment Design Element: Continuity of Treatment When Multiple Postobservations Are Used in Time-Series and Repeated Measures Study Designs

    ERIC Educational Resources Information Center

    Barnette, J. Jackson; Wallis, Anne Baber

    2005-01-01

    We rely a great deal on the schematic descriptions that represent experimental and quasi-experimental design arrangements, as well as the discussions of threats to validity associated with these, provided by Campbell and his associates: Stanley, Cook, and Shadish. Some of these designs include descriptions of treatments removed, removed and then…

  11. Essential Study Skills: The Complete Guide to Success at University. Second Edition. Sage Study Skills Series

    ERIC Educational Resources Information Center

    Burns, Tom; Sinfield, Sandra

    2008-01-01

    The eagerly-awaited new edition of the successful "Essential Study Skills" continues to provide a truly practical guide to achieving success at university. Whether you are going to university straight from school, a mature student, or an overseas student studying in the UK for the first time, this is the book that will help you to better…

  12. Detecting mortality induced structural and functional changes in a pinon-juniper woodland using Landsat and RapidEye time series

    Treesearch

    Dan J. Krofcheck; Jan U. H. Eitel; Lee A. Vierling; Urs Schulthess; Timothy M. Hilton; Eva Dettweiler-Robinson; Rosemary Pendleton; Marcy E. Litvak

    2014-01-01

    Pinon-juniper (PJ) woodlands have recently undergone dramatic drought-induced mortality, triggering broad scale structural changes in this extensive Southwestern US biome. Given that climate projections for the region suggest widespread conifer mortality is likely to continue into the next century, it is critical to better understand how this climate-induced change in...

  13. The 1983 tail-era series. Volume 1: ISEE 3 plasma

    NASA Technical Reports Server (NTRS)

    Fairfield, D. H.; Phillips, J. L.

    1991-01-01

    Observations from the ISEE 3 electron analyzer are presented in plots. Electrons were measured in 15 continuous energy levels between 8.5 and 1140 eV during individual 3-sec spacecraft spins. Times associated with each data point are the beginning time of the 3 sec data collection interval. Moments calculated from the measured distribution function are shown as density, temperature, velocity, and velocity azimuthal angle. Spacecraft ephemeris is shown at the bottom in GSE and GSM coordinates in units of Earth radii, with vertical ticks on the time axis corresponding to the printed positions.

  14. Perceptual multistability in figure-ground segregation using motion stimuli.

    PubMed

    Gori, Simone; Giora, Enrico; Pedersini, Riccardo

    2008-11-01

    In a series of experiments using ambiguous stimuli, we investigate the effects of displaying ordered, discrete series of images on the dynamics of figure-ground segregation. For low frame presentation speeds, the series were perceived as a sequence of discontinuous, static images, while for high speeds they were perceived as continuous. We conclude that using stimuli varying continuously along one parameter results in stronger hysteresis and reduces spontaneous switching compared to matched static stimuli with discontinuous parameter changes. The additional evidence that the size of the hysteresis effects depended on trial duration is consistent with the stochastic nature of the dynamics governing figure-ground segregation. The results showed that for continuously changing stimuli, alternative figure-ground organizations are resolved via low-level, dynamical competition. A second series of experiments confirmed these results with an ambiguous stimulus based on Petter's effect.

  15. Assessing spatial coupling in complex population dynamics using mutual prediction and continuity statistics

    USGS Publications Warehouse

    Nichols, J.M.; Moniz, L.; Nichols, J.D.; Pecora, L.M.; Cooch, E.

    2005-01-01

    A number of important questions in ecology involve the possibility of interactions or ?coupling? among potential components of ecological systems. The basic question of whether two components are coupled (exhibit dynamical interdependence) is relevant to investigations of movement of animals over space, population regulation, food webs and trophic interactions, and is also useful in the design of monitoring programs. For example, in spatially extended systems, coupling among populations in different locations implies the existence of redundant information in the system and the possibility of exploiting this redundancy in the development of spatial sampling designs. One approach to the identification of coupling involves study of the purported mechanisms linking system components. Another approach is based on time series of two potential components of the same system and, in previous ecological work, has relied on linear cross-correlation analysis. Here we present two different attractor-based approaches, continuity and mutual prediction, for determining the degree to which two population time series (e.g., at different spatial locations) are coupled. Both approaches are demonstrated on a one-dimensional predator?prey model system exhibiting complex dynamics. Of particular interest is the spatial asymmetry introduced into the model as linearly declining resource for the prey over the domain of the spatial coordinate. Results from these approaches are then compared to the more standard cross-correlation analysis. In contrast to cross-correlation, both continuity and mutual prediction are clearly able to discern the asymmetry in the flow of information through this system.

  16. Reducing waiting time and raising outpatient satisfaction in a Chinese public tertiary general hospital-an interrupted time series study.

    PubMed

    Sun, Jing; Lin, Qian; Zhao, Pengyu; Zhang, Qiongyao; Xu, Kai; Chen, Huiying; Hu, Cecile Jia; Stuntz, Mark; Li, Hong; Liu, Yuanli

    2017-08-22

    It is globally agreed that a well-designed health system deliver timely and convenient access to health services for all patients. Many interventions aiming to reduce waiting times have been implemented in Chinese public tertiary hospitals to improve patients' satisfaction. However, few were well-documented, and the effects were rarely measured with robust methods. We conducted a longitudinal study of the length of waiting times in a public tertiary hospital in Southern China which developed comprehensive data collection systems. Around an average of 60,000 outpatients and 70,000 prescribed outpatients per month were targeted for the study during Oct 2014-February 2017. We analyzed longitudinal time series data using a segmented linear regression model to assess changes in levels and trends of waiting times before and after the introduction of waiting time reduction interventions. Pearson correlation analysis was conducted to indicate the strength of association between waiting times and patient satisfactions. The statistical significance level was set at 0.05. The monthly average length of waiting time decreased 3.49 min (P = 0.003) for consultations and 8.70 min (P = 0.02) for filling prescriptions in the corresponding month when respective interventions were introduced. The trend shifted from baseline slight increasing to afterwards significant decreasing for filling prescriptions (P =0.003). There was a significant negative correlation between waiting time of filling prescriptions and outpatient satisfaction towards pharmacy services (r = -0.71, P = 0.004). The interventions aimed at reducing waiting time and raising patient satisfaction in Fujian Provincial Hospital are effective. A long-lasting reduction effect on waiting time for filling prescriptions was observed because of carefully designed continuous efforts, rather than a one-time campaign, and with appropriate incentives implemented by a taskforce authorized by the hospital managers. This case provides a model of carrying out continuous quality improvement and optimizing management process with the support of relevant evidence.

  17. Application of automated measurement stations for continuous water quality monitoring of the Dender river in Flanders, Belgium.

    PubMed

    Vandenberghe, V; Goethals, P L M; Van Griensven, A; Meirlaen, J; De Pauw, N; Vanrolleghem, P; Bauwens, W

    2005-09-01

    During the summer of 1999, two automated water quality measurement stations were installed along the Dender river in Belgium. The variables dissolved oxygen, temperature, conductivity, pH, rain-intensity, flow and solar radiation were measured continuously. In this paper these on-line measurement series are presented and interpreted using also additional measurements and ecological expert-knowledge. The purpose was to demonstrate the variability in time and space of the aquatic processes and the consequences of conducting and interpreting discrete measurements for river quality assessment and management. The large fluctuations of the data illustrated the importance of continuous measurements for the complete description and modelling of the biological processes in the river.

  18. Time-Frequency Analyses of Tide-Gauge Sensor Data

    PubMed Central

    Erol, Serdar

    2011-01-01

    The real world phenomena being observed by sensors are generally non-stationary in nature. The classical linear techniques for analysis and modeling natural time-series observations are inefficient and should be replaced by non-linear techniques of whose theoretical aspects and performances are varied. In this manner adopting the most appropriate technique and strategy is essential in evaluating sensors’ data. In this study, two different time-series analysis approaches, namely least squares spectral analysis (LSSA) and wavelet analysis (continuous wavelet transform, cross wavelet transform and wavelet coherence algorithms as extensions of wavelet analysis), are applied to sea-level observations recorded by tide-gauge sensors, and the advantages and drawbacks of these methods are reviewed. The analyses were carried out using sea-level observations recorded at the Antalya-II and Erdek tide-gauge stations of the Turkish National Sea-Level Monitoring System. In the analyses, the useful information hidden in the noisy signals was detected, and the common features between the two sea-level time series were clarified. The tide-gauge records have data gaps in time because of issues such as instrumental shortcomings and power outages. Concerning the difficulties of the time-frequency analysis of data with voids, the sea-level observations were preprocessed, and the missing parts were predicted using the neural network method prior to the analysis. In conclusion the merits and limitations of the techniques in evaluating non-stationary observations by means of tide-gauge sensors records were documented and an analysis strategy for the sequential sensors observations was presented. PMID:22163829

  19. Time-frequency analyses of tide-gauge sensor data.

    PubMed

    Erol, Serdar

    2011-01-01

    The real world phenomena being observed by sensors are generally non-stationary in nature. The classical linear techniques for analysis and modeling natural time-series observations are inefficient and should be replaced by non-linear techniques of whose theoretical aspects and performances are varied. In this manner adopting the most appropriate technique and strategy is essential in evaluating sensors' data. In this study, two different time-series analysis approaches, namely least squares spectral analysis (LSSA) and wavelet analysis (continuous wavelet transform, cross wavelet transform and wavelet coherence algorithms as extensions of wavelet analysis), are applied to sea-level observations recorded by tide-gauge sensors, and the advantages and drawbacks of these methods are reviewed. The analyses were carried out using sea-level observations recorded at the Antalya-II and Erdek tide-gauge stations of the Turkish National Sea-Level Monitoring System. In the analyses, the useful information hidden in the noisy signals was detected, and the common features between the two sea-level time series were clarified. The tide-gauge records have data gaps in time because of issues such as instrumental shortcomings and power outages. Concerning the difficulties of the time-frequency analysis of data with voids, the sea-level observations were preprocessed, and the missing parts were predicted using the neural network method prior to the analysis. In conclusion the merits and limitations of the techniques in evaluating non-stationary observations by means of tide-gauge sensors records were documented and an analysis strategy for the sequential sensors observations was presented.

  20. Evaluation of Growing Season Milestones, Using Eddy Covariance Time-Series of Net Ecosystem Exchange

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Faybishenko, B.; Poindexter, C.; Menzer, O.; Agarwal, D.; Papale, D.; Baldocchi, D. D.

    2014-12-01

    Common methods for determining timing of plants' developmental events, such as direct observation and remote sensing of NDVI, usually produce data of temporal resolution on the order of one week or more. This limitation can make observing subtle trends across years difficult. The goal of this presentation is to demonstrate a conceptual approach and a computational technique to quantify seasonal, annual and long-term phenological indices and patterns, based on continuous eddy covariance measurements of net ecosystem exchange (NEE) measured at eddy covariance towers in the AmeriFlux network. Using a comprehensive time series analysis of NEE fluxes in different climatic zones, we determined multiple characteristics (and their confidence intervals) of the growing season including: the initiation day—the day when canopy photosynthesis development starts, the photosynthesis stabilization day - the day when the development process of canopy photosynthesis starts to slow down and gradually moves toward stabilization, and the growing season effective termination day. We also determined the spring photosynthetic development velocity and the fall photosynthetic development velocity. The results of calculations using NEE were compared with those from temperature and precipitation data measured at the same AmeriFlux tower stations, as well as with the in-situ directly observed phenological records. The results of calculations of phenological indices from the NEE time-series collected at AmeriFlux sites can be used to constrain the application of other time- and labor-intensive sensing methods and to reduce the uncertainty in identifying trends in the timing of phenological indices.

  1. Evaluating four gap-filling methods for eddy covariance measurements of evapotranspiration over hilly crop fields

    NASA Astrophysics Data System (ADS)

    Boudhina, Nissaf; Zitouna-Chebbi, Rim; Mekki, Insaf; Jacob, Frédéric; Ben Mechlia, Nétij; Masmoudi, Moncef; Prévot, Laurent

    2018-06-01

    Estimating evapotranspiration in hilly watersheds is paramount for managing water resources, especially in semiarid/subhumid regions. The eddy covariance (EC) technique allows continuous measurements of latent heat flux (LE). However, time series of EC measurements often experience large portions of missing data because of instrumental malfunctions or quality filtering. Existing gap-filling methods are questionable over hilly crop fields because of changes in airflow inclination and subsequent aerodynamic properties. We evaluated the performances of different gap-filling methods before and after tailoring to conditions of hilly crop fields. The tailoring consisted of splitting the LE time series beforehand on the basis of upslope and downslope winds. The experiment was setup within an agricultural hilly watershed in northeastern Tunisia. EC measurements were collected throughout the growth cycle of three wheat crops, two of them located in adjacent fields on opposite hillslopes, and the third one located in a flat field. We considered four gap-filling methods: the REddyProc method, the linear regression between LE and net radiation (Rn), the multi-linear regression of LE against the other energy fluxes, and the use of evaporative fraction (EF). Regardless of the method, the splitting of the LE time series did not impact the gap-filling rate, and it might improve the accuracies on LE retrievals in some cases. Regardless of the method, the obtained accuracies on LE estimates after gap filling were close to instrumental accuracies, and they were comparable to those reported in previous studies over flat and mountainous terrains. Overall, REddyProc was the most appropriate method, for both gap-filling rate and retrieval accuracy. Thus, it seems possible to conduct gap filling for LE time series collected over hilly crop fields, provided the LE time series are split beforehand on the basis of upslope-downslope winds. Future works should address consecutive vegetation growth cycles for a larger panel of conditions in terms of climate, vegetation, and water status.

  2. Analyses of GIMMS NDVI Time Series in Kogi State, Nigeria

    NASA Astrophysics Data System (ADS)

    Palka, Jessica; Wessollek, Christine; Karrasch, Pierre

    2017-10-01

    The value of remote sensing data is particularly evident where an areal monitoring is needed to provide information on the earth's surface development. The use of temporal high resolution time series data allows for detecting short-term changes. In Kogi State in Nigeria different vegetation types can be found. As the major population in this region is living in rural communities with crop farming the existing vegetation is slowly being altered. The expansion of agricultural land causes loss of natural vegetation, especially in the regions close to the rivers which are suitable for crop production. With regard to these facts, two questions can be dealt with covering different aspects of the development of vegetation in the Kogi state, the determination and evaluation of the general development of the vegetation in the study area (trend estimation) and analyses on a short-term behavior of vegetation conditions, which can provide information about seasonal effects in vegetation development. For this purpose, the GIMMS-NDVI data set, provided by the NOAA, provides information on the normalized difference vegetation index (NDVI) in a geometric resolution of approx. 8 km. The temporal resolution of 15 days allows the already described analyses. For the presented analysis data for the period 1981-2012 (31 years) were used. The implemented workflow mainly applies methods of time series analysis. The results show that in addition to the classical seasonal development, artefacts of different vegetation periods (several NDVI maxima) can be found in the data. The trend component of the time series shows a consistently positive development in the entire study area considering the full investigation period of 31 years. However, the results also show that this development has not been continuous and a simple linear modeling of the NDVI increase is only possible to a limited extent. For this reason, the trend modeling was extended by procedures for detecting structural breaks in the time series.

  3. Progress Report on the Airborne Metadata and Time Series Working Groups of the 2016 ESDSWG

    NASA Astrophysics Data System (ADS)

    Evans, K. D.; Northup, E. A.; Chen, G.; Conover, H.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.

    2016-12-01

    NASA's Earth Science Data Systems Working Groups (ESDSWG) was created over 10 years ago. The role of the ESDSWG is to make recommendations relevant to NASA's Earth science data systems from users' experiences. Each group works independently focusing on a unique topic. Participation in ESDSWG groups comes from a variety of NASA-funded science and technology projects, including MEaSUREs and ROSS. Participants include NASA information technology experts, affiliated contractor staff and other interested community members from academia and industry. Recommendations from the ESDSWG groups will enhance NASA's efforts to develop long term data products. The Airborne Metadata Working Group is evaluating the suitability of the current Common Metadata Repository (CMR) and Unified Metadata Model (UMM) for airborne data sets and to develop new recommendations as necessary. The overarching goal is to enhance the usability, interoperability, discovery and distribution of airborne observational data sets. This will be done by assessing the suitability (gaps) of the current UMM model for airborne data using lessons learned from current and past field campaigns, listening to user needs and community recommendations and assessing the suitability of ISO metadata and other standards to fill the gaps. The Time Series Working Group (TSWG) is a continuation of the 2015 Time Series/WaterML2 Working Group. The TSWG is using a case study-driven approach to test the new Open Geospatial Consortium (OGC) TimeseriesML standard to determine any deficiencies with respect to its ability to fully describe and encode NASA earth observation-derived time series data. To do this, the time series working group is engaging with the OGC TimeseriesML Standards Working Group (SWG) regarding unsatisfied needs and possible solutions. The effort will end with the drafting of an OGC Engineering Report based on the use cases and interactions with the OGC TimeseriesML SWG. Progress towards finalizing recommendations will be presented at the meeting.

  4. Consistent Long-Time Series of GPS Satellite Antenna Phase Center Corrections

    NASA Astrophysics Data System (ADS)

    Steigenberger, P.; Schmid, R.; Rothacher, M.

    2004-12-01

    The current IGS processing strategy disregards satellite antenna phase center variations (pcvs) depending on the nadir angle and applies block-specific phase center offsets only. However, the transition from relative to absolute receiver antenna corrections presently under discussion necessitates the consideration of satellite antenna pcvs. Moreover, studies of several groups have shown that the offsets are not homogeneous within a satellite block. Manufacturer specifications seem to confirm this assumption. In order to get best possible antenna corrections, consistent ten-year time series (1994-2004) of satellite-specific pcvs and offsets were generated. This challenging effort became possible as part of the reprocessing of a global GPS network currently performed by the Technical Universities of Munich and Dresden. The data of about 160 stations since the official start of the IGS in 1994 have been reprocessed, as today's GPS time series are mostly inhomogeneous and inconsistent due to continuous improvements in the processing strategies and modeling of global GPS solutions. An analysis of the signals contained in the time series of the phase center offsets demonstrates amplitudes on the decimeter level, at least one order of magnitude worse than the desired accuracy. The periods partly arise from the GPS orbit configuration, as the orientation of the orbit planes with regard to the inertial system repeats after about 350 days due to the rotation of the ascending nodes. In addition, the rms values of the X- and Y-offsets show a high correlation with the angle between the orbit plane and the direction to the sun. The time series of the pcvs mainly point at the correlation with the global terrestrial scale. Solutions with relative and absolute phase center corrections, with block- and satellite-specific satellite antenna corrections demonstrate the effect of this parameter group on other global GPS parameters such as the terrestrial scale, station velocities, the geocenter position or the tropospheric delays. Thus, deeper insight into the so-called `Bermuda triangle' of several highly correlated parameters is given.

  5. Change Detection Processing Chain Dedicated to Sentinel Data Time Series. Application to Forest and Water Bodies Monitoring

    NASA Astrophysics Data System (ADS)

    Perez Saavedra, L.-M.; Mercier, G.; Yesou, H.; Liege, F.; Pasero, G.

    2016-08-01

    The Copernicus program of ESA and European commission (6 Sentinels Missions, among them Sentinel-1 with Synthetic Aperture Radar sensor and Sentinel-2 with 13-band 10 to 60 meter resolution optical sensors), offers a new opportunity to Earth Observation with high temporal acquisition capability ( 12 days repetitiveness and 5 days in some geographic areas of the world) with high spatial resolution.Due to these high temporal and spatial resolutions, it opens new challenges in several fields such as image processing, new algorithms for Time Series and big data analysis. In addition, these missions will be able to analyze several topics of earth temporal evolution such as crop vegetation, water bodies, Land use and Land Cover (LULC), sea and ice information, etc. This is particularly useful for end users and policy makers to detect early signs of damages, vegetation illness, flooding areas, etc.From the state of the art, one can find algorithms and methods that use a bi-date comparison for change detection [1-3] or time series analysis. Actually, these methods are essentially used for target detection or for abrupt change detection that requires 2 observations only.A Hölder means-based change detection technique has been proposed in [2,3] for high resolution radar images. This so-called MIMOSA technique has been mainly dedicated to man-made change detection in urban areas and CARABAS - II project by using a couple of SAR images. An extension to multitemporal change detection technique has been investigated but its application to land use and cover changes still has to be validated.The Hölder Hp is a Time Series pixel by pixel feature extraction and is defined by:H𝑝[X]=[1/n∑ⁿᵢ₌1 Xᴾᵢ]1/p p∈R Hp[X] : N images * S Bandes * t datesn is the number of images in the time series. N > 2Hp (X) is continuous and monotonic increasing in p for - ∞ < p < ∞

  6. Ichthyoplankton Time Series: A Potential Ocean Observing Network to Provide Indicators of Climate Impacts on Fish Communities along the West Coast of North America

    NASA Astrophysics Data System (ADS)

    Koslow, J. A.; Brodeur, R.; Duffy-Anderson, J. T.; Perry, I.; jimenez Rosenberg, S.; Aceves, G.

    2016-02-01

    Ichthyoplankton time series available from the Bering Sea, Gulf of Alaska and California Current (Oregon to Baja California) provide a potential ocean observing network to assess climate impacts on fish communities along the west coast of North America. Larval fish abundance reflects spawning stock biomass, so these data sets provide indicators of the status of a broad range of exploited and unexploited fish populations. Analyses to date have focused on individual time series, which generally exhibit significant change in relation to climate. Off California, a suite of 24 midwater fish taxa have declined > 60%, correlated with declining midwater oxygen concentrations, and overall larval fish abundance has declined 72% since 1969, a trend based on the decline of predominantly cool-water affinity taxa in response to warming ocean temperatures. Off Oregon, there were dramatic differences in community structure and abundance of larval fishes between warm and cool ocean conditions. Midwater deoxygenation and warming sea surface temperature trends are predicted to continue as a result of global climate change. US, Canadian, and Mexican fishery scientists are now collaborating in a virtual ocean observing network to synthesize available ichthyoplankton time series and compare patterns of change in relation to climate. This will provide regional indicators of populations and groups of taxa sensitive to warming, deoxygenation and potentially other stressors, establish the relevant scales of coherence among sub-regions and across Large Marine Ecosystems, and provide the basis for predicting future climate change impacts on these ecosystems.

  7. Burned area detection based on Landsat time series in savannas of southern Burkina Faso

    NASA Astrophysics Data System (ADS)

    Liu, Jinxiu; Heiskanen, Janne; Maeda, Eduardo Eiji; Pellikka, Petri K. E.

    2018-02-01

    West African savannas are subject to regular fires, which have impacts on vegetation structure, biodiversity and carbon balance. An efficient and accurate mapping of burned area associated with seasonal fires can greatly benefit decision making in land management. Since coarse resolution burned area products cannot meet the accuracy needed for fire management and climate modelling at local scales, the medium resolution Landsat data is a promising alternative for local scale studies. In this study, we developed an algorithm for continuous monitoring of annual burned areas using Landsat time series. The algorithm is based on burned pixel detection using harmonic model fitting with Landsat time series and breakpoint identification in the time series data. This approach was tested in a savanna area in southern Burkina Faso using 281 images acquired between October 2000 and April 2016. An overall accuracy of 79.2% was obtained with balanced omission and commission errors. This represents a significant improvement in comparison with MODIS burned area product (67.6%), which had more omission errors than commission errors, indicating underestimation of the total burned area. By observing the spatial distribution of burned areas, we found that the Landsat based method misclassified cropland and cloud shadows as burned areas due to the similar spectral response, and MODIS burned area product omitted small and fragmented burned areas. The proposed algorithm is flexible and robust against decreased data availability caused by clouds and Landsat 7 missing lines, therefore having a high potential for being applied in other landscapes in future studies.

  8. The Oceanic Flux Program: A three decade time-series of particle flux in the deep Sargasso Sea

    NASA Astrophysics Data System (ADS)

    Weber, J. C.; Conte, M. H.

    2010-12-01

    The Oceanic Flux Program (OFP), 75 km SE of Bermuda, is the longest running time-series of its kind. Initiated in 1978, the OFP has produced an unsurpassed, nearly continuous record of temporal variability in deep ocean fluxes, with a >90% temporal coverage at 3200m depth. The OFP, in conjunction with the co-located Bermuda-Atlantic Time Series (BATS) and the Bermuda Testbed Mooring (BTM) time-series, has provided key observations enabling detailed assessment of how seasonal and non-seasonal variability in the deep ocean is linked with the overlying physical and biogeochemical environment. This talk will focus on the short-term flux variability that overlies the seasonal flux pattern in the Sargasso Sea, emphasizing episodic extreme flux events. Extreme flux events are responsible for much of the year-to-year variability in mean annual flux and are most often observed during early winter and late spring when surface stratification is weak or transient. In addition to biological phenomena (e.g. salp blooms), passage of productive meso-scale features such as eddies, which alter surface water mixing characteristics and surface export fluxes, may initiate some extreme flux events. Yet other productive eddies show a minimal influence on the deep flux, underscoring the importance of upper ocean ecosystem structure and midwater processes on the coupling between the surface ocean environment and deep fluxes. Using key organic and inorganic tracers, causative processes that influence deep flux generation and the strength of the coupling with the surface ocean environment can be identified.

  9. Status and trends of the Lake Huron offshore demersal fish community, 1976-2015

    USGS Publications Warehouse

    Roseman, Edward; Chriscinske, Margret Ann; Castle, Dana Kristina; Prichard, Carson G.

    2016-01-01

    The USGS Great Lakes Science Center has conducted trawl surveys to assess annual changes in the offshore demersal fish community of Lake Huron since 1973. Sample sites include five ports in U.S. waters with less frequent sampling near Goderich, Ontario. The 2015 fall bottom trawl survey was carried out between 14 and 28 October and included all U.S. ports, as well as Goderich, ON. The 2015 main basin prey fish biomass estimate for Lake Huron was 19.4 kilotonnes, a decline of about 50 percent from 2014. This estimate is the second lowest in the time series, and is approximately 5 percent of the maximum estimate in the time series observed in 1987. No adult alewife were collected in 2015 and YOY alewife was the second lowest in the time series, up slightly from the record low in 2014. The estimated biomass of yearling and older rainbow smelt also decreased and was the lowest observed in the time series. Estimated adult bloater biomass in Lake Huron declined to about half of the 2014 estimate. YOY alewife, rainbow smelt, and bloater abundance and biomass decreased over 2014. Biomass estimates for deepwater sculpins declined while trout-perch and ninespine stickleback increased over 2014 values, but all remained low compared to historic estimates. The 2014 biomass estimate for round goby increased from 2014 but remains at only 7 percent of the maximum observed in 2003. Wild juvenile lake trout were captured again in 2015, suggesting that natural reproduction by lake trout continues to occur.

  10. Dancing sprites: Detailed analysis of two case studies

    NASA Astrophysics Data System (ADS)

    Soula, Serge; Mlynarczyk, Janusz; Füllekrug, Martin; Pineda, Nicolau; Georgis, Jean-François; van der Velde, Oscar; Montanyà, Joan; Fabró, Ferran

    2017-03-01

    On 29-30 October 2013, a low-light video camera installed at Pic du Midi (2877 m), recorded transient luminous events above a very active storm over the Mediterranean Sea. The minimum cloud top temperature reached -73°C, while its cloud to ground (CG) flash rate exceeded 30 fl min-1. Some sprite events have long duration and resemble to dancing sprites. We analyze in detail the temporal evolution and estimated location of two series of sprite sequences, as well as the cloud structure, the lightning activity, the electric field radiated in a broad range of low frequencies, and the current moment waveform of the lightning strokes. (i) In each series, successive sprite sequences reflect time and location of corresponding positive lightning strokes across the stratiform region. (ii) The longer time-delayed (>20 ms) sprite elements correspond to the lower impulsive charge moment changes (iCMC) of the parent strokes (<200 C km), and they are shifted few tens of kilometers from their SP + CG stroke. However, both short and long time-delayed sprite elements also occur after strokes that produce a large iCMC and that are followed by a continuing current. (iii) The long time-delayed sprite elements during the continuing current correspond to surges in the current moment waveform. They occur sometimes at an altitude apparently lower than the previous short time-delayed sprite elements, possibly because of changes in the local conductivity. (iv) The largest and brightest sprite elements produce significant current signatures, visible when their delay is not too short ( 3-5 ms).

  11. An Efficient Algorithm for Perturbed Orbit Integration Combining Analytical Continuation and Modified Chebyshev Picard Iteration

    NASA Astrophysics Data System (ADS)

    Elgohary, T.; Kim, D.; Turner, J.; Junkins, J.

    2014-09-01

    Several methods exist for integrating the motion in high order gravity fields. Some recent methods use an approximate starting orbit, and an efficient method is needed for generating warm starts that account for specific low order gravity approximations. By introducing two scalar Lagrange-like invariants and employing Leibniz product rule, the perturbed motion is integrated by a novel recursive formulation. The Lagrange-like invariants allow exact arbitrary order time derivatives. Restricting attention to the perturbations due to the zonal harmonics J2 through J6, we illustrate an idea. The recursively generated vector-valued time derivatives for the trajectory are used to develop a continuation series-based solution for propagating position and velocity. Numerical comparisons indicate performance improvements of ~ 70X over existing explicit Runge-Kutta methods while maintaining mm accuracy for the orbit predictions. The Modified Chebyshev Picard Iteration (MCPI) is an iterative path approximation method to solve nonlinear ordinary differential equations. The MCPI utilizes Picard iteration with orthogonal Chebyshev polynomial basis functions to recursively update the states. The key advantages of the MCPI are as follows: 1) Large segments of a trajectory can be approximated by evaluating the forcing function at multiple nodes along the current approximation during each iteration. 2) It can readily handle general gravity perturbations as well as non-conservative forces. 3) Parallel applications are possible. The Picard sequence converges to the solution over large time intervals when the forces are continuous and differentiable. According to the accuracy of the starting solutions, however, the MCPI may require significant number of iterations and function evaluations compared to other integrators. In this work, we provide an efficient methodology to establish good starting solutions from the continuation series method; this warm start improves the performance of the MCPI significantly and will likely be useful for other applications where efficiently computed approximate orbit solutions are needed.

  12. The Reassuring Role of TV's Continuing Characters.

    ERIC Educational Resources Information Center

    Piltch, Charles N.

    1979-01-01

    The author analyzes the series form of television program, particularly the qualities and functions of the continuing characters and their relationship to the plot. He discusses the reassuring psychological effects of a TV series on the audience and the implications of a decline in this type of programing. (SJL)

  13. About well-posed definition of geophysical fields'

    NASA Astrophysics Data System (ADS)

    Ermokhine, Konstantin; Zhdanova, Ludmila; Litvinova, Tamara

    2013-04-01

    We introduce a new approach to the downward continuation of geophysical fields based on approximation of observed data by continued fractions. Key Words: downward continuation, continued fraction, Viskovatov's algorithm. Many papers in geophysics are devoted to the downward continuation of geophysical fields from the earth surface to the lower halfspace. Known obstacle for the method practical use is a field's breaking-down phenomenon near the pole closest to the earth surface. It is explained by the discrepancy of the studied fields' mathematical description: linear presentation of the field in the polynomial form, Taylor or Fourier series, leads to essential and unremovable instability of the inverse problem since the field with specific features in the form of poles in the lower halfspace principally can't be adequately described by the linear construction. Field description by the rational fractions is closer to reality. In this case the presence of function's poles in the lower halfspace corresponds adequately to the denominator zeros. Method proposed below is based on the continued fractions. Let's consider the function measured along the profile and represented it in the form of the Tchebishev series (preliminary reducing the argument to the interval [-1, 1]): There are many variants of power series' presentation by continued fractions. The areas of series and corresponding continued fraction's convergence may differ essentially. As investigations have shown, the most suitable mathematical construction for geophysical fields' continuation is so called general C-fraction: where ( , z designates the depth) For construction of C-fraction corresponding to power series exists a rather effective and stable Viskovatov's algorithm (Viskovatov B. "De la methode generale pour reduire toutes sortes des quantitees en fraction continues". Memoires de l' Academie Imperiale des Sciences de St. Petersburg, 1, 1805). A fundamentally new algorithm for Downward Continuation (in an underground half-space) a field measured at the surface, allows you to make the interpretation of geophysical data, to build a cross-section, determine the depth, the approximate shape and size of the sources measured at the surface of the geophysical fields. Appliance of the method are any geophysical surveys: magnetic, gravimetric, electrical exploration, seismic, geochemical surveying, etc. Method was tested on model examples, and practical data. The results are confirmed by drilling.

  14. Precursory diffuse CO2 and H2S emission signatures of the 2011-2012 El Hierro submarine eruption, Canary Islands

    NASA Astrophysics Data System (ADS)

    Pérez, Nemesio M.; Padilla, Germán D.; Padrón, Eleazar; Hernández, Pedro A.; Melián, Gladys V.; Barrancos, José; Dionis, Samara; Nolasco, Dácil; Rodríguez, Fátima; Calvo, David; Hernández, Íñigo

    2012-08-01

    On October 12, 2011, a submarine eruption began 2 km off the coast of La Restinga, south of El Hierro Island. CO2 and H2S soil efflux were continuously measured during the period of volcanic unrest by using the accumulation chamber method at two different geochemical stations, HIE01 and HIE07. Recorded CO2 and H2S effluxes showed precursory signals that preceded the submarine eruption. Beginning in late August, the CO2 efflux time series started increasing at a relatively constant rate over one month, reaching a maximum of 19 gm-2d-1 one week before the onset of the submarine volcanic eruption. The H2S efflux time series at HIE07 showed a pulse in H2S emission just one day before the initiation of the submarine eruption, reaching peak values of 42 mg m-2 d-1, 10 times the average H2S efflux recorded during the observation period. Since CO2 and H2S effluxes are strongly influenced by external factors, we applied a multiple regression analysis to remove their contribution. A statistical analysis showed that the long-term trend of the filtered data is well correlated with the seismic energy. We find that these geochemical stations are important monitoring sites for evaluating the volcanic activity of El Hierro and that they demonstrate the potential of applying continuous monitoring of soil CO2 and H2S efflux to improve and optimize the detection of early warning signals of future volcanic unrest episodes at El Hierro. Continuous diffuse degassing studies would likely prove useful for monitoring other volcanoes during unrest episodes.

  15. Seventh international conference on time-resolved vibrational spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, R.B.; Martinez, M.A.D.; Shreve, A.

    1997-04-01

    The International Conference on Time-Resolved Vibrational Spectroscopy (TRVS) is widely recognized as the major international forum for the discussion of advances in this rapidly growing field. The 1995 conference was the seventh in a series that began at Lake Placid, New York, 1982. Santa Fe, New Mexico, was the site of the Seventh International Conference on Time-Resolved Vibrational Spectroscopy, held from June 11 to 16, 1995. TRVS-7 was attended by 157 participants from 16 countries and 85 institutions, and research ranging across the full breadth of the field of time-resolved vibrational spectroscopy was presented. Advances in both experimental capabilities formore » time-resolved vibrational measurements and in theoretical descriptions of time-resolved vibrational methods continue to occur, and several sessions of the conference were devoted to discussion of these advances and the associated new directions in TRVS. Continuing the interdisciplinary tradition of the TRVS meetings, applications of time-resolved vibrational methods to problems in physics, biology, materials science, and chemistry comprised a large portion of the papers presented at the conference.« less

  16. Ecological dynamics of continuous and categorical decision-making: the regatta start in sailing.

    PubMed

    Araújo, Duarte; Davids, Keith; Diniz, Ana; Rocha, Luis; Santos, João Coelho; Dias, Gonçalo; Fernandes, Orlando

    2015-01-01

    Ecological dynamics of decision-making in the sport of sailing exemplifies emergent, conditionally coupled, co-adaptive behaviours. In this study, observation of the coupling dynamics of paired boats during competitive sailing showed that decision-making can be modelled as a self-sustained, co-adapting system of informationally coupled oscillators (boats). Bytracing the spatial-temporal displacements of the boats, time series analyses (autocorrelations, periodograms and running correlations) revealed that trajectories of match racing boats are coupled more than 88% of the time during a pre-start race, via continuous, competing co-adaptions between boats. Results showed that both the continuously selected trajectories of the sailors (12 years of age) and their categorical starting point locations were examples of emergent decisions. In this dynamical conception of decision-making behaviours, strategic positioning (categorical) and continuous displacement of a boat over the course in match-race sailing emerged as a function of interacting task, personal and environmental constraints. Results suggest how key interacting constraints could be manipulated in practice to enhance sailors' perceptual attunement to them in competition.

  17. Intrinsic vs. spurious long-range memory in high-frequency records of environmental radioactivity - Critical re-assessment and application to indoor 222Rn concentrations from Coimbra, Portugal

    NASA Astrophysics Data System (ADS)

    Donner, Reik V.; Potirakis, Stelios M.; Barbosa, Susana M.; Matos, Jose A. O.

    2015-04-01

    The presence or absence of long-range correlations in environmental radioactivity fluctuations has recently attracted considerable interest. Among a multiplicity of practically relevant applications, identifying and disentangling the environmental factors controlling the variable concentrations of the radioactive noble gas Radon is important for estimating its effect on human health and the efficiency of possible measures for reducing the corresponding exposition. In this work, we present a critical re-assessment of a multiplicity of complementary methods that have been previously applied for evaluating the presence of long-range correlations and fractal scaling in environmental Radon variations with a particular focus on the specific properties of the underlying time series. As an illustrative case study, we subsequently re-analyze two high-frequency records of indoor Radon concentrations from Coimbra, Portugal, each of which spans several months of continuous measurements at a high temporal resolution of five minutes. Our results reveal that at the study site, Radon concentrations exhibit complex multi-scale dynamics with qualitatively different properties at different time-scales: (i) essentially white noise in the high-frequency part (up to time-scales of about one hour), (ii) spurious indications of a non-stationary, apparently long-range correlated process (at time scales between hours and one day) arising from marked periodic components probably related to tidal frequencies, and (iii) low-frequency variability indicating a true long-range dependent process, which might be dominated by a response to meteorological drivers. In the presence of such multi-scale variability, common estimators of long-range memory in time series are necessarily prone to fail if applied to the raw data without previous separation of time-scales with qualitatively different dynamics. We emphasize that similar properties can be found in other types of geophysical time series (for example, tide gauge records), calling for a careful application of time series analysis tools when studying such data.

  18. Radiosonde Atmospheric Temperature Products for Assessing Climate (RATPAC): Towards a New Adjusted Radiosonde Dataset

    NASA Astrophysics Data System (ADS)

    Free, M. P.; Angell, J. K.; Durre, I.; Klein, S.; Lanzante, J.; Lawrimore, J.; Peterson, T.; Seidel, D.

    2002-05-01

    The objective of NOAA's RATPAC project is to develop climate-quality global, hemispheric and zonal upper-air temperature time series from the NCDC radiosonde database. Lanzante, Klein and Seidel (LKS) have produced an 87-station adjusted radiosonde dataset using a multifactor expert decision approach. Our goal is to extend this dataset spatially and temporally and to provide a method to update it routinely at NCDC. Since the LKS adjustment method is too labor-intensive for these purposes, we are investigating a first-difference method (Peterson et al., 1998) and an automated version of the LKS method. The first difference method (FD) can be used to combine large numbers of time series into spatial means, but also introduces a random error in the resulting large-scale averages. If the portions of the time series with suspect continuity are withheld from the calculations, it has the potential to reconstruct the real variability without the effects of the discontinuities. However, tests of FD on unadjusted radiosonde data and on reanalysis temperature data suggest that it must be used with caution when the number of stations is low and the number of data gaps is high. Because of these problems with the first difference approach, we are also considering an automated version of the LKS adjustment method using statistical change points, day-night temperature difference series, relationships between changes in adjacent atmospheric levels, and station histories to identify inhomogeneities in the temperature data.

  19. Constraining the physics of carbon crystallization through pulsations of a massive DAV BPM37093

    NASA Astrophysics Data System (ADS)

    Nitta, Atsuko; Kepler, S. O.; Chené, André-Nicolas; Koester, D.; Provencal, J. L.; Kleinmani, S. J.; Sullivan, D. J.; Chote, Paul; Sefako, Ramotholo; Kanaan, Antonio; Romero, Alejandra; Corti, Mariela; Kilic, Mukremin; Montgomery, M. H.; Winget, D. E.

    We are trying to reduce the largest uncertainties in using white dwarf stars as Galactic chronometers by understanding the details of carbon crystalliazation that currently result in a 1-2 Gyr uncertainty in the ages of the oldest white dwarf stars. We expect the coolest white dwarf stars to have crystallized interiors, but theory also predicts hotter white dwarf stars, if they are massive enough, will also have some core crystallization. BPM 37093 is the first discovered of only a handful of known massive white dwarf stars that are also pulsating DAV, or ZZ Ceti, variables. Our approach is to use the pulsations to constrain the core composition and amount of crystallization. Here we report our analysis of 4 hours of continuous time series spectroscopy of BPM 37093 with Gemini South combined with simultaneous time-series photometry from Mt. John (New Zealand), SAAO, PROMPT, and Complejo Astronomico El Leoncito (CASLEO, Argentina).

  20. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  1. Land Cover Analysis by Using Pixel-Based and Object-Based Image Classification Method in Bogor

    NASA Astrophysics Data System (ADS)

    Amalisana, Birohmatin; Rokhmatullah; Hernina, Revi

    2017-12-01

    The advantage of image classification is to provide earth’s surface information like landcover and time-series changes. Nowadays, pixel-based image classification technique is commonly performed with variety of algorithm such as minimum distance, parallelepiped, maximum likelihood, mahalanobis distance. On the other hand, landcover classification can also be acquired by using object-based image classification technique. In addition, object-based classification uses image segmentation from parameter such as scale, form, colour, smoothness and compactness. This research is aimed to compare the result of landcover classification and its change detection between parallelepiped pixel-based and object-based classification method. Location of this research is Bogor with 20 years range of observation from 1996 until 2016. This region is famous as urban areas which continuously change due to its rapid development, so that time-series landcover information of this region will be interesting.

  2. Experimental nonlinear dynamical studies in cesium magneto-optical trap using time-series analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anwar, M., E-mail: mamalik2000@gmail.com; Islam, R.; Faisal, M.

    2015-03-30

    A magneto-optical trap of neutral atoms is essentially a dissipative quantum system. The fast thermal atoms continuously dissipate their energy to the environment via spontaneous emissions during the cooling. The atoms are, therefore, strongly coupled with the vacuum reservoir and the laser field. The vacuum fluctuations as well as the field fluctuations are imparted to the atoms as random photon recoils. Consequently, the external and internal dynamics of atoms becomes stochastic. In this paper, we have investigated the stochastic dynamics of the atoms in a magneto-optical trap during the loading process. The time series analysis of the fluorescence signal showsmore » that the dynamics of the atoms evolves, like all dissipative systems, from deterministic to the chaotic regime. The subsequent disappearance and revival of chaos was attributed to chaos synchronization between spatially different atoms in the magneto-optical trap.« less

  3. Product demand forecasts using wavelet kernel support vector machine and particle swarm optimization in manufacture system

    NASA Astrophysics Data System (ADS)

    Wu, Qi

    2010-03-01

    Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.

  4. Observations and projections of visibility and aerosol optical thickness (1956-2100) in the Netherlands: impacts of time-varying aerosol composition and hygroscopicity

    NASA Astrophysics Data System (ADS)

    Boers, R.; van Weele, M.; van Meijgaard, E.; Savenije, M.; Siebesma, A. P.; Bosveld, F.; Stammes, P.

    2015-01-01

    Time series of visibility and aerosol optical thickness for the Netherlands have been constructed for 1956-2100 based on observations and aerosol mass scenarios. Aerosol optical thickness from 1956 to 2013 has been reconstructed by converting time series of visibility to visible extinction which in turn are converted to aerosol optical thickness using an appropriate scaling depth. The reconstruction compares closely with remote sensing observations of aerosol optical thickness between 1960 and 2013. It appears that aerosol optical thickness was relatively constant over the Netherlands in the years 1955-1985. After 1985, visibility has improved, while at the same time aerosol optical thickness has decreased. Based on aerosol emission scenarios for the Netherlands three aerosol types have been identified: (1) a constant background consisting of sea salt and mineral dust, (2) a hydrophilic anthropogenic inorganic mixture, and (3) a partly hydrophobic mixture of black carbon (BC) and organic aerosols (OAs). A reduction in overall aerosol concentration turns out to be the most influential factor in the reduction in aerosol optical thickness. But during 1956-1985, an upward trend in hydrophilic aerosols and associated upward trend in optical extinction has partly compensated the overall reduction in optical extinction due to the reduction in less hydrophilic BC and OAs. A constant optical thickness ensues. This feature highlights the influence of aerosol hygroscopicity on time-varying signatures of atmospheric optical properties. Within the hydrophilic inorganic aerosol mixture there is a gradual shift from sulfur-based (1956-1985) to a nitrogen-based water aerosol chemistry (1990 onwards) but always modulated by the continual input of sodium from sea salt. From 2013 to 2100, visibility is expected to continue its increase, while at the same time optical thickness is foreseen to continue to decrease. The contribution of the hydrophilic mixture to the aerosol optical thickness will increase from 30% to 35% in 1956 to more than 70% in 2100. At the same time the contribution of black and organic aerosols will decrease by more than 80%.

  5. Evaluating the impact of a mandatory pre-abortion ultrasound viewing law: A mixed methods study.

    PubMed

    Upadhyay, Ushma D; Kimport, Katrina; Belusa, Elise K O; Johns, Nicole E; Laube, Douglas W; Roberts, Sarah C M

    2017-01-01

    Since mid-2013, Wisconsin abortion providers have been legally required to display and describe pre-abortion ultrasound images. We aimed to understand the impact of this law. We used a mixed-methods study design at an abortion facility in Wisconsin. We abstracted data from medical charts one year before the law to one year after and used multivariable models, mediation/moderation analysis, and interrupted time series to assess the impact of the law, viewing, and decision certainty on likelihood of continuing the pregnancy. We conducted in-depth interviews with women in the post-law period about their ultrasound experience and analyzed them using elaborative and modified grounded theory. A total of 5342 charts were abstracted; 8.7% continued their pregnancies pre-law and 11.2% post-law (p = 0.002). A multivariable model confirmed the law was associated with higher odds of continuing pregnancy (aOR = 1.23, 95% CI: 1.01-1.50). Decision certainty (aOR = 6.39, 95% CI: 4.72-8.64) and having to pay fully out of pocket (aOR = 4.98, 95% CI: 3.86-6.41) were most strongly associated with continuing pregnancy. Ultrasound viewing fully mediated the relationship between the law and continuing pregnancy. Interrupted time series analyses found no significant effect of the law but may have been underpowered to detect such a small effect. Nineteen of twenty-three women interviewed viewed their ultrasound image. Most reported no impact on their abortion decision; five reported a temporary emotional impact or increased certainty about choosing abortion. Two women reported that viewing helped them decide to continue the pregnancy; both also described preexisting decision uncertainty. This law caused an increase in viewing rates and a statistically significant but small increase in continuing pregnancy rates. However, the majority of women were certain of their abortion decision and the law did not change their decision. Other factors were more significant in women's decision-making, suggesting evaluations of restrictive laws should take account of the broader social environment.

  6. Evaluating the impact of a mandatory pre-abortion ultrasound viewing law: A mixed methods study

    PubMed Central

    Kimport, Katrina; Belusa, Elise K. O.; Johns, Nicole E.; Laube, Douglas W.; Roberts, Sarah C. M.

    2017-01-01

    Background Since mid-2013, Wisconsin abortion providers have been legally required to display and describe pre-abortion ultrasound images. We aimed to understand the impact of this law. Methods We used a mixed-methods study design at an abortion facility in Wisconsin. We abstracted data from medical charts one year before the law to one year after and used multivariable models, mediation/moderation analysis, and interrupted time series to assess the impact of the law, viewing, and decision certainty on likelihood of continuing the pregnancy. We conducted in-depth interviews with women in the post-law period about their ultrasound experience and analyzed them using elaborative and modified grounded theory. Results A total of 5342 charts were abstracted; 8.7% continued their pregnancies pre-law and 11.2% post-law (p = 0.002). A multivariable model confirmed the law was associated with higher odds of continuing pregnancy (aOR = 1.23, 95% CI: 1.01–1.50). Decision certainty (aOR = 6.39, 95% CI: 4.72–8.64) and having to pay fully out of pocket (aOR = 4.98, 95% CI: 3.86–6.41) were most strongly associated with continuing pregnancy. Ultrasound viewing fully mediated the relationship between the law and continuing pregnancy. Interrupted time series analyses found no significant effect of the law but may have been underpowered to detect such a small effect. Nineteen of twenty-three women interviewed viewed their ultrasound image. Most reported no impact on their abortion decision; five reported a temporary emotional impact or increased certainty about choosing abortion. Two women reported that viewing helped them decide to continue the pregnancy; both also described preexisting decision uncertainty. Conclusions This law caused an increase in viewing rates and a statistically significant but small increase in continuing pregnancy rates. However, the majority of women were certain of their abortion decision and the law did not change their decision. Other factors were more significant in women’s decision-making, suggesting evaluations of restrictive laws should take account of the broader social environment. PMID:28746377

  7. Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.

    Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches formore » companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.« less

  8. Initial Validation of NDVI time seriesfrom AVHRR, VEGETATION, and MODIS

    NASA Technical Reports Server (NTRS)

    Morisette, Jeffrey T.; Pinzon, Jorge E.; Brown, Molly E.; Tucker, Jim; Justice, Christopher O.

    2004-01-01

    The paper will address Theme 7: Multi-sensor opportunities for VEGETATION. We present analysis of a long-term vegetation record derived from three moderate resolution sensors: AVHRR, VEGETATION, and MODIS. While empirically based manipulation can ensure agreement between the three data sets, there is a need to validate the series. This paper uses atmospherically corrected ETM+ data available over the EOS Land Validation Core Sites as an independent data set with which to compare the time series. We use ETM+ data from 15 globally distributed sites, 7 of which contain repeat coverage in time. These high-resolution data are compared to the values of each sensor by spatially aggregating the ETM+ to each specific sensors' spatial coverage. The aggregated ETM+ value provides a point estimate for a specific site on a specific date. The standard deviation of that point estimate is used to construct a confidence interval for that point estimate. The values from each moderate resolution sensor are then evaluated with respect to that confident interval. Result show that AVHRR, VEGETATION, and MODIS data can be combined to assess temporal uncertainties and address data continuity issues and that the atmospherically corrected ETM+ data provide an independent source with which to compare that record. The final product is a consistent time series climate record that links historical observations to current and future measurements.

  9. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance

    PubMed Central

    Schiepek, Günter; Aichhorn, Wolfgang; Gruber, Martin; Strunk, Guido; Bachler, Egon; Aas, Benjamin

    2016-01-01

    Objective: The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients' compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific) surveys. Methods: The process-outcome monitoring is technically realized by an internet-based device for data collection and data analysis, the Synergetic Navigation System. Its feasibility is documented by a compliance study on 151 clients treated in an inpatient and a day-treatment clinic. Results: We found high compliance rates (mean: 78.3%, median: 89.4%) amongst the respondents, independent of the severity of symptoms or the degree of impairment. Compared to other diagnoses, the compliance rate was lower in the group diagnosed with personality disorders. Conclusion: The results support the feasibility of high-frequency monitoring in routine psychotherapy settings. Daily collection of psychological surveys allows for the assessment of highly resolved, equidistant time series data which gives insight into the nonlinear qualities of therapeutic change processes (e.g., pattern transitions, critical instabilities). PMID:27199837

  10. Analysis of biomedical time signals for characterization of cutaneous diabetic micro-angiopathy

    NASA Astrophysics Data System (ADS)

    Kraitl, Jens; Ewald, Hartmut

    2007-02-01

    Photo-plethysmography (PPG) is frequently used in research on microcirculation of blood. It is a non-invasive procedure and takes minimal time to be carried out. Usually PPG time series are analyzed by conventional linear methods, mainly Fourier analysis. These methods may not be optimal for the investigation of nonlinear effects of the hearth circulation system like vasomotion, autoregulation, thermoregulation, breathing, heartbeat and vessels. The wavelet analysis of the PPG time series is a specific, sensitive nonlinear method for the in vivo identification of hearth circulation patterns and human health status. This nonlinear analysis of PPG signals provides additional information which cannot be detected using conventional approaches. The wavelet analysis has been used to study healthy subjects and to characterize the health status of patients with a functional cutaneous microangiopathy which was associated with diabetic neuropathy. The non-invasive in vivo method is based on the radiation of monochromatic light through an area of skin on the finger. A Photometrical Measurement Device (PMD) has been developed. The PMD is suitable for non-invasive continuous online monitoring of one or more biologic constituent values and blood circulation patterns.

  11. Chaotic examination

    NASA Astrophysics Data System (ADS)

    Bildirici, Melike; Sonustun, Fulya Ozaksoy; Sonustun, Bahri

    2018-01-01

    In the regards of chaos theory, new concepts such as complexity, determinism, quantum mechanics, relativity, multiple equilibrium, complexity, (continuously) instability, nonlinearity, heterogeneous agents, irregularity were widely questioned in economics. It is noticed that linear models are insufficient for analyzing unpredictable, irregular and noncyclical oscillations of economies, and for predicting bubbles, financial crisis, business cycles in financial markets. Therefore, economists gave great consequence to use appropriate tools for modelling non-linear dynamical structures and chaotic behaviors of the economies especially in macro and the financial economy. In this paper, we aim to model the chaotic structure of exchange rates (USD-TL and EUR-TL). To determine non-linear patterns of the selected time series, daily returns of the exchange rates were tested by BDS during the period from January 01, 2002 to May 11, 2017 which covers after the era of the 2001 financial crisis. After specifying the non-linear structure of the selected time series, it was aimed to examine the chaotic characteristic for the selected time period by Lyapunov Exponents. The findings verify the existence of the chaotic structure of the exchange rate returns in the analyzed time period.

  12. Improved detection of congestive heart failure via probabilistic symbolic pattern recognition and heart rate variability metrics.

    PubMed

    Mahajan, Ruhi; Viangteeravat, Teeradache; Akbilgic, Oguz

    2017-12-01

    A timely diagnosis of congestive heart failure (CHF) is crucial to evade a life-threatening event. This paper presents a novel probabilistic symbol pattern recognition (PSPR) approach to detect CHF in subjects from their cardiac interbeat (R-R) intervals. PSPR discretizes each continuous R-R interval time series by mapping them onto an eight-symbol alphabet and then models the pattern transition behavior in the symbolic representation of the series. The PSPR-based analysis of the discretized series from 107 subjects (69 normal and 38 CHF subjects) yielded discernible features to distinguish normal subjects and subjects with CHF. In addition to PSPR features, we also extracted features using the time-domain heart rate variability measures such as average and standard deviation of R-R intervals. An ensemble of bagged decision trees was used to classify two groups resulting in a five-fold cross-validation accuracy, specificity, and sensitivity of 98.1%, 100%, and 94.7%, respectively. However, a 20% holdout validation yielded an accuracy, specificity, and sensitivity of 99.5%, 100%, and 98.57%, respectively. Results from this study suggest that features obtained with the combination of PSPR and long-term heart rate variability measures can be used in developing automated CHF diagnosis tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Toward a comprehensive landscape vegetation monitoring framework

    NASA Astrophysics Data System (ADS)

    Kennedy, Robert; Hughes, Joseph; Neeti, Neeti; Larrue, Tara; Gregory, Matthew; Roberts, Heather; Ohmann, Janet; Kane, Van; Kane, Jonathan; Hooper, Sam; Nelson, Peder; Cohen, Warren; Yang, Zhiqiang

    2016-04-01

    Blossoming Earth observation resources provide great opportunity to better understand land vegetation dynamics, but also require new techniques and frameworks to exploit their potential. Here, I describe several parallel projects that leverage time-series Landsat imagery to describe vegetation dynamics at regional and continental scales. At the core of these projects are the LandTrendr algorithms, which distill time-series earth observation data into periods of consistent long or short-duration dynamics. In one approach, we built an integrated, empirical framework to blend these algorithmically-processed time-series data with field data and lidar data to ascribe yearly change in forest biomass across the US states of Washington, Oregon, and California. In a separate project, we expanded from forest-only monitoring to full landscape land cover monitoring over the same regional scale, including both categorical class labels and continuous-field estimates. In these and other projects, we apply machine-learning approaches to ascribe all changes in vegetation to driving processes such as harvest, fire, urbanization, etc., allowing full description of both disturbance and recovery processes and drivers. Finally, we are moving toward extension of these same techniques to continental and eventually global scales using Google Earth Engine. Taken together, these approaches provide one framework for describing and understanding processes of change in vegetation communities at broad scales.

  14. Time Series Spectroscopic and Photometric Observations of the Massive DAV BPM 37093

    NASA Astrophysics Data System (ADS)

    Nitta, Atsuko; Kepler, S. O.; Chene, Andre–Nicolas; Koester, D.; Provencal, J. L.; Sullivan, D. J.; Chote, Paul; Safeko, Ramotholo; Kanaan, Antonio; Romero, Alejandra; Corti, Mariela; Corti, Mariela; Kilic, Mukremin; Winget, D. E.

    2015-06-01

    BPM 37093 was the first of only a handful of massive (1.05+/-0.05 M⊙; Bergeron 2004;Koester & Allard 2000) white dwarf pulsators discovered (Kanaan et al. 1992). These stars are particularly interesting because the crystallized mass-fraction as a function of mass and temperature is poorly constrained by observation, yet this process adds 1-2 Gyr uncertainty in ages of the oldest white dwarf stars observed and hence, in the ages of associations that contain them (Abrikosov 1960; Kirzhnits 1960; Salpeter 1961). Last year, we discovered that ESO uses BPM 37093 as a standard star and extracted corresponding spectra from the public archive. The data suggested a large variation in the observed hydrogen line profiles that could potentially be due to pulsations, but the measurement did not reach a detection-quality threshold. To further explore this possibility, though, we obtained 4hrs of continuous time series spectroscopy of BPM 37093 with Gemini in the Northern Spring of 2014. We present our preliminary results from these data along with those from the accompanying time series photometric observations we gathered from Mt. John (New Zealand), South African Astronomical Observatory (SAAO), Panchromatic Robotic optical Monitoring and Polarimetry Telescopes (PROMPT) in Chile, and Complejo Astronomico El Leoncito (Argentina) to support the Gemini observations.

  15. [Effects of continuous cropping of vegetables on ammonia oxidizers community structure].

    PubMed

    Meng, De-Long; Yang, Yang; Wu, Yan-Zheng; Wu, Min-Na; Qin, Hong-Ling; Zhu, Yi-Jun; Wei, Wen-Xue

    2012-04-01

    Investigations were conducted on the effects of intensive application of chemical fertilizers in crop production on soil nitrifier communities and the relationship between nitrifier communities and soil nitrification ability. Two series of vegetable soils were selected from Huangxing, Changsha, reflecting continuous vegetable cropping with about 20 years and new vegetable field with only about 2 years vegetable growing history. In each series five independent topsoils (0-20 cm) were sampled and each soil was a mixture of 10 cores randomly taken in the same field. Terminal restriction fragment length polymorphism (T-RFLP) and quantity PCR (Q-PCR) were used to determine the composition and abundance of ammonia-oxidizing bacteria (AOB) and ammonia-oxidizing archaea (AOA) communities. Results indicated that long-term and continuous vegetable cropping obviously changed the compositions of both AOB and AOA amoA gene, soil pH and Olsen-P content were the dominant factors affecting the composition of AOB amoA. In the vegetable soils, although the copy number of AOA amoA gene was about 5 times higher than AOB amoA gene, no significant correlation was detected between AOA amoA gene abundance and soil nitrification rate. It was not sure whether long-term and continuous vegetable cropping could shift the abundance of AOB and AOA, but it resulted in the enrichment of some dominant AOB species and increase of soil nitrification potential (PNF).

  16. Detection of deformation time-series in Miyake-jima using PALSAR/InSAR

    NASA Astrophysics Data System (ADS)

    Ozawa, T.; Ueda, H.

    2010-12-01

    Volcano deformation is often complicated temporally and spatially. Then deformation mapping by InSAR is useful to understand it in detail. However, InSAR is affected by the atmospheric, the ionospheric and other noises, and then we sometimes miss an important temporal change of deformation with a few cm. So we want to develop InSAR time-series analysis which detects volcano deformation precisely. Generally, the area of 10×10km which covers general volcano size is included in several SAR scenes obtained from different orbits or observation modes. First, interferograms are generated for each orbit path. In InSAR processing, the atmospheric noise reduction using the simulation from numerical weather model is used. Long wavelength noise due to orbit error and the ionospheric disturbance is corrected by adjusting to GPS deformation time-series, assuming it to be a plane. Next, we estimate deformation time-series from obtained interferograms. Radar incidence directions for each orbit path are different, but those for observation modes with 34.3° and 41.5° offnadir angles are almost included in one plane. Then slant-range change for all orbit paths can be described by the horizontal and the vertical components of its co-plane. Inversely, we estimate them for all epochs with the constraint that temporal change of deformation is smooth. Simultaneously, we estimate DEM error. As one of case studies, we present an application in Miyake-jima. Miyake-jima is a volcanic island located to 200km south of Tokyo, and a large amount of volcanic gas has been ejecting since the 2000 eruption. Crustal deformation associated with such volcanic activity has been observed by continuous GPS observations. However, its distribution is complicated, and therefore we applied this method to detect precise deformation time-series. In the most of GPS sites, obtained time-series were good agreement with GPS time-series, and the root-mean-square of residuals was less than 1cm. However, the temporal step of deformation was estimated in 2008, and it is not consistent with GPS time-series. We think that the effect of an orbit maneuver in 2008 has appeared. An improvement for such noise is one of next subjects. In the obtained deformation map, contraction around the caldera and uplift along the north-west-south coast were found. It is obvious that this deformation pattern cannot be explained by simple one inflation or deflation source, and its interpretation is also one of next subjects. In the caldera bottom, subsidence with 14cm/yr was found. Though its subsidence speed was constant until 2008, it decelerated to 20cm/yr from 2009. Furthermore subsidence speed in 2010 was 3cm/yr. Around the same time, low-frequency earthquakes increased just under the caldera. Then we speculate that deceleration of subsidence may directly relate with the volcanic activity. Although the result shows volcano deformation in detail, some mis-estimations were obtained. We believe that this InSAR time-series analysis is useful, but more improvements are necessary.

  17. An evaluation of grease-type ball bearing lubricants operation in various environments

    NASA Technical Reports Server (NTRS)

    Mcmurtrey, E. L.

    1983-01-01

    Because many future spacecraft or space stations will require mechanisms to operate for long periods of time in environments which are adverse to most bearing lubricants, a series of tests is continuing to evaluate 38 grease type lubricants in R-4 size bearings in five different environments for a 1 year period. Four repetitions of each test are made to provide statistical samples. These tests have also been used to select four lubricants for 5 year tests in selected environments with five repetitions of each test for statistical samples. At the present time, 142 test sets have been completed and 30 test sets are underway. The three 5 year tests in (1) continuous operation and (2) start stop operation, with both in vacuum at ambient temperatures, and (3) continuous vacuum operation at 93.3 C are now completed. To date, in both the 1 year and 5 year tests, the best results in all environments have been obtained with a high viscosity index perfluoroalkylpolyether (PFPE) grease.

  18. Dysart Unified School District: How One School District Used Collaborative Planning to Improve Outcomes for All Students. From the Field. Digital Learning Series

    ERIC Educational Resources Information Center

    Slaven, Chip; Hall, Sara; Schwartzbeck, Terri Duggan; Jones, Rachel; Wolf, Mary Ann

    2013-01-01

    The Dysart Unified School District (Dysart) in Arizona covers 140 square miles and serves numerous communities, including the cities of Surprise and El Mirage and some unincorporated areas of Maricopa County. At one time the fastest-growing school system in Arizona, Dysart has tripled in size since 2000. The district continues to grow, and in…

  19. The Conduct of Continuous Operations,

    DTIC Science & Technology

    1987-04-30

    INTRODUCTION The great campaigns of the twentieth century have one thing in common : at least one operational pause occurred between the commencement of...and General Staff College published a series of books on the conduct of war by armies and army groups. Common themes in these works were: the importance...be so located that It can be carried out without interruption from the enemy; sufficient time should be available for placing troops in relative

  20. Reading Achievement: Characteristics Associated with Success and Failure: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," April through June 1978 (Vol. 38 Nos. 10 through 12).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 20 titles deal with a variety of topics, including the following: the relationships between reading achievement and such factors as dependency, attitude toward reading, mastery of word attack skills, reaction time on selected…

  1. Automatic high throughput empty ISO container verification

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2007-04-01

    Encouraging results are presented for the automatic analysis of radiographic images of a continuous stream of ISO containers to confirm they are truly empty. A series of image processing algorithms are described that process real-time data acquired during the actual inspection of each container and assigns each to one of the classes "empty", "not empty" or "suspect threat". This research is one step towards achieving fully automated analysis of cargo containers.

  2. Solar-terrestrial research for the 1980's

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The solar-terrestrial system is described. Techniques for observations involving all relevant platforms: spacecraft, the Earth's surface, aircraft, balloons, and rockets are proposed. The need for interagency coordination of programs, efficient data management, theoretical studies and modeling, the continuity of long time series observations, and innovative instrument design is emphasized. Examples of the practical impact of interactions between solar terrestrial phenomena and the environment, including technological systems are presented.

  3. Mask industry assessment trend analysis: 2012

    NASA Astrophysics Data System (ADS)

    Chan, Y. David

    2012-02-01

    Microelectronics industry leaders consistently cite the cost and cycle time of mask technology and mask supply among the top critical issues for lithography. A survey was designed by SEMATECH with input from semiconductor company mask technologists and merchant mask suppliers to objectively assess the overall conditions of the mask industry. With the continued support of the industry, this year's assessment was the tenth in the current series of annual reports. This year's survey is basically the same as the 2005 through 2011 surveys. Questions are grouped into six categories: General Business Profile Information, Data Processing, Yields and Yield Loss Mechanisms, Delivery Times, Returns, and Services. Within each category is a multitude of questions that ultimately produce a detailed profile of both the business and technical status of the critical mask industry. We received data from 11 companies this year, which was a record high since the beginning of the series. The responding companies represented more than 96% of the volume shipped and about 90% of the 2011 revenue for the photomask industry. These survey reports are often used as a baseline to gain perspective on the technical and business status of the mask and microelectronics industries. They will continue to serve as a valuable reference to identify strengths and opportunities. Results can also be used to guide future investments in critical path issues.

  4. The use of segmented regression in analysing interrupted time series studies: an example in pre-hospital ambulance care.

    PubMed

    Taljaard, Monica; McKenzie, Joanne E; Ramsay, Craig R; Grimshaw, Jeremy M

    2014-06-19

    An interrupted time series design is a powerful quasi-experimental approach for evaluating effects of interventions introduced at a specific point in time. To utilize the strength of this design, a modification to standard regression analysis, such as segmented regression, is required. In segmented regression analysis, the change in intercept and/or slope from pre- to post-intervention is estimated and used to test causal hypotheses about the intervention. We illustrate segmented regression using data from a previously published study that evaluated the effectiveness of a collaborative intervention to improve quality in pre-hospital ambulance care for acute myocardial infarction (AMI) and stroke. In the original analysis, a standard regression model was used with time as a continuous variable. We contrast the results from this standard regression analysis with those from segmented regression analysis. We discuss the limitations of the former and advantages of the latter, as well as the challenges of using segmented regression in analysing complex quality improvement interventions. Based on the estimated change in intercept and slope from pre- to post-intervention using segmented regression, we found insufficient evidence of a statistically significant effect on quality of care for stroke, although potential clinically important effects for AMI cannot be ruled out. Segmented regression analysis is the recommended approach for analysing data from an interrupted time series study. Several modifications to the basic segmented regression analysis approach are available to deal with challenges arising in the evaluation of complex quality improvement interventions.

  5. Ensemble Bayesian forecasting system Part I: Theory and algorithms

    NASA Astrophysics Data System (ADS)

    Herr, Henry D.; Krzysztofowicz, Roman

    2015-05-01

    The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.

  6. High-resolution (noble) gas time series for aquatic research

    NASA Astrophysics Data System (ADS)

    Popp, A. L.; Brennwald, M. S.; Weber, U.; Kipfer, R.

    2017-12-01

    We developed a portable mass spectrometer (miniRUEDI) for on-site quantification of gas concentrations (He, Ar, Kr, N2, O2, CO2, CH4, etc.) in terrestrial gases [1,2]. Using the gas-equilibrium membrane-inlet technique (GE-MIMS), the miniRUEDI for the first time also allows accurate on-site and long-term dissolved-gas analysis in water bodies. The miniRUEDI is designed for operation in the field and at remote locations, using battery power and ambient air as a calibration gas. In contrast to conventional sampling and subsequent lab analysis, the miniRUEDI provides real-time and continuous time series of gas concentrations with a time resolution of a few seconds.Such high-resolution time series and immediate data availability open up new opportunities for research in highly dynamic and heterogeneous environmental systems. In addition the combined analysis of inert and reactive gas species provides direct information on the linkages of physical and biogoechemical processes, such as the air/water gas exchange, excess air formation, O2 turnover, or N2 production by denitrification [1,3,4].We present the miniRUEDI instrument and discuss its use for environmental research based on recent applications of tracking gas dynamics related to rapid and short-term processes in aquatic systems. [1] Brennwald, M.S., Schmidt, M., Oser, J., and Kipfer, R. (2016). Environmental Science and Technology, 50(24):13455-13463, doi: 10.1021/acs.est.6b03669[2] Gasometrix GmbH, gasometrix.com[3] Mächler, L., Peter, S., Brennwald, M.S., and Kipfer, R. (2013). Excess air formation as a mechanism for delivering oxygen to groundwater. Water Resources Research, doi:10.1002/wrcr.20547[4] Mächler, L., Brennwald, M.S., and Kipfer, R. (2013). Argon Concentration Time-Series As a Tool to Study Gas Dynamics in the Hyporheic Zone. Environmental Science and Technology, doi: 10.1021/es305309b

  7. Systematic identification of an integrative network module during senescence from time-series gene expression.

    PubMed

    Park, Chihyun; Yun, So Jeong; Ryu, Sung Jin; Lee, Soyoung; Lee, Young-Sam; Yoon, Youngmi; Park, Sang Chul

    2017-03-15

    Cellular senescence irreversibly arrests growth of human diploid cells. In addition, recent studies have indicated that senescence is a multi-step evolving process related to important complex biological processes. Most studies analyzed only the genes and their functions representing each senescence phase without considering gene-level interactions and continuously perturbed genes. It is necessary to reveal the genotypic mechanism inferred by affected genes and their interaction underlying the senescence process. We suggested a novel computational approach to identify an integrative network which profiles an underlying genotypic signature from time-series gene expression data. The relatively perturbed genes were selected for each time point based on the proposed scoring measure denominated as perturbation scores. Then, the selected genes were integrated with protein-protein interactions to construct time point specific network. From these constructed networks, the conserved edges across time point were extracted for the common network and statistical test was performed to demonstrate that the network could explain the phenotypic alteration. As a result, it was confirmed that the difference of average perturbation scores of common networks at both two time points could explain the phenotypic alteration. We also performed functional enrichment on the common network and identified high association with phenotypic alteration. Remarkably, we observed that the identified cell cycle specific common network played an important role in replicative senescence as a key regulator. Heretofore, the network analysis from time series gene expression data has been focused on what topological structure was changed over time point. Conversely, we focused on the conserved structure but its context was changed in course of time and showed it was available to explain the phenotypic changes. We expect that the proposed method will help to elucidate the biological mechanism unrevealed by the existing approaches.

  8. Prostatic cancer. Treated at a categorical center, 1980-1983.

    PubMed

    Slack, N H; Lane, W W; Priore, R L; Murphy, G P

    1986-03-01

    This report covers the experience from 537 patients with prostatic cancer seen at Roswell Park Memorial Institute (RPMI) from 1980 through 1983. This is a look at experiences in the early 1980s and is a continuation of the series covering the decades of the 1950s, 1960s, and 1970s. Referrals continue to dominate the series (85% of cases) but are now only slightly younger (65 years) than in-house diagnoses (66 years), of which one third were diagnosed at autopsy. Survival rates in this series, although limited in follow-up, were similar at two years to those in the 1970s and in the extensive series collected by the survey of the American College of Surgeons. Multiple primary tumors were observed in 22 per cent of this series, most frequently involving the bladder in addition to the prostate. Treatments continue to involve chemotherapy earlier in the course of disease as part of a succession of therapeutic modalities that include transurethral resection of prostate (TURP) or prostatectomy, lymph node dissection, external irradiation, castration, and hormones.

  9. A real-time monitoring system for the facial nerve.

    PubMed

    Prell, Julian; Rachinger, Jens; Scheller, Christian; Alfieri, Alex; Strauss, Christian; Rampp, Stefan

    2010-06-01

    Damage to the facial nerve during surgery in the cerebellopontine angle is indicated by A-trains, a specific electromyogram pattern. These A-trains can be quantified by the parameter "traintime," which is reliably correlated with postoperative functional outcome. The system presented was designed to monitor traintime in real-time. A dedicated hardware and software platform for automated continuous analysis of the intraoperative facial nerve electromyogram was specifically designed. The automatic detection of A-trains is performed by a software algorithm for real-time analysis of nonstationary biosignals. The system was evaluated in a series of 30 patients operated on for vestibular schwannoma. A-trains can be detected and measured automatically by the described method for real-time analysis. Traintime is monitored continuously via a graphic display and is shown as an absolute numeric value during the operation. It is an expression of overall, cumulated length of A-trains in a given channel; a high correlation between traintime as measured by real-time analysis and functional outcome immediately after the operation (Spearman correlation coefficient [rho] = 0.664, P < .001) and in long-term outcome (rho = 0.631, P < .001) was observed. Automated real-time analysis of the intraoperative facial nerve electromyogram is the first technique capable of reliable continuous real-time monitoring. It can critically contribute to the estimation of functional outcome during the course of the operative procedure.

  10. Real-Time Data Management, IP Telemetry, Data Integration, and Data Center Operations for the Source Physics Experiment (SPE), Nevada National Security Site

    NASA Astrophysics Data System (ADS)

    Plank, G.; Slater, D.; Torrisi, J.; Presser, R.; Williams, M.; Smith, K. D.

    2012-12-01

    The Nevada Seismological Laboratory (NSL) manages time-series data and high-throughput IP telemetry for the National Center for Nuclear Security (NCNS) Source Physics Experiment (SPE), underway on the Nevada National Security Site (NNSS). During active-source experiments, SPE's heterogeneous systems record over 350 channels of a variety of data types including seismic, infrasound, acoustic, and electro-magnetic. During the interim periods, broadband and short period instruments record approximately 200 channels of continuous, high-sample-rate seismic data. Frequent changes in sensor and station configurations create a challenging meta-data environment. Meta-data account for complete operational histories, including sensor types, serial numbers, gains, sample rates, orientations, instrument responses, data-logger types etc. To date, these catalogue 217 stations, over 40 different sensor types, and over 1000 unique recording configurations (epochs). Facilities for processing, backup, and distribution of time-series data currently span four Linux servers, 60Tb of disk capacity, and two data centers. Bandwidth, physical security, and redundant power and cooling systems for acquisition, processing, and backup servers are provided by NSL's Reno data center. The Nevada System of Higher Education (NSHE) System Computer Services (SCS) in Las Vegas provides similar facilities for the distribution server. NSL staff handle setup, maintenance, and security of all data management systems. SPE PIs have remote access to meta-data, raw data, and CSS3.0 compilations, via SSL-based transfers such as rsync or secure-copy, as well as shell access for data browsing and limited processing. Meta-data are continuously updated and posted on the Las Vegas distribution server as station histories are better understood and errors are corrected. Raw time series and refined CSS3.0 data compilations with standardized formats are transferred to the Las Vegas data server as available. For better data availability and station monitoring, SPE is beginning to leverage NSL's wide-area digital IP network with nine SPE stations and six Rock Valley area stations that stream continuous recordings in real time to the NSL Reno data center. These stations, in addition to eight regional legacy stations supported by National Security Technologies (NSTec), are integrated with NSL's regional monitoring network and constrain a high-quality local earthquake catalog for NNSS. The telemetered stations provide critical capabilities for SPE, and infrastructure for earthquake response on NNSS as well as southern Nevada and the Las Vegas area.

  11. The Temporal Dynamics of Spatial Patterns of Observed Soil Moisture Interpreted Using the Hydrus 1-D Model

    NASA Astrophysics Data System (ADS)

    Chen, M.; Willgoose, G. R.; Saco, P. M.

    2009-12-01

    This paper investigates the soil moisture dynamics over two subcatchments (Stanley and Krui) in the Goulburn River in NSW during a three year period (2005-2007) using the Hydrus 1-D unsaturated soil water flow model. The model was calibrated to the seven Stanley microcatchment sites (1 sqkm site) using continuous time surface 30cm and full profile soil moisture measurements. Soil type, leaf area index and soil depth were found to be the key parameters changing model fit to the soil moisture time series. They either shifted the time series up or down, changed the steepness of dry-down recessions or determined the lowest point of soil moisture dry-down respectively. Good correlations were obtained between observed and simulated soil water storage (R=0.8-0.9) when calibrated parameters for one site were applied to the other sites. Soil type was also found to be the main determinant (after rainfall) of the mean of modelled soil moisture time series. Simulations of top 30cm were better than those of the whole soil profile. Within the Stanley microcatchment excellent soil moisture matches could be generated simply by adjusting the mean of soil moisture up or down slightly. Only minor modification of soil properties from site to site enable good fits for all of the Stanley sites. We extended the predictions of soil moisture to a larger spatial scale of the Krui catchment (sites up to 30km distant from Stanley) using soil and vegetation parameters from Stanley but the locally recorded rainfall at the soil moisture measurement site. The results were encouraging (R=0.7~0.8). These results show that it is possible to use a calibrated soil moisture model to extrapolate the soil moisture to other sites for a catchment with an area of up to 1000km2. This paper demonstrates the potential usefulness of continuous time, point scale soil moisture (typical of that measured by permanently installed TDR probes) in predicting the soil wetness status over a catchment of significant size.

  12. Why didn't Box-Jenkins win (again)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pack, D.J.; Downing, D.J.

    This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less

  13. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  14. Progress Report on the Airborne Composition Standard Variable Name and Time Series Working Groups of the 2017 ESDSWG

    NASA Astrophysics Data System (ADS)

    Evans, K. D.; Early, A. B.; Northup, E. A.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.; Arctur, D. K.; Beach, A. L., III; Silverman, M. L.

    2017-12-01

    The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering Report template, describing recommended changes to TimeseriesML 1.0, in the form of use cases. In 2017, we are conducting interoperability experiments to implement the use cases and demonstrate the feasibility and suitability of these modifications for NASA and related user communities. The results will be incorporated into the existing draft engineering report.

  15. Progress Report on the Airborne Composition Standard Variable Name and Time Series Working Groups of the 2017 ESDSWG

    NASA Technical Reports Server (NTRS)

    Evans, Keith D.; Early, Amanda; Northup, Emily; Ames, Dan; Teng, William; Archur, David; Beach, Aubrey; Olding, Steve; Krotkov, Nickolay A.

    2017-01-01

    The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering Report template, describing recommended changes to TimeseriesML 1.0, in the form of use cases. In 2017, we are conducting interoperability experiments to implement the use cases and demonstrate the feasibility and suitability of these modifications for NASA and related user communities. The results will be incorporated into the existing draft engineering report.

  16. Landsat Time-Series Analysis Opens New Approaches for Regional Glacier Mapping

    NASA Astrophysics Data System (ADS)

    Winsvold, S. H.; Kääb, A.; Nuth, C.; Altena, B.

    2016-12-01

    The archive of Landsat satellite scenes is important for mapping of glaciers, especially as it represents the longest running and continuous satellite record of sufficient resolution to track glacier changes over time. Contributing optical sensors newly launched (Landsat 8 and Sentinel-2A) or upcoming in the near future (Sentinel-2B), will promote very high temporal resolution of optical satellite images especially in high-latitude regions. Because of the potential that lies within such near-future dense time series, methods for mapping glaciers from space should be revisited. We present application scenarios that utilize and explore dense time series of optical data for automatic mapping of glacier outlines and glacier facies. Throughout the season, glaciers display a temporal sequence of properties in optical reflection as the seasonal snow melts away, and glacier ice appears in the ablation area and firn in the accumulation area. In one application scenario presented we simulated potential future seasonal resolution using several years of Landsat 5TM/7ETM+ data, and found a sinusoidal evolution of the spectral reflectance for on-glacier pixels throughout a year. We believe this is because of the short wave infrared band and its sensitivity to snow grain size. The parameters retrieved from the fitting sinus curve can be used for glacier mapping purposes, thus we also found similar results using e.g. the mean of summer band ratio images. In individual optical mapping scenes, conditions will vary (e.g., snow, ice, and clouds) and will not be equally optimal over the entire scene. Using robust statistics on stacked pixels reveals a potential for synthesizing optimal mapping scenes from a temporal stack, as we present in a further application scenario. The dense time series available from satellite imagery will also promote multi-temporal and multi-sensor based analyses. The seasonal pattern of snow and ice on a glacier seen in the optical time series can in the summer season also be observed using radar backscatter series. Optical sensors reveal the reflective properties at the surface, while radar sensors may penetrate the surface revealing properties from a certain volume.In an outlook to this contribution we have explored how we can combine information from SAR and optical sensor systems for different purposes.

  17. Impacts of GNSS position offsets on global frame stability

    NASA Astrophysics Data System (ADS)

    Griffiths, Jake; Ray, Jim

    2015-04-01

    Positional offsets appear in Global Navigation Satellite System (GNSS) time series for a variety of reasons. Antenna or radome changes are the most common cause for these discontinuities. Many others are from earthquakes, receiver changes, and different anthropogenic modifications at or near the stations. Some jumps appear for unknown or undocumented reasons. Accurate determination of station velocities, and therefore geophysical parameters and terrestrial reference frames, requires that positional offsets be correctly found and compensated. Williams (2003) found that undetected offsets introduce a random walk error component in individual station time series. The topic of detecting positional offsets has received considerable attention in recent years (e.g., Detection of Offsets in GPS Experiment; DOGEx), and most research groups using GNSS have adopted a mix of manual and automated methods for finding them. The removal of a positional offset from a time series is usually handled by estimating the average station position on both sides of the discontinuity. Except for large earthquake events, the velocity is usually assumed constant and continuous across the positional jump. This approach is sufficient in the absence of time-correlated errors. However, GNSS time series contain periodic and power-law (flicker) errors. In this paper, we evaluate the impact to individual station results and the overall stability of the global reference frame from adding increasing numbers of positional discontinuities. We use the International GNSS Service (IGS) weekly SINEX files, and iteratively insert positional offset parameters. Each iteration includes a restacking of the modified SINEX files using the CATREF software from Institut National de l'Information Géographique et Forestière (IGN). Comparisons of successive stacked solutions are used to assess the impacts on the time series of x-pole and y-pole offsets, along with changes in regularized position and secular velocity for stations with more than 2.5 years of data. Our preliminary results indicate that the change in polar motion scatter is logarithmic with increasing numbers of discontinuities. The best-fit natural logarithm to the changes in scatter for x-pole has R2 = 0.58; the fit for the y-pole series has R2 = 0.99. From these empirical functions, we find that polar motion scatter increases from zero when the total rate of discontinuities exceeds 0.2 (x-pole) and 1.3 (y-pole) per station, on average (the IGS has 0.65 per station). Thus, the presence of position offsets in GNSS station time series is likely already a contributor to IGS polar motion inaccuracy and global frame instability. Impacts to station position and velocity estimates depend on noise features found in that station's positional time series. For instance, larger changes in velocity occur for stations with shorter and noisier data spans. This is because an added discontinuity parameter for an individual station time series can induce changes in average position on both sides of the break. We will expand on these results, and consider remaining questions about the role of velocity discontinuities and the effects caused by non-core reference frame stations.

  18. Viscoplastic Characterization of Ti-6-4: Experiments

    NASA Technical Reports Server (NTRS)

    Lerch, Bradley A.; Arnold, Steven M.

    2016-01-01

    As part of a continued effort to improve the understanding of material time-dependent response, a series of mechanical tests have been conducted on the titanium alloy, Ti-6Al-4V. Tensile, creep, and stress relaxation tests were performed over a wide range of temperatures and strain rates to engage various amounts of time-dependent behavior. Additional tests were conducted that involved loading steps, overloads, dwell periods, and block loading segments to characterize the interaction between plasticity and time-dependent behavior. These data will be used to characterize a recently developed, viscoelastoplastic constitutive model with a goal toward better estimates of aerospace component behavior, resulting in improved safety.

  19. Introduction to focus issue: Synchronization in large networks and continuous media—data, models, and supermodels

    NASA Astrophysics Data System (ADS)

    Duane, Gregory S.; Grabow, Carsten; Selten, Frank; Ghil, Michael

    2017-12-01

    The synchronization of loosely coupled chaotic systems has increasingly found applications to large networks of differential equations and to models of continuous media. These applications are at the core of the present Focus Issue. Synchronization between a system and its model, based on limited observations, gives a new perspective on data assimilation. Synchronization among different models of the same system defines a supermodel that can achieve partial consensus among models that otherwise disagree in several respects. Finally, novel methods of time series analysis permit a better description of synchronization in a system that is only observed partially and for a relatively short time. This Focus Issue discusses synchronization in extended systems or in components thereof, with particular attention to data assimilation, supermodeling, and their applications to various areas, from climate modeling to macroeconomics.

  20. Introduction to focus issue: Synchronization in large networks and continuous media-data, models, and supermodels.

    PubMed

    Duane, Gregory S; Grabow, Carsten; Selten, Frank; Ghil, Michael

    2017-12-01

    The synchronization of loosely coupled chaotic systems has increasingly found applications to large networks of differential equations and to models of continuous media. These applications are at the core of the present Focus Issue. Synchronization between a system and its model, based on limited observations, gives a new perspective on data assimilation. Synchronization among different models of the same system defines a supermodel that can achieve partial consensus among models that otherwise disagree in several respects. Finally, novel methods of time series analysis permit a better description of synchronization in a system that is only observed partially and for a relatively short time. This Focus Issue discusses synchronization in extended systems or in components thereof, with particular attention to data assimilation, supermodeling, and their applications to various areas, from climate modeling to macroeconomics.

Top